Sample records for system models-3 version

  1. Integrated Farm System Model Version 4.3 and Dairy Gas Emissions Model Version 3.3 Software development and distribution

    USDA-ARS?s Scientific Manuscript database

    Modeling routines of the Integrated Farm System Model (IFSM version 4.2) and Dairy Gas Emission Model (DairyGEM version 3.2), two whole-farm simulation models developed and maintained by USDA-ARS, were revised with new components for: (1) simulation of ammonia (NH3) and greenhouse gas emissions gene...

  2. Integrated Farm System Model Version 4.1 and Dairy Gas Emissions Model Version 3.1 software release and distribution

    USDA-ARS?s Scientific Manuscript database

    Animal facilities are significant contributors of gaseous emissions including ammonia (NH3) and nitrous oxide (N2O). Previous versions of the Integrated Farm System Model (IFSM version 4.0) and Dairy Gas Emissions Model (DairyGEM version 3.0), two whole-farm simulation models developed by USDA-ARS, ...

  3. 3MRA UNCERTAINTY AND SENSITIVITY ANALYSIS

    EPA Science Inventory

    This presentation discusses the Multimedia, Multipathway, Multireceptor Risk Assessment (3MRA) modeling system. The outline of the presentation is: modeling system overview - 3MRA versions; 3MRA version 1.0; national-scale assessment dimensionality; SuperMUSE: windows-based super...

  4. RELEASE NOTES FOR MODELS-3 VERSION 4.1 PATCH: SMOKE TOOL AND FILE CONVERTER

    EPA Science Inventory

    This software patch to the Models-3 system corrects minor errors in the Models-3 framework, provides substantial improvements in the ASCII to I/O API format conversion of the File Converter utility, and new functionalities for the SMOKE Tool. Version 4.1 of the Models-3 system...

  5. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Computer Code Documentation (Version 3). Volume 3, Part 4.

    DTIC Science & Technology

    1983-09-01

    6ENFRAL. ELECTROMAGNETIC MODEL FOR THE ANALYSIS OF COMPLEX SYSTEMS **%(GEMA CS) Computer Code Documentation ii( Version 3 ). A the BDM Corporation Dr...ANALYSIS FnlTcnclRpr F COMPLEX SYSTEM (GmCS) February 81 - July 83- I TR CODE DOCUMENTATION (Version 3 ) 6.PROMN N.REPORT NUMBER 5. CONTRACT ORGAT97...the ti and t2 directions on the source patch. 3 . METHOD: The electric field at a segment observation point due to the source patch j is given by 1-- lnA

  6. Documentation and Validation of the Goddard Earth Observing System (GEOS) Data Assimilation System, Version 4

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); daSilva, Arlindo; Dee, Dick; Bloom, Stephen; Bosilovich, Michael; Pawson, Steven; Schubert, Siegfried; Wu, Man-Li; Sienkiewicz, Meta; Stajner, Ivanka

    2005-01-01

    This document describes the structure and validation of a frozen version of the Goddard Earth Observing System Data Assimilation System (GEOS DAS): GEOS-4.0.3. Significant features of GEOS-4 include: version 3 of the Community Climate Model (CCM3) with the addition of a finite volume dynamical core; version two of the Community Land Model (CLM2); the Physical-space Statistical Analysis System (PSAS); and an interactive retrieval system (iRET) for assimilating TOVS radiance data. Upon completion of the GEOS-4 validation in December 2003, GEOS-4 became operational on 15 January 2004. Products from GEOS-4 have been used in supporting field campaigns and for reprocessing several years of data for CERES.

  7. ECONOMIC GROWTH ANALYSIS SYSTEM: USER'S GUIDE - VERSION 3.0

    EPA Science Inventory

    The two-volume report describes the development of, and provides information needed to operate, the Economic Growth Analysis System (E-GAS) Version 3.0 model. The model will be used to project emissions inventories of volatile organic compounds, oxides of nitrogen, and carbon mon...

  8. ECONOMIC GROWTH ANALYSIS SYSTEM: REFERENCE MANUAL VERSION 3.0

    EPA Science Inventory

    The two-volume report describes the development of, and provides information needed to operate, the Economic Growth Analysis System (E-GAS) Version 3.0 model. The model will be used to project emissions inventories of volatile organic compounds, oxides of nitrogen, and carbon mon...

  9. Performance of Versions 1,2 and 3 of the Goddard Earth Observing System (GEOS) Chemistry-Climate Model (CCM)

    NASA Technical Reports Server (NTRS)

    Pawson, Steven; Stolarski, Richard S.; Nielsen, J. Eric; Duncan, Bryan N.

    2008-01-01

    Version 1 of the Goddard Earth Observing System Chemistry-Climate Model (GEOS CCM) was used in the first CCMVa1 model evaluation and forms the basis for several studies of links between ozone and the circulation. That version of the CCM was based on the GEOS-4 GCM. Versions 2 and 3 of the GEOS CCM are based on the GEOS-5 GCM, which retains the "Lin-Rood" dynamical core but has a totally different set of physical parameterizatiOns to GEOS-4. In Version 2 of the GEOS CCM the Goddard stratospheric chemistry module is retained. Difference between Versions 1 and 2 thus reflect the physics changes of the underlying GCMs. Several comparisons between these two models are made, several of which reveal improvements in Version 2 (including a more realistic representation of the interannual variability of the Antarctic vortex). In Version 3 of the GEOS CCM, the stratospheric chemistry mechanism is replaced by the "GMI COMBO" code that includes tropospheric chemistry and different computational approaches. An advantage of this model version. is the reduction of high ozone biases that prevail at low chlorine loadings in Versions 1 and 2. This poster will compare and contrast various aspects of the three model versions that are relevant for understanding interactions between ozone and climate.

  10. Building an Evaluation Framework for the VIC Model in the NLDAS Testbed

    NASA Astrophysics Data System (ADS)

    Xia, Y.; Mocko, D. M.; Wang, S.; Pan, M.; Kumar, S.; Peters-Lidard, C. D.; Wei, H.; Ek, M. B.

    2017-12-01

    Since the second phase of North American Land Data Assimilation System (NLDAS-2) was operationally implemented at NCEP in August 2014, developing the third phase of NLDAS system (NLDAS-3) has been a key task for the NCEP and NASA NLDAS team. The Variable Infiltration Capacity (VIC) model is one major component of the NLDAS system. The current operational NLDAS-2 uses version 4.0.3 (VIC403), research NLDAS-2 uses version 4.0.5 (VIC405), and LIS-based (Land Information System) NLDAS uses version 4.1.2 (VIC412). The purpose of this study is to compressively evaluate three versions and document changes in model behavior towards VIC412 for NLDAS-3. To do that, we develop a relatively comprehensive framework including multiple variables and metrics to assess the performance of different versions. This framework is being incorporated into the NASA Land Verification Toolkit (LVT) for evaluation of other LSMs for NLDAS-3 development. The evaluation results show that there are large and significant improvements for VIC412 in southeastern United States when compared with VIC403 and VIC405. In the other regions, there are very limited improvements or even some degree of deteriorations. Potential reasons are due to: (1) few USGS streamflow observations for soil and hydrologic parameter calibration, (2) the lack of re-calibration of VIC412 in the NLDAS domain, and (3) changes in model physics from VIC403 to VIC412. Overall, the model version upgrade largely/significantly enhances model performance and skill score for all United States except for the Great Plains, suggesting a right direction for VIC model development. Some further efforts are needed for science understanding of land surface physical processes in GP and a re-calibration for VIC412 using reasonable reference datasets is suggested.

  11. INM, integrated noise model, version 4.11 : user's guide, supplement

    DOT National Transportation Integrated Search

    1993-12-01

    The Volpe National Transportation Systems Center, in support of the Federal Aviation Administration, Office of Environment and Energy, has developed Version 4.11 of the Integrated Noise Model (INM). This User's Guide is a supplement to INM, Version 3...

  12. HYDROLOGIC EVALUATION OF LANDFILL PERFORMANCE (HELP) MODEL - USER'S GUIDE FOR VERSION 3

    EPA Science Inventory

    This report documents the solution methods and process descriptions used in the Version 3 of the HELP model. Program documentation including program options, system and operating requirements, file structures, program structure and variable descriptions are provided in a separat...

  13. Nuclear Engine System Simulation (NESS). Version 2.0: Program user's guide

    NASA Technical Reports Server (NTRS)

    Pelaccio, Dennis G.; Scheil, Christine M.; Petrosky, Lyman

    1993-01-01

    This Program User's Guide discusses the Nuclear Thermal Propulsion (NTP) engine system design features and capabilities modeled in the Nuclear Engine System Simulation (NESS): Version 2.0 program (referred to as NESS throughout the remainder of this document), as well as its operation. NESS was upgraded to include many new modeling capabilities not available in the original version delivered to NASA LeRC in Dec. 1991, NESS's new features include the following: (1) an improved input format; (2) an advanced solid-core NERVA-type reactor system model (ENABLER 2); (3) a bleed-cycle engine system option; (4) an axial-turbopump design option; (5) an automated pump-out turbopump assembly sizing option; (6) an off-design gas generator engine cycle design option; (7) updated hydrogen properties; (8) an improved output format; and (9) personal computer operation capability. Sample design cases are presented in the user's guide that demonstrate many of the new features associated with this upgraded version of NESS, as well as design modeling features associated with the original version of NESS.

  14. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    USGS Publications Warehouse

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 3 software suite can be compiled for Microsoft Windows®4 and Linux®5 operating systems; the source code is available in a Microsoft Visual Studio®6 2013 solution; Linux Makefiles are also provided. PEST++ Version 3 continues to build a foundation for an open-source framework capable of producing robust and efficient parameter estimation tools for large environmental models.

  15. A framework for expanding aqueous chemistry in the Community Multiscale Air Quality (CMAQ) model version 5.1

    EPA Science Inventory

    This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM − KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used t...

  16. Hydrologic Evaluation of Landfill Performance (HELP) Model: B (Set Includes, A- User's Guide for Version 3 w/disks, B-Engineering Documentation for Version 3

    EPA Science Inventory

    The Hydrologic Evaluation of Landfill Performance (HELP) computer program is a quasi-two-dimensional hydrologic model of water movement across, into, through and out of landfills. The model accepts weather, soil and design data. Landfill systems including various combinations o...

  17. PRMS-IV, the precipitation-runoff modeling system, version 4

    USGS Publications Warehouse

    Markstrom, Steven L.; Regan, R. Steve; Hay, Lauren E.; Viger, Roland J.; Webb, Richard M.; Payn, Robert A.; LaFontaine, Jacob H.

    2015-01-01

    Computer models that simulate the hydrologic cycle at a watershed scale facilitate assessment of variability in climate, biota, geology, and human activities on water availability and flow. This report describes an updated version of the Precipitation-Runoff Modeling System. The Precipitation-Runoff Modeling System is a deterministic, distributed-parameter, physical-process-based modeling system developed to evaluate the response of various combinations of climate and land use on streamflow and general watershed hydrology. Several new model components were developed, and all existing components were updated, to enhance performance and supportability. This report describes the history, application, concepts, organization, and mathematical formulation of the Precipitation-Runoff Modeling System and its model components. This updated version provides improvements in (1) system flexibility for integrated science, (2) verification of conservation of water during simulation, (3) methods for spatial distribution of climate boundary conditions, and (4) methods for simulation of soil-water flow and storage.

  18. Representations of the Stratospheric Polar Vortices in Versions 1 and 2 of the Goddard Earth Observing System Chemistry-Climate Model (GEOS CCM)

    NASA Technical Reports Server (NTRS)

    Pawson, S.; Stolarski, R.S.; Nielsen, J.E.; Perlwitz, J.; Oman, L.; Waugh, D.

    2009-01-01

    This study will document the behavior of the polar vortices in two versions of the GEOS CCM. Both versions of the model include the same stratospheric chemistry, They differ in the underlying circulation model. Version 1 of the GEOS CCM is based on the Goddard Earth Observing System, Version 4, general circulation model which includes the finite-volume (Lin-Rood) dynamical core and physical parameterizations from Community Climate Model, Version 3. GEOS CCM Version 2 is based on the GEOS-5 GCM that includes a different tropospheric physics package. Baseline simulations of both models, performed at two-degree spatial resolution, show some improvements in Version 2, but also some degradation, In the Antarctic, both models show an over-persistent stratospheric polar vortex with late breakdown, but the year-to-year variations that are overestimated in Version I are more realistic in Version 2. The implications of this for the interactions with tropospheric climate, the Southern Annular Mode, will be discussed. In the Arctic both model versions show a dominant dynamically forced variabi;ity, but Version 2 has a persistent warm bias in the low stratosphere and there are seasonal differences in the simulations. These differences will be quantified in terms of climate change and ozone loss. Impacts of model resolution, using simulations at one-degree and half-degree, and changes in physical parameterizations (especially the gravity wave drag) will be discussed.

  19. A framework for expanding aqueous chemistry in the Community Multiscale Air Quality (CMAQ) model version 5.1

    EPA Science Inventory

    This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM − KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used to generate a Rosen...

  20. MODELS-3 (CMAQ). NARSTO NEWS (VOL. 3, NO. 2, SUMMER/FALL 1999)

    EPA Science Inventory

    A revised version of the U.S. EPA's Models-3/CMAQ system was released on June 30, 1999. Models-3 consists of a sophisticated computational framework for environmental models allowing for much flexibility in the communications between component parts of the system, in updating or ...

  1. SYSTEM INSTALLATION AND OPERATION MANUAL FOR THE EPA THIRD-GENERATION AIR QUALITY MODELING SYSTEM (MODELS-3) VERSION 3.0

    EPA Science Inventory

    Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheri...

  2. MODELS-3 INSTALLATION PROCEDURES FOR A PC WITH AN NT OPERATING SYSTEM (MODELS-3 VERSION 4.0)

    EPA Science Inventory

    Models-3 is a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of at...

  3. MODELS-3 INSTALLATION PROCEDURES FOR A PERSONAL COMPUTER WITH A NT OPERATING SYSTEM (MODELS-3 VERSION 4.1)

    EPA Science Inventory

    Models-3 is a flexible system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric...

  4. Climate Sensitivity of the Community Climate System Model, Version 4

    DOE PAGES

    Bitz, Cecilia M.; Shell, K. M.; Gent, P. R.; ...

    2012-05-01

    Equilibrium climate sensitivity of the Community Climate System Model Version 4 (CCSM4) is 3.20°C for 1° horizontal resolution in each component. This is about a half degree Celsius higher than in the previous version (CCSM3). The transient climate sensitivity of CCSM4 at 1° resolution is 1.72°C, which is about 0.2°C higher than in CCSM3. These higher climate sensitivities in CCSM4 cannot be explained by the change to a preindustrial baseline climate. We use the radiative kernel technique to show that from CCSM3 to CCSM4, the global mean lapse-rate feedback declines in magnitude, and the shortwave cloud feedback increases. These twomore » warming effects are partially canceled by cooling due to slight decreases in the global mean water-vapor feedback and longwave cloud feedback from CCSM3 to CCSM4. A new formulation of the mixed-layer, slab ocean model in CCSM4 attempts to reproduce the SST and sea ice climatology from an integration with a full-depth ocean, and it is integrated with a dynamic sea ice model. These new features allow an isolation of the influence of ocean dynamical changes on the climate response when comparing integrations with the slab ocean and full-depth ocean. The transient climate response of the full-depth ocean version is 0.54 of the equilibrium climate sensitivity when estimated with the new slab ocean model version for both CCSM3 and CCSM4. We argue the ratio is the same in both versions because they have about the same zonal mean pattern of change in ocean surface heat flux, which broadly resembles the zonal mean pattern of net feedback strength.« less

  5. MODELS-3 INSTALLATION PROCEDURES FOR A SUN WORKSTATION WITH A UNIX-BASED OPERATING SYSTEM (MODELS-3 VERSION 4.1)

    EPA Science Inventory

    Models-3 is a flexible system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheric...

  6. Integrated Farm System Model Version 4.2 and Dairy Gas Emissions Model Version 3.2 Software development and distribution

    USDA-ARS?s Scientific Manuscript database

    Emissions of ammonia (NH3) and nitrous oxide (N2O) vary among animal facilities due to differences in housing structure and associated manure management. Bedded pack barns are structures with a roof and sidewalls resulting in a lower air velocity and evaporation potential inside the structure. But s...

  7. The Multimedia Environmental Pollutant Assessment System (MEPAS){reg_sign}: Atmospheric pathway formulations. Revision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Droppo, J.G.; Buck, J.W.

    1996-03-01

    The Multimedia Environmental Pollutant Assessment System (MEPAS) is an integrated software implementation of physics-based fate and transport models for health and environmental risk assessments of both radioactive and hazardous pollutants. This atmospheric component report is one of a series of formulation reports that document the MEPAS mathematical models. MEPAS is a multimedia model; pollutant transport is modeled within, through, and between multiple media (air, soil, groundwater, and surface water). The estimated concentrations in the various media are used to compute exposures and impacts to the environment, to maximum individuals, and to populations. The MEPAS atmospheric component for the air mediamore » documented in this report includes models for emission from a source to the air, initial plume rise and dispersion, airborne pollutant transport and dispersion, and deposition to soils and crops. The material in this report is documentation for MEPAS Versions 3.0 and 3.1 and the MEPAS version used in the Remedial Action Assessment System (RAAS) Version 1.0.« less

  8. USER MANUAL FOR THE EPA THIRD-GENERATION AIR QUALITY MODELING SYSTEM (MODELS-3 VERSION 3.0)

    EPA Science Inventory

    Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheri...

  9. Evaluation of snow modeling with Noah and Noah-MP land surface models in NCEP GFS/CFS system

    NASA Astrophysics Data System (ADS)

    Dong, J.; Ek, M. B.; Wei, H.; Meng, J.

    2017-12-01

    Land surface serves as lower boundary forcing in global forecast system (GFS) and climate forecast system (CFS), simulating interactions between land and the atmosphere. Understanding the underlying land model physics is a key to improving weather and seasonal prediction skills. With the upgrades in land model physics (e.g., release of newer versions of a land model), different land initializations, changes in parameterization schemes used in the land model (e.g., land physical parametrization options), and how the land impact is handled (e.g., physics ensemble approach), it always prompts the necessity that climate prediction experiments need to be re-conducted to examine its impact. The current NASA LIS (version 7) integrates NOAA operational land surface and hydrological models (NCEP's Noah, versions from 2.7.1 to 3.6 and the future Noah-MP), high-resolution satellite and observational data, and land DA tools. The newer versions of the Noah LSM used in operational models have a variety of enhancements compared to older versions, where the Noah-MP allows for different physics parameterization options and the choice could have large impact on physical processes underlying seasonal predictions. These impacts need to be reexamined before implemented into NCEP operational systems. A set of offline numerical experiments driven by the GFS forecast forcing have been conducted to evaluate the impact of snow modeling with daily Global Historical Climatology Network (GHCN).

  10. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 2 Core.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Dräger, Andreas; Hoops, Stefan; Keating, Sarah M; Le Novère, Nicolas; Myers, Chris J; Olivier, Brett G; Sahle, Sven; Schaff, James C; Smith, Lucian P; Waltemath, Dagmar; Wilkinson, Darren J

    2018-03-09

    Computational models can help researchers to interpret data, understand biological functions, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that different software systems can exchange. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 2 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML, their encoding in XML (the eXtensible Markup Language), validation rules that determine the validity of an SBML document, and examples of models in SBML form. The design of Version 2 differs from Version 1 principally in allowing new MathML constructs, making more child elements optional, and adding identifiers to all SBML elements instead of only selected elements. Other materials and software are available from the SBML project website at http://sbml.org/.

  11. ArgoEcoSystem-watershed (AgES-W) model evaluation for streamflow and nitrogen/sediment dynamics on a midwest agricultural watershed

    USDA-ARS?s Scientific Manuscript database

    AgroEcoSystem-Watershed (AgES-W) is a modular, Java-based spatially distributed model which implements hydrologic/water quality simulation components under the Object Modeling System Version 3 (OMS3). The AgES-W model was previously evaluated for streamflow and recently has been enhanced with the ad...

  12. SMOKE TOOL FOR MODELS-3 VERSION 4.1 STRUCTURE AND OPERATION DOCUMENTATION

    EPA Science Inventory

    The SMOKE Tool is a part of the Models-3 system, a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. The SMOKE Tool is an input processor for SMOKE, (Sparse Matrix Operator Kernel Emissio...

  13. Semantic interoperability--HL7 Version 3 compared to advanced architecture standards.

    PubMed

    Blobel, B G M E; Engel, K; Pharow, P

    2006-01-01

    To meet the challenge for high quality and efficient care, highly specialized and distributed healthcare establishments have to communicate and co-operate in a semantically interoperable way. Information and communication technology must be open, flexible, scalable, knowledge-based and service-oriented as well as secure and safe. For enabling semantic interoperability, a unified process for defining and implementing the architecture, i.e. structure and functions of the cooperating systems' components, as well as the approach for knowledge representation, i.e. the used information and its interpretation, algorithms, etc. have to be defined in a harmonized way. Deploying the Generic Component Model, systems and their components, underlying concepts and applied constraints must be formally modeled, strictly separating platform-independent from platform-specific models. As HL7 Version 3 claims to represent the most successful standard for semantic interoperability, HL7 has been analyzed regarding the requirements for model-driven, service-oriented design of semantic interoperable information systems, thereby moving from a communication to an architecture paradigm. The approach is compared with advanced architectural approaches for information systems such as OMG's CORBA 3 or EHR systems such as GEHR/openEHR and CEN EN 13606 Electronic Health Record Communication. HL7 Version 3 is maturing towards an architectural approach for semantic interoperability. Despite current differences, there is a close collaboration between the teams involved guaranteeing a convergence between competing approaches.

  14. C-Language Integrated Production System, Version 6.0

    NASA Technical Reports Server (NTRS)

    Riley, Gary; Donnell, Brian; Ly, Huyen-Anh Bebe; Ortiz, Chris

    1995-01-01

    C Language Integrated Production System (CLIPS) computer programs are specifically intended to model human expertise or other knowledge. CLIPS is designed to enable research on, and development and delivery of, artificial intelligence on conventional computers. CLIPS 6.0 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming: representation of knowledge as heuristics - essentially, rules of thumb that specify set of actions performed in given situation. Object-oriented programming: modeling of complex systems comprised of modular components easily reused to model other systems or create new components. Procedural-programming: representation of knowledge in ways similar to those of such languages as C, Pascal, Ada, and LISP. Version of CLIPS 6.0 for IBM PC-compatible computers requires DOS v3.3 or later and/or Windows 3.1 or later.

  15. Joint Polar Satellite System (JPSS) Micrometeoroid and Orbital Debris (MMOD) Assessment

    NASA Technical Reports Server (NTRS)

    Squire, Michael D.; Cooke, William J.; Williamsen, Joel; Kessler, Donald; Vesely, William E.; Hull, Scott H.; Schonberg, William; Peterson, Glenn E.; Jenkin, Alan B.; Cornford, Steven L.

    2015-01-01

    The Joint Polar Satellite System (JPSS) Project requested the NASA Engineering and Safety Center (NESC) conduct an independent evaluation of the Micrometeoroid and Orbital Debris (MMOD) models used in the latest JPSS MMOD risk assessment. The principal focus of the assessment was to compare Orbital Debris Engineering Model version 3 (ORDEM 3.0) with the Meteoroid and Space Debris Terrestrial Environment Reference version 2009 (MASTER-2009) and Aerospace Debris Environment Projection Tool (ADEPT) and provide recommendations to the JPSS Project regarding MMOD protection. The outcome of the NESC assessment is contained in this report.

  16. Version 3 of the SMAP Level 4 Soil Moisture Product

    NASA Technical Reports Server (NTRS)

    Reichle, Rolf; Liu, Qing; Ardizzone, Joe; Crow, Wade; De Lannoy, Gabrielle; Kolassa, Jana; Kimball, John; Koster, Randy

    2017-01-01

    The NASA Soil Moisture Active Passive (SMAP) Level 4 Soil Moisture (L4_SM) product provides 3-hourly, 9-km resolution, global estimates of surface (0-5 cm) and root zone (0-100 cm) soil moisture as well as related land surface states and fluxes from 31 March 2015 to present with a latency of 2.5 days. The ensemble-based L4_SM algorithm is a variant of the Goddard Earth Observing System version 5 (GEOS-5) land data assimilation system and ingests SMAP L-band (1.4 GHz) Level 1 brightness temperature observations into the Catchment land surface model. The soil moisture analysis is non-local (spatially distributed), performs downscaling from the 36-km resolution of the observations to that of the model, and respects the relative uncertainties of the modeled and observed brightness temperatures. Prior to assimilation, a climatological rescaling is applied to the assimilated brightness temperatures using a 6 year record of SMOS observations. A new feature in Version 3 of the L4_SM data product is the use of 2 years of SMAP observations for rescaling where SMOS observations are not available because of radio frequency interference, which expands the impact of SMAP observations on the L4_SM estimates into large regions of northern Africa and Asia. This presentation investigates the performance and data assimilation diagnostics of the Version 3 L4_SM data product. The L4_SM soil moisture estimates meet the 0.04 m3m3 (unbiased) RMSE requirement. We further demonstrate that there is little bias in the soil moisture analysis. Finally, we illustrate where the assimilation system overestimates or underestimates the actual errors in the system.

  17. Integration of a three-dimensional process-based hydrological model into the Object Modeling System

    USDA-ARS?s Scientific Manuscript database

    The integration of a spatial process model into an environmental modelling framework can enhance the model’s capabilities. We present the integration of the GEOtop model into the Object Modeling System (OMS) version 3.0 and illustrate its application in a small watershed. GEOtop is a physically base...

  18. Pumping Optimization Model for Pump and Treat Systems - 15091

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baker, S.; Ivarson, Kristine A.; Karanovic, M.

    2015-01-15

    Pump and Treat systems are being utilized to remediate contaminated groundwater in the Hanford 100 Areas adjacent to the Columbia River in Eastern Washington. Design of the systems was supported by a three-dimensional (3D) fate and transport model. This model provided sophisticated simulation capabilities but requires many hours to calculate results for each simulation considered. Many simulations are required to optimize system performance, so a two-dimensional (2D) model was created to reduce run time. The 2D model was developed as a equivalent-property version of the 3D model that derives boundary conditions and aquifer properties from the 3D model. It producesmore » predictions that are very close to the 3D model predictions, allowing it to be used for comparative remedy analyses. Any potential system modifications identified by using the 2D version are verified for use by running the 3D model to confirm performance. The 2D model was incorporated into a comprehensive analysis system (the Pumping Optimization Model, POM) to simplify analysis of multiple simulations. It allows rapid turnaround by utilizing a graphical user interface that: 1 allows operators to create hypothetical scenarios for system operation, 2 feeds the input to the 2D fate and transport model, and 3 displays the scenario results to evaluate performance improvement. All of the above is accomplished within the user interface. Complex analyses can be completed within a few hours and multiple simulations can be compared side-by-side. The POM utilizes standard office computing equipment and established groundwater modeling software.« less

  19. MODEL VERSION CONTROL FOR GREAT LAKES MODELS ON UNIX SYSTEMS

    EPA Science Inventory

    Scientific results of the Lake Michigan Mass Balance Project were provided where atrazine was measured and modeled. The presentation also provided the model version control system which has been used for models at Grosse Ile for approximately a decade and contains various version...

  20. AN OVERVIEW OF EPANET VERSION 3.0

    EPA Science Inventory

    EPANET is a widely used public domain software package for modeling the hydraulic and water quality behavior of water distribution systems over an extended period of time. The last major update to the code was version 2.0 released in 2000 (Rossman, 2000). Since that time there ha...

  1. Major modes of short-term climate variability in the newly developed NUIST Earth System Model (NESM)

    NASA Astrophysics Data System (ADS)

    Cao, Jian; Wang, Bin; Xiang, Baoqiang; Li, Juan; Wu, Tianjie; Fu, Xiouhua; Wu, Liguang; Min, Jinzhong

    2015-05-01

    A coupled earth system model (ESM) has been developed at the Nanjing University of Information Science and Technology (NUIST) by using version 5.3 of the European Centre Hamburg Model (ECHAM), version 3.4 of the Nucleus for European Modelling of the Ocean (NEMO), and version 4.1 of the Los Alamos sea ice model (CICE). The model is referred to as NUIST ESM1 (NESM1). Comprehensive and quantitative metrics are used to assess the model's major modes of climate variability most relevant to subseasonal-to-interannual climate prediction. The model's assessment is placed in a multi-model framework. The model yields a realistic annual mean and annual cycle of equatorial SST, and a reasonably realistic precipitation climatology, but has difficulty in capturing the spring-fall asymmetry and monsoon precipitation domains. The ENSO mode is reproduced well with respect to its spatial structure, power spectrum, phase locking to the annual cycle, and spatial structures of the central Pacific (CP)-ENSO and eastern Pacific (EP)-ENSO; however, the equatorial SST variability, biennial component of ENSO, and the amplitude of CP-ENSO are overestimated. The model captures realistic intraseasonal variability patterns, the vertical-zonal structures of the first two leading predictable modes of Madden-Julian Oscillation (MJO), and its eastward propagation; but the simulated MJO speed is significantly slower than observed. Compared with the T42 version, the high resolution version (T159) demonstrates improved simulation with respect to the climatology, interannual variance, monsoon-ENSO lead-lag correlation, spatial structures of the leading mode of the Asian-Australian monsoon rainfall variability, and the eastward propagation of the MJO.

  2. Threat Assessment & Remediation Analysis (TARA): Methodology Description Version 1.0

    DTIC Science & Technology

    2011-10-01

    collectively support this practice. v Table of Contents 1 Introduction...4 1.3.2.3 Common Vulnerability Scoring System (CVSS) ........................................ 4 1.3.2.4 Microsoft Threat Modeling ...6 2.1.1.3 Eliminate Implausible TTPs ........................................................................ 6 2.1.1.4 Apply Scoring Model

  3. BehavePlus fire modeling system, version 5.0: Variables

    Treesearch

    Patricia L. Andrews

    2009-01-01

    This publication has been revised to reflect updates to version 4.0 of the BehavePlus software. It was originally published as the BehavePlus fire modeling system, version 4.0: Variables in July, 2008.The BehavePlus fire modeling system is a computer program based on mathematical models that describe wildland fire behavior and effects and the...

  4. PAN AIR: A computer program for predicting subsonic or supersonic linear potential flows about arbitrary configurations using a higher order panel method. Volume 2: User's manual (version 3.0)

    NASA Technical Reports Server (NTRS)

    Sidwell, Kenneth W.; Baruah, Pranab K.; Bussoletti, John E.; Medan, Richard T.; Conner, R. S.; Purdon, David J.

    1990-01-01

    A comprehensive description of user problem definition for the PAN AIR (Panel Aerodynamics) system is given. PAN AIR solves the 3-D linear integral equations of subsonic and supersonic flow. Influence coefficient methods are used which employ source and doublet panels as boundary surfaces. Both analysis and design boundary conditions can be used. This User's Manual describes the information needed to use the PAN AIR system. The structure and organization of PAN AIR are described, including the job control and module execution control languages for execution of the program system. The engineering input data are described, including the mathematical and physical modeling requirements. Version 3.0 strictly applies only to PAN AIR version 3.0. The major revisions include: (1) inputs and guidelines for the new FDP module (which calculates streamlines and offbody points); (2) nine new class 1 and class 2 boundary conditions to cover commonly used modeling practices, in particular the vorticity matching Kutta condition; (3) use of the CRAY solid state Storage Device (SSD); and (4) incorporation of errata and typo's together with additional explanation and guidelines.

  5. PRZM-3, A MODEL FOR PREDICTING PESTICIDE AND NITROGEN FATE IN THE CROP ROOT AND UNSATURATED SOIL ZONES: USER'S MANUAL FOR RELEASE 3.12.2

    EPA Science Inventory

    This publication contains documentation for the PRZM-3 model. PRZM-3 is the most recent version of a modeling system that links two subordinate models, PRZM and VADOFT, in order to predict pesticide transport and transformation down through the crop root and unsaturated soil zone...

  6. Monitoring the performance of the next Climate Forecast System version 3, throughout its development stage at EMC/NCEP

    NASA Astrophysics Data System (ADS)

    Peña, M.; Saha, S.; Wu, X.; Wang, J.; Tripp, P.; Moorthi, S.; Bhattacharjee, P.

    2016-12-01

    The next version of the operational Climate Forecast System (version 3, CFSv3) will be a fully coupled six-components system with diverse applications to earth system modeling, including weather and climate predictions. This system will couple the earth's atmosphere, land, ocean, sea-ice, waves and aerosols for both data assimilation and modeling. It will also use the NOAA Environmental Modeling System (NEMS) software super structure to couple these components. The CFSv3 is part of the next Unified Global Coupled System (UGCS), which will unify the global prediction systems that are now operational at NCEP. The UGCS is being developed through the efforts of dedicated research and engineering teams and through coordination across many CPO/MAPP and NGGPS groups. During this development phase, the UGCS is being tested for seasonal purposes and undergoes frequent revisions. Each new revision is evaluated to quickly discover, isolate and solve problems that negatively impact its performance. In the UGCS-seasonal model, components (e.g., ocean, sea-ice, atmosphere, etc.) are coupled through a NEMS-based "mediator". In this numerical infrastructure, model diagnostics and forecast validation are carried out, both component by component, and as a whole. The next stage, model optimization, will require enhanced performance diagnostics tools to help prioritize areas of numerical improvements. After the technical development of the UGCS-seasonal is completed, it will become the first realization of the CFSv3. All future development of this system will be carried out by the climate team at NCEP, in scientific collaboration with the groups that developed the individual components, as well as the climate community. A unique challenge to evaluate this unified weather-climate system is the large number of variables, which evolve over a wide range of temporal and spatial scales. A small set of performance measures and scorecard displays are been created, and collaboration and software contributions from research and operational centers are being incorporated. A status of the CFSv3/UGCS-seasonal development and examples of its performance and measuring tools will be presented.

  7. Potential System Integration Issues in the Joint Multi-Role (JMR) Joint Common Architecture (JCA) Demonstration System

    DTIC Science & Technology

    2015-12-01

    the MIS System/Subsystem Specification ( SSS ), and supplementary BAA document. On June 26, 2014, the SEI provided a draft interim report of the...findings and issues. The SEI team also received July 3, 2014, versions of the MIS Stakeholder Requirements, MIS SSS , and build plan and July 17, 2014...versions of the MIS SSS together with the MIS system model. On July 14–15, 2014, the SEI presented a summary of the issues at the two contractors

  8. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections

    PubMed Central

    Bailey, Stephanie L.; Bono, Rose S.; Nash, Denis; Kimmel, April D.

    2018-01-01

    Background Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. Methods We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. Results We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Conclusions Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited. PMID:29570737

  9. Implementing parallel spreadsheet models for health policy decisions: The impact of unintentional errors on model projections.

    PubMed

    Bailey, Stephanie L; Bono, Rose S; Nash, Denis; Kimmel, April D

    2018-01-01

    Spreadsheet software is increasingly used to implement systems science models informing health policy decisions, both in academia and in practice where technical capacity may be limited. However, spreadsheet models are prone to unintentional errors that may not always be identified using standard error-checking techniques. Our objective was to illustrate, through a methodologic case study analysis, the impact of unintentional errors on model projections by implementing parallel model versions. We leveraged a real-world need to revise an existing spreadsheet model designed to inform HIV policy. We developed three parallel versions of a previously validated spreadsheet-based model; versions differed by the spreadsheet cell-referencing approach (named single cells; column/row references; named matrices). For each version, we implemented three model revisions (re-entry into care; guideline-concordant treatment initiation; immediate treatment initiation). After standard error-checking, we identified unintentional errors by comparing model output across the three versions. Concordant model output across all versions was considered error-free. We calculated the impact of unintentional errors as the percentage difference in model projections between model versions with and without unintentional errors, using +/-5% difference to define a material error. We identified 58 original and 4,331 propagated unintentional errors across all model versions and revisions. Over 40% (24/58) of original unintentional errors occurred in the column/row reference model version; most (23/24) were due to incorrect cell references. Overall, >20% of model spreadsheet cells had material unintentional errors. When examining error impact along the HIV care continuum, the percentage difference between versions with and without unintentional errors ranged from +3% to +16% (named single cells), +26% to +76% (column/row reference), and 0% (named matrices). Standard error-checking techniques may not identify all errors in spreadsheet-based models. Comparing parallel model versions can aid in identifying unintentional errors and promoting reliable model projections, particularly when resources are limited.

  10. Structural Validity of CLASS K-3 in Primary Grades: Testing Alternative Models

    ERIC Educational Resources Information Center

    Sandilos, Lia E.; Shervey, Sarah Wollersheim; DiPerna, James C.; Lei, Puiwa; Cheng, Weiyi

    2017-01-01

    This study examined the internal structure of the Classroom Assessment Scoring System (CLASS; K-3 version). The original CLASS K-3 model (Pianta, La Paro, & Hamre, 2008) and 5 alternative models were tested using confirmatory factor analysis with a sample of first- and second-grade classrooms (N = 141). Findings indicated that a slightly…

  11. Economic Impact Forecast System (EIFS). Version 2.0. Users Manual. Supplement II. European Economic Impact Forecast System (EEIFS), Phase 1, (FRG/EIFS Pilot Model).

    DTIC Science & Technology

    1982-05-01

    Chmpip. tL : Construction engineering Research Laboratory ; available from NTIS. 1982. 71 p. (Technical report / Construction Engineering Researsh ...AD-Al17 661 CONSTRUCTION ENGINEERING RESEARCH LAB (ARMY) CHAMPAIGN IL F/G 5/3 ECONOMIC IMPACT FORECAST SYSTEM (EIFS). VERSION 2.0. USERS MANU--ETC(u...CONSTRUCTION ENGINEERING RESEARCH LABORATORY 4A762720A896-C-004 P.O. BOX 4005, CHAMPAIGN, IL 61820 I. CONTROLLING OFFICE NAME AND ADDRESS It. REPORT

  12. Development of Unsteady Aerodynamic and Aeroelastic Reduced-Order Models Using the FUN3D Code

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Vatsa, Veer N.; Biedron, Robert T.

    2009-01-01

    Recent significant improvements to the development of CFD-based unsteady aerodynamic reduced-order models (ROMs) are implemented into the FUN3D unstructured flow solver. These improvements include the simultaneous excitation of the structural modes of the CFD-based unsteady aerodynamic system via a single CFD solution, minimization of the error between the full CFD and the ROM unsteady aero- dynamic solution, and computation of a root locus plot of the aeroelastic ROM. Results are presented for a viscous version of the two-dimensional Benchmark Active Controls Technology (BACT) model and an inviscid version of the AGARD 445.6 aeroelastic wing using the FUN3D code.

  13. What's new in the Atmospheric Model Evaluation Tool (AMET) version 1.3

    EPA Science Inventory

    A new version of the Atmospheric Model Evaluation Tool (AMET) has been released. The new version of AMET, version 1.3 (AMETv1.3), contains a number of updates and changes from the previous of version of AMET (v1.2) released in 2012. First, the Perl scripts used in the previous ve...

  14. Implementation of the chemistry module MECCA (v2.5) in the modal aerosol version of the Community Atmosphere Model component (v3.6.33) of the Community Earth System Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, M. S.; Keene, W. C.; Easter, Richard C.

    2013-02-22

    A coupled atmospheric chemistry and climate system model was developed using the modal aerosol version of the National Center for Atmospheric Research Community Atmosphere Model (modal-CAM; v3.6.33) and the Max Planck Institute for Chemistry’s Module Efficiently Calculating the Chemistry of the Atmosphere (MECCA; v2.5) to provide enhanced resolution of multiphase processes, particularly those involving inorganic halogens, and associated impacts on atmospheric composition and climate. Three Rosenbrock solvers (Ros-2, Ros-3, RODAS-3) were tested in conjunction with the basic load-balancing options available to modal-CAM (1) to establish an optimal configuration of the implicitly-solved multiphase chemistry module that maximizes both computational speed andmore » repeatability of Ros- 2 and RODAS-3 results versus Ros-3, and (2) to identify potential implementation strategies for future versions of this and similar coupled systems. RODAS-3 was faster than Ros-2 and Ros-3 with good reproduction of Ros-3 results, while Ros-2 was both slower and substantially less reproducible relative to Ros-3 results. Modal-CAM with MECCA chemistry was a factor of 15 slower than modal-CAM using standard chemistry. MECCA chemistry integration times demonstrated a systematic frequency distribution for all three solvers, and revealed that the change in run-time performance was due to a change in the frequency distribution of chemical integration times; the peak frequency was similar for all solvers. This suggests that efficient chemistry-focused load-balancing schemes can be developed that rely on the parameters of this frequency distribution.« less

  15. The GRASP 3: Graphical Reliability Analysis Simulation Program. Version 3: A users' manual and modelling guide

    NASA Technical Reports Server (NTRS)

    Phillips, D. T.; Manseur, B.; Foster, J. W.

    1982-01-01

    Alternate definitions of system failure create complex analysis for which analytic solutions are available only for simple, special cases. The GRASP methodology is a computer simulation approach for solving all classes of problems in which both failure and repair events are modeled according to the probability laws of the individual components of the system.

  16. MODFLOW-2000, the U.S. Geological Survey modular ground-water model : user guide to the LMT6 package, the linkage with MT3DMS for multi-species mass transport modeling

    USGS Publications Warehouse

    Zheng, Chunmiao; Hill, Mary Catherine; Hsieh, Paul A.

    2001-01-01

    MODFLOW-2000, the newest version of MODFLOW, is a computer program that numerically solves the three-dimensional ground-water flow equation for a porous medium using a finite-difference method. MT3DMS, the successor to MT3D, is a computer program for modeling multi-species solute transport in three-dimensional ground-water systems using multiple solution techniques, including the finite-difference method, the method of characteristics (MOC), and the total-variation-diminishing (TVD) method. This report documents a new version of the Link-MT3DMS Package, which enables MODFLOW-2000 to produce the information needed by MT3DMS, and also discusses new visualization software for MT3DMS. Unlike the Link-MT3D Packages that coordinated previous versions of MODFLOW and MT3D, the new Link-MT3DMS Package requires an input file that, among other things, provides enhanced support for additional MODFLOW sink/source packages and allows list-directed (free) format for the flow model produced flow-transport link file. The report contains four parts: (a) documentation of the Link-MT3DMS Package Version 6 for MODFLOW-2000; (b) discussion of several issues related to simulation setup and input data preparation for running MT3DMS with MODFLOW-2000; (c) description of two test example problems, with comparison to results obtained using another MODFLOW-based transport program; and (d) overview of post-simulation visualization and animation using the U.S. Geological Survey?s Model Viewer.

  17. Ada Compiler Validation Summary Report: Certificate Number: 940305W1. 11335 TLD Systems, Ltd. TLD Comanche VAX/i960 Ada Compiler System, Version 4.1.1 VAX Cluster under VMS 5.5 = Tronix JIAWG Execution Vehicle (i960MX) under TLD Real Time Executive, Version 4.1.1

    DTIC Science & Technology

    1994-03-14

    Comanche VAX/i960 Ada Compiler System, Version 4.1.1 Host Computer System: Digital Local Area Network VAX Cluster executing on (2) MicroVAX 3100 Model 90...31 $MAX DIGITS 15 SmNx INT 2147483647 $MAX INT PLUS_1 2147483648 $MIN IN -2_147483648 A-3 MACR PARAMEERIS $NAME NO SUCH INTEGER TYPE $NAME LIST...nested generlcs are Supported and generics defined in libary units are pexitted. zt is not possible to pen ore a macro instantiation for a generic I

  18. Solar Advisor Model User Guide for Version 2.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilman, P.; Blair, N.; Mehos, M.

    2008-08-01

    The Solar Advisor Model (SAM) provides a consistent framework for analyzing and comparing power system costs and performance across the range of solar technologies and markets, from photovoltaic systems for residential and commercial markets to concentrating solar power and large photovoltaic systems for utility markets. This manual describes Version 2.0 of the software, which can model photovoltaic and concentrating solar power technologies for electric applications for several markets. The current version of the Solar Advisor Model does not model solar heating and lighting technologies.

  19. A web-system of virtual morphometric globes

    NASA Astrophysics Data System (ADS)

    Florinsky, Igor; Garov, Andrei; Karachevtseva, Irina

    2017-04-01

    Virtual globes — programs implementing interactive three-dimensional (3D) models of planets — are increasingly used in geo- and planetary sciences. We develop a web-system of virtual morphometric globes. As the initial data, we used the following global digital elevation models (DEMs): (1) a DEM of the Earth extracted from SRTM30_PLUS database; (2) a DEM of Mars extracted from the Mars Orbiter Laser Altimeter (MOLA) gridded data record archive; and (3) A DEM of the Moon extracted from the Lunar Orbiter Laser Altimeter (LOLA) gridded data record archive. From these DEMs, we derived global digital models of the following 16 local, nonlocal, and combined morphometric variables: horizontal curvature, vertical curvature, mean curvature, Gaussian curvature, minimal curvature, maximal curvature, unsphericity curvature, difference curvature, vertical excess curvature, horizontal excess curvature, ring curvature, accumulation curvature, catchment area, dispersive area, topographic index, and stream power index (definitions, formulae, and interpretations can be found elsewhere [1]). To calculate local morphometric variables, we applied a finite-difference method intended for spheroidal equal angular grids [1]. Digital models of a nonlocal and combined morphometric variables were derived by a method of Martz and de Jong adapted to spheroidal equal angular grids [1]. DEM processing was performed in the software LandLord [1]. The calculated morphometric models were integrated into the testing version of the system. The following main functions are implemented in the system: (1) selection of a celestial body; (2) selection of a morphometric variable; (3) 2D visualization of a calculated global morphometric model (a map in equirectangular projection); (4) 3D visualization of a calculated global morphometric model on the sphere surface (a globe by itself); (5) change of a globe scale (zooming); and (6) globe rotation by an arbitrary angle. The testing version of the system represents morphometric models with the resolution of 15'. In the final version of the system, we plan to implement a multiscale 3D visualization for models of 17 morphometric variables with the resolution from 15' to 30". The web-system of virtual morphometric globes is designed as a separate unit of a 3D web GIS for storage, processing, and access to planetary data [2], which is currently developed as an extension of an existing 2D web GIS (http://cartsrv.mexlab.ru/geoportal). Free, real-time web access to the system of virtual globes will be provided. The testing version of the system is available at: http://cartsrv.mexlab.ru/virtualglobe. The study is supported by the Russian Foundation for Basic Research, grant 15-07-02484. References 1. Florinsky, I.V., 2016. Digital Terrain Analysis in Soil Science and Geology. 2nd ed. Academic Press, Amsterdam, 486 p. 2. Garov, A.S., Karachevtseva, I.P., Matveev, E.V., Zubarev, A.E., and Florinsky, I.V., 2016. Development of a heterogenic distributed environment for spatial data processing using cloud technologies. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 41(B4): 385-390.

  20. Cyber Selection Test Research Effort for U.S. Army New Accessions

    DTIC Science & Technology

    2017-10-12

    assessment game 3. Develop an operational version of the STA game which incorporates assessments from phase 1 and (through game -play) examines...3 more STA abilities •5 STA behaviors 4. Validate the system thinking assessment game in an operational setting C O M PL ET ED PL AN N ED Research...Information Identifies Elements of Systems Models Relationships Understands System Dynamics Evaluates & Revises Model Applies Understanding to Problem STA Game

  1. FEAT - FAILURE ENVIRONMENT ANALYSIS TOOL (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Pack, G.

    1994-01-01

    The Failure Environment Analysis Tool, FEAT, enables people to see and better understand the effects of failures in a system. FEAT uses digraph models to determine what will happen to a system if a set of failure events occurs and to identify the possible causes of a selected set of failures. Failures can be user-selected from either engineering schematic or digraph model graphics, and the effects or potential causes of the failures will be color highlighted on the same schematic or model graphic. As a design tool, FEAT helps design reviewers understand exactly what redundancies have been built into a system and where weaknesses need to be protected or designed out. A properly developed digraph will reflect how a system functionally degrades as failures accumulate. FEAT is also useful in operations, where it can help identify causes of failures after they occur. Finally, FEAT is valuable both in conceptual development and as a training aid, since digraphs can identify weaknesses in scenarios as well as hardware. Digraphs models for use with FEAT are generally built with the Digraph Editor, a Macintosh-based application which is distributed with FEAT. The Digraph Editor was developed specifically with the needs of FEAT users in mind and offers several time-saving features. It includes an icon toolbox of components required in a digraph model and a menu of functions for manipulating these components. It also offers FEAT users a convenient way to attach a formatted textual description to each digraph node. FEAT needs these node descriptions in order to recognize nodes and propagate failures within the digraph. FEAT users store their node descriptions in modelling tables using any word processing or spreadsheet package capable of saving data to an ASCII text file. From within the Digraph Editor they can then interactively attach a properly formatted textual description to each node in a digraph. Once descriptions are attached to them, a selected set of nodes can be saved as a library file which represents a generic digraph structure for a class of components. The Generate Model feature can then use library files to generate digraphs for every component listed in the modeling tables, and these individual digraph files can be used in a variety of ways to speed generation of complete digraph models. FEAT contains a preprocessor which performs transitive closure on the digraph. This multi-step algorithm builds a series of phantom bridges, or gates, that allow accurate bi-directional processing of digraphs. This preprocessing can be time-consuming, but once preprocessing is complete, queries can be answered and displayed within seconds. A UNIX X-Windows port of version 3.5 of FEAT, XFEAT, is also available to speed the processing of digraph models created on the Macintosh. FEAT v3.6, which is only available for the Macintosh, has some report generation capabilities which are not available in XFEAT. For very large integrated systems, FEAT can be a real cost saver in terms of design evaluation, training, and knowledge capture. The capability of loading multiple digraphs and schematics into FEAT allows modelers to build smaller, more focused digraphs. Typically, each digraph file will represent only a portion of a larger failure scenario. FEAT will combine these files and digraphs from other modelers to form a continuous mathematical model of the system's failure logic. Since multiple digraphs can be cumbersome to use, FEAT ties propagation results to schematic drawings produced using MacDraw II (v1.1v2 or later) or MacDraw Pro. This makes it easier to identify single and double point failures that may have to cross several system boundaries and multiple engineering disciplines before creating a hazardous condition. FEAT v3.6 for the Macintosh is written in C-language using Macintosh Programmer's Workshop C v3.2. It requires at least a Mac II series computer running System 7 or System 6.0.8 and 32 Bit QuickDraw. It also requires a math coprocessor or coprocessor emulator and a color monitor (or one with 256 gray scale capability). A minimum of 4Mb of free RAM is highly recommended. The UNIX version of FEAT includes both FEAT v3.6 for the Macintosh and XFEAT. XFEAT is written in C-language for Sun series workstations running SunOS, SGI workstations running IRIX, DECstations running ULTRIX, and Intergraph workstations running CLIX version 6. It requires the MIT X Window System, Version 11 Revision 4, with OSF/Motif 1.1.3, and 16Mb of RAM. The standard distribution medium for FEAT 3.6 (Macintosh version) is a set of three 3.5 inch Macintosh format diskettes. The standard distribution package for the UNIX version includes the three FEAT 3.6 Macintosh diskettes plus a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format which contains XFEAT. Alternate distribution media and formats for XFEAT are available upon request. FEAT has been under development since 1990. Both FEAT v3.6 for the Macintosh and XFEAT v3.5 were released in 1993.

  2. Response to Nuclear Regulatory Commission`s ten questions pertaining to site-specific models for use in the license termination rule: Multimedia Environmental Pollutant Assessment System (MEPAS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buck, J.W.; Whelan, G.; Strenge, D.L.

    This paper is in response to the US Nuclear Regulatory Commission (NRC) ten questions posed at the Modeling Workshop held November 13 and 14, 1997. The ten questions were developed in advance of the workshop to allow model developers to prepare a presentation at the Workshop. This paper is an expanded version of the Multimedia Environmental Pollutant Assessment System (MEPAS) presentation given at the Modeling Workshop by Pacific Northwest National Laboratory (PNNL) staff. This paper is organized by the ten questions asked by the NRC, each section devoted to a single question. The current version of methodology is MEPAS 3.2more » (NRC 1997) and the discussion in this paper will pertain to that version. In some cases, MEPAS 4.0, which is currently being developed under the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) (Whelan et al. 1997), will be referenced to inform the reader of potential capabilities in the near future. A separate paper is included in the document that discusses the FRAMES concept.« less

  3. An event-version-based spatio-temporal modeling approach and its application in the cadastral management

    NASA Astrophysics Data System (ADS)

    Li, Yangdong; Han, Zhen; Liao, Zhongping

    2009-10-01

    Spatiality, temporality, legality, accuracy and continuality are characteristic of cadastral information, and the cadastral management demands that the cadastral data should be accurate, integrated and updated timely. It's a good idea to build an effective GIS management system to manage the cadastral data which are characterized by spatiality and temporality. Because no sound spatio-temporal data models have been adopted, however, the spatio-temporal characteristics of cadastral data are not well expressed in the existing cadastral management systems. An event-version-based spatio-temporal modeling approach is first proposed from the angle of event and version. Then with the help of it, an event-version-based spatio-temporal cadastral data model is built to represent spatio-temporal cadastral data. At last, the previous model is used in the design and implementation of a spatio-temporal cadastral management system. The result of the application of the system shows that the event-version-based spatio-temporal data model is very suitable for the representation and organization of cadastral data.

  4. The Optimal Convergence Rate of the p-Version of the Finite Element Method.

    DTIC Science & Technology

    1985-10-01

    commercial and research codes. The p-version and h-p versions are new developments. There is only one commercial code, the system PROBE ( Noetic Tech, St...Louis). The theoretical aspects have been studied only recently. The first theoretical paper appeared in 1981 (see [7)). See also [1), [7], [81, [9...model problem (2.2) (2.3) is a classical case of the elliptic equation problem on a nonsmooth domain. The structure of this problem is well studied

  5. NETPATH-WIN: an interactive user version of the mass-balance model, NETPATH

    USGS Publications Warehouse

    El-Kadi, A. I.; Plummer, Niel; Aggarwal, P.

    2011-01-01

    NETPATH-WIN is an interactive user version of NETPATH, an inverse geochemical modeling code used to find mass-balance reaction models that are consistent with the observed chemical and isotopic composition of waters from aquatic systems. NETPATH-WIN was constructed to migrate NETPATH applications into the Microsoft WINDOWS® environment. The new version facilitates model utilization by eliminating difficulties in data preparation and results analysis of the DOS version of NETPATH, while preserving all of the capabilities of the original version. Through example applications, the note describes some of the features of NETPATH-WIN as applied to adjustment of radiocarbon data for geochemical reactions in groundwater systems.

  6. Review and verification of CARE 3 mathematical model and code

    NASA Technical Reports Server (NTRS)

    Rose, D. M.; Altschul, R. E.; Manke, J. W.; Nelson, D. L.

    1983-01-01

    The CARE-III mathematical model and code verification performed by Boeing Computer Services were documented. The mathematical model was verified for permanent and intermittent faults. The transient fault model was not addressed. The code verification was performed on CARE-III, Version 3. A CARE III Version 4, which corrects deficiencies identified in Version 3, is being developed.

  7. User's guide to Model Viewer, a program for three-dimensional visualization of ground-water model results

    USGS Publications Warehouse

    Hsieh, Paul A.; Winston, Richard B.

    2002-01-01

    Model Viewer is a computer program that displays the results of three-dimensional groundwater models. Scalar data (such as hydraulic head or solute concentration) may be displayed as a solid or a set of isosurfaces, using a red-to-blue color spectrum to represent a range of scalar values. Vector data (such as velocity or specific discharge) are represented by lines oriented to the vector direction and scaled to the vector magnitude. Model Viewer can also display pathlines, cells or nodes that represent model features such as streams and wells, and auxiliary graphic objects such as grid lines and coordinate axes. Users may crop the model grid in different orientations to examine the interior structure of the data. For transient simulations, Model Viewer can animate the time evolution of the simulated quantities. The current version (1.0) of Model Viewer runs on Microsoft Windows 95, 98, NT and 2000 operating systems, and supports the following models: MODFLOW-2000, MODFLOW-2000 with the Ground-Water Transport Process, MODFLOW-96, MOC3D (Version 3.5), MODPATH, MT3DMS, and SUTRA (Version 2D3D.1). Model Viewer is designed to directly read input and output files from these models, thus minimizing the need for additional postprocessing. This report provides an overview of Model Viewer. Complete instructions on how to use the software are provided in the on-line help pages.

  8. Ada Compiler Validation Summary Report. Certificate Number: 900726W1. 11017, Verdix Corporation VADS IBM RISC System/6000, AIX 3.1, VAda-110-7171, Version 6.0 IBM RISC System/6000 Model 530 = IBM RISC System/6000 Model 530

    DTIC Science & Technology

    1991-01-22

    Customer Agreement Number: 90-05-29- VRX See Section 3.1 for any additional information about the testing environment. As a result of this validation...22 January 1991 90-05-29- VRX Ada COMPILER VALIDATION SUMMARY REPORT: Certificate Number: 900726W1.11017 Verdix Corporation VADS IBM RISC System/6000

  9. Development and Applications of the FV3 GEOS-5 Adjoint Modeling System

    NASA Technical Reports Server (NTRS)

    Holdaway, Daniel; Kim, Jong G.; Lin, Shian-Jiann; Errico, Ron; Gelaro, Ron; Kent, James; Coy, Larry; Doyle, Jim; Goldstein, Alex

    2017-01-01

    GMAO has developed a highly sophisticated adjoint modeling system based on the most recent version of the finite volume cubed sphere (FV3) dynamical core. This provides a mechanism for investigating sensitivity to initial conditions and examining observation impacts. It also allows for the computation of singular vectors and for the implementation of hybrid 4DVAR. In this work we will present the scientific assessment of the new adjoint system and show results from a number of research application of the adjoint system.

  10. Overview and Evaluation of the Community Multiscale Air Quality (CMAQ) Modeling System Version 5.2

    EPA Science Inventory

    A new version of the Community Multiscale Air Quality (CMAQ) model, version 5.2 (CMAQv5.2), is currently being developed, with a planned release date in 2017. The new model includes numerous updates from the previous version of the model (CMAQv5.1). Specific updates include a new...

  11. COSMIC monthly progress report

    NASA Technical Reports Server (NTRS)

    1994-01-01

    Activities of the Computer Software Management and Information Center (COSMIC) are summarized for the month of May 1994. Tables showing the current inventory of programs available from COSMIC are presented and program processing and evaluation activities are summarized. Nine articles were prepared for publication in the NASA Tech Brief Journal. These articles (included in this report) describe the following software items: (1) WFI - Windowing System for Test and Simulation; (2) HZETRN - A Free Space Radiation Transport and Shielding Program; (3) COMGEN-BEM - Composite Model Generation-Boundary Element Method; (4) IDDS - Interactive Data Display System; (5) CET93/PC - Chemical Equilibrium with Transport Properties, 1993; (6) SDVIC - Sub-pixel Digital Video Image Correlation; (7) TRASYS - Thermal Radiation Analyzer System (HP9000 Series 700/800 Version without NASADIG); (8) NASADIG - NASA Device Independent Graphics Library, Version 6.0 (VAX VMS Version); and (9) NASADIG - NASA Device Independent Graphics Library, Version 6.0 (UNIX Version). Activities in the areas of marketing, customer service, benefits identification, maintenance and support, and dissemination are also described along with a budget summary.

  12. A nested-grid limited-area model for short term weather forecasting

    NASA Technical Reports Server (NTRS)

    Wong, V. C.; Zack, J. W.; Kaplan, M. L.; Coats, G. D.

    1983-01-01

    The present investigation is concerned with a mesoscale atmospheric simulation system (MASS), incorporating the sigma-coordinate primitive equations. The present version of this model (MASS 3.0) has 14 vertical layers, with the upper boundary at 100 mb. There are 128 x 96 grid points in each layer. The earlier version of this model (MASS 2.0) has been described by Kaplan et al. (1982). The current investigation provides a summary of major revisions to that version and a description of the parameterization schemes which are presently included in the model. The planetary boundary layer (PBL) is considered, taking into account aspects of generalized similarity theory and free convection, the surface energy budget, the surface moisture budget, and prognostic equations for the depth h of the PBL. A cloud model is discussed, giving attention to stable precipitation, and cumulus convection.

  13. Blueprint for Acquisition Reform, Version 3.0

    DTIC Science & Technology

    2008-07-01

    represents a substantial and immediate step forward in establishing the Coast Guard as a model mid-sized federal agency for acquisition processes...Blueprint for Acquisition Reform in the U. S. Coast Guard “The Coast Guard must become the model for mid-sized Federal agency acquisition in process...acquisition (DoD 5000 model >CG Major Systems Acquisition Manual) • Deepwater Program Executive Officer (PEO): System of Systems performance-based

  14. Sensitivity of the Plume Rise Model in the estimation of biomass burning plume injection heights in South America

    NASA Astrophysics Data System (ADS)

    Ferrada, Gonzalo A.; Freitas, Saulo; Pereira, Gabriel; Paugam, Ronan

    2017-04-01

    This study had the aim to evaluate the new developments on the Plume Rise Model (PRM), embedded into the Brazilian developments on the Regional Atmospheric Modelling System (BRAMS). PRM computes the biomass burning plume injection heights and returns that information to the host model. Then, the atmospheric model releases all the fire emissions at this height. New developments are based on the initialization data used by the PRM, using fire size and fire radiative power (FRP) from remote sensing. The main difference between the two new versions is the conversion parameter (β) used to convert from FRP to the plume convective flux. In addition, a new scheme to generate daily fire emission fluxes is offered using the fire radiative energy (computed from remote sensing) in the Brazilian Biomass Burning Emission Model (3BEM-FRE). Model results using the three versions of the PRM are compared with observed airborne CO and O3 data from the SAMBBA campaign, which took place in southern Amazonia and Cerrado (savanna-like) regions in September 2012. Results show that improvements in both 3BEM-FRE and PRM models, had a better performance in the vertical and horizontal reproduction of CO and O3 than the original versions of them, especially in the middle and upper troposphere. Nevertheless, with some difficulty to reproduce the emissions by the end of the campaign, probably due to the cumulus parameterization used, which overestimated the precipitation in the region of study. Also, developments made in the 3BEM model show better agreement with the observed remote sensing data of daily fire emissions than the original version of it in the Amazon region, but with some difficulty in the Cerrado.

  15. Effect of software version and parameter settings on the marginal and internal adaptation of crowns fabricated with the CAD/CAM system.

    PubMed

    Shim, Ji Suk; Lee, Jin Sook; Lee, Jeong Yol; Choi, Yeon Jo; Shin, Sang Wan; Ryu, Jae Jun

    2015-10-01

    This study investigated the marginal and internal adaptation of individual dental crowns fabricated using a CAD/CAM system (Sirona's BlueCam), also evaluating the effect of the software version used, and the specific parameter settings in the adaptation of crowns. Forty digital impressions of a master model previously prepared were acquired using an intraoral scanner and divided into four groups based on the software version and on the spacer settings used. The versions 3.8 and 4.2 of the software were used, and the spacer parameter was set at either 40 μm or 80 μm. The marginal and internal fit of the crowns were measured using the replica technique, which uses a low viscosity silicone material that simulates the thickness of the cement layer. The data were analyzed using a Friedman two-way analysis of variance (ANOVA) and paired t-tests with significance level set at p<0.05. The two-way ANOVA analysis showed the software version (p<0.05) and the spacer parameter (p<0.05) significantly affected the crown adaptation. The crowns designed with the version 4.2 of the software showed a better fit than those designed with the version 3.8, particularly in the axial wall and in the inner margin. The spacer parameter was more accurately represented in the version 4.2 of the software than in the version 3.8. In addition, the use of the version 4.2 of the software combined with the spacer parameter set at 80 μm showed the least variation. On the other hand, the outer margin was not affected by the variables. Compared to the version 3.8 of the software, the version 4.2 can be recommended for the fabrication of well-fitting crown restorations, and for the appropriate regulation of the spacer parameter.

  16. Geometric Modelling of Tree Roots with Different Levels of Detail

    NASA Astrophysics Data System (ADS)

    Guerrero Iñiguez, J. I.

    2017-09-01

    This paper presents a geometric approach for modelling tree roots with different Levels of Detail, suitable for analysis of the tree anchoring, potentially occupied underground space, interaction with urban elements and damage produced and taken in the built-in environment. Three types of tree roots are considered to cover several species: tap root, heart shaped root and lateral roots. Shrubs and smaller plants are not considered, however, a similar approach can be considered if the information is available for individual species. The geometrical approach considers the difficulties of modelling the actual roots, which are dynamic and almost opaque to direct observation, proposing generalized versions. For each type of root, different geometric models are considered to capture the overall shape of the root, a simplified block model, and a planar or surface projected version. Lower detail versions are considered as compatibility version for 2D systems while higher detail models are suitable for 3D analysis and visualization. The proposed levels of detail are matched with CityGML Levels of Detail, enabling both analysis and aesthetic views for urban modelling.

  17. Architectural approaches for HL7-based health information systems implementation.

    PubMed

    López, D M; Blobel, B

    2010-01-01

    Information systems integration is hard, especially when semantic and business process interoperability requirements need to be met. To succeed, a unified methodology, approaching different aspects of systems architecture such as business, information, computational, engineering and technology viewpoints, has to be considered. The paper contributes with an analysis and demonstration on how the HL7 standard set can support health information systems integration. Based on the Health Information Systems Development Framework (HIS-DF), common architectural models for HIS integration are analyzed. The framework is a standard-based, consistent, comprehensive, customizable, scalable methodology that supports the design of semantically interoperable health information systems and components. Three main architectural models for system integration are analyzed: the point to point interface, the messages server and the mediator models. Point to point interface and messages server models are completely supported by traditional HL7 version 2 and version 3 messaging. The HL7 v3 standard specification, combined with service-oriented, model-driven approaches provided by HIS-DF, makes the mediator model possible. The different integration scenarios are illustrated by describing a proof-of-concept implementation of an integrated public health surveillance system based on Enterprise Java Beans technology. Selecting the appropriate integration architecture is a fundamental issue of any software development project. HIS-DF provides a unique methodological approach guiding the development of healthcare integration projects. The mediator model - offered by the HIS-DF and supported in HL7 v3 artifacts - is the more promising one promoting the development of open, reusable, flexible, semantically interoperable, platform-independent, service-oriented and standard-based health information systems.

  18. Microscopic approach based on a multiscale algebraic version of the resonating group model for radiative capture reactions

    NASA Astrophysics Data System (ADS)

    Solovyev, Alexander S.; Igashov, Sergey Yu.

    2017-12-01

    A microscopic approach to description of radiative capture reactions based on a multiscale algebraic version of the resonating group model is developed. The main idea of the approach is to expand wave functions of discrete spectrum and continuum for a nuclear system over different bases of the algebraic version of the resonating group model. These bases differ from each other by values of oscillator radius playing a role of scale parameter. This allows us in a unified way to calculate total and partial cross sections (astrophysical S factors) as well as branching ratio for the radiative capture reaction, to describe phase shifts for the colliding nuclei in the initial channel of the reaction, and at the same time to reproduce breakup thresholds of the final nucleus. The approach is applied to the theoretical study of the mirror 3H(α ,γ )7Li and 3He(α ,γ )7Be reactions, which are of great interest to nuclear astrophysics. The calculated results are compared with existing experimental data and with our previous calculations in the framework of the single-scale algebraic version of the resonating group model.

  19. Time-Dependent Simulation of Incompressible Flow in a Turbopump Using Overset Grid Approach

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Kwak, Dochan

    2001-01-01

    This paper reports the progress being made towards complete unsteady turbopump simulation capability by using overset grid systems. A computational model of a turbo-pump impeller is used as a test case for the performance evaluation of the MPI, hybrid MPI/Open-MP, and MLP versions of the INS3D code. Relative motion of the grid system for rotor-stator interaction was obtained by employing overset grid techniques. Unsteady computations for a turbo-pump, which contains 114 zones with 34.3 Million grid points, are performed on Origin 2000 systems at NASA Ames Research Center. The approach taken for these simulations, and the performance of the parallel versions of the code are presented.

  20. Robust effects of cloud superparameterization on simulated daily rainfall intensity statistics across multiple versions of the Community Earth System Model

    DOE PAGES

    Kooperman, Gabriel J.; Pritchard, Michael S.; Burt, Melissa A.; ...

    2016-02-01

    This study evaluates several important statistics of daily rainfall based on frequency and amount distributions as simulated by a global climate model whose precipitation does not depend on convective parameterization—Super-Parameterized Community Atmosphere Model (SPCAM). Three superparameterized and conventional versions of CAM, coupled within the Community Earth System Model (CESM1 and CCSM4), are compared against two modern rainfall products (GPCP 1DD and TRMM 3B42) to discriminate robust effects of superparameterization that emerge across multiple versions. The geographic pattern of annual-mean rainfall is mostly insensitive to superparameterization, with only slight improvements in the double-ITCZ bias. However, unfolding intensity distributions reveal several improvementsmore » in the character of rainfall simulated by SPCAM. The rainfall rate that delivers the most accumulated rain (i.e., amount mode) is systematically too weak in all versions of CAM relative to TRMM 3B42 and does not improve with horizontal resolution. It is improved by superparameterization though, with higher modes in regions of tropical wave, Madden-Julian Oscillation, and monsoon activity. Superparameterization produces better representations of extreme rates compared to TRMM 3B42, without sensitivity to horizontal resolution seen in CAM. SPCAM produces more dry days over land and fewer over the ocean. Updates to CAM’s low cloud parameterizations have narrowed the frequency peak of light rain, converging toward SPCAM. Poleward of 50°, where more rainfall is produced by resolved-scale processes in CAM, few differences discriminate the rainfall properties of the two models. Lastly, these results are discussed in light of their implication for future rainfall changes in response to climate forcing.« less

  1. Solving a discrete model of the lac operon using Z3

    NASA Astrophysics Data System (ADS)

    Gutierrez, Natalia A.

    2014-05-01

    A discrete model for the Lcac Operon is solved using the SMT-solver Z3. Traditionally the Lac Operon is formulated in a continuous math model. This model is a system of ordinary differential equations. Here, it was considerated as a discrete model, based on a Boolean red. The biological problem of Lac Operon is enunciated as a problem of Boolean satisfiability, and it is solved using an STM-solver named Z3. Z3 is a powerful solver that allows understanding the basic dynamic of the Lac Operon in an easier and more efficient way. The multi-stability of the Lac Operon can be easily computed with Z3. The code that solves the Boolean red can be written in Python language or SMT-Lib language. Both languages were used in local version of the program as online version of Z3. For future investigations it is proposed to solve the Boolean red of Lac Operon using others SMT-solvers as cvc4, alt-ergo, mathsat and yices.

  2. Seasonal scale water deficit forecasting in Africa and the Middle East using NASA's Land Information System (LIS)

    NASA Astrophysics Data System (ADS)

    Shukla, Shraddhanand; Arsenault, Kristi R.; Getirana, Augusto; Kumar, Sujay V.; Roningen, Jeanne; Zaitchik, Ben; McNally, Amy; Koster, Randal D.; Peters-Lidard, Christa

    2017-04-01

    Drought and water scarcity are among the important issues facing several regions within Africa and the Middle East. A seamless and effective monitoring and early warning system is needed by regional/national stakeholders. Such system should support a proactive drought management approach and mitigate the socio-economic losses up to the extent possible. In this presentation, we report on the ongoing development and validation of a seasonal scale water deficit forecasting system based on NASA's Land Information System (LIS) and seasonal climate forecasts. First, our presentation will focus on the implementation and validation of the LIS models used for drought and water availability monitoring in the region. The second part will focus on evaluating drought and water availability forecasts. Finally, details will be provided of our ongoing collaboration with end-user partners in the region (e.g., USAID's Famine Early Warning Systems Network, FEWS NET), on formulating meaningful early warning indicators, effective communication and seamless dissemination of the monitoring and forecasting products through NASA's web-services. The water deficit forecasting system thus far incorporates NOAA's Noah land surface model (LSM), version 3.3, the Variable Infiltration Capacity (VIC) model, version 4.12, NASA GMAO's Catchment LSM, and the Noah Multi-Physics (MP) LSM (the latter two incorporate prognostic water table schemes). In addition, the LSMs' surface and subsurface runoff are routed through the Hydrological Modeling and Analysis Platform (HyMAP) to simulate surface water dynamics. The LSMs are driven by NASA/GMAO's Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2), and the USGS and UCSB Climate Hazards Group InfraRed Precipitation with Station (CHIRPS) daily rainfall dataset. The LIS software framework integrates these forcing datasets and drives the four LSMs and HyMAP. The Land Verification Toolkit (LVT) is used for the evaluation of the LSMs, as it provides model ensemble metrics and the ability to compare against a variety of remotely sensed measurements, like different evapotranspiration (ET) and soil moisture products, and other reanalysis datasets that are available for this region. Comparison of the models' energy and hydrological budgets will be shown for this region (and sub-basin level, e.g., Blue Nile River) and time period (1981-2015), along with evaluating ET, streamflow, groundwater storage and soil moisture, using evaluation metrics (e.g., anomaly correlation, RMSE, etc.). The system uses seasonal climate forecasts from NASA's GMAO (the Goddard Earth Observing System Model, version 5) and NCEP's Climate Forecast System, version 2, and it produces forecasts of soil moisture, ET and streamflow out to 6 months in the future. Forecasts of those variables are formulated in terms of indicators to provide forecasts of drought and water availability in the region.

  3. Evaluating Surface Flux Results from CERES-FLASHFlux

    NASA Technical Reports Server (NTRS)

    Wilber, Anne C.; Stackhouse, Paul W., Jr.; Kratz, David P.; Gupta, Shashi K.; Sawaengphokhai, Parnchai K.

    2015-01-01

    The Fast Longwave and Shortwave Radiative Flux (FLASHFlux) data product was developed to provide a rapid release version of the Clouds and Earth's Radiant Energy System (CERES) results, which could be made available to the research and applications communities within one week of the satellite observations by exchanging some accuracy for speed of processing. Unlike standard CERES products, FLASHFlux does not maintain a long-term consistent record. Therefore the latest algorithm changes and input data can be incorporated into processing. FLASHFlux released Version3A (January 2013) and Version 3B (August 2014) which include the latest meteorological product from Global Modeling and Assimilation Office (GMAO), GEOS FP-IT (5.9.1), the latest spectral response functions and gains for the CERES instruments, and aerosol climatology based on the latest MATCH data. Version 3B included a slightly updated calibration and some changes to the surface albedo over snow/ice. Typically FLASHFlux does not reprocess earlier versions when a new version is released. The combined record of Time Interpolated Space Averaged (TISA) surface flux results from Versions3A and 3B for July 2012 to October 2015 have been compared to the ground-based measurements. The FLASHFlux results are also compared to two other CERES gridded products, SYN1deg and EBAF surface fluxes.

  4. GEOS S2S-2_1: GMAO's New High Resolution Seasonal Prediction System

    NASA Technical Reports Server (NTRS)

    Molod, Andrea; Akella, Santha; Andrews, Lauren; Barahona, Donifan; Borovikov, Anna; Chang, Yehui; Cullather, Richard; Hackert, Eric; Kovach, Robin; Koster, Randal; hide

    2017-01-01

    A new version of the modeling and analysis system used to produce sub-seasonal to seasonal forecasts has just been released by the NASA Goddard Global Modeling and Assimilation Office. The new version runs at higher atmospheric resolution (approximately 12 degree globally), contains a substantially improved model description of the cryosphere, and includes additional interactive earth system model components (aerosol model). In addition, the Ocean data assimilation system has been replaced with a Local Ensemble Transform Kalman Filter. Here will describe the new system, along with the plans for the future (GEOS S2S-3_0) which will include a higher resolution ocean model and more interactive earth system model components (interactive vegetation, biomass burning from fires). We will also present results from a free-running coupled simulation with the new system and results from a series of retrospective seasonal forecasts. Results from retrospective forecasts show significant improvements in surface temperatures over much of the northern hemisphere and a much improved prediction of sea ice extent in both hemispheres. The precipitation forecast skill is comparable to previous S2S systems, and the only trade off is an increased double ITCZ, which is expected as we go to higher atmospheric resolution.

  5. User Guide for VISION 3.4.7 (Verifiable Fuel Cycle Simulation) Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacob J. Jacobson; Robert F. Jeffers; Gretchen E. Matthern

    2011-07-01

    The purpose of this document is to provide a guide for using the current version of the Verifiable Fuel Cycle Simulation (VISION) model. This is a complex model with many parameters and options; the user is strongly encouraged to read this user guide before attempting to run the model. This model is an R&D work in progress and may contain errors and omissions. It is based upon numerous assumptions. This model is intended to assist in evaluating 'what if' scenarios and in comparing fuel, reactor, and fuel processing alternatives at a systems level. The model is not intended as amore » tool for process flow and design modeling of specific facilities nor for tracking individual units of fuel or other material through the system. The model is intended to examine the interactions among the components of a fuel system as a function of time varying system parameters; this model represents a dynamic rather than steady-state approximation of the nuclear fuel system. VISION models the nuclear cycle at the system level, not individual facilities, e.g., 'reactor types' not individual reactors and 'separation types' not individual separation plants. Natural uranium can be enriched, which produces enriched uranium, which goes into fuel fabrication, and depleted uranium (DU), which goes into storage. Fuel is transformed (transmuted) in reactors and then goes into a storage buffer. Used fuel can be pulled from storage into either separation or disposal. If sent to separations, fuel is transformed (partitioned) into fuel products, recovered uranium, and various categories of waste. Recycled material is stored until used by its assigned reactor type. VISION is comprised of several Microsoft Excel input files, a Powersim Studio core, and several Microsoft Excel output files. All must be co-located in the same folder on a PC to function. You must use Powersim Studio 8 or better. We have tested VISION with the Studio 8 Expert, Executive, and Education versions. The Expert and Education versions work with the number of reactor types of 3 or less. For more reactor types, the Executive version is currently required. The input files are Excel2003 format (xls). The output files are macro-enabled Excel2007 format (xlsm). VISION 3.4 was designed with more flexibility than previous versions, which were structured for only three reactor types - LWRs that can use only uranium oxide (UOX) fuel, LWRs that can use multiple fuel types (LWR MF), and fast reactors. One could not have, for example, two types of fast reactors concurrently. The new version allows 10 reactor types and any user-defined uranium-plutonium fuel is allowed. (Thorium-based fuels can be input but several features of the model would not work.) The user identifies (by year) the primary fuel to be used for each reactor type. The user can identify for each primary fuel a contingent fuel to use if the primary fuel is not available, e.g., a reactor designated as using mixed oxide fuel (MOX) would have UOX as the contingent fuel. Another example is that a fast reactor using recycled transuranic (TRU) material can be designated as either having or not having appropriately enriched uranium oxide as a contingent fuel. Because of the need to study evolution in recycling and separation strategies, the user can now select the recycling strategy and separation technology, by year.« less

  6. Examples of Nonconservatism in the CARE 3 Program

    NASA Technical Reports Server (NTRS)

    Dotson, Kelly J.

    1988-01-01

    This paper presents parameter regions in the CARE 3 (Computer-Aided Reliability Estimation version 3) computer program where the program overestimates the reliability of a modeled system without warning the user. Five simple models of fault-tolerant computer systems are analyzed; and, the parameter regions where reliability is overestimated are given. The source of the error in the reliability estimates for models which incorporate transient fault occurrences was not readily apparent. However, the source of much of the error for models with permanent and intermittent faults can be attributed to the choice of values for the run-time parameters of the program.

  7. Development of U-Mart System with Plural Brands and Plural Markets

    NASA Astrophysics Data System (ADS)

    Akimoto, Yoshihito; Mori, Naoki; Ono, Isao; Nakajima, Yoshihiro; Kita, Hajime; Matsumoto, Keinosuke

    In this paper, we first discuss the notion that artificial market systems should meet the requirements of fidelity, transparency, reproducibility, and traceability. Next, we introduce history of development of the artificial market system named U-Mart system that meet the requirements well, which have been developed by the U-Mart project. We have already developed the U-Mart system called “U-Mart system version 3.0” to solve problems of old U-Mart systems. In version 3.0 system, trading process is modularized and universal market system can be easily introduced.
    However, U-Mart system version 3.0 only simulates the single brand futures market. The simulation of the plural brands and plural markets has been required by lot of users. In this paper, we proposed a novel U-Mart system called “U-Mart system version 4.0” to solve this problem of U-Mart system version 3.0. We improve the server system, machine agents and GUI in order to simulate plural brands and plural markets in U-Mart system version 4.0. The effectiveness of the proposed system is confirmed by statistical analysis of results of spot market simulation with random agents.

  8. [3D modeling of the female pelvis by Computer-Assisted Anatomical Dissection: Applications and perspectives].

    PubMed

    Balaya, V; Uhl, J-F; Lanore, A; Salachas, C; Samoyeau, T; Ngo, C; Bensaid, C; Cornou, C; Rossi, L; Douard, R; Bats, A-S; Lecuru, F; Delmas, V

    2016-05-01

    To achieve a 3D vectorial model of a female pelvis by Computer-Assisted Anatomical Dissection and to assess educationnal and surgical applications. From the database of "visible female" of Visible Human Project(®) (VHP) of the "national library of medicine" NLM (United States), we used 739 transverse anatomical slices of 0.33mm thickness going from L4 to the trochanters. The manual segmentation of each anatomical structures was done with Winsurf(®) software version 4.3. Each anatomical element was built as a separate vectorial object. The whole colored-rendered vectorial model with realistic textures was exported in 3Dpdf format to allow a real time interactive manipulation with Acrobat(®) pro version 11 software. Each element can be handled separately at any transparency, which allows an anatomical learning by systems: skeleton, pelvic organs, urogenital system, arterial and venous vascularization. This 3D anatomical model can be used as data bank to teach of the fundamental anatomy. This 3D vectorial model, realistic and interactive constitutes an efficient educational tool for the teaching of the anatomy of the pelvis. 3D printing of the pelvis is possible with the new printers. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  9. Evaluation of CESM1 (WACCM) with Observations of Stratospheric Composition

    NASA Astrophysics Data System (ADS)

    Kinnison, Doug; Froidevaux, Lucien; Garcia, Rolando; Fuller, Ryan

    2017-04-01

    The Community Earth System Model version 1 (CESM1) Whole Atmosphere Community Climate Model (WACCM) is used in this study. CESM1 (WACCM) includes a detailed representation of tropospheric through lower thermospheric chemistry and physical processes. Simulations for this work were based on scenarios defined by the Chemistry Climate Model Initiative (CCMI). These scenarios included both free-running (FR) and specified-dynamics versions (SD) of CESM1 (WACCM). Comparisons were made with global monthly zonal mean stratospheric data records from satellite-based remote measurements created by the Global Ozone Chemistry and Related Trace gas Data Records for the Stratosphere (GOZCARDS) project. These data records were drawn from high quality measurements of stratospheric composition starting in 1979 for ozone and in the early 1990s for other species. We discuss stratospheric variability and trends through analyses of observed time series of ozone (O3), hydrogen chloride (HCl), nitrous oxide (N2O), nitric acid (HNO3), and water vapor (H2O), and we contrast the fits from the FR and SD model versions. Conclusions from this work have aided in the development of a new version of CESM (WACCM) that will be used in the next Intergovernmental Panel on Climate Change (IPCC) Coupled Model Intercomparison Project Phase 6 (CMIP6) assessment.

  10. Semantic World Modelling and Data Management in a 4d Forest Simulation and Information System

    NASA Astrophysics Data System (ADS)

    Roßmann, J.; Hoppen, M.; Bücken, A.

    2013-08-01

    Various types of 3D simulation applications benefit from realistic forest models. They range from flight simulators for entertainment to harvester simulators for training and tree growth simulations for research and planning. Our 4D forest simulation and information system integrates the necessary methods for data extraction, modelling and management. Using modern methods of semantic world modelling, tree data can efficiently be extracted from remote sensing data. The derived forest models contain position, height, crown volume, type and diameter of each tree. This data is modelled using GML-based data models to assure compatibility and exchangeability. A flexible approach for database synchronization is used to manage the data and provide caching, persistence, a central communication hub for change distribution, and a versioning mechanism. Combining various simulation techniques and data versioning, the 4D forest simulation and information system can provide applications with "both directions" of the fourth dimension. Our paper outlines the current state, new developments, and integration of tree extraction, data modelling, and data management. It also shows several applications realized with the system.

  11. INTEGRATED AIR POLLUTION CONTROL SYSTEM VERSION 5.0 - VOLUME 3: PROGRAMMER'S MAINTENANCE MANUAL

    EPA Science Inventory

    The three volume report and two diskettes document the Integrated Air Pollution Control System (IAPCS), developed for the U.S. EPA to estimate costs and performance for emission control systems applied to coal-fired utility boilers. The model can project a material balance, an eq...

  12. INTEGRATED AIR POLLUTION CONTROL SYSTEM, VERSION 4.0 - VOLUME 3: PROGRAMMER'S MAINTENACE MANUAL

    EPA Science Inventory

    The Integrated Air Pollution Control System (IAPCS) was developed for the U.S. EPA's Air and Energy Engineering Research Laboratory to estimate costs and performance for emission control systems applied to coal-fired utility boilers. The model can project a material balance, and ...

  13. Description and evaluation of a new four-mode version of the Modal Aerosol Module (MAM4) within version 5.3 of the Community Atmosphere Model

    NASA Astrophysics Data System (ADS)

    Liu, X.; Ma, P.-L.; Wang, H.; Tilmes, S.; Singh, B.; Easter, R. C.; Ghan, S. J.; Rasch, P. J.

    2016-02-01

    Atmospheric carbonaceous aerosols play an important role in the climate system by influencing the Earth's radiation budgets and modifying the cloud properties. Despite the importance, their representations in large-scale atmospheric models are still crude, which can influence model simulated burden, lifetime, physical, chemical and optical properties, and the climate forcing of carbonaceous aerosols. In this study, we improve the current three-mode version of the Modal Aerosol Module (MAM3) in the Community Atmosphere Model version 5 (CAM5) by introducing an additional primary carbon mode to explicitly account for the microphysical ageing of primary carbonaceous aerosols in the atmosphere. Compared to MAM3, the four-mode version of MAM (MAM4) significantly increases the column burdens of primary particulate organic matter (POM) and black carbon (BC) by up to 40 % in many remote regions, where in-cloud scavenging plays an important role in determining the aerosol concentrations. Differences in the column burdens for other types of aerosol (e.g., sulfate, secondary organic aerosols, mineral dust, sea salt) are less than 1 %. Evaluating the MAM4 simulation against in situ surface and aircraft observations, we find that MAM4 significantly improves the simulation of seasonal variation of near-surface BC concentrations in the polar regions, by increasing the BC concentrations in all seasons and particularly in cold seasons. However, it exacerbates the overestimation of modeled BC concentrations in the upper troposphere in the Pacific regions. The comparisons suggest that, to address the remaining model POM and BC biases, future improvements are required related to (1) in-cloud scavenging and vertical transport in convective clouds and (2) emissions of anthropogenic and biomass burning aerosols.

  14. Description and evaluation of a new four-mode version of the Modal Aerosol Module (MAM4) within version 5.3 of the Community Atmosphere Model

    DOE PAGES

    Liu, X.; Ma, P. -L.; Wang, H.; ...

    2016-02-08

    Atmospheric carbonaceous aerosols play an important role in the climate system by influencing the Earth's radiation budgets and modifying the cloud properties. Despite the importance, their representations in large-scale atmospheric models are still crude, which can influence model simulated burden, lifetime, physical, chemical and optical properties, and the climate forcing of carbonaceous aerosols. In this study, we improve the current three-mode version of the Modal Aerosol Module (MAM3) in the Community Atmosphere Model version 5 (CAM5) by introducing an additional primary carbon mode to explicitly account for the microphysical ageing of primary carbonaceous aerosols in the atmosphere. Compared to MAM3,more » the four-mode version of MAM (MAM4) significantly increases the column burdens of primary particulate organic matter (POM) and black carbon (BC) by up to 40 % in many remote regions, where in-cloud scavenging plays an important role in determining the aerosol concentrations. Differences in the column burdens for other types of aerosol (e.g., sulfate, secondary organic aerosols, mineral dust, sea salt) are less than 1 %. Evaluating the MAM4 simulation against in situ surface and aircraft observations, we find that MAM4 significantly improves the simulation of seasonal variation of near-surface BC concentrations in the polar regions, by increasing the BC concentrations in all seasons and particularly in cold seasons. However, it exacerbates the overestimation of modeled BC concentrations in the upper troposphere in the Pacific regions. As a result, the comparisons suggest that, to address the remaining model POM and BC biases, future improvements are required related to (1) in-cloud scavenging and vertical transport in convective clouds and (2) emissions of anthropogenic and biomass burning aerosols.« less

  15. 2011 Version 6.3 Technical Support Document

    EPA Pesticide Factsheets

    This TSD describes how the emission inventories were prepared for air quality modeling for the years 2011, 2017, and 2025 using the 2011, version 6.2 emissions modeling platform, which is based on the 2011 National Emissions Inventory, Version 3

  16. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core

    PubMed Central

    Hucka, Michael; Bergmann, Frank T.; Hoops, Stefan; Keating, Sarah M.; Sahle, Sven; Schaff, James C.; Smith, Lucian P.; Wilkinson, Darren J.

    2017-01-01

    Summary Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/. PMID:26528564

  17. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J

    2015-09-04

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.

  18. The Systems Biology Markup Language (SBML): Language Specification for Level 3 Version 1 Core.

    PubMed

    Hucka, Michael; Bergmann, Frank T; Hoops, Stefan; Keating, Sarah M; Sahle, Sven; Schaff, James C; Smith, Lucian P; Wilkinson, Darren J

    2015-06-01

    Computational models can help researchers to interpret data, understand biological function, and make quantitative predictions. The Systems Biology Markup Language (SBML) is a file format for representing computational models in a declarative form that can be exchanged between different software systems. SBML is oriented towards describing biological processes of the sort common in research on a number of topics, including metabolic pathways, cell signaling pathways, and many others. By supporting SBML as an input/output format, different tools can all operate on an identical representation of a model, removing opportunities for translation errors and assuring a common starting point for analyses and simulations. This document provides the specification for Version 1 of SBML Level 3 Core. The specification defines the data structures prescribed by SBML as well as their encoding in XML, the eXtensible Markup Language. This specification also defines validation rules that determine the validity of an SBML document, and provides many examples of models in SBML form. Other materials and software are available from the SBML project web site, http://sbml.org/.

  19. Due Regard Encounter Model Version 1.0

    DTIC Science & Technology

    2013-08-19

    Table No. Page 1 Encounter model categories. 3 2 Geographic domain limits. 8 3 Cut points used for feature quantization. 15 B-1 Validation results. 34...out to the limits of radar coverage [ 8 ]. • Correlated Encounter Model of the National Airspace System (C): A correlated encounter model is used to...Note that no existing model covers encoun- ters between two IFR aircraft in oceanic airspace. The reason for this is that one cannot observe encounters

  20. Potential climate-induced runoff changes and associated uncertainty in four Pacific Northwest estuaries

    USGS Publications Warehouse

    Steele, Madeline O.; Chang, Heejun; Reusser, Deborah A.; Brown, Cheryl A.; Jung, Il-Won

    2012-01-01

    As part of a larger investigation into potential effects of climate change on estuarine habitats in the Pacific Northwest, we estimated changes in freshwater inputs into four estuaries: Coquille River estuary, South Slough of Coos Bay, and Yaquina Bay in Oregon, and Willapa Bay in Washington. We used the U.S. Geological Survey's Precipitation Runoff Modeling System (PRMS) to model watershed hydrological processes under current and future climatic conditions. This model allowed us to explore possible shifts in coastal hydrologic regimes at a range of spatial scales. All modeled watersheds are located in rainfall-dominated coastal areas with relatively insignificant base flow inputs, and their areas vary from 74.3 to 2,747.6 square kilometers. The watersheds also vary in mean elevation, ranging from 147 meters in the Willapa to 1,179 meters in the Coquille. The latitudes of watershed centroids range from 43.037 degrees north latitude in the Coquille River estuary to 46.629 degrees north latitude in Willapa Bay. We calibrated model parameters using historical climate grid data downscaled to one-sixteenth of a degree by the Climate Impacts Group, and historical runoff from sub-watersheds or neighboring watersheds. Nash Sutcliffe efficiency values for daily flows in calibration sub-watersheds ranged from 0.71 to 0.89. After calibration, we forced the PRMS models with four North American Regional Climate Change Assessment Program climate models: Canadian Regional Climate Model-(National Center for Atmospheric Research) Community Climate System Model version 3, Canadian Regional Climate Model-Canadian Global Climate Model version 3, Hadley Regional Model version 3-Hadley Centre Climate Model version 3, and Regional Climate Model-Canadian Global Climate Model version 3. These are global climate models (GCMs) downscaled with regional climate models that are embedded within the GCMs, and all use the A2 carbon emission scenario developed by the Intergovernmental Panel on Climate Change. With these climate-forcing outputs, we derived the mean change in flow from the period encompassing the 1980s (1971-1995) to the period encompassing the 2050s (2041-2065). Specifically, we calculated percent change in mean monthly flow rate, coefficient of variation, top 5 percent of flow, and 7-day low flow. The trends with the most agreement among climate models and among watersheds were increases in autumn mean monthly flows, especially in October and November, decreases in summer monthly mean flow, and increases in the top 5 percent of flow. We also estimated variance in PRMS outputs owing to parameter uncertainty and the selection of climate model using Latin hypercube sampling. This analysis showed that PRMS low-flow simulations are more uncertain than medium or high flow simulations, and that variation among climate models was a larger source of uncertainty than the hydrological model parameters. These results improve our understanding of how climate change may affect the saltwater-freshwater balance in Pacific Northwest estuaries, with implications for their sensitive ecosystems.

  1. Description and evaluation of the Community Multiscale Air Quality (CMAQ) modeling system version 5.1

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a comprehensive multipollutant air quality modeling system developed and maintained by the US Environmental Protection Agency's (EPA) Office of Research and Development (ORD). Recently, version 5.1 of the CMAQ model (v5.1) was ...

  2. Dynamic coupling of regional atmosphere to biosphere in the new generation regional climate system model REMO-iMOVE

    NASA Astrophysics Data System (ADS)

    Wilhelm, C.; Rechid, D.; Jacob, D.

    2013-05-01

    The main objective of this study is the coupling of the regional climate model REMO to a 3rd generation land surface scheme and the evaluation of the new model version of REMO, called REMO with interactive MOsaic-based VEgetation: REMO-iMOVE. Attention is paid to the documentation of the technical aspects of the new model constituents and the coupling mechanism. We compare simulation results of REMO-iMOVE and of the reference version REMO2009, to investigate the sensitivity of the regional model to the new land surface scheme. An 11 yr climate model run (1995-2005), forced with ECMWF ERA-Interim lateral boundary conditions, over Europe in 0.44° resolution of both model versions was carried out, to represent present day European climate. The result of these experiments are compared to multiple temperature, precipitation, heat flux and leaf area index observation data, to determine the differences in the model versions. The new model version has further the ability to model net primary productivity for the given plant functional types. This new feature is thoroughly evaluated by literature values of net primary productivity of different plant species in European climatic regions. The new model version REMO-iMOVE is able to model the European climate in the same quality as the parent model version REMO2009 does. The differences in the results of the two model versions stem from the differences in the dynamics of vegetation cover and density and can be distinct in some regions, due to the influences of these parameters to the surface heat and moisture fluxes. The modeled inter-annual variability in the phenology as well as the net primary productivity lays in the range of observations and literature values for most European regions. This study also reveals the need for a more sophisticated soil moisture representation in the newly developed model version REMO-iMOVE to be able to treat the differences in plant functional types. This gets especially important if the model will be used in dynamic vegetation studies.

  3. Early Clinical Manifestations Associated with Death from Visceral Leishmaniasis

    PubMed Central

    de Araújo, Valdelaine Etelvina Miranda; Morais, Maria Helena Franco; Reis, Ilka Afonso; Rabello, Ana; Carneiro, Mariângela

    2012-01-01

    Background In Brazil, lethality from visceral leishmaniasis (VL) is high and few studies have addressed prognostic factors. This historical cohort study was designed to investigate the prognostic factors for death from VL in Belo Horizonte (Brazil). Methodology The analysis was based on data of the Reportable Disease Information System-SINAN (Brazilian Ministry of Health) relating to the clinical manifestations of the disease. During the study period (2002–2009), the SINAN changed platform from a Windows to a Net-version that differed with respect to some of the parameters collected. Multivariate logistic regression models were performed to identify variables associated with death from VL, and these were included in prognostic score. Principal Findings Model 1 (period 2002–2009; 111 deaths from VL and 777 cured patients) included the variables present in both SINAN versions, whereas Model 2 (period 2007–2009; 49 deaths from VL and 327 cured patients) included variables common to both SINAN versions plus the additional variables included in the Net version. In Model 1, the variables significantly associated with a greater risk of death from VL were weakness (OR 2.9; 95%CI 1.3–6.4), Leishmania-HIV co-infection (OR 2.4; 95%CI 1.2–4.8) and age ≥60 years (OR 2.5; 95%CI 1.5–4.3). In Model 2, the variables were bleeding (OR 3.5; 95%CI 1.2–10.3), other associated infections (OR 3.2; 95%CI 1.3–7.8), jaundice (OR 10.1; 95%CI 3.7–27.2) and age ≥60 years (OR 3.1; 95%CI 1.4–7.1). The prognosis score was developed using the variables associated with death from VL of the latest version of the SINAN (Model 2). The predictive performance of which was evaluated by sensitivity (71.4%), specificity (73.7%), positive and negative predictive values (28.9% and 94.5%) and area under the receiver operating characteristic curve (75.6%). Conclusions Knowledge regarding the factors associated with death from VL may improve clinical management of patients and contribute to lower mortality. PMID:22347514

  4. GEOS S2S-2_1: The GMAO new high resolution Seasonal Prediction System

    NASA Astrophysics Data System (ADS)

    Molod, A.; Vikhliaev, Y. V.; Hackert, E. C.; Kovach, R. M.; Zhao, B.; Cullather, R. I.; Marshak, J.; Borovikov, A.; Li, Z.; Barahona, D.; Andrews, L. C.; Chang, Y.; Schubert, S. D.; Koster, R. D.; Suarez, M.; Akella, S.

    2017-12-01

    A new version of the modeling and analysis system used to produce subseasonalto seasonal forecasts has just been released by the NASA/Goddard GlobalModeling and Assimilation Office. The new version runs at higher atmospheric resolution (approximately 1/2 degree globally), contains a subtantially improvedmodel description of the cryosphere, and includes additional interactive earth system model components (aerosol model). In addition, the Ocean data assimilationsystem has been replaced with a Local Ensemble Transform Kalman Filter.Here will describe the new system, along with the plans for the future (GEOS S2S-3_0) which will include a higher resolution ocean model and more interactive earth system model components (interactive vegetation, biomass burning from fires). We will alsopresent results from a free-running coupled simulation with the new system and resultsfrom a series of retrospective seasonal forecasts.Results from retrospective forecasts show significant improvements in surface temperaturesover much of the northern hemisphere and a much improved prediction of sea ice extent in bothhemispheres. The precipitation forecast skill is comparable to previous S2S systems, andthe only tradeoff is an increased "double ITCZ", which is expected as we go to higher atmospheric resolution.

  5. Mineralogic Model (MM3.0) Analysis Model Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. Lum

    2002-02-12

    The purpose of this report is to document the Mineralogic Model (MM), Version 3.0 (MM3.0) with regard to data input, modeling methods, assumptions, uncertainties, limitations and validation of the model results, qualification status of the model, and the differences between Version 3.0 and previous versions. A three-dimensional (3-D) Mineralogic Model was developed for Yucca Mountain to support the analyses of hydrologic properties, radionuclide transport, mineral health hazards, repository performance, and repository design. Version 3.0 of the MM was developed from mineralogic data obtained from borehole samples. It consists of matrix mineral abundances as a function of x (easting), y (northing),more » and z (elevation), referenced to the stratigraphic framework defined in Version 3.1 of the Geologic Framework Model (GFM). The MM was developed specifically for incorporation into the 3-D Integrated Site Model (ISM). The MM enables project personnel to obtain calculated mineral abundances at any position, within any region, or within any stratigraphic unit in the model area. The significance of the MM for key aspects of site characterization and performance assessment is explained in the following subsections. This work was conducted in accordance with the Development Plan for the MM (CRWMS M&O 2000). The planning document for this Rev. 00, ICN 02 of this AMR is Technical Work Plan, TWP-NBS-GS-000003, Technical Work Plan for the Integrated Site Model, Process Model Report, Revision 01 (CRWMS M&O 2000). The purpose of this ICN is to record changes in the classification of input status by the resolution of the use of TBV software and data in this report. Constraints and limitations of the MM are discussed in the appropriate sections that follow. The MM is one component of the ISM, which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed stratigraphy and structural features of the site into a 3-D model that will be useful in primary downstream models and repository design. These downstream models include the hydrologic flow models and the radionuclide transport models. All the models and the repository design, in turn, will be incorporated into the Total System Performance Assessment (TSPA) of the potential nuclear waste repository block and vicinity to determine the suitability of Yucca Mountain as a host for a repository. The interrelationship of the three components of the ISM and their interface with downstream uses are illustrated in Figure 1. The lateral boundaries of the ISM and its three component models are shown in Figure 2.« less

  6. Military Retirement Reform: A Review of Proposals and Options for Congress

    DTIC Science & Technology

    2011-11-17

    year of service (either High-3 or Redux), reservists are restricted to one system (a modified version of High-3), and disability retirees also have...the opportunity to choose between two systems (High-3 or Disability retirement). Many observers have agreed that a reformed retirement system could...2008 by the 10th Quadrennial Review of Military Compensation (QRMC) that further modeled , refined, and amplified on the work of the DACMC. Both efforts

  7. Automated Interactive Simulation Model (AISIM) VAX Version 5.0 Training Manual.

    DTIC Science & Technology

    1987-05-29

    action, activity, decision , etc. that consumes time. The entity is automatically created by the system when an ACTION Primitive is placed. 1.3.2.4 The...MODELED SYSTEM 1.3.2.1 The Process Entity. A Process is used to represent the operations, decisions , actions or activities that can be decomposed and...is associated with the Action entity described below, is included in Process definitions to indicate the time a certain Action (or process, decision

  8. A spectral nudging method for the ACCESS1.3 atmospheric model

    NASA Astrophysics Data System (ADS)

    Uhe, P.; Thatcher, M.

    2015-06-01

    A convolution-based method of spectral nudging of atmospheric fields is developed in the Australian Community Climate and Earth Systems Simulator (ACCESS) version 1.3 which uses the UK Met Office Unified Model version 7.3 as its atmospheric component. The use of convolutions allow for flexibility in application to different atmospheric grids. An approximation using one-dimensional convolutions is applied, improving the time taken by the nudging scheme by 10-30 times compared with a version using a two-dimensional convolution, without measurably degrading its performance. Care needs to be taken in the order of the convolutions and the frequency of nudging to obtain the best outcome. The spectral nudging scheme is benchmarked against a Newtonian relaxation method, nudging winds and air temperature towards ERA-Interim reanalyses. We find that the convolution approach can produce results that are competitive with Newtonian relaxation in both the effectiveness and efficiency of the scheme, while giving the added flexibility of choosing which length scales to nudge.

  9. A spectral nudging method for the ACCESS1.3 atmospheric model

    NASA Astrophysics Data System (ADS)

    Uhe, P.; Thatcher, M.

    2014-10-01

    A convolution based method of spectral nudging of atmospheric fields is developed in the Australian Community Climate and Earth Systems Simulator (ACCESS) version 1.3 which uses the UK Met Office Unified Model version 7.3 as its atmospheric component. The use of convolutions allow flexibility in application to different atmospheric grids. An approximation using one-dimensional convolutions is applied, improving the time taken by the nudging scheme by 10 to 30 times compared with a version using a two-dimensional convolution, without measurably degrading its performance. Care needs to be taken in the order of the convolutions and the frequency of nudging to obtain the best outcome. The spectral nudging scheme is benchmarked against a Newtonian relaxation method, nudging winds and air temperature towards ERA-Interim reanalyses. We find that the convolution approach can produce results that are competitive with Newtonian relaxation in both the effectiveness and efficiency of the scheme, while giving the added flexibility of choosing which length scales to nudge.

  10. Evaluating the Ocean Component of the US Navy Earth System Model

    NASA Astrophysics Data System (ADS)

    Zamudio, L.

    2017-12-01

    Ocean currents, temperature, and salinity observations are used to evaluate the ocean component of the US Navy Earth System Model. The ocean and atmosphere components of the system are an eddy-resolving (1/12.5° equatorial resolution) version of the HYbrid Coordinate Ocean Model (HYCOM), and a T359L50 version of the NAVy Global Environmental Model (NAVGEM), respectively. The system was integrated in hindcast mode and the ocean results are compared against unassimilated observations, a stand-alone version of HYCOM, and the Generalized Digital Environment Model ocean climatology. The different observation types used in the system evaluation are: drifting buoys, temperature profiles, salinity profiles, and acoustical proxies (mixed layer depth, sonic layer depth, below layer gradient, and acoustical trapping). To evaluate the system's performance in each different metric, a scorecard is used to translate the system's errors into scores, which provide an indication of the system's skill in both space and time.

  11. Long wavelength propagation capacity, version 1.1 (computer diskette)

    NASA Astrophysics Data System (ADS)

    1994-05-01

    File Characteristics: software and data file. (72 files); ASCII character set. Physical Description: 2 computer diskettes; 3 1/2 in.; high density; 1.44 MB. System Requirements: PC compatible; Digital Equipment Corp. VMS; PKZIP (included on diskette). This report describes a revision of the Naval Command, Control and Ocean Surveillance Center RDT&E Division's Long Wavelength Propagation Capability (LWPC). The first version of this capability was a collection of separate FORTRAN programs linked together in operation by a command procedure written in an operating system unique to the Digital Equipment Corporation (Ferguson & Snyder, 1989a, b). A FORTRAN computer program named Long Wavelength Propagation Model (LWPM) was developed to replace the VMS control system (Ferguson & Snyder, 1990; Ferguson, 1990). This was designated version 1 (LWPC-1). This program implemented all the features of the original VMS plus a number of auxiliary programs that provided summaries of the files and graphical displays of the output files. This report describes a revision of the LWPC, designated version 1.1 (LWPC-1.1)

  12. SINDA'85/FLUINT - SYSTEMS IMPROVED NUMERICAL DIFFERENCING ANALYZER AND FLUID INTEGRATOR (CONVEX VERSION)

    NASA Technical Reports Server (NTRS)

    Cullimore, B.

    1994-01-01

    SINDA, the Systems Improved Numerical Differencing Analyzer, is a software system for solving lumped parameter representations of physical problems governed by diffusion-type equations. SINDA was originally designed for analyzing thermal systems represented in electrical analog, lumped parameter form, although its use may be extended to include other classes of physical systems which can be modeled in this form. As a thermal analyzer, SINDA can handle such interrelated phenomena as sublimation, diffuse radiation within enclosures, transport delay effects, and sensitivity analysis. FLUINT, the FLUid INTegrator, is an advanced one-dimensional fluid analysis program that solves arbitrary fluid flow networks. The working fluids can be single phase vapor, single phase liquid, or two phase. The SINDA'85/FLUINT system permits the mutual influences of thermal and fluid problems to be analyzed. The SINDA system consists of a programming language, a preprocessor, and a subroutine library. The SINDA language is designed for working with lumped parameter representations and finite difference solution techniques. The preprocessor accepts programs written in the SINDA language and converts them into standard FORTRAN. The SINDA library consists of a large number of FORTRAN subroutines that perform a variety of commonly needed actions. The use of these subroutines can greatly reduce the programming effort required to solve many problems. A complete run of a SINDA'85/FLUINT model is a four step process. First, the user's desired model is run through the preprocessor which writes out data files for the processor to read and translates the user's program code. Second, the translated code is compiled. The third step requires linking the user's code with the processor library. Finally, the processor is executed. SINDA'85/FLUINT program features include 20,000 nodes, 100,000 conductors, 100 thermal submodels, and 10 fluid submodels. SINDA'85/FLUINT can also model two phase flow, capillary devices, user defined fluids, gravity and acceleration body forces on a fluid, and variable volumes. SINDA'85/FLUINT offers the following numerical solution techniques. The Finite difference formulation of the explicit method is the Forward-difference explicit approximation. The formulation of the implicit method is the Crank-Nicolson approximation. The program allows simulation of non-uniform heating and facilitates modeling thin-walled heat exchangers. The ability to model non-equilibrium behavior within two-phase volumes is included. Recent improvements to the program were made in modeling real evaporator-pumps and other capillary-assist evaporators. SINDA'85/FLUINT is available by license for a period of ten (10) years to approved licensees. The licensed program product includes the source code and one copy of the supporting documentation. Additional copies of the documentation may be purchased separately at any time. SINDA'85/FLUINT is written in FORTRAN 77. Version 2.3 has been implemented on Cray series computers running UNICOS, CONVEX computers running CONVEX OS, and DEC RISC computers running ULTRIX. Binaries are included with the Cray version only. The Cray version of SINDA'85/FLUINT also contains SINGE, an additional graphics program developed at Johnson Space Flight Center. Both source and executable code are provided for SINGE. Users wishing to create their own SINGE executable will also need the NASA Device Independent Graphics Library (NASADIG, previously known as SMDDIG; UNIX version, MSC-22001). The Cray and CONVEX versions of SINDA'85/FLUINT are available on 9-track 1600 BPI UNIX tar format magnetic tapes. The CONVEX version is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format. The DEC RISC ULTRIX version is available on a TK50 magnetic tape cartridge in UNIX tar format. SINDA was developed in 1971, and first had fluid capability added in 1975. SINDA'85/FLUINT version 2.3 was released in 1990.

  13. Incremental Testing of the Community Multiscale Air Quality (CMAQ) Modeling System Version 4.7

    EPA Science Inventory

    This paper describes the scientific and structural updates to the latest release of the Community Multiscale Air Quality (CMAQ) modeling system version 4.7 (v4.7) and points the reader to additional resources for further details. The model updates were evaluated relative to obse...

  14. Designing and implementing a regional urban modeling system using the SLEUTH cellular urban model

    USGS Publications Warehouse

    Jantz, Claire A.; Goetz, Scott J.; Donato, David I.; Claggett, Peter

    2010-01-01

    This paper presents a fine-scale (30 meter resolution) regional land cover modeling system, based on the SLEUTH cellular automata model, that was developed for a 257000 km2 area comprising the Chesapeake Bay drainage basin in the eastern United States. As part of this effort, we developed a new version of the SLEUTH model (SLEUTH-3r), which introduces new functionality and fit metrics that substantially increase the performance and applicability of the model. In addition, we developed methods that expand the capability of SLEUTH to incorporate economic, cultural and policy information, opening up new avenues for the integration of SLEUTH with other land-change models. SLEUTH-3r is also more computationally efficient (by a factor of 5) and uses less memory (reduced 65%) than the original software. With the new version of SLEUTH, we were able to achieve high accuracies at both the aggregate level of 15 sub-regional modeling units and at finer scales. We present forecasts to 2030 of urban development under a current trends scenario across the entire Chesapeake Bay drainage basin, and three alternative scenarios for a sub-region within the Chesapeake Bay watershed to illustrate the new ability of SLEUTH-3r to generate forecasts across a broad range of conditions.

  15. Standard Port-Visit Cost Forecasting Model for U.S. Navy Husbanding Contracts

    DTIC Science & Technology

    2009-12-01

    Protocol (HTTP) server.35 2. MySQL . An open-source database.36 3. PHP . A common scripting language used for Web development.37 E. IMPLEMENTATION OF...Inc. (2009). MySQL Community Server (Version 5.1) [Software]. Available from http://dev.mysql.com/downloads/ 37 The PHP Group (2009). PHP (Version...Logistics Services MySQL My Structured Query Language NAVSUP Navy Supply Systems Command NC Non-Contract Items NPS Naval Postgraduate

  16. National Centers for Environmental Prediction

    Science.gov Websites

    Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration consists of the following components: - The NOAA Environmental Modeling System (NEMS) version of the Non updates for the 12 km parent domain and the 3 km CONUS/Alaska nests. The non-cycled nests (Hawaii, Puerto

  17. Care 3 phase 2 report, maintenance manual

    NASA Technical Reports Server (NTRS)

    Bryant, L. A.; Stiffler, J. J.

    1982-01-01

    CARE 3 (Computer-Aided Reliability Estimation, version three) is a computer program designed to help estimate the reliability of complex, redundant systems. Although the program can model a wide variety of redundant structures, it was developed specifically for fault-tolerant avionics systems--systems distinguished by the need for extremely reliable performance since a system failure could well result in the loss of human life. It substantially generalizes the class of redundant configurations that could be accommodated, and includes a coverage model to determine the various coverage probabilities as a function of the applicable fault recovery mechanisms (detection delay, diagnostic scheduling interval, isolation and recovery delay, etc.). CARE 3 further generalizes the class of system structures that can be modeled and greatly expands the coverage model to take into account such effects as intermittent and transient faults, latent faults, error propagation, etc.

  18. Comparative Evaluation of Five Fire Emissions Datasets Using the GEOS-5 Model

    NASA Astrophysics Data System (ADS)

    Ichoku, C. M.; Pan, X.; Chin, M.; Bian, H.; Darmenov, A.; Ellison, L.; Kucsera, T. L.; da Silva, A. M., Jr.; Petrenko, M. M.; Wang, J.; Ge, C.; Wiedinmyer, C.

    2017-12-01

    Wildfires and other types of biomass burning affect most vegetated parts of the globe, contributing 40% of the annual global atmospheric loading of carbonaceous aerosols, as well as significant amounts of numerous trace gases, such as carbon dioxide, carbon monoxide, and methane. Many of these smoke constituents affect the air quality and/or the climate system directly or through their interactions with solar radiation and cloud properties. However, fire emissions are poorly constrained in global and regional models, resulting in high levels of uncertainty in understanding their real impacts. With the advent of satellite remote sensing of fires and burned areas in the last couple of decades, a number of fire emissions products have become available for use in relevant research and applications. In this study, we evaluated five global biomass burning emissions datasets, namely: (1) GFEDv3.1 (Global Fire Emissions Database version 3.1); (2) GFEDv4s (Global Fire Emissions Database version 4 with small fires); (3) FEERv1 (Fire Energetics and Emissions Research version 1.0); (4) QFEDv2.4 (Quick Fire Emissions Dataset version 2.4); and (5) Fire INventory from NCAR (FINN) version 1.5. Overall, the spatial patterns of biomass burning emissions from these inventories are similar, although the magnitudes of the emissions can be noticeably different. The inventories derived using top-down approaches (QFEDv2.4 and FEERv1) are larger than those based on bottom-up approaches. For example, global organic carbon (OC) emissions in 2008 are: QFEDv2.4 (51.93 Tg), FEERv1 (28.48 Tg), FINN v1.5 (19.48 Tg), GFEDv3.1 (15.65 Tg) and GFEDv4s (13.76 Tg); representing a factor of 3.7 difference between the largest and the least. We also used all five biomass-burning emissions datasets to conduct aerosol simulations using the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5), and compared the resulting aerosol optical depth (AOD) output to the corresponding retrievals from MODIS and AERONET. Simulated AOD based on all five emissions inventories show significant underestimation in biomass burning dominated regions. Attributions of possible factors responsible for the differences among the inventories were further explored in Southern Africa and South America, two of the major biomass burning regions of the world.

  19. Distributed Visualization Project

    NASA Technical Reports Server (NTRS)

    Craig, Douglas; Conroy, Michael; Kickbusch, Tracey; Mazone, Rebecca

    2016-01-01

    Distributed Visualization allows anyone, anywhere to see any simulation at any time. Development focuses on algorithms, software, data formats, data systems and processes to enable sharing simulation-based information across temporal and spatial boundaries without requiring stakeholders to possess highly-specialized and very expensive display systems. It also introduces abstraction between the native and shared data, which allows teams to share results without giving away proprietary or sensitive data. The initial implementation of this capability is the Distributed Observer Network (DON) version 3.1. DON 3.1 is available for public release in the NASA Software Store (https://software.nasa.gov/software/KSC-13775) and works with version 3.0 of the Model Process Control specification (an XML Simulation Data Representation and Communication Language) to display complex graphical information and associated Meta-Data.

  20. Comparison of GFED3, QFED2 and FEER1 Biomass Burning Emissions Datasets in a Global Model

    NASA Technical Reports Server (NTRS)

    Pan, Xiaohua; Ichoku, Charles; Bian, Huisheng; Chin, Mian; Ellison, Luke; da Silva, Arlindo; Darmenov, Anton

    2015-01-01

    Biomass burning contributes about 40% of the global loading of carbonaceous aerosols, significantly affecting air quality and the climate system by modulating solar radiation and cloud properties. However, fire emissions are poorly constrained in models on global and regional levels. In this study, we investigate 3 global biomass burning emission datasets in NASA GEOS5, namely: (1) GFEDv3.1 (Global Fire Emissions Database version 3.1); (2) QFEDv2.4 (Quick Fire Emissions Dataset version 2.4); (3) FEERv1 (Fire Energetics and Emissions Research version 1.0). The simulated aerosol optical depth (AOD), absorption AOD (AAOD), angstrom exponent and surface concentrations of aerosol plumes dominated by fire emissions are evaluated and compared to MODIS, OMI, AERONET, and IMPROVE data over different regions. In general, the spatial patterns of biomass burning emissions from these inventories are similar, although the strength of the emissions can be noticeably different. The emissions estimates from QFED are generally larger than those of FEER, which are in turn larger than those of GFED. AOD simulated with all these 3 databases are lower than the corresponding observations in Southern Africa and South America, two of the major biomass burning regions in the world.

  1. Latest NASA Instrument Cost Model (NICM): Version VI

    NASA Technical Reports Server (NTRS)

    Mrozinski, Joe; Habib-Agahi, Hamid; Fox, George; Ball, Gary

    2014-01-01

    The NASA Instrument Cost Model, NICM, is a suite of tools which allow for probabilistic cost estimation of NASA's space-flight instruments at both the system and subsystem level. NICM also includes the ability to perform cost by analogy as well as joint confidence level (JCL) analysis. The latest version of NICM, Version VI, was released in Spring 2014. This paper will focus on the new features released with NICM VI, which include: 1) The NICM-E cost estimating relationship, which is applicable for instruments flying on Explorer-like class missions; 2) The new cluster analysis ability which, alongside the results of the parametric cost estimation for the user's instrument, also provides a visualization of the user's instrument's similarity to previously flown instruments; and 3) includes new cost estimating relationships for in-situ instruments.

  2. On the hyperbolicity and stability of 3+1 formulations of metric f( R) gravity

    NASA Astrophysics Data System (ADS)

    Mongwane, Bishop

    2016-11-01

    3+1 formulations of the Einstein field equations have become an invaluable tool in Numerical relativity, having been used successfully in modeling spacetimes of black hole collisions, stellar collapse and other complex systems. It is plausible that similar considerations could prove fruitful for modified gravity theories. In this article, we pursue from a numerical relativistic viewpoint the 3+1 formulation of metric f( R) gravity as it arises from the fourth order equations of motion, without invoking the dynamical equivalence with Brans-Dicke theories. We present the resulting system of evolution and constraint equations for a generic function f( R), subject to the usual viability conditions. We confirm that the time propagation of the f( R) Hamiltonian and Momentum constraints take the same Mathematical form as in general relativity, irrespective of the f( R) model. We further recast the 3+1 system in a form akin to the BSSNOK formulation of numerical relativity. Without assuming any specific model, we show that the ADM version of f( R) is weakly hyperbolic and is plagued by similar zero speed modes as in the general relativity case. On the other hand the BSSNOK version is strongly hyperbolic and hence a promising formulation for numerical simulations in metric f( R) theories.

  3. Retrospective Analysis of Clinical Performance of an Estonian Speech Recognition System for Radiology: Effects of Different Acoustic and Language Models.

    PubMed

    Paats, A; Alumäe, T; Meister, E; Fridolin, I

    2018-04-30

    The aim of this study was to analyze retrospectively the influence of different acoustic and language models in order to determine the most important effects to the clinical performance of an Estonian language-based non-commercial radiology-oriented automatic speech recognition (ASR) system. An ASR system was developed for Estonian language in radiology domain by utilizing open-source software components (Kaldi toolkit, Thrax). The ASR system was trained with the real radiology text reports and dictations collected during development phases. The final version of the ASR system was tested by 11 radiologists who dictated 219 reports in total, in spontaneous manner in a real clinical environment. The audio files collected in the final phase were used to measure the performance of different versions of the ASR system retrospectively. ASR system versions were evaluated by word error rate (WER) for each speaker and modality and by WER difference for the first and the last version of the ASR system. Total average WER for the final version throughout all material was improved from 18.4% of the first version (v1) to 5.8% of the last (v8) version which corresponds to relative improvement of 68.5%. WER improvement was strongly related to modality and radiologist. In summary, the performance of the final ASR system version was close to optimal, delivering similar results to all modalities and being independent on user, the complexity of the radiology reports, user experience, and speech characteristics.

  4. COSMIC monthly progress report

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Activities of the Computer Software Management and Information Center (COSMIC) are summarized for the month of August, 1993. Tables showing the current inventory of programs available from COSMIC are presented and program processing and evaluation activities are discussed. Ten articles were prepared for publication in the NASA Tech Brief Journal. These articles (included in this report) describe the following software items: (1) MOM3D - A Method of Moments Code for Electromagnetic Scattering (UNIX Version); (2) EM-Animate - Computer Program for Displaying and Animating the Steady-State Time-Harmonic Electromagnetic Near Field and Surface-Current Solutions; (3) MOM3D - A Method of Moments Code for Electromagnetic Scattering (IBM PC Version); (4) M414 - MIL-STD-414 Variable Sampling Procedures Computer Program; (5) MEDOF - Minimum Euclidean Distance Optimal Filter; (6) CLIPS 6.0 - C Language Integrated Production System, Version 6.0 (Macintosh Version); (7) CLIPS 6.0 - C Language Integrated Production System, Version 6.0 (IBM PC Version); (8) CLIPS 6.0 - C Language Integrated Production System, Version 6.0 (UNIX Version); (9) CLIPS 6.0 - C Language Integrated Production System, Version 6.0 (DEC VAX VMS Version); and (10) TFSSRA - Thick Frequency Selective Surface with Rectangular Apertures. Activities in the areas of marketing, customer service, benefits identification, maintenance and support, and dissemination are also described along with a budget summary.

  5. ECONOMIC GROWTH ANALYSIS SYSTEM: USER'S GUIDE VERSION 2.0

    EPA Science Inventory

    The two-volume report describes the development of and provides information needed to operate, the Economic Growth Analysis System (E-GAS) Version 2.0 model. The model will be used to project emissions inventories of volatile organic compounds (VOCs), oxides of nitrogen (NOx), a...

  6. ECONOMIC GROWTH ANALYSIS SYSTEM: REFERENCE MANUAL VERSION 2.0

    EPA Science Inventory

    The two-volume report describes the development of and provides information needed to operate, the Economic Growth Analysis System (E-GAS) Version 2.0 model. The model will be used to project emissions inventories of volatile organic compounds (VOCs), oxides of nitrogen (NOx), a...

  7. Comparative Evaluation of Performances of Two Versions of NCEP Climate Forecast System in Predicting Winter Precipitation over India

    NASA Astrophysics Data System (ADS)

    Nageswararao, M. M.; Mohanty, U. C.; Nair, Archana; Ramakrishna, S. S. V. S.

    2016-06-01

    The precipitation during winter (December through February) over India is highly variable in terms of time and space. Maximum precipitation occurs over the Himalaya region, which is important for water resources and agriculture sectors over the region and also for the economy of the country. Therefore, in the present global warming era, the realistic prediction of winter precipitation over India is important for planning and implementing agriculture and water management strategies. The National Centers for Environmental Prediction (NCEP) issued the operational prediction of climatic variables in monthly to seasonal scale since 2004 using their first version of fully coupled global climate model known as Climate Forecast System (CFSv1). In 2011, a new version of CFS (CFSv2) was introduced with the incorporation of significant changes in older version of CFS (CFSv1). The new version of CFS is required to compare in detail with the older version in the context of simulating the winter precipitation over India. Therefore, the current study presents a detailed analysis on the performance of CFSv2 as compared to CFSv1 for the winter precipitation over India. The hindcast runs of both CFS versions from 1982 to 2008 with November initial conditions are used and the model's precipitation is evaluated with that of India Meteorological Department (IMD). The models simulated wind and geopotential height against the National Center for Atmospheric Research (NCEP-NCAR) reanalysis-2 (NNRP2) and remote response patterns of SST against Extended Reconstructed Sea Surface Temperatures version 3b (ERSSTv3b) are examined for the same period. The analyses of winter precipitation revealed that both the models are able to replicate the patterns of observed climatology; interannual variability and coefficient of variation. However, the magnitude is lesser than IMD observation that can be attributed to the model's inability to simulate the observed remote response of sea surface temperatures to all India winter precipitation. Of the two, CFSv1 is appreciable in capturing year-to-year variations in observed winter precipitation while CFSv2 failed in simulating the same. CFSv1 has accounted for less mean bias and RMSE errors along with good correlations and index of agreements than CFSv2 for predicting winter precipitation over India. In addition, the CFSv1 is also having a high probability of detection in predicting different categories (normal, excess and deficit) of observed winter precipitation over India.

  8. UPDATES AND EVALUATION OF THE PX-LSM IN MM5

    EPA Science Inventory

    Starting with Version 3.4, there is a new land surface model known as the Pleim-Xiu LSM available in the MM5 system. Pleim and Xiu (1995) described the initial development and testing of this land surface and workshop proceedings provided a basic description of the model and s...

  9. Experimenting with the GMAO 4D Data Assimilation

    NASA Technical Reports Server (NTRS)

    Todling, R.; El Akkraoui, A.; Errico, R. M.; Guo, J.; Kim, J.; Kliest, D.; Parrish, D. F.; Suarez, M.; Trayanov, A.; Tremolet, Yannick; hide

    2012-01-01

    The Global Modeling and Assimilation Office (GMAO) has been working to promote its prototype four-dimensional variational (4DVAR) system to a version that can be exercised at operationally desirable configurations. Beyond a general circulation model (GeM) and an analysis system, traditional 4DV AR requires availability of tangent linear (TL) and adjoint (AD) models of the corresponding GeM. The GMAO prototype 4DVAR uses the finite-volume-based GEOS GeM and the Grid-point Statistical Interpolation (GSI) system for the first two, and TL and AD models derived ITom an early version of the finite-volume hydrodynamics that is scientifically equivalent to the present GEOS nonlinear GeM but computationally rather outdated. Specifically, the TL and AD models hydrodynamics uses a simple (I-dimensional) latitudinal MPI domain decomposition, which has consequent low scalability and prevents the prototype 4DV AR ITom being used in realistic applications. In the near future, GMAO will be upgrading its operational GEOS GCM (and assimilation system) to use a cubed-sphere-based hydrodynamics. This versions of the dynamics scales to thousands of processes and has led to a decision to re-derive the TL and AD models for this more modern dynamics, thus taking advantage of a two-dimensional MPI decomposition and improved scalability properties. With the aid of the Transformation of Algorithms in FORTRAN (l'AF) automatic adjoint generation tool and some hand-coding, a version of the cubed-sphere-based TL and AD models, with a simplified vertical diffusion scheme, is now available, enabling multiple configurations of standard implementations of 4DV AR in GEOS. Concurrent to this development, collaboration with the National Centers for Environmental Prediction (NCEP) and the Earth System Research Laboratory (ESRL) has allowed GMAO to implement a hybrid-ensemble capability within the GEOS data assimilation system. Both 3Dand 4D-ensemble capabilities are presently available thus allowing GMAO to now evaluate the performance and benefit of various ensemble and variational assimilation strategies. This presentation will cover the most recent developments taking place at GMAO and show results from various comparisons from traditional techniques to more recent ensemble-based ones.

  10. MULTIPLE PROJECTIONS SYSTEM (MPS): USER'S MANUAL VERSION 2.0

    EPA Science Inventory

    The document is a user's manual for Multiple Projections System (MPS) Version 2.0, based on the 3% reasonable further progress (RFP) tracking system that was developed in FY92/FY93. The 3% RFP tracking system is a Windows application, and enhancements to convert the 3% RFP track...

  11. Design and fabrication of complete dentures using CAD/CAM technology

    PubMed Central

    Han, Weili; Li, Yanfeng; Zhang, Yue; lv, Yuan; Zhang, Ying; Hu, Ping; Liu, Huanyue; Ma, Zheng; Shen, Yi

    2017-01-01

    Abstract The aim of the study was to test the feasibility of using commercially available computer-aided design and computer-aided manufacturing (CAD/CAM) technology including 3Shape Dental System 2013 trial version, WIELAND V2.0.049 and WIELAND ZENOTEC T1 milling machine to design and fabricate complete dentures. The modeling process of full denture available in the trial version of 3Shape Dental System 2013 was used to design virtual complete dentures on the basis of 3-dimensional (3D) digital edentulous models generated from the physical models. The virtual complete dentures designed were exported to CAM software of WIELAND V2.0.049. A WIELAND ZENOTEC T1 milling machine controlled by the CAM software was used to fabricate physical dentitions and baseplates by milling acrylic resin composite plates. The physical dentitions were bonded to the corresponding baseplates to form the maxillary and mandibular complete dentures. Virtual complete dentures were successfully designed using the software through several steps including generation of 3D digital edentulous models, model analysis, arrangement of artificial teeth, trimming relief area, and occlusal adjustment. Physical dentitions and baseplates were successfully fabricated according to the designed virtual complete dentures using milling machine controlled by a CAM software. Bonding physical dentitions to the corresponding baseplates generated the final physical complete dentures. Our study demonstrated that complete dentures could be successfully designed and fabricated by using CAD/CAM. PMID:28072686

  12. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Donnell, B.

    1994-01-01

    CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.

  13. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (IBM PC VERSION)

    NASA Technical Reports Server (NTRS)

    Donnell, B.

    1994-01-01

    CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.

  14. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (MACINTOSH VERSION)

    NASA Technical Reports Server (NTRS)

    Riley, G.

    1994-01-01

    CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.

  15. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (DEC VAX VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Donnell, B.

    1994-01-01

    CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.

  16. 78 FR 32224 - Availability of Version 3.1.2 of the Connect America Fund Phase II Cost Model; Additional...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-29

    ... Version 3.1.2 of the Connect America Fund Phase II Cost Model; Additional Discussion Topics in Connect America Cost Model Virtual Workshop AGENCY: Federal Communications Commission. ACTION: Proposed rule... America Cost Model (CAM v3.1.2), which allows Commission staff and interested parties to calculate costs...

  17. The Tangent Linear and Adjoint of the FV3 Dynamical Core: Development and Applications

    NASA Technical Reports Server (NTRS)

    Holdaway, Daniel

    2018-01-01

    GMAO (NASA's Global Modeling and Assimilation Office) has developed a highly sophisticated adjoint modeling system based on the most recent version of the finite volume cubed sphere (FV3) dynamical core. This provides a mechanism for investigating sensitivity to initial conditions and examining observation impacts. It also allows for the computation of singular vectors and for the implementation of hybrid 4DVAR (4-Dimensional Variational Assimilation). In this work we will present the scientific assessment of the new adjoint system and show results from a number of research application of the adjoint system.

  18. Spatio-energetic cross-talk in photon counting detectors: Numerical detector model (PcTK) and workflow for CT image quality assessment.

    PubMed

    Taguchi, Katsuyuki; Stierstorfer, Karl; Polster, Christoph; Lee, Okkyun; Kappler, Steffen

    2018-05-01

    The interpixel cross-talk of energy-sensitive photon counting x-ray detectors (PCDs) has been studied and an analytical model (version 2.1) has been developed for double-counting between neighboring pixels due to charge sharing and K-shell fluorescence x-ray emission followed by its reabsorption (Taguchi K, et al., Medical Physics 2016;43(12):6386-6404). While the model version 2.1 simulated the spectral degradation well, it had the following problems that has been found to be significant recently: (1) The spectrum is inaccurate with smaller pixel sizes; (2) the charge cloud size must be smaller than the pixel size; (3) the model underestimates the spectrum/counts for 10-40 keV; and (4) the model version 2.1 cannot handlen-tuple-counting withn > 2 (i.e., triple-counting or higher). These problems are inherent to the design of the model version 2.1; therefore, we developed a new model and addressed these problems in this study. We propose a new PCD cross-talk model (version 3.2; Pc TK for "photon counting toolkit") that is based on a completely different design concept from the previous version. It uses a numerical approach and starts with a 2-D model of charge sharing (as opposed to an analytical approach and a 1-D model with version 2.1) and addresses all of the four problems. The model takes the following factors into account: (1) shift-variant electron density of the charge cloud (Gaussian-distributed), (2) detection efficiency, (3) interactions between photons and PCDs via photoelectric effect, and (4) electronic noise. Correlated noisy PCD data can be generated using either a multivariate normal random number generator or a Poisson random number generator. The effect of the two parameters, the effective charge cloud diameter (d 0 ) and pixel size (d pix ), was studied and results were compared with Monte Carlo simulations and the previous model version 2.1. Finally, a script for the workflow for CT image quality assessment has been developed, which started with a few material density images, generated material-specific sinogram (line integrals) data, noisy PCD data with spectral distortion using the model version 3.2, and reconstructed PCD- CT images for four energy windows. The model version 3.2 addressed all of the four problems listed above. The spectra withd pix  = 56-113 μm agreed with that of Medipix3 detector withd pix  = 55-110 μm without charge summing mode qualitatively. The counts for 10-40 keV were larger than the previous model (version 2.1) and agreed with MC simulations very well (root-mean-square difference values with model version 3.2 were decreased to 16%-67% of the values with version 2.1). There were many non-zero off-diagonal elements withn-tuple-counting withn > 2 in the normalized covariance matrix of 3 × 3 neighboring pixels. Reconstructed images showed biases and artifacts attributed to the spectral distortion due to the charge sharing and fluorescence x rays. We have developed a new PCD model for spatio-energetic cross-talk and correlation between PCD pixels. The workflow demonstrated the utility of the model for general or task-specific image quality assessments for the PCD- CT.Note: The program (Pc TK) and the workflow scripts have been made available to academic researchers. Interested readers should visit the website (pctk.jhu.edu) or contact the corresponding author. © 2018 American Association of Physicists in Medicine.

  19. Final Design Documentation for the Wartime Personnel Assessment Model (WARPAM) (Version 1.0)

    DTIC Science & Technology

    1991-03-25

    Bldg 401B) Ft. Benjamin Harrison, IN 46216-5000 Accesion For DTI& NTIS CRA& I J DTIC 1A;3 A Uta,I.ou- i Justilicatluol .... . .. . By...GENERATOR FIGURE 2: WARPAN OPERATIONAL ARCHITECTURE 10 WARPAN DESIGN DOCUMENTATION WARPAM is programmed in FORTRAN 77, except for the CRC model which is...to enter directly into a specific model and utilize data currently in the system. The modular architecture of WARPAM is depicted in Figure 3

  20. IWR-MAIN Water Use Forecasting System. Version 5.1. User’s Manual and System Description

    DTIC Science & Technology

    1987-12-01

    Crosschecks for Input Data 1-68 11-1 Organization of the IWR-MAIN System H-8 11-2 Example of Econometric Demand Model 11-9 11-3 Example of Unit Use Coefficient...Unaccounted (entry does not affect default Loss and free service calculations) Y Conservation Data City Name: Test City USA Fl-Hetp, F2-return to monu, F4...socioeconomic data. 1-11 (1) Internal Growth Models The IWR-MAIN program contains a subroutine called GROWTH which uses econometric growth models based on

  1. Research accomplished at the Knowledge Based Systems Lab: IDEF3, version 1.0

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Menzel, Christopher P.; Mayer, Paula S. D.

    1991-01-01

    An overview is presented of the foundations and content of the evolving IDEF3 process flow and object state description capture method. This method is currently in beta test. Ongoing efforts in the formulation of formal semantics models for descriptions captured in the outlined form and in the actual application of this method can be expected to cause an evolution in the method language. A language is described for the representation of process and object state centered system description. IDEF3 is a scenario driven process flow modeling methodology created specifically for these types of descriptive activities.

  2. Extended frequency turbofan model

    NASA Technical Reports Server (NTRS)

    Mason, J. R.; Park, J. W.; Jaekel, R. F.

    1980-01-01

    The fan model was developed using two dimensional modeling techniques to add dynamic radial coupling between the core stream and the bypass stream of the fan. When incorporated into a complete TF-30 engine simulation, the fan model greatly improved compression system frequency response to planar inlet pressure disturbances up to 100 Hz. The improved simulation also matched engine stability limits at 15 Hz, whereas the one dimensional fan model required twice the inlet pressure amplitude to stall the simulation. With verification of the two dimensional fan model, this program formulated a high frequency F-100(3) engine simulation using row by row compression system characteristics. In addition to the F-100(3) remote splitter fan, the program modified the model fan characteristics to simulate a proximate splitter version of the F-100(3) engine.

  3. Extension of the MIRS computer package for the modeling of molecular spectra: From effective to full ab initio ro-vibrational Hamiltonians in irreducible tensor form

    NASA Astrophysics Data System (ADS)

    Nikitin, A. V.; Rey, M.; Champion, J. P.; Tyuterev, Vl. G.

    2012-07-01

    The MIRS software for the modeling of ro-vibrational spectra of polyatomic molecules was considerably extended and improved. The original version [Nikitin AV, Champion JP, Tyuterev VlG. The MIRS computer package for modeling the rovibrational spectra of polyatomic molecules. J Quant Spectrosc Radiat Transf 2003;82:239-49.] was especially designed for separate or simultaneous treatments of complex band systems of polyatomic molecules. It was set up in the frame of effective polyad models by using algorithms based on advanced group theory algebra to take full account of symmetry properties. It has been successfully used for predictions and data fitting (positions and intensities) of numerous spectra of symmetric and spherical top molecules within the vibration extrapolation scheme. The new version offers more advanced possibilities for spectra calculations and modeling by getting rid of several previous limitations particularly for the size of polyads and the number of tensors involved. It allows dealing with overlapping polyads and includes more efficient and faster algorithms for the calculation of coefficients related to molecular symmetry properties (6C, 9C and 12C symbols for C3v, Td, and Oh point groups) and for better convergence of least-square-fit iterations as well. The new version is not limited to polyad effective models. It also allows direct predictions using full ab initio ro-vibrational normal mode Hamiltonians converted into the irreducible tensor form. Illustrative examples on CH3D, CH4, CH3Cl, CH3F and PH3 are reported reflecting the present status of data available. It is written in C++ for standard PC computer operating under Windows. The full package including on-line documentation and recent data are freely available at http://www.iao.ru/mirs/mirs.htm or http://xeon.univ-reims.fr/Mirs/ or http://icb.u-bourgogne.fr/OMR/SMA/SHTDS/MIRS.html and as supplementary data from the online version of the article.

  4. IMPLEMENTATION OF THE SMOKE EMISSION DATA PROCESSOR AND SMOKE TOOL INPUT DATA PROCESSOR IN MODELS-3

    EPA Science Inventory

    The U.S. Environmental Protection Agency has implemented Version 1.3 of SMOKE (Sparse Matrix Object Kernel Emission) processor for preparation of area, mobile, point, and biogenic sources emission data within Version 4.1 of the Models-3 air quality modeling framework. The SMOK...

  5. Landfill Gas Energy Cost Model Version 3.0 (LFGcost-Web V3 ...

    EPA Pesticide Factsheets

    To help stakeholders estimate the costs of a landfill gas (LFG) energy project, in 2002, LMOP developed a cost tool (LFGcost). Since then, LMOP has routinely updated the tool to reflect changes in the LFG energy industry. Initially the model was designed for EPA to assist landfills in evaluating the economic and financial feasibility of LFG energy project development. In 2014, LMOP developed a public version of the model, LFGcost-Web (Version 3.0), to allow landfill and industry stakeholders to evaluate project feasibility on their own. LFGcost-Web can analyze costs for 12 energy recovery project types. These project costs can be estimated with or without the costs of a gas collection and control system (GCCS). The EPA used select equations from LFGcost-Web to estimate costs of the regulatory options in the 2015 proposed revisions to the MSW Landfills Standards of Performance (also known as New Source Performance Standards) and the Emission Guidelines (herein thereafter referred to collectively as the Landfill Rules). More specifically, equations derived from LFGcost-Web were applied to each landfill expected to be impacted by the Landfill Rules to estimate annualized installed capital costs and annual O&M costs of a gas collection and control system. In addition, after applying the LFGcost-Web equations to the list of landfills expected to require a GCCS in year 2025 as a result of the proposed Landfill Rules, the regulatory analysis evaluated whether electr

  6. An implicit dispersive transport algorithm for the US Geological Survey MOC3D solute-transport model

    USGS Publications Warehouse

    Kipp, K.L.; Konikow, Leonard F.; Hornberger, G.Z.

    1998-01-01

    This report documents an extension to the U.S. Geological Survey MOC3D transport model that incorporates an implicit-in-time difference approximation for the dispersive transport equation, including source/sink terms. The original MOC3D transport model (Version 1) uses the method of characteristics to solve the transport equation on the basis of the velocity field. The original MOC3D solution algorithm incorporates particle tracking to represent advective processes and an explicit finite-difference formulation to calculate dispersive fluxes. The new implicit procedure eliminates several stability criteria required for the previous explicit formulation. This allows much larger transport time increments to be used in dispersion-dominated problems. The decoupling of advective and dispersive transport in MOC3D, however, is unchanged. With the implicit extension, the MOC3D model is upgraded to Version 2. A description of the numerical method of the implicit dispersion calculation, the data-input requirements and output options, and the results of simulator testing and evaluation are presented. Version 2 of MOC3D was evaluated for the same set of problems used for verification of Version 1. These test results indicate that the implicit calculation of Version 2 matches the accuracy of Version 1, yet is more efficient than the explicit calculation for transport problems that are characterized by a grid Peclet number less than about 1.0.

  7. The Brazilian developments on the Regional Atmospheric Modeling System (BRAMS 5.2): an integrated environmental model tuned for tropical areas

    NASA Astrophysics Data System (ADS)

    Freitas, Saulo R.; Panetta, Jairo; Longo, Karla M.; Rodrigues, Luiz F.; Moreira, Demerval S.; Rosário, Nilton E.; Silva Dias, Pedro L.; Silva Dias, Maria A. F.; Souza, Enio P.; Freitas, Edmilson D.; Longo, Marcos; Frassoni, Ariane; Fazenda, Alvaro L.; Silva, Cláudio M. Santos e.; Pavani, Cláudio A. B.; Eiras, Denis; França, Daniela A.; Massaru, Daniel; Silva, Fernanda B.; Santos, Fernando C.; Pereira, Gabriel; Camponogara, Gláuber; Ferrada, Gonzalo A.; Campos Velho, Haroldo F.; Menezes, Isilda; Freire, Julliana L.; Alonso, Marcelo F.; Gácita, Madeleine S.; Zarzur, Maurício; Fonseca, Rafael M.; Lima, Rafael S.; Siqueira, Ricardo A.; Braz, Rodrigo; Tomita, Simone; Oliveira, Valter; Martins, Leila D.

    2017-01-01

    We present a new version of the Brazilian developments on the Regional Atmospheric Modeling System (BRAMS), in which different previous versions for weather, chemistry, and carbon cycle were unified in a single integrated modeling system software. This new version also has a new set of state-of-the-art physical parameterizations and greater computational parallel and memory usage efficiency. The description of the main model features includes several examples illustrating the quality of the transport scheme for scalars, radiative fluxes on surface, and model simulation of rainfall systems over South America at different spatial resolutions using a scale aware convective parameterization. Additionally, the simulation of the diurnal cycle of the convection and carbon dioxide concentration over the Amazon Basin, as well as carbon dioxide fluxes from biogenic processes over a large portion of South America, are shown. Atmospheric chemistry examples show the model performance in simulating near-surface carbon monoxide and ozone in the Amazon Basin and the megacity of Rio de Janeiro. For tracer transport and dispersion, the model capabilities to simulate the volcanic ash 3-D redistribution associated with the eruption of a Chilean volcano are demonstrated. The gain of computational efficiency is described in some detail. BRAMS has been applied for research and operational forecasting mainly in South America. Model results from the operational weather forecast of BRAMS on 5 km grid spacing in the Center for Weather Forecasting and Climate Studies, INPE/Brazil, since 2013 are used to quantify the model skill of near-surface variables and rainfall. The scores show the reliability of BRAMS for the tropical and subtropical areas of South America. Requirements for keeping this modeling system competitive regarding both its functionalities and skills are discussed. Finally, we highlight the relevant contribution of this work to building a South American community of model developers.

  8. Overview and Evaluation of the Community Multiscale Air ...

    EPA Pesticide Factsheets

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Protection Agency develops the CMAQ model and periodically releases new versions of the model that include bug fixes and various other improvements to the modeling system. In late 2016 or early 2017, CMAQ version 5.2 will be released. This new version of CMAQ will contain important updates from the current CMAQv5.1 modeling system, along with several instrumented versions of the model (e.g. decoupled direct method and sulfur tracking). Some specific model updates include the implementation of a new wind-blown dust treatment in CMAQv5.2, a significant improvement over the treatment in v5.1 which can severely overestimate wind-blown dust under certain conditions. Several other major updates to the modeling system include an update to the calculation of aerosols; implementation of full halogen chemistry (CMAQv5.1 contains a partial implementation of halogen chemistry); the new carbon bond 6 (CB6) chemical mechanism; updates to cloud model in CMAQ; and a new lightning assimilation scheme for the WRF model which significant improves the placement and timing of convective precipitation in the WRF precipitation fields. Numerous other updates to the modeling system will also be available in v5.2.

  9. The Systems Biology Markup Language (SBML) Level 3 Package: Qualitative Models, Version 1, Release 1.

    PubMed

    Chaouiya, Claudine; Keating, Sarah M; Berenguier, Duncan; Naldi, Aurélien; Thieffry, Denis; van Iersel, Martijn P; Le Novère, Nicolas; Helikar, Tomáš

    2015-09-04

    Quantitative methods for modelling biological networks require an in-depth knowledge of the biochemical reactions and their stoichiometric and kinetic parameters. In many practical cases, this knowledge is missing. This has led to the development of several qualitative modelling methods using information such as, for example, gene expression data coming from functional genomic experiments. The SBML Level 3 Version 1 Core specification does not provide a mechanism for explicitly encoding qualitative models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Qualitative Models package for SBML Level 3 adds features so that qualitative models can be directly and explicitly encoded. The approach taken in this package is essentially based on the definition of regulatory or influence graphs. The SBML Qualitative Models package defines the structure and syntax necessary to describe qualitative models that associate discrete levels of activities with entity pools and the transitions between states that describe the processes involved. This is particularly suited to logical models (Boolean or multi-valued) and some classes of Petri net models can be encoded with the approach.

  10. U.S. GODAE: Global Ocean Prediction with the HYbrid Coordinate Ocean Model

    DTIC Science & Technology

    2008-09-30

    major contributors to the strength of the Gulf Stream, (1) the wind forcing, (2) the Atlantic meridional overturning circulation (AMOC), and (3) a...convergence and sensitivity studies with North Atlantic circulation models. Part I. The western boundary current system. Ocean Model., 16, 141-159...a baroclinic version of ADvanced CIRCulation (ADCIRC), the latter an unstructured grid model for baroclinic coastal/estuarian applications. NCOM is

  11. Comparison of updates to the Molly cow model to predict methane production from dairy cows fed pasture.

    PubMed

    Gregorini, P; Beukes, P C; Hanigan, M D; Waghorn, G; Muetzel, S; McNamara, J P

    2013-08-01

    Molly is a deterministic, mechanistic, dynamic model representing the digestion, metabolism, and production of a dairy cow. This study compared the predictions of enteric methane production from the original version of Molly (MollyOrigin) and 2 new versions of Molly. Updated versions included new ruminal fiber digestive parameters and animal hormonal parameters (Molly84) and a revised version of digestive and ruminal parameters (Molly85), using 3 different ruminal volatile fatty acid (VFA) stoichiometry constructs to describe the VFA pattern and methane (CH4) production (g of CH4/d). The VFA stoichiometry constructs were the original forage and mixed-diet VFA constructs and a new VFA stoichiometry based on a more recent and larger set of data that includes lactate and valerate production, amylolytic and cellulolytic bacteria, as well as protozoal pools. The models' outputs were challenged using data from 16 dairy cattle 26 mo old [standard error of the mean (SEM)=1.7], 82 (SEM=8.7) d in milk, producing 17 (SEM=0.2) kg of milk/d, and fed fresh-cut ryegrass [dry matter intake=12.3 (SEM=0.3) kg of DM/d] in respiration chambers. Mean observed CH4 production was 266±5.6 SEM (g/d). Mean predicted values for CH4 production were 287 and 258 g/d for MollyOrigin without and with the new VFA construct. Model Molly84 predicted 295 and 288 g of CH4/d with and without the new VFA settings. Model Molly85 predicted the same CH4 production (276 g/d) with or without the new VFA construct. The incorporation of the new VFA construct did not consistently reduce the low prediction error across the versions of Molly evaluated in the present study. The improvements in the Molly versions from MollyOrigin to Molly84 to Molly85 resulted in a decrease in mean square prediction error from 8.6 to 8.3 to 4.3% using the forage diet setting. The majority of the mean square prediction error was apportioned to random bias (e.g., 43, 65, and 70% in MollyOrigin, Molly84, and Molly85, respectively, on the forage setting, showing that with the updated versions a greater proportion of error was random). The slope bias was less than 2% in all cases. We concluded that, of the versions of Molly used for pastoral systems, Molly85 has the capability to predict CH4 production from grass-fed dairy cows with the highest accuracy. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  12. BIOSCREEN: NATURAL ATTENTUATION DECISION SUPPORT SYSTEM - USER'S MANUAL, VERSION 1.3

    EPA Science Inventory

    BIOSCREEN is an easy-to-use screening model which simulates remediation through natural attenuation (RNA) of dissolved hydrocarbons at petroleum fuel release sites. The software, programmed in the Microsoft Excel spreadsheet environment and based on the Domenico analytical solu...

  13. U. S. GODAE: Global Ocean Prediction with the HYbrid Coordinate Ocean Model

    DTIC Science & Technology

    2009-01-01

    2008). There are three major contributors to the strength of the Gulf Stream, (1) the wind forcing, (2) the Atlantic meridional overturning ...Smith, 2007. Resolution convergence and sensitivity studies with North Atlantic circulation models. Part I. The western boundary current system...σ-z coordinates, and (3) a baroclinic version of ADvanced CIRCulation (ADCIRC), the latter an unstructured grid model for baroclinic coastal

  14. Adaptive Interfaces

    DTIC Science & Technology

    1990-11-01

    to design and implement an adaptive intelligent interface for a command-and-control-style domain. The primary functionality of the resulting...technical tasks, as follows: 1. Analysis of Current Interface Technologies 2. Dejineation of User Roles 3. Development of User Models 4. Design of Interface...Management Association (FEMA). In the initial version of the prototype, two distin-t user models were designed . One type of user modeled by the system is

  15. Recent developments of DMI's operational system: Coupled Ecosystem-Circulation-and SPM model.

    NASA Astrophysics Data System (ADS)

    Murawski, Jens; Tian, Tian; Dobrynin, Mikhail

    2010-05-01

    ECOOP is a pan- European project with 72 partners from 29 countries around the Baltic Sea, the North Sea, the Iberia-Biscay-Ireland region, the Mediterranean Sea and the Black Sea. The project aims at the development and the integration of the different coastal and regional observation and forecasting systems. The Danish Meteorological Institute DMI coordinates the project and is responsible for the Baltic Sea regional forecasting System. Over the project period, the Baltic Sea system was developed from a purely hydro dynamical model (version V1), running operationally since summer 2009, to a coupled model platform (version V2), including model components for the simulation of suspended particles, data assimilation and ecosystem variables. The ECOOP V2 model is currently tested and validated, and will replace the V1 version soon. The coupled biogeochemical- and circulation model runs operationally since November 2009. The daily forecasts are presented at DMI's homepage http:/ocean.dmi.dk. The presentation includes a short description of the ECOOP forecasting system, discusses the model results and shows the outcome of the model validation.

  16. Thermosphere-Ionosphere-Mesosphere Modeling Using the TIME-GCM

    DTIC Science & Technology

    2014-09-30

    respectively. The CCM3 is the NCAR Community Climate Model, Version 3.6, a GCM of the troposphere and stratosphere. All models include self-consistent...middle atmosphere version of the NCAR Community Climate Model, (2) the NCAR TIME-GCM, and (3) the Model for Ozone and Related Chemical Tracers (MOZART... troposphere , but the impacts of such events extend well into the mesosphere. The coupled NCAR thermosphere-ionosphere-mesosphere- electrodynamics general

  17. Implementation of Advanced Two Equation Turbulence Models in the USM3D Unstructured Flow Solver

    NASA Technical Reports Server (NTRS)

    Wang, Qun-Zhen; Massey, Steven J.; Abdol-Hamid, Khaled S.

    2000-01-01

    USM3D is a widely-used unstructured flow solver for simulating inviscid and viscous flows over complex geometries. The current version (version 5.0) of USM3D, however, does not have advanced turbulence models to accurately simulate complicated flow. We have implemented two modified versions of the original Jones and Launder k-epsilon "two-equation" turbulence model and the Girimaji algebraic Reynolds stress model in USM3D. Tests have been conducted for three flat plate boundary layer cases, a RAE2822 airfoil and an ONERA M6 wing. The results are compared with those from direct numerical simulation, empirical formulae, theoretical results, and the existing Spalart-Allmaras one-equation model.

  18. A Descriptive Evaluation of Automated Software Cost-Estimation Models,

    DTIC Science & Technology

    1986-10-01

    Version 1.03D) * PCOC (Version 7.01) - PRICE S • SLIM (Version 1.1) • SoftCost (Version 5. 1) * SPQR /20 (Version 1. 1) - WICOMO (Version 1.3) These...produce detailed GANTT and PERT charts. SPQR /20 is based on a cost model developed at ITT. In addition to cost, schedule, and staffing estimates, it...cases and test runs required, and the effectiveness of pre-test and test activities. SPQR /20 also predicts enhancement and maintenance activities. C

  19. Computational control of flexible aerospace systems

    NASA Technical Reports Server (NTRS)

    Sharpe, Lonnie, Jr.; Shen, Ji Yao

    1994-01-01

    The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed based on several incomplete versions. The verification of the code had been conducted by comparing the results with those examples for which the exact theoretical solutions can be obtained. The theoretical background of the package and the verification examples has been reported in a technical paper submitted to the Joint Applied Mechanics & Material Conference, ASME. A brief USER'S MANUAL had been compiled, which includes three parts: (1) Input data preparation; (2) Explanation of the Subroutines; and (3) Specification of control variables. Meanwhile, a theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modeling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide an embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.

  20. Comparison of the results of several heat transfer computer codes when applied to a hypothetical nuclear waste repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Claiborne, H.C.; Wagner, R.S.; Just, R.A.

    1979-12-01

    A direct comparison of transient thermal calculations was made with the heat transfer codes HEATING5, THAC-SIP-3D, ADINAT, SINDA, TRUMP, and TRANCO for a hypothetical nuclear waste repository. With the exception of TRUMP and SINDA (actually closer to the earlier CINDA3G version), the other codes agreed to within +-5% for the temperature rises as a function of time. The TRUMP results agreed within +-5% up to about 50 years, where the maximum temperature occurs, and then began an oscillary behavior with up to 25% deviations at longer times. This could have resulted from time steps that were too large or frommore » some unknown system problems. The available version of the SINDA code was not compatible with the IBM compiler without using an alternative method for handling a variable thermal conductivity. The results were about 40% low, but a reasonable agreement was obtained by assuming a uniform thermal conductivity; however, a programming error was later discovered in the alternative method. Some work is required on the IBM version to make it compatible with the system and still use the recommended method of handling variable thermal conductivity. TRANCO can only be run as a 2-D model, and TRUMP and CINDA apparently required longer running times and did not agree in the 2-D case; therefore, only HEATING5, THAC-SIP-3D, and ADINAT were used for the 3-D model calculations. The codes agreed within +-5%; at distances of about 1 ft from the waste canister edge, temperature rises were also close to that predicted by the 3-D model.« less

  1. A geographic data model for representing ground water systems.

    PubMed

    Strassberg, Gil; Maidment, David R; Jones, Norm L

    2007-01-01

    The Arc Hydro ground water data model is a geographic data model for representing spatial and temporal ground water information within a geographic information system (GIS). The data model is a standardized representation of ground water systems within a spatial database that provides a public domain template for GIS users to store, document, and analyze commonly used spatial and temporal ground water data sets. This paper describes the data model framework, a simplified version of the complete ground water data model that includes two-dimensional and three-dimensional (3D) object classes for representing aquifers, wells, and borehole data, and the 3D geospatial context in which these data exist. The framework data model also includes tabular objects for representing temporal information such as water levels and water quality samples that are related with spatial features.

  2. Impact of Three-Dimensional Printed Pelvicaliceal System Models on Residents' Understanding of Pelvicaliceal System Anatomy Before Percutaneous Nephrolithotripsy Surgery: A Pilot Study.

    PubMed

    Atalay, Hasan Anıl; Ülker, Volkan; Alkan, İlter; Canat, Halil Lütfi; Özkuvancı, Ünsal; Altunrende, Fatih

    2016-10-01

    To investigate the impact of three-dimensional (3D) printed pelvicaliceal system models on residents' understanding of pelvicaliceal system anatomy before percutaneous nephrolithotripsy (PCNL). Patients with unilateral complex renal stones indicating PCNL were selected. Usable data of patients were obtained from CT-scans in Digital Imaging and Communications in Medicine (DICOM) format. Mimics software version 16.0 (Materialise, Belgium) was used for segmentation and extraction of pelvicaliceal systems (PCSs). All DICOM-formatted files were converted to the stereolithography file format. Finally, fused deposition modeling was used to create plasticine 3D models of PCSs. A questionnaire was designed so that residents could assess the 3D models' effects on their understanding of the anatomy of the pelvicaliceal system before PCNL (Fig. 3). Five patients' anatomically accurate models of the human renal collecting system were effectively generated (Figs. 1 and 2). After presentation of the 3D models, residents were 86% and 88% better at determining the number of anterior and posterior calices, respectively, 60% better at understanding stone location, and 64% better at determining optimal entry calix into the collecting system (Fig. 5). Generating kidney models of PCSs using 3D printing technology is feasible, and the models were accepted by residents as aids in surgical planning and understanding of pelvicaliceal system anatomy before PCNL.

  3. Industrial Waste Management Evaluation Model Version 3.1

    EPA Pesticide Factsheets

    IWEM is a screening level ground water model designed to simulate contaminant fate and transport. IWEM v3.1 is the latest version of the IWEM software, which includes additional tools to evaluate the beneficial use of industrial materials

  4. The Brazilian Developments on the Regional Atmospheric Modeling System (BRAMS 5.2): An Integrated Environmental Model Tuned for Tropical Areas

    NASA Technical Reports Server (NTRS)

    Freitas, Saulo R.; Panetta, Jairo; Longo, Karla M.; Rodrigues, Luiz F.; Moreira, Demerval S.; Rosario, Nilton E.; Silva Dias, Pedro L.; Silva Dias, Maria A. F.; Souza, Enio P.; Freitas, Edmilson D.; hide

    2017-01-01

    We present a new version of the Brazilian developments on the Regional Atmospheric Modeling System where different previous versions for weather, chemistry and carbon cycle were unified in a single integrated software system. The new version also has a new set of state-of-the-art physical parameterizations and greater computational parallel and memory usage efficiency. Together with the description of the main features are examples of the quality of the transport scheme for scalars, radiative fluxes on surface and model simulation of rainfall systems over South America in different spatial resolutions using a scale-aware convective parameterization. Besides, the simulation of the diurnal cycle of the convection and carbon dioxide concentration over the Amazon Basin, as well as carbon dioxide fluxes from biogenic processes over a large portion of South America are shown. Atmospheric chemistry examples present model performance in simulating near-surface carbon monoxide and ozone in Amazon Basin and Rio de Janeiro megacity. For tracer transport and dispersion, it is demonstrated the model capabilities to simulate the volcanic ash 3-d redistribution associated with the eruption of a Chilean volcano. Then, the gain of computational efficiency is described with some details. BRAMS has been applied for research and operational forecasting mainly in South America. Model results from the operational weather forecast of BRAMS on 5 km grid spacing in the Center for Weather Forecasting and Climate Studies, INPE/Brazil, since 2013 are used to quantify the model skill of near surface variables and rainfall. The scores show the reliability of BRAMS for the tropical and subtropical areas of South America. Requirements for keeping this modeling system competitive regarding on its functionalities and skills are discussed. At last, we highlight the relevant contribution of this work on the building up of a South American community of model developers.

  5. Object-oriented analysis and design of a health care management information system.

    PubMed

    Krol, M; Reich, D L

    1999-04-01

    We have created a prototype for a universal object-oriented model of a health care system compatible with the object-oriented approach used in version 3.0 of the HL7 standard for communication messages. A set of three models has been developed: (1) the Object Model describes the hierarchical structure of objects in a system--their identity, relationships, attributes, and operations; (2) the Dynamic Model represents the sequence of operations in time as a collection of state diagrams for object classes in the system; and (3) functional Diagram represents the transformation of data within a system by means of data flow diagrams. Within these models, we have defined major object classes of health care participants and their subclasses, associations, attributes and operators, states, and behavioral scenarios. We have also defined the major processes and subprocesses. The top-down design approach allows use, reuse, and cloning of standard components.

  6. Characteristics of the ocean simulations in the Max Planck Institute Ocean Model (MPIOM) the ocean component of the MPI-Earth system model

    NASA Astrophysics Data System (ADS)

    Jungclaus, J. H.; Fischer, N.; Haak, H.; Lohmann, K.; Marotzke, J.; Matei, D.; Mikolajewicz, U.; Notz, D.; von Storch, J. S.

    2013-06-01

    MPI-ESM is a new version of the global Earth system model developed at the Max Planck Institute for Meteorology. This paper describes the ocean state and circulation as well as basic aspects of variability in simulations contributing to the fifth phase of the Coupled Model Intercomparison Project (CMIP5). The performance of the ocean/sea-ice model MPIOM, coupled to a new version of the atmosphere model ECHAM6 and modules for land surface and ocean biogeochemistry, is assessed for two model versions with different grid resolution in the ocean. The low-resolution configuration has a nominal resolution of 1.5°, whereas the higher resolution version features a quasiuniform, eddy-permitting global resolution of 0.4°. The paper focuses on important oceanic features, such as surface temperature and salinity, water mass distribution, large-scale circulation, and heat and freshwater transports. In general, these integral quantities are simulated well in comparison with observational estimates, and improvements in comparison with the predecessor system are documented; for example, for tropical variability and sea ice representation. Introducing an eddy-permitting grid configuration in the ocean leads to improvements, in particular, in the representation of interior water mass properties in the Atlantic and in the representation of important ocean currents, such as the Agulhas and Equatorial current systems. In general, however, there are more similarities than differences between the two grid configurations, and several shortcomings, known from earlier versions of the coupled model, prevail.

  7. Solving Navier-Stokes Equations with Advanced Turbulence Models on Three-Dimensional Unstructured Grids

    NASA Technical Reports Server (NTRS)

    Wang, Qun-Zhen; Massey, Steven J.; Abdol-Hamid, Khaled S.; Frink, Neal T.

    1999-01-01

    USM3D is a widely-used unstructured flow solver for simulating inviscid and viscous flows over complex geometries. The current version (version 5.0) of USM3D, however, does not have advanced turbulence models to accurately simulate complicated flows. We have implemented two modified versions of the original Jones and Launder k-epsilon two-equation turbulence model and the Girimaji algebraic Reynolds stress model in USM3D. Tests have been conducted for two flat plate boundary layer cases, a RAE2822 airfoil and an ONERA M6 wing. The results are compared with those of empirical formulae, theoretical results and the existing Spalart-Allmaras one-equation model.

  8. Simplifying HL7 Version 3 messages.

    PubMed

    Worden, Robert; Scott, Philip

    2011-01-01

    HL7 Version 3 offers a semantically robust method for healthcare interoperability but has been criticized as overly complex to implement. This paper reviews initiatives to simplify HL7 Version 3 messaging and presents a novel approach based on semantic mapping. Based on user-defined definitions, precise transforms between simple and full messages are automatically generated. Systems can be interfaced with the simple messages and achieve interoperability with full Version 3 messages through the transforms. This reduces the costs of HL7 interfacing and will encourage better uptake of HL7 Version 3 and CDA.

  9. Precipitation Processes developed during ARM (1997), TOGA COARE(1992), GATE(1 974), SCSMEX(1998) and KWAJEX(1999): Consistent 2D and 3D Cloud Resolving Model Simulations

    NASA Technical Reports Server (NTRS)

    Tao, W.-K.; Shie, C.-H.; Simpson, J.; Starr, D.; Johnson, D.; Sud, Y.

    2003-01-01

    Real clouds and clouds systems are inherently three dimensional (3D). Because of the limitations in computer resources, however, most cloud-resolving models (CRMs) today are still two-dimensional (2D). A few 3D CRMs have been used to study the response of clouds to large-scale forcing. In these 3D simulations, the model domain was small, and the integration time was 6 hours. Only recently have 3D experiments been performed for multi-day periods for tropical cloud system with large horizontal domains at the National Center for Atmospheric Research. The results indicate that surface precipitation and latent heating profiles are very similar between the 2D and 3D simulations of these same cases. The reason for the strong similarity between the 2D and 3D CRM simulations is that the observed large-scale advective tendencies of potential temperature, water vapor mixing ratio, and horizontal momentum were used as the main forcing in both the 2D and 3D models. Interestingly, the 2D and 3D versions of the CRM used in CSU and U.K. Met Office showed significant differences in the rainfall and cloud statistics for three ARM cases. The major objectives of this project are to calculate and axamine: (1)the surface energy and water budgets, (2) the precipitation processes in the convective and stratiform regions, (3) the cloud upward and downward mass fluxes in the convective and stratiform regions; (4) cloud characteristics such as size, updraft intensity and lifetime, and (5) the entrainment and detrainment rates associated with clouds and cloud systems that developed in TOGA COARE, GATE, SCSMEX, ARM and KWAJEX. Of special note is that the analyzed (model generated) data sets are all produced by the same current version of the GCE model, i.e. consistent model physics and configurations. Trajectory analyse and inert tracer calculation will be conducted to identify the differences and similarities in the organization of convection between simulated 2D and 3D cloud systems.

  10. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Engineering Manual (Version 3). Volume 3.

    DTIC Science & Technology

    1983-09-01

    processor. How- ever, upon completion of the restart initialization, additional commands may be added or original commands deleted with normal input...written IOSI Scratch logical unit designator IOS1SV Saved value of lOS1 IOS2 Scratch logical unit designator IR Index pointer to upper triangular matrix

  11. Simulation of modern climate with the new version of the INM RAS climate model

    NASA Astrophysics Data System (ADS)

    Volodin, E. M.; Mortikov, E. V.; Kostrykin, S. V.; Galin, V. Ya.; Lykosov, V. N.; Gritsun, A. S.; Diansky, N. A.; Gusev, A. V.; Yakovlev, N. G.

    2017-03-01

    The INMCM5.0 numerical model of the Earth's climate system is presented, which is an evolution from the previous version, INMCM4.0. A higher vertical resolution for the stratosphere is applied in the atmospheric block. Also, we raised the upper boundary of the calculating area, added the aerosol block, modified parameterization of clouds and condensation, and increased the horizontal resolution in the ocean block. The program implementation of the model was also updated. We consider the simulation of the current climate using the new version of the model. Attention is focused on reducing systematic errors as compared to the previous version, reproducing phenomena that could not be simulated correctly in the previous version, and modeling the problems that remain unresolved.

  12. AF-GEOSPACE Version 2.1

    NASA Astrophysics Data System (ADS)

    Hilmer, R. V.; Ginet, G. P.; Hall, T.; Holeman, E.; Madden, D.; Tautz, M.; Roth, C.

    2004-05-01

    AF-GEOSpace is a graphics-intensive software program with space environment models and applications developed and distributed by the Space Weather Center of Excellence at AFRL. A review of current (Version 2.0) and planned (Version 2.1) AF-GEOSpace capabilities will be given. A wide range of physical domains is represented enabling the software to address such things as solar disturbance propagation, radiation belt configuration, and ionospheric auroral particle precipitation and scintillation. The software is currently being used to aid with the design, operation, and simulation of a wide variety of communications, navigation, and surveillance systems. Building on the success of previous releases, AF-GEOSpace has become a platform for the rapid prototyping of automated operational and simulation space weather visualization products and helps with a variety of tasks, including: orbit specification for radiation hazard avoidance; satellite design assessment and post-event anomaly analysis; solar disturbance effects forecasting; frequency and antenna management for radar and HF communications; determination of link outage regions for active ionospheric conditions; scientific model validation and comparison, physics research, and education. Version 2.0 provided a simplified graphical user interface, improved science and application modules, and significantly enhanced graphical performance. Common input data archive sets, application modules, and 1-D, 2-D, and 3-D visualization tools are provided to all models. Dynamic capabilities permit multiple environments to be generated at user-specified time intervals while animation tools enable displays such as satellite orbits and environment data together as a function of time. Building on the existing Version 2.0 software architecture, AF-GEOSpace Version 2.1 is currently under development and will include a host of new modules to provide, for example, geosynchronous charged particle fluxes, neutral atmosphere densities, cosmic ray cutoff maps, low-altitude trapped proton belt specification, and meteor shower/storm fluxes with spacecraft impact probabilities. AF-GEOSpace Version 2.1 is being developed for Windows NT/2000/XP and Linux systems.

  13. Cohesive and mixed sediment in the Regional Ocean Modeling System (ROMS v3.6) implemented in the Coupled Ocean-Atmosphere-Wave-Sediment Transport Modeling System (COAWST r1234)

    NASA Astrophysics Data System (ADS)

    Sherwood, Christopher R.; Aretxabaleta, Alfredo L.; Harris, Courtney K.; Rinehimer, J. Paul; Verney, Romaric; Ferré, Bénédicte

    2018-05-01

    We describe and demonstrate algorithms for treating cohesive and mixed sediment that have been added to the Regional Ocean Modeling System (ROMS version 3.6), as implemented in the Coupled Ocean-Atmosphere-Wave-Sediment Transport Modeling System (COAWST Subversion repository revision 1234). These include the following: floc dynamics (aggregation and disaggregation in the water column); changes in floc characteristics in the seabed; erosion and deposition of cohesive and mixed (combination of cohesive and non-cohesive) sediment; and biodiffusive mixing of bed sediment. These routines supplement existing non-cohesive sediment modules, thereby increasing our ability to model fine-grained and mixed-sediment environments. Additionally, we describe changes to the sediment bed layering scheme that improve the fidelity of the modeled stratigraphic record. Finally, we provide examples of these modules implemented in idealized test cases and a realistic application.

  14. EVALUATION OF THE COMMUNITY MULTISCALE AIR QUALITY (CMAQ) MODEL VERSION 4.5: UNCERTAINTIES AND SENSITIVITIES IMPACTING MODEL PERFORMANCE: PART I - OZONE

    EPA Science Inventory

    This study examines ozone (O3) predictions from the Community Multiscale Air Quality (CMAQ) model version 4.5 and discusses potential factors influencing the model results. Daily maximum 8-hr average O3 levels are largely underpredicted when observed O...

  15. Computing and Visualizing the Complex Dynamics of Earthquake Fault Systems: Towards Ensemble Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Rundle, J.; Rundle, P.; Donnellan, A.; Li, P.

    2003-12-01

    We consider the problem of the complex dynamics of earthquake fault systems, and whether numerical simulations can be used to define an ensemble forecasting technology similar to that used in weather and climate research. To effectively carry out such a program, we need 1) a topological realistic model to simulate the fault system; 2) data sets to constrain the model parameters through a systematic program of data assimilation; 3) a computational technology making use of modern paradigms of high performance and parallel computing systems; and 4) software to visualize and analyze the results. In particular, we focus attention of a new version of our code Virtual California (version 2001) in which we model all of the major strike slip faults extending throughout California, from the Mexico-California border to the Mendocino Triple Junction. We use the historic data set of earthquakes larger than magnitude M > 6 to define the frictional properties of all 654 fault segments (degrees of freedom) in the model. Previous versions of Virtual California had used only 215 fault segments to model the strike slip faults in southern California. To compute the dynamics and the associated surface deformation, we use message passing as implemented in the MPICH standard distribution on a small Beowulf cluster consisting of 10 cpus. We are also planning to run the code on significantly larger machines so that we can begin to examine much finer spatial scales of resolution, and to assess scaling properties of the code. We present results of simulations both as static images and as mpeg movies, so that the dynamical aspects of the computation can be assessed by the viewer. We also compute a variety of statistics from the simulations, including magnitude-frequency relations, and compare these with data from real fault systems.

  16. An international land-biosphere model benchmarking activity for the IPCC Fifth Assessment Report (AR5)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forrest M; Randerson, James T; Thornton, Peter E

    2009-12-01

    The need to capture important climate feedbacks in general circulation models (GCMs) has resulted in efforts to include atmospheric chemistry and land and ocean biogeochemistry into the next generation of production climate models, called Earth System Models (ESMs). While many terrestrial and ocean carbon models have been coupled to GCMs, recent work has shown that such models can yield a wide range of results (Friedlingstein et al., 2006). This work suggests that a more rigorous set of global offline and partially coupled experiments, along with detailed analyses of processes and comparisons with measurements, are needed. The Carbon-Land Model Intercomparison Projectmore » (C-LAMP) was designed to meet this need by providing a simulation protocol and model performance metrics based upon comparisons against best-available satellite- and ground-based measurements (Hoffman et al., 2007). Recently, a similar effort in Europe, called the International Land Model Benchmark (ILAMB) Project, was begun to assess the performance of European land surface models. These two projects will now serve as prototypes for a proposed international land-biosphere model benchmarking activity for those models participating in the IPCC Fifth Assessment Report (AR5). Initially used for model validation for terrestrial biogeochemistry models in the NCAR Community Land Model (CLM), C-LAMP incorporates a simulation protocol for both offline and partially coupled simulations using a prescribed historical trajectory of atmospheric CO2 concentrations. Models are confronted with data through comparisons against AmeriFlux site measurements, MODIS satellite observations, NOAA Globalview flask records, TRANSCOM inversions, and Free Air CO2 Enrichment (FACE) site measurements. Both sets of experiments have been performed using two different terrestrial biogeochemistry modules coupled to the CLM version 3 in the Community Climate System Model version 3 (CCSM3): the CASA model of Fung, et al., and the carbon-nitrogen (CN) model of Thornton. Comparisons of the CLM3 offline results against observational datasets have been performed and are described in Randerson et al. (2009). CLM version 4 has been evaluated using C-LAMP, showing improvement in many of the metrics. Efforts are now underway to initiate a Nitrogen-Land Model Intercomparison Project (N-LAMP) to better constrain the effects of the nitrogen cycle in biosphere models. Presented will be new results from C-LAMP for CLM4, initial N-LAMP developments, and the proposed land-biosphere model benchmarking activity.« less

  17. Cell illustrator 4.0: a computational platform for systems biology.

    PubMed

    Nagasaki, Masao; Saito, Ayumu; Jeong, Euna; Li, Chen; Kojima, Kaname; Ikeda, Emi; Miyano, Satoru

    2011-01-01

    Cell Illustrator is a software platform for Systems Biology that uses the concept of Petri net for modeling and simulating biopathways. It is intended for biological scientists working at bench. The latest version of Cell Illustrator 4.0 uses Java Web Start technology and is enhanced with new capabilities, including: automatic graph grid layout algorithms using ontology information; tools using Cell System Markup Language (CSML) 3.0 and Cell System Ontology 3.0; parameter search module; high-performance simulation module; CSML database management system; conversion from CSML model to programming languages (FORTRAN, C, C++, Java, Python and Perl); import from SBML, CellML, and BioPAX; and, export to SVG and HTML. Cell Illustrator employs an extension of hybrid Petri net in an object-oriented style so that biopathway models can include objects such as DNA sequence, molecular density, 3D localization information, transcription with frame-shift, translation with codon table, as well as biochemical reactions.

  18. Cell Illustrator 4.0: a computational platform for systems biology.

    PubMed

    Nagasaki, Masao; Saito, Ayumu; Jeong, Euna; Li, Chen; Kojima, Kaname; Ikeda, Emi; Miyano, Satoru

    2010-01-01

    Cell Illustrator is a software platform for Systems Biology that uses the concept of Petri net for modeling and simulating biopathways. It is intended for biological scientists working at bench. The latest version of Cell Illustrator 4.0 uses Java Web Start technology and is enhanced with new capabilities, including: automatic graph grid layout algorithms using ontology information; tools using Cell System Markup Language (CSML) 3.0 and Cell System Ontology 3.0; parameter search module; high-performance simulation module; CSML database management system; conversion from CSML model to programming languages (FORTRAN, C, C++, Java, Python and Perl); import from SBML, CellML, and BioPAX; and, export to SVG and HTML. Cell Illustrator employs an extension of hybrid Petri net in an object-oriented style so that biopathway models can include objects such as DNA sequence, molecular density, 3D localization information, transcription with frame-shift, translation with codon table, as well as biochemical reactions.

  19. Algorithm To Architecture Mapping Model (ATAMM) multicomputer operating system functional specification

    NASA Technical Reports Server (NTRS)

    Mielke, R.; Stoughton, J.; Som, S.; Obando, R.; Malekpour, M.; Mandala, B.

    1990-01-01

    A functional description of the ATAMM Multicomputer Operating System is presented. ATAMM (Algorithm to Architecture Mapping Model) is a marked graph model which describes the implementation of large grained, decomposed algorithms on data flow architectures. AMOS, the ATAMM Multicomputer Operating System, is an operating system which implements the ATAMM rules. A first generation version of AMOS which was developed for the Advanced Development Module (ADM) is described. A second generation version of AMOS being developed for the Generic VHSIC Spaceborne Computer (GVSC) is also presented.

  20. Updated System-Availability and Resource-Allocation Program

    NASA Technical Reports Server (NTRS)

    Viterna, Larry

    2004-01-01

    A second version of the Availability, Cost and Resource Allocation (ACARA) computer program has become available. The first version was reported in an earlier tech brief. To recapitulate: ACARA analyzes the availability, mean-time-between-failures of components, life-cycle costs, and scheduling of resources of a complex system of equipment. ACARA uses a statistical Monte Carlo method to simulate the failure and repair of components while complying with user-specified constraints on spare parts and resources. ACARA evaluates the performance of the system on the basis of a mathematical model developed from a block-diagram representation. The previous version utilized the MS-DOS operating system and could not be run by use of the most recent versions of the Windows operating system. The current version incorporates the algorithms of the previous version but is compatible with Windows and utilizes menus and a file-management approach typical of Windows-based software.

  1. Technical report series on global modeling and data assimilation. Volume 1: Documentation of the Goddard Earth Observing System (GEOS) General Circulation Model, version 1

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); Takacs, Lawrence L.; Molod, Andrea; Wang, Tina

    1994-01-01

    This technical report documents Version 1 of the Goddard Earth Observing System (GEOS) General Circulation Model (GCM). The GEOS-1 GCM is being used by NASA's Data Assimilation Office (DAO) to produce multiyear data sets for climate research. This report provides a documentation of the model components used in the GEOS-1 GCM, a complete description of model diagnostics available, and a User's Guide to facilitate GEOS-1 GCM experiments.

  2. TSARINA: A Computer Model for Assessing Conventional and Chemical Attacks on Airbases

    DTIC Science & Technology

    1990-09-01

    IV, and has been updated to FORTRAN 77; it has been adapted to various computer systems, as was the widely used AIDA model and the previous versions of...conventional and chemical attacks on sortie generation. In the first version of TSARINA [1 2], several key additions were made to the AIDA model so that (1...various on-base resources, in addition to the estimates of hits and facility damage that are generated by the original AIDA model . The second version

  3. A new version of Scilab software package for the study of dynamical systems

    NASA Astrophysics Data System (ADS)

    Bordeianu, C. C.; Felea, D.; Beşliu, C.; Jipa, Al.; Grossu, I. V.

    2009-11-01

    This work presents a new version of a software package for the study of chaotic flows, maps and fractals [1]. The codes were written using Scilab, a software package for numerical computations providing a powerful open computing environment for engineering and scientific applications. It was found that Scilab provides various functions for ordinary differential equation solving, Fast Fourier Transform, autocorrelation, and excellent 2D and 3D graphical capabilities. The chaotic behaviors of the nonlinear dynamics systems were analyzed using phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropy. Various well-known examples are implemented, with the capability of the users inserting their own ODE or iterative equations. New version program summaryProgram title: Chaos v2.0 Catalogue identifier: AEAP_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAP_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 1275 No. of bytes in distributed program, including test data, etc.: 7135 Distribution format: tar.gz Programming language: Scilab 5.1.1. Scilab 5.1.1 should be installed before running the program. Information about the installation can be found at http://wiki.scilab.org/howto/install/windows. Computer: PC-compatible running Scilab on MS Windows or Linux Operating system: Windows XP, Linux RAM: below 150 Megabytes Classification: 6.2 Catalogue identifier of previous version: AEAP_v1_0 Journal reference of previous version: Comput. Phys. Comm. 178 (2008) 788 Does the new version supersede the previous version?: Yes Nature of problem: Any physical model containing linear or nonlinear ordinary differential equations (ODE). Solution method: Numerical solving of ordinary differential equations for the study of chaotic flows. The chaotic behavior of the nonlinear dynamical system is analyzed using Poincare sections, phase-space maps, autocorrelation functions, power spectra, Lyapunov exponents and Kolmogorov-Sinai entropies. Numerical solving of iterative equations for the study of maps and fractals. Reasons for new version: The program has been updated to use the new version 5.1.1 of Scilab with new graphical capabilities [2]. Moreover, new use cases have been added which make the handling of the program easier and more efficient. Summary of revisions: A new use case concerning coupled predator-prey models has been added [3]. Three new use cases concerning fractals (Sierpinsky gasket, Barnsley's Fern and Tree) have been added [3]. The graphical user interface (GUI) of the program has been reconstructed to include the new use cases. The program has been updated to use Scilab 5.1.1 with the new graphical capabilities. Additional comments: The program package contains 12 subprograms. interface.sce - the graphical user interface (GUI) that permits the choice of a routine as follows 1.sci - Lorenz dynamical system 2.sci - Chua dynamical system 3.sci - Rosler dynamical system 4.sci - Henon map 5.sci - Lyapunov exponents for Lorenz dynamical system 6.sci - Lyapunov exponent for the logistic map 7.sci - Shannon entropy for the logistic map 8.sci - Coupled predator-prey model 1f.sci - Sierpinsky gasket 2f.sci - Barnsley's Fern 3f.sci - Barnsley's Tree Running time: 10 to 20 seconds for problems that do not involve Lyapunov exponents calculation; 60 to 1000 seconds for problems that involve high orders ODE, Lyapunov exponents calculation and fractals. References: C.C. Bordeianu, C. Besliu, Al. Jipa, D. Felea, I. V. Grossu, Comput. Phys. Comm. 178 (2008) 788. S. Campbell, J.P. Chancelier, R. Nikoukhah, Modeling and Simulation in Scilab/Scicos, Springer, 2006. R.H. Landau, M.J. Paez, C.C. Bordeianu, A Survey of Computational Physics, Introductory Computational Science, Princeton University Press, 2008.

  4. A computer-based instrumentation system for measurement of breath-by-breath oxygen consumption and carbon dioxide production.

    PubMed

    Sharma, C; Gallagher, R R

    1994-01-01

    Improvements are implemented (Version 4) in a Computer-Based Respiratory Measurement System (CBRMS) identified as Version 3. The programming language has been changed from Pascal to C. A Gateway 2000 desktop computer with 486 DX2/50MHz CPU and a plug-in data I/O board (KEITHLEY METRABYTE/ASYST/DAC's DAS-HRES 16-bit Analog and Digital I/O board) replaces an HP 9836 system used in Version 3. The breath-by-breath system consists of a mass spectrometer for measuring fractional concentrations of oxygen and carbon dioxide and the accommodation of a turbine or pneumotachometer for measuring inspiratory and expiratory flows. The temperature of the inspiratory and expiratory gases can be monitored if temperature corrections are necessary for the flow measurement device. These signals are presented to the PC via the data acquisition module. To compare the two Versions, ten significant respiratory parameters were investigated and compared for physiological resting states and steady states obtained during an exercise forcing. Both graphical and statistical (analysis of variance, regression, and correlation) tests were carried out on the data. The results from the two versions compared well for all ten parameters. Also, no evidence of a statistically significant difference was found between the resting and steady-state results of the present CBRMS (Version 4) and the previous CBRMS (Version 3). This evidence suggests that Version 3 (Pascal) has been successfully converted to Version 4 (C). Implementation of the CBRMS in C on a PC has several advantages.(ABSTRACT TRUNCATED AT 250 WORDS)

  5. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1994-01-01

    Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  6. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (SGI IRIS VERSION)

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1994-01-01

    Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  7. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (MACINTOSH VERSION)

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1994-01-01

    Many engineering systems are large and multi-disciplinary. Before the design of new complex systems such as large space platforms can begin, the possible interactions among subsystems and their parts must be determined. Once this is completed the proposed system can be decomposed to identify its hierarchical structure. DeMAID (A Design Manager's Aid for Intelligent Decomposition) is a knowledge-based system for ordering the sequence of modules and identifying a possible multilevel structure for the design problem. DeMAID displays the modules in an N x N matrix format (called a design structure matrix) where a module is any process that requires input and generates an output. (Modules which generate an output but do not require an input, such as an initialization process, are also acceptable.) Although DeMAID requires an investment of time to generate and refine the list of modules for input, it could save a considerable amount of money and time in the total design process, particularly in new design problems where the ordering of the modules has not been defined. The decomposition of a complex design system into subsystems requires the judgement of the design manager. DeMAID reorders and groups the modules based on the links (interactions) among the modules, helping the design manager make decomposition decisions early in the design cycle. The modules are grouped into circuits (the subsystems) and displayed in an N x N matrix format. Feedback links, which indicate an iterative process, are minimized and only occur within a subsystem. Since there are no feedback links among the circuits, the circuits can be displayed in a multilevel format. Thus, a large amount of information is reduced to one or two displays which are stored for later retrieval and modification. The design manager and leaders of the design teams then have a visual display of the design problem and the intricate interactions among the different modules. The design manager could save a substantial amount of time if circuits on the same level of the multilevel structure are executed in parallel. DeMAID estimates the time savings based on the number of available processors. In addition to decomposing the system into subsystems, DeMAID examines the dependencies of a problem with independent variables and dependant functions. A dependency matrix is created to show the relationship. DeMAID is based on knowledge base techniques to provide flexibility and ease in adding new capabilities. Although DeMAID was originally written for design problems, it has proven to be very general in solving any problem which contains modules (processes) which take an input and generate an output. For example, one group is applying DeMAID to gain understanding of the data flow of a very large computer program. In this example, the modules are the subroutines of the program. The design manager begins the design of a system by determining the level of modules which need to be ordered. The level is the "granularity" of the problem. For example, the design manager may wish to examine disciplines (a coarse model), analysis programs, or the data level (a fine model). Once the system is divided into these modules, each module's input and output is determined, creating a data file for input to the main program. DeMAID is executed through a system of menus. The user has the choice to plan, schedule, display the N x N matrix, display the multilevel organization, or examine the dependency matrix. The main program calls a subroutine which reads a rule file and a data file, asserts facts into the knowledge base, and executes the inference engine of the artificial intelligence/expert systems program, CLIPS (C Language Integrated Production System). To determine the effects of changes in the design process, DeMAID includes a trace effects feature. There are two methods available to trace the effects of a change in the design process. The first method traces forward through the outputs to determine the effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  8. Description and evaluation of the Community Multiscale Air ...

    EPA Pesticide Factsheets

    The Community Multiscale Air Quality (CMAQ) model is a comprehensive multipollutant air quality modeling system developed and maintained by the US Environmental Protection Agency's (EPA) Office of Research and Development (ORD). Recently, version 5.1 of the CMAQ model (v5.1) was released to the public, incorporating a large number of science updates and extended capabilities over the previous release version of the model (v5.0.2). These updates include the following: improvements in the meteorological calculations in both CMAQ and the Weather Research and Forecast (WRF) model used to provide meteorological fields to CMAQ, updates to the gas and aerosol chemistry, revisions to the calculations of clouds and photolysis, and improvements to the dry and wet deposition in the model. Sensitivity simulations isolating several of the major updates to the modeling system show that changes to the meteorological calculations result in enhanced afternoon and early evening mixing in the model, periods when the model historically underestimates mixing. This enhanced mixing results in higher ozone (O3) mixing ratios on average due to reduced NO titration, and lower fine particulate matter (PM2. 5) concentrations due to greater dilution of primary pollutants (e.g., elemental and organic carbon). Updates to the clouds and photolysis calculations greatly improve consistency between the WRF and CMAQ models and result in generally higher O3 mixing ratios, primarily due to reduced

  9. Ada Compiler Validation Summary Report: Certificate Number: 901129W1. 11096 Verdix Corporation, VADS Sequent Balance DYNIX 3.0, VAda-110-2323, Version 6.0, Sequent Balance 8000, DYNIX Version 3.0 (Host & Target).

    DTIC Science & Technology

    1991-08-01

    90-09-25- VRX Ada COMPILER VALIDATION SUMMARY REPORT: Certificate Number: 901129W1.11096 Verdix Corporation VADS Sequent Balance DYNIX 3.0, VAda-1lO...8000, DYNIX Version 3.0 Target Computer System: Sequent Balance 8000, DYNIX Version 3.0 Customer Agreement Number: 90-09-25- VRX See Section 3.1 for any

  10. Object Toolkit Version 4.3 User’s Manual

    DTIC Science & Technology

    2016-12-31

    unlimited. (OPS-17-12855 dtd 19 Jan 2017) 13. SUPPLEMENTARY NOTES 14. ABSTRACT Object Toolkit is a finite - element model builder specifically designed for...INTRODUCTION 1 What Is Object Toolkit? Object Toolkit is a finite - element model builder specifically designed for creating representations of spacecraft...Nascap-2k and EPIC, the user is not required to purchase or learn expensive finite element generators to create system models. Second, Object Toolkit

  11. Aircraft noise synthesis system: Version 4 user instructions

    NASA Technical Reports Server (NTRS)

    Mccurdy, David A.; Sullivan, Brenda M.; Grandle, Robert E.

    1987-01-01

    A modified version of the Aircraft Noise Synthesis System with improved directivity and tonal content modeling has been developed. The synthesis system is used to provide test stimuli for studies of community annoyance to aircraft flyover noise. The computer-based system generates realistic, time-varying audio simulations of aircraft flyover noise at a specified observer location on the ground. The synthesis takes into account the time-varying aircraft position relative to the observer; specified reference spectra consisting of broadband, narrowband, and pure tone components; directivity patterns; Doppler shift; atmospheric effects; and ground effects. These parameters can be specified and controlled in such a way as to generate stimuli in which certain noise characteristics such as duration or tonal content are independently varied while the remaining characteristics such as broadband content are held constant. The modified version of the system provides improved modeling of noise directivity patterns and an increased number of pure tone components. User instructions for the modified version of the synthesis system are provided.

  12. TESTING U.S. EPA'S ISCST -VERSION 3 MODEL ON DIOXINS: A COMPARISON OF PREDICTED AND OBSERVED AIR AND SOIL CONCENTRATIONS

    EPA Science Inventory

    The central purpose of our study was to examine the performance of the United States Environmental Protection Agency's (EPA) nonreactive Gaussian air quality dispersion model, the Industrial Source Complex Short Term Model (ISCST3) Version 98226, in predicting polychlorinated dib...

  13. An Operational Model for the Application of Planning-Programming-Budgeting Systems to Local School Districts. Post-Pilot-Test Version. Parts One and Two.

    ERIC Educational Resources Information Center

    Kiser, Chester; And Others

    This 2-part document is designed to aid school districts in the implementation of a planning programing budgeting system. The first part of the manual contains (1) statements of policy, (2) a master flowchart, (3) organization and functions of a PPBS system, (4) a flowscript of procedures, (5) job outlines, and (6) supplementary appendix material.…

  14. Integrated Noise Model (INM) version 6.0 technical manual

    DOT National Transportation Integrated Search

    2002-01-31

    The Federal Aviation Administration, Office of Environment and Energy (FAA, AEE-100) has : developed Version 6.0 of the Integrated Noise Model (INM) with support from the John A. Volpe : National Transportation Systems Center, Acoustics Facility (Vol...

  15. Integrated noise model (INM) version 7.0 technical manual

    DOT National Transportation Integrated Search

    2008-01-31

    The Federal Aviation Administration, Office of Environment and Energy (FAA, AEE-100) has developed Version 7.0 of the Integrated Noise Model (INM) with support from the John A. Volpe National Transportation Systems Center, Acoustics Facility (Volpe C...

  16. Integrated Noise Model (INM), version 5.1 : technical manual

    DOT National Transportation Integrated Search

    1997-12-01

    The Federal Aviation Administration, Office of Environment and Energy (FAA, AEE-120) : has developed Version 5.1 of the Integrated Noise Model (INM) with support from the : John A. Volpe National Transportation Systems Center, Acoustics Facility (Vol...

  17. Implementing the HL7v3 standard in Croatian primary healthcare domain.

    PubMed

    Koncar, Miroslav

    2004-01-01

    The mission of HL7 Inc. is to provide standards for the exchange, management and integration of data that supports clinical patient care and the management, delivery and evaluation of healthcare services. The scope of this work includes the specifications of flexible, cost-effective approaches, standards, guidelines, methodologies, and related services for interoperability between healthcare information systems. In the field of medical information technologies, HL7 provides the world's most advanced information standards. Versions 1 and 2 of the HL7 standard have on the one hand solved many issues, but on the other demonstrated the size and complexity of the health information sharing problem. As the solution, a complete new methodology has been adopted, which is being encompassed in version 3 recommendations. This approach standardizes the Reference Information Model (RIM), which is the source of all domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely-coupled systems that are designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project, we have decided to go directly to HL7v3. Implementing the HL7v3 standard in healthcare applications represents a challenging task. By using standardized refinement and localization methods we were able to define information models for Croatian primary healthcare domain. The scope of our work includes clinical, financial and administrative data management, where in some cases we were compelled to introduce new HL7v3-compliant models. All of the HL7v3 transactions are digitally signed, using the W3C XML Digital Signature standard.

  18. IMS Version 3 Student Data Base Maintenance Program.

    ERIC Educational Resources Information Center

    Brown, John R.

    Computer routines that update the Instructional Management System (IMS) Version 3 student data base which supports the Southwest Regional Laboratory's (SWRL) student monitoring system are described. Written in IBM System 360 FORTRAN IV, the program updates the data base by adding, changing and deleting records, as well as adding and deleting…

  19. A Hemispheric Version of the Community Multiscale Air Quality (CMAQ) Modeling System

    EPA Science Inventory

    This invited presentation will be given at the 4th Biannual Western Modeling Workshop in the Plenary session on Global model development, evaluation, and new source attribution tools. We describe the development and application of the hemispheric version of the CMAQ to examine th...

  20. [HL7 standard--features, principles, and methodology].

    PubMed

    Koncar, Miroslav

    2005-01-01

    The mission of HL7 Inc. non-profit organization is to provide standards for the exchange, management and integration of data that support clinical patient care, and the management, delivery and evaluation of healthcare services. As the standards developed by HL7 Inc. represent the world's most influential standardization efforts in the field of medical informatics, the HL7 family of standards has been recognized by the technical and scientific community as the foundation for the next generation healthcare information systems. Versions 1 and 2 of HL7 standard have solved many issues, but also demonstrated the size and complexity of health information sharing problem. As the solution complete new methodology has been adopted that is encompassed in the HL7 Version 3 recommendations. This approach standardizes Reference Information Model (RIM), which is the source of all derived domain models and message structures. Message design is now defined in detail, enabling interoperability between loosely coupled systems that are.designed by different vendors and deployed in various environments. At the start of the Primary Healthcare Information System project in the Republic of Croatia in 2002, the decision was to go directly to Version 3. The target scope of work includes clinical, financial and administrative data management in the domain of healthcare processes. By using HL7v3 standardized methodology we were able to completely map the Croatian primary healthcare domain to HL7v3 artefacts. Further refinement processes that are planned for the future will provide semantic interoperability and detailed description of all elements in HL7 messages. Our HL7 Business Component is in constant process of studying different legacy applications, making solid foundation for their integration to HL7-enabled communication environment.

  1. Continuous-time quantum Monte Carlo impurity solvers

    NASA Astrophysics Data System (ADS)

    Gull, Emanuel; Werner, Philipp; Fuchs, Sebastian; Surer, Brigitte; Pruschke, Thomas; Troyer, Matthias

    2011-04-01

    Continuous-time quantum Monte Carlo impurity solvers are algorithms that sample the partition function of an impurity model using diagrammatic Monte Carlo techniques. The present paper describes codes that implement the interaction expansion algorithm originally developed by Rubtsov, Savkin, and Lichtenstein, as well as the hybridization expansion method developed by Werner, Millis, Troyer, et al. These impurity solvers are part of the ALPS-DMFT application package and are accompanied by an implementation of dynamical mean-field self-consistency equations for (single orbital single site) dynamical mean-field problems with arbitrary densities of states. Program summaryProgram title: dmft Catalogue identifier: AEIL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEIL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: ALPS LIBRARY LICENSE version 1.1 No. of lines in distributed program, including test data, etc.: 899 806 No. of bytes in distributed program, including test data, etc.: 32 153 916 Distribution format: tar.gz Programming language: C++ Operating system: The ALPS libraries have been tested on the following platforms and compilers: Linux with GNU Compiler Collection (g++ version 3.1 and higher), and Intel C++ Compiler (icc version 7.0 and higher) MacOS X with GNU Compiler (g++ Apple-version 3.1, 3.3 and 4.0) IBM AIX with Visual Age C++ (xlC version 6.0) and GNU (g++ version 3.1 and higher) compilers Compaq Tru64 UNIX with Compq C++ Compiler (cxx) SGI IRIX with MIPSpro C++ Compiler (CC) HP-UX with HP C++ Compiler (aCC) Windows with Cygwin or coLinux platforms and GNU Compiler Collection (g++ version 3.1 and higher) RAM: 10 MB-1 GB Classification: 7.3 External routines: ALPS [1], BLAS/LAPACK, HDF5 Nature of problem: (See [2].) Quantum impurity models describe an atom or molecule embedded in a host material with which it can exchange electrons. They are basic to nanoscience as representations of quantum dots and molecular conductors and play an increasingly important role in the theory of "correlated electron" materials as auxiliary problems whose solution gives the "dynamical mean field" approximation to the self-energy and local correlation functions. Solution method: Quantum impurity models require a method of solution which provides access to both high and low energy scales and is effective for wide classes of physically realistic models. The continuous-time quantum Monte Carlo algorithms for which we present implementations here meet this challenge. Continuous-time quantum impurity methods are based on partition function expansions of quantum impurity models that are stochastically sampled to all orders using diagrammatic quantum Monte Carlo techniques. For a review of quantum impurity models and their applications and of continuous-time quantum Monte Carlo methods for impurity models we refer the reader to [2]. Additional comments: Use of dmft requires citation of this paper. Use of any ALPS program requires citation of the ALPS [1] paper. Running time: 60 s-8 h per iteration.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadgu, Teklu; Appel, Gordon John

    Sandia National Laboratories (SNL) continued evaluation of total system performance assessment (TSPA) computing systems for the previously considered Yucca Mountain Project (YMP). This was done to maintain the operational readiness of the computing infrastructure (computer hardware and software) and knowledge capability for total system performance assessment (TSPA) type analysis, as directed by the National Nuclear Security Administration (NNSA), DOE 2010. This work is a continuation of the ongoing readiness evaluation reported in Lee and Hadgu (2014) and Hadgu et al. (2015). The TSPA computing hardware (CL2014) and storage system described in Hadgu et al. (2015) were used for the currentmore » analysis. One floating license of GoldSim with Versions 9.60.300, 10.5 and 11.1.6 was installed on the cluster head node, and its distributed processing capability was mapped on the cluster processors. Other supporting software were tested and installed to support the TSPA-type analysis on the server cluster. The current tasks included verification of the TSPA-LA uncertainty and sensitivity analyses, and preliminary upgrade of the TSPA-LA from Version 9.60.300 to the latest version 11.1. All the TSPA-LA uncertainty and sensitivity analyses modeling cases were successfully tested and verified for the model reproducibility on the upgraded 2014 server cluster (CL2014). The uncertainty and sensitivity analyses used TSPA-LA modeling cases output generated in FY15 based on GoldSim Version 9.60.300 documented in Hadgu et al. (2015). The model upgrade task successfully converted the Nominal Modeling case to GoldSim Version 11.1. Upgrade of the remaining of the modeling cases and distributed processing tasks will continue. The 2014 server cluster and supporting software systems are fully operational to support TSPA-LA type analysis.« less

  3. SecPop Version 4: Sector Population Land Fraction and Economic Estimation Program: Users? Guide Model Manual and Verification Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Scott; Bixler, Nathan E.; McFadden, Katherine Letizia

    In 1973 the U.S. Environmental Protection Agency (EPA) developed SecPop to calculate population estimates to support a study on air quality. The Nuclear Regulatory Commission (NRC) adopted this program to support siting reviews for nuclear power plant construction and license applications. Currently SecPop is used to prepare site data input files for offsite consequence calculations with the MELCOR Accident Consequence Code System (MACCS). SecPop enables the use of site-specific population, land use, and economic data for a polar grid defined by the user. Updated versions of SecPop have been released to use U.S. decennial census population data. SECPOP90 was releasedmore » in 1997 to use 1990 population and economic data. SECPOP2000 was released in 2003 to use 2000 population data and 1997 economic data. This report describes the current code version, SecPop version 4.3, which uses 2010 population data and both 2007 and 2012 economic data. It is also compatible with 2000 census and 2002 economic data. At the time of this writing, the current version of SecPop is 4.3.0, and that version is described herein. This report contains guidance for the installation and use of the code as well as a description of the theory, models, and algorithms involved. This report contains appendices which describe the development of the 2010 census file, 2007 county file, and 2012 county file. Finally, an appendix is included that describes the validation assessments performed.« less

  4. HNM, heliport noise model : version 2.2 user's guide

    DOT National Transportation Integrated Search

    1994-02-01

    The John A. Volpe National Transportation Systems Center (Volpe Center), in support of : the Federal Aviation Administration, Office of Environment and Energy, has developed : Version 2.2 of the Heliport Noise Model (HNM). The HNM is a computer progr...

  5. User's Guide to the Western Root Disease Model, Version 3.0

    Treesearch

    Susan J. Frankel

    1998-01-01

    Effects of Armillaria spp., Phellinus weirii, Heterobasidion annosum, or bark beetles on stand dynamics are represented by the Western Root Disease Model,Version 3.0. This model, which operates in conjunction with the Forest Vegetation Simulator, can be used to evaluate the effects of many silvicultural practices. This guide contains instructions for use, detailed...

  6. FIAMODEL: Users Guide Version 3.0.

    Treesearch

    Scott A. Pugh; David D. Reed; Kurt S. Pregitzer; Patrick D. Miles

    2002-01-01

    FIAMODEL is a geographic information system (GIS program used to summarize Forest Inventory and Analysis (FIA, USDA Forest Service) data such as volume. The model runs in ArcView and allows users to select FIA plots with heads-up-digitizing, overlays of digital map layers, or queries based on specific plot attributes.

  7. AF-GEOSpace Version 2.0: Space Environment Software Products for 2002

    NASA Astrophysics Data System (ADS)

    Hilmer, R. V.; Ginet, G. P.; Hall, T.; Holeman, E.; Tautz, M.

    2002-05-01

    AF-GEOSpace Version 2.0 (release 2002 on WindowsNT/2000/XP) is a graphics-intensive software program developed by AFRL with space environment models and applications. It has grown steadily to become a development tool for automated space weather visualization products and helps with a variety of tasks: orbit specification for radiation hazard avoidance; satellite design assessment and post-event analysis; solar disturbance effects forecasting; frequency and antenna management for radar and HF communications; determination of link outage regions for active ionospheric conditions; and physics research and education. The object-oriented C++ code is divided into five module classes. Science Modules control science models to give output data on user-specified grids. Application Modules manipulate these data and provide orbit generation and magnetic field line tracing capabilities. Data Modules read and assist with the analysis of user-generated data sets. Graphics Modules enable the display of features such as plane slices, magnetic field lines, line plots, axes, the Earth, stars, and satellites. Worksheet Modules provide commonly requested coordinate transformations and calendar conversion tools. Common input data archive sets, application modules, and 1-, 2-, and 3-D visualization tools are provided to all models. The code documentation includes detailed examples with click-by-click instructions for investigating phenomena that have well known effects on communications and spacecraft systems. AF-GEOSpace Version 2.0 builds on the success of its predecessors. The first release (Version 1.21, 1996/IRIX on SGI) contained radiation belt particle flux and dose models derived from CRRES satellite data, an aurora model, an ionosphere model, and ionospheric HF ray tracing capabilities. Next (Version 1.4, 1999/IRIX on SGI) science modules were added related to cosmic rays and solar protons, low-Earth orbit radiation dosages, single event effects probability maps, ionospheric scintillation, and shock propagation models. New application modules for estimating linear energy transfer (LET) and single event upset (SEU) rates in solid-state devices, and graphic modules for visualizing radar fans, communication domes, and satellite detector cones and links were added. Automated FTP scripts permitted users to update their global input parameter set directly from NOAA/SEC. What?s New? Version 2.0 includes the first true dynamic run capabilities and offers new and enhanced graphical and data visualization tools such as 3-D volume rendering and eclipse umbra and penumbra determination. Animations of all model results can now be displayed together in all dimensions. There is a new realistic day-to-day ionospheric scintillation simulation generator (IONSCINT), an upgrade to the WBMOD scintillation code, a simplified HF ionospheric ray tracing module, and applications built on the NASA AE-8 and AP-8 radiation belt models. User-generated satellite data sets can now be visualized along with their orbital ephemeris. A prototype tool for visualizing MHD model results stored in structured grids provides a hint of where future space weather model development efforts are headed. A new graphical user interface (GUI) with improved module tracking and renaming features greatly simplifies software operation. AF-GEOSpace is distributed by the Space Weather Center of Excellence in the Space Vehicles Directorate of AFRL. Recently released for WindowsNT/2000/XP, versions for UNIX and LINUX operating systems will follow shortly. To obtain AF-GEOSpace Version 2.0, please send an e-mail request to the first author.

  8. Alternative Factor Models and Heritability of the Short Leyton Obsessional Inventory--Children's Version

    ERIC Educational Resources Information Center

    Moore, Janette; Smith, Gillian W.; Shevlin, Mark; O'Neill, Francis A.

    2010-01-01

    An alternative models framework was used to test three confirmatory factor analytic models for the Short Leyton Obsessional Inventory-Children's Version (Short LOI-CV) in a general population sample of 517 young adolescent twins (11-16 years). A one-factor model as implicit in current classification systems of Obsessive-Compulsive Disorder (OCD),…

  9. CARE 3, Version 4 enhancements

    NASA Technical Reports Server (NTRS)

    Bryant, L. A.; Stiffler, J. J.

    1985-01-01

    The enhancements and error corrections to CARE III Version 4 are listed. All changes to Version 4 with the exception of the internal redundancy model were implemented in Version 5. Version 4 is the first public release version for execution on the CDC Cyber 170 series computers. Version 5 is the second release version and it is written in ANSI standard FORTRAN 77 for execution on the DEC VAX 11/700 series computers and many others.

  10. The Infobiotics Workbench: an integrated in silico modelling platform for Systems and Synthetic Biology.

    PubMed

    Blakes, Jonathan; Twycross, Jamie; Romero-Campero, Francisco Jose; Krasnogor, Natalio

    2011-12-01

    The Infobiotics Workbench is an integrated software suite incorporating model specification, simulation, parameter optimization and model checking for Systems and Synthetic Biology. A modular model specification allows for straightforward creation of large-scale models containing many compartments and reactions. Models are simulated either using stochastic simulation or numerical integration, and visualized in time and space. Model parameters and structure can be optimized with evolutionary algorithms, and model properties calculated using probabilistic model checking. Source code and binaries for Linux, Mac and Windows are available at http://www.infobiotics.org/infobiotics-workbench/; released under the GNU General Public License (GPL) version 3. Natalio.Krasnogor@nottingham.ac.uk.

  11. SHABERTH - ANALYSIS OF A SHAFT BEARING SYSTEM (CRAY VERSION)

    NASA Technical Reports Server (NTRS)

    Coe, H. H.

    1994-01-01

    The SHABERTH computer program was developed to predict operating characteristics of bearings in a multibearing load support system. Lubricated and non-lubricated bearings can be modeled. SHABERTH calculates the loads, torques, temperatures, and fatigue life for ball and/or roller bearings on a single shaft. The program also allows for an analysis of the system reaction to the termination of lubricant supply to the bearings and other lubricated mechanical elements. SHABERTH has proven to be a valuable tool in the design and analysis of shaft bearing systems. The SHABERTH program is structured with four nested calculation schemes. The thermal scheme performs steady state and transient temperature calculations which predict system temperatures for a given operating state. The bearing dimensional equilibrium scheme uses the bearing temperatures, predicted by the temperature mapping subprograms, and the rolling element raceway load distribution, predicted by the bearing subprogram, to calculate bearing diametral clearance for a given operating state. The shaft-bearing system load equilibrium scheme calculates bearing inner ring positions relative to the respective outer rings such that the external loading applied to the shaft is brought into equilibrium by the rolling element loads which develop at each bearing inner ring for a given operating state. The bearing rolling element and cage load equilibrium scheme calculates the rolling element and cage equilibrium positions and rotational speeds based on the relative inner-outer ring positions, inertia effects, and friction conditions. The ball bearing subprograms in the current SHABERTH program have several model enhancements over similar programs. These enhancements include an elastohydrodynamic (EHD) film thickness model that accounts for thermal heating in the contact area and lubricant film starvation; a new model for traction combined with an asperity load sharing model; a model for the hydrodynamic rolling and shear forces in the inlet zone of lubricated contacts, which accounts for the degree of lubricant film starvation; modeling normal and friction forces between a ball and a cage pocket, which account for the transition between the hydrodynamic and elastohydrodynamic regimes of lubrication; and a model of the effect on fatigue life of the ratio of the EHD plateau film thickness to the composite surface roughness. SHABERTH is intended to be as general as possible. The models in SHABERTH allow for the complete mathematical simulation of real physical systems. Systems are limited to a maximum of five bearings supporting the shaft, a maximum of thirty rolling elements per bearing, and a maximum of one hundred temperature nodes. The SHABERTH program structure is modular and has been designed to permit refinement and replacement of various component models as the need and opportunities develop. A preprocessor is included in the IBM PC version of SHABERTH to provide a user friendly means of developing SHABERTH models and executing the resulting code. The preprocessor allows the user to create and modify data files with minimal effort and a reduced chance for errors. Data is utilized as it is entered; the preprocessor then decides what additional data is required to complete the model. Only this required information is requested. The preprocessor can accommodate data input for any SHABERTH compatible shaft bearing system model. The system may include ball bearings, roller bearings, and/or tapered roller bearings. SHABERTH is written in FORTRAN 77, and two machine versions are available from COSMIC. The CRAY version (LEW-14860) has a RAM requirement of 176K of 64 bit words. The IBM PC version (MFS-28818) is written for IBM PC series and compatible computers running MS-DOS, and includes a sample MS-DOS executable. For execution, the PC version requires at least 1Mb of RAM and an 80386 or 486 processor machine with an 80x87 math co-processor. The standard distribution medium for the IBM PC version is a set of two 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. The standard distribution medium for the CRAY version is also a 5.25 inch 360K MS-DOS format diskette, but alternate distribution media and formats are available upon request. The original version of SHABERTH was developed in FORTRAN IV at Lewis Research Center for use on a UNIVAC 1100 series computer. The Cray version was released in 1988, and was updated in 1990 to incorporate fluid rheological data for Rocket Propellant 1 (RP-1), thereby allowing the analysis of bearings lubricated with RP-1. The PC version is a port of the 1990 CRAY version and was developed in 1992 by SRS Technologies under contract to NASA Marshall Space Flight Center.

  12. Simultaneous quantification of soil phosphorus labile pool and desorption kinetics using DGTs and 3D-DIFS

    NASA Astrophysics Data System (ADS)

    Menezes-Blackburn, Daniel; Sun, Jiahui; Lehto, Niklas; Zhang, Hao; Stutter, Marc; Giles, Courtney D.; Darch, Tegan; George, Timothy S.; Shand, Charles; Lumsdon, David; Blackwell, Martin; Wearing, Catherine; Cooper, Patricia; Wendler, Renate; Brown, Lawrie; Haygarth, Philip M.

    2017-04-01

    The phosphorus (P) labile pool and desorption kinetics were simultaneously evaluated in ten representative UK soils using the technique of Diffusive gradients in thin films (DGT). The DGT-induced fluxes in soil and sediments model (DIFS) was fitted to the time series of DGT deployment (1h to 240h). The desorbable P concentration (labile P) was obtained by multiplying the fitted Kd by the soil solution P concentration obtained using Diffusive Equilibration in Thin Films (DET) devices. The labile P was then compared to several soil P extracts including Olsen P, Resin P, FeO-P and water extractable P, in order to assess if these analytical procedures can be used to represent the labile P across different soils. The Olsen P, commonly used as a representation of the soil labile P pool, overestimated the desorbable P concentration by a seven fold factor. The use of this approach for the quantification of soil P desorption kinetics parameters was somewhat unprecise, showing a wide range of equally valid solutions for the response of the system P equilibration time (Tc). Additionally, the performance of different DIFS model versions (1D, 2D and 3D) was compared. Although these models had a good fit to experimental DGT time series data, the fitted parameters showed a poor agreement between different model versions. The limitations of the DIFS model family are associated with the assumptions taken in the modelling approach and the 3D version is here considered to be the most precise among them.

  13. A framework for expanding aqueous chemistry in the Community Multiscale Air Quality (CMAQ) model version 5.1

    NASA Astrophysics Data System (ADS)

    Fahey, Kathleen M.; Carlton, Annmarie G.; Pye, Havala O. T.; Baek, Jaemeen; Hutzell, William T.; Stanier, Charles O.; Baker, Kirk R.; Wyat Appel, K.; Jaoui, Mohammed; Offenberg, John H.

    2017-04-01

    This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM - KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used to generate a Rosenbrock solver (Rodas3) to integrate the stiff system of ordinary differential equations (ODEs) that describe the mass transfer, chemical kinetics, and scavenging processes of CMAQ clouds. CMAQ's standard cloud chemistry module (AQCHEM) is structurally limited to the treatment of a simple chemical mechanism. This work advances our ability to test and implement more sophisticated aqueous chemical mechanisms in CMAQ and further investigate the impacts of microphysical parameters on cloud chemistry. Box model cloud chemistry simulations were performed to choose efficient solver and tolerance settings, evaluate the implementation of the KPP solver, and assess the direct impacts of alternative solver and kinetic mass transfer on predicted concentrations for a range of scenarios. Month-long CMAQ simulations for winter and summer periods over the US reveal the changes in model predictions due to these cloud module updates within the full chemical transport model. While monthly average CMAQ predictions are not drastically altered between AQCHEM and AQCHEM - KMT, hourly concentration differences can be significant. With added in-cloud secondary organic aerosol (SOA) formation from biogenic epoxides (AQCHEM - KMTI), normalized mean error and bias statistics are slightly improved for 2-methyltetrols and 2-methylglyceric acid at the Research Triangle Park measurement site in North Carolina during the Southern Oxidant and Aerosol Study (SOAS) period. The added in-cloud chemistry leads to a monthly average increase of 11-18 % in cloud SOA at the surface in the eastern United States for June 2013.

  14. An ocean data assimilation system and reanalysis of the World Ocean hydrophysical fields

    NASA Astrophysics Data System (ADS)

    Zelenko, A. A.; Vil'fand, R. M.; Resnyanskii, Yu. D.; Strukov, B. S.; Tsyrulnikov, M. D.; Svirenko, P. I.

    2016-07-01

    A new version of the ocean data assimilation system (ODAS) developed at the Hydrometcentre of Russia is presented. The assimilation is performed following the sequential scheme analysis-forecast-analysis. The main components of the ODAS are procedures for operational observation data processing, a variational analysis scheme, and an ocean general circulation model used to estimate the first guess fields involved in the analysis. In situ observations of temperature and salinity in the upper 1400-m ocean layer obtained from various observational platforms are used as input data. In the new ODAS version, the horizontal resolution of the assimilating model and of the output products is increased, the previous 2D-Var analysis scheme is replaced by a more general 3D-Var scheme, and a more flexible incremental analysis updating procedure is introduced to correct the model calculations. A reanalysis of the main World Ocean hydrophysical fields over the 2005-2015 period has been performed using the updated ODAS. The reanalysis results are compared with data from independent sources.

  15. Version Control in Project-Based Learning

    ERIC Educational Resources Information Center

    Milentijevic, Ivan; Ciric, Vladimir; Vojinovic, Oliver

    2008-01-01

    This paper deals with the development of a generalized model for version control systems application as a support in a range of project-based learning methods. The model is given as UML sequence diagram and described in detail. The proposed model encompasses a wide range of different project-based learning approaches by assigning a supervisory…

  16. The Global Ocean Forecast System, Version 3.0 (GOFS 3.0) or the Hybrid Coordinate Ocean Model (HYCOM)

    DTIC Science & Technology

    2012-04-10

    System (GOFS) V3.0 – 1/12 HYCOM/NCODA: Phase I‖ by Metzger et al., dated 26 November 2008 (NRL/MR/7320—08- 9148). The HYbrid Coordinate Ocean...C. Lozano, H.L. Tolman, A. Srinivasan, S. Hankin, P. Cornillon, R. Weisberg, A. Barth, R. He, F. Werner, and J. Wilkin , 2009. U.S. GODAE: Global...E.J. Metzger, J.F. Shriver, O.M. Smedstad, A.J. Wallcraft, and C.N. Barron, 2008 : Eddy-resolving global ocean prediction. In "Eddy-Resolving Ocean

  17. A SELF-CONSISTENT DEUTSCHIAN ESP MODEL

    EPA Science Inventory

    The report presents a new version of the EPA I Southern Research Institute electrostatic precipitator (ESP) model. The primary difference between this and the standard (Revision 3) versions is in the treatment of the particulate space charge. Both models apply the Deutsch equatio...

  18. Computing Operating Characteristics Of Bearing/Shaft Systems

    NASA Technical Reports Server (NTRS)

    Moore, James D.

    1996-01-01

    SHABERTH computer program predicts operating characteristics of bearings in multibearing load-support system. Lubricated and nonlubricated bearings modeled. Calculates loads, torques, temperatures, and fatigue lives of ball and/or roller bearings on single shaft. Provides for analysis of reaction of system to termination of supply of lubricant to bearings and other lubricated mechanical elements. Valuable in design and analysis of shaft/bearing systems. Two versions of SHABERTH available. Cray version (LEW-14860), "Computing Thermal Performances Of Shafts and Bearings". IBM PC version (MFS-28818), written for IBM PC-series and compatible computers running MS-DOS.

  19. The Emergent Executive: A Dynamic Field Theory of the Development of Executive Function

    PubMed Central

    Buss, Aaron T.; Spencer, John P.

    2015-01-01

    A dynamic neural field (DNF) model is presented which provides a process-based account of behavior and developmental change in a key task used to probe the early development of executive function—the Dimensional Change Card Sort (DCCS) task. In the DCCS, children must flexibly switch from sorting cards either by shape or color to sorting by the other dimension. Typically, 3-year-olds, but not 4-year-olds, lack the flexibility to do so and perseverate on the first set of rules when instructed to switch. In the DNF model, rule-use and behavioral flexibility come about through a form of dimensional attention which modulates activity within different cortical fields tuned to specific feature dimensions. In particular, we capture developmental change by increasing the strength of excitatory and inhibitory neural interactions in the dimensional attention system as well as refining the connectivity between this system and the feature-specific cortical fields. Note that although this enables the model to effectively switch tasks, the dimensional attention system does not ‘know’ the details of task-specific performance. Rather, correct performance emerges as a property of system-wide neural interactions. We show how this captures children's behavior in quantitative detail across 12 versions of the DCCS task. Moreover, we successfully test a set of novel predictions with 3-year-old children from a version of the task not explained by other theories. PMID:24818836

  20. Impacts of Residential Biofuel Emissions on Air Quality and Climate

    NASA Astrophysics Data System (ADS)

    Huang, Y.; Unger, N.; Harper, K.; Storelvmo, T.

    2016-12-01

    The residential biofuel sector is defined as fuelwood, agricultural residues and dung used for household cooking and heating. Aerosol emissions from this human activity play an important role affecting local, regional and global air quality, climate and public health. However, there are only few studies available that evaluate the net impacts and large uncertainties persist. Here we use the Community Atmosphere Model version 5.3 (CAM v5.3) within the Community Earth System Model version 1.2.2, to quantify the impacts of cook-stove biofuel emissions on air quality and climate. The model incorporates a novel advanced treatment of black carbon (BC) effects on mixed-phase/ice clouds. We update the global anthropogenic emission inventory in CAM v5.3 to a state-of-the-art emission inventory from the Greenhouse Gas-Air Pollution Interactions and Synergies integrated assessment model. Global in-situ and aircraft campaign observations for BC and organic carbon are used to evaluate and validate the model performance. Sensitivity simulations are employed to assess the impacts of residential biofuel emissions on regional and global direct and indirect radiative forcings in the contemporary world. We focus the analyses on several key regions including India, China and Sub-Saharan Africa.

  1. SAM Photovoltaic Model Technical Reference 2016 Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilman, Paul; DiOrio, Nicholas A; Freeman, Janine M

    This manual describes the photovoltaic performance model in the System Advisor Model (SAM) software, Version 2016.3.14 Revision 4 (SSC Version 160). It is an update to the 2015 edition of the manual, which describes the photovoltaic model in SAM 2015.1.30 (SSC 41). This new edition includes corrections of errors in the 2015 edition and descriptions of new features introduced in SAM 2016.3.14, including: 3D shade calculator Battery storage model DC power optimizer loss inputs Snow loss model Plane-of-array irradiance input from weather file option Support for sub-hourly simulations Self-shading works with all four subarrays, and uses same algorithm for fixedmore » arrays and one-axis tracking Linear self-shading algorithm for thin-film modules Loss percentages replace derate factors. The photovoltaic performance model is one of the modules in the SAM Simulation Core (SSC), which is part of both SAM and the SAM SDK. SAM is a user-friedly desktop application for analysis of renewable energy projects. The SAM SDK (Software Development Kit) is for developers writing their own renewable energy analysis software based on SSC. This manual is written for users of both SAM and the SAM SDK wanting to learn more about the details of SAM's photovoltaic model.« less

  2. Revision history aware repositories of computational models of biological systems.

    PubMed

    Miller, Andrew K; Yu, Tommy; Britten, Randall; Cooling, Mike T; Lawson, James; Cowan, Dougal; Garny, Alan; Halstead, Matt D B; Hunter, Peter J; Nickerson, David P; Nunns, Geo; Wimalaratne, Sarala M; Nielsen, Poul M F

    2011-01-14

    Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model. One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file. The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs) for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems. We have extended the Physiome Model Repository software to be fully revision history aware, by building it on top of Mercurial, an existing DVCS. We have demonstrated the utility of this approach, when used in conjunction with the model composition facilities in CellML, to build and understand more complex models. We have also demonstrated the ability of the repository software to present version history to casual users over the web, and to highlight specific versions which are likely to be useful to users. Providing facilities for maintaining and using revision history information is an important part of building a useful repository of computational models, as this information is useful both for understanding the source of and justification for parts of a model, and to facilitate automated processes such as merges. The availability of fully revision history aware repositories, and associated tools, will therefore be of significant benefit to the community.

  3. Spacecraft Orbit Design and Analysis (SODA), version 1.0 user's guide

    NASA Technical Reports Server (NTRS)

    Stallcup, Scott S.; Davis, John S.

    1989-01-01

    The Spacecraft Orbit Design and Analysis (SODA) computer program, Version 1.0 is described. SODA is a spaceflight mission planning system which consists of five program modules integrated around a common database and user interface. SODA runs on a VAX/VMS computer with an EVANS & SUTHERLAND PS300 graphics workstation. BOEING RIM-Version 7 relational database management system performs transparent database services. In the current version three program modules produce an interactive three dimensional (3D) animation of one or more satellites in planetary orbit. Satellite visibility and sensor coverage capabilities are also provided. One module produces an interactive 3D animation of the solar system. Another module calculates cumulative satellite sensor coverage and revisit time for one or more satellites. Currently Earth, Moon, and Mars systems are supported for all modules except the solar system module.

  4. TIM Version 3.0 beta - Technical Description and User's Guidance

    EPA Pesticide Factsheets

    Provides technical information on version 3.0 of the Terrestrial Investigation Model (TIM v.3.0). Describes how TIM derives joint distributions of exposure and toxicity to calculate the risk of mortality to birds.

  5. HANSF 1.3 Users Manual FAI/98-40-R2 Hanford Spent Nuclear Fuel (SNF) Safety Analysis Model [SEC 1 and 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DUNCAN, D.R.

    The HANSF analysis tool is an integrated model considering phenomena inside a multi-canister overpack (MCO) spent nuclear fuel container such as fuel oxidation, convective and radiative heat transfer, and the potential for fission product release. This manual reflects the HANSF version 1.3.2, a revised version of 1.3.1. HANSF 1.3.2 was written to correct minor errors and to allow modeling of condensate flow on the MCO inner surface. HANSF 1.3.2 is intended for use on personal computers such as IBM-compatible machines with Intel processors running under Lahey TI or digital Visual FORTRAN, Version 6.0, but this does not preclude operation inmore » other environments.« less

  6. Detection of faults and software reliability analysis

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1986-01-01

    Multiversion or N-version programming was proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. Specific topics addressed are: failure probabilities in N-version systems, consistent comparison in N-version systems, descriptions of the faults found in the Knight and Leveson experiment, analytic models of comparison testing, characteristics of the input regions that trigger faults, fault tolerance through data diversity, and the relationship between failures caused by automatically seeded faults.

  7. Using ISBA model for partitioning evapotranspiration into soil evaporation and plant transpiration of irrigated crops under semi-arid climate

    NASA Astrophysics Data System (ADS)

    Aouade, Ghizlane; Jarlan, Lionel; Ezzahar, Jamal; Er-raki, Salah; Napoly, Adrien; Benkaddour, Abdelfettah; Khabba, Said; Boulet, Gilles; Chehbouni, Abdelghani; Boone, Aaron

    2016-04-01

    The Haouz region, typical of southern Mediterranean basins, is characterized by a semi-arid climate, with average annual rainfall of 250, whilst evaporative demand is about 1600 mm per year. Under these conditions, crop irrigation is inevitable for growth and development. Irrigated agriculture currently consumes the majority of total available water (up to 85%), making it critical for more efficient water use. Flood irrigation is widely practiced by the majority of the farmers (more than 85 %) with an efficiency which does not exceed 50%. In this context, a good knowledge of the partitioning of evapotranspiration (ET) into soil evaporation and plant transpiration is of crucial need for improving the irrigation scheduling and thus water use efficiency. In this study, the ISBA (Interactions Soil-Biosphere-Atmosphere) model was used for estimating ET and its partition over an olive orchard and a wheat field located near to the Marrakech City (Centre of Morocco). Two versions were evaluated: standard version which simulates a single energy balance for the soil and vegetation and the recently developed multiple energy balance (MEB) version which solves a separate energy balance for each of the two sources. Eddy covariance system, which provides the sensible and latent heat fluxes and meteorological instruments were operated during years 2003-2004 for the Olive Orchard and during years 2013 for wheat. The transpiration component was measured using a Sap flow system during summer over the wheat crop and stable isotope samples were gathered over wheat. The comparison between ET estimated by ISBA model and that measured by the Eddy covariance system showed that MEB version yielded a remarkable improvement compared to the standard version. The root mean square error (RMSE) and the correlation coefficient (R²) were about 45wm-2 and 0.8 for MEB version. By contrast, for the standard version, the RMSE and R² were about 60wm-2 and 0.7, respectively. The result also showed that MEB version simulates more accurately the crop transpiration compared to the standard version. The RMSE and R² were about 0.79 mm and 0.67 for MEB and 1.37mm and 0.65 for standard version. An in-depth analysis of the results points out : (1) a deficiency of the standard version in simulating soil evaporation, in particular after an irrigation event, that directly impact the latent heat fluxes prediction because of two much energy reaching the soil and (2) a significant improvement of the surface temperature predictions with the double energy balance version; an interesting feature in the context of data assimilation; (3) a poor parameterization of the stomatal conductance in the A-gs photosynthetic module that is corrected thanks to a stochastic parameter identification approach. Results have direct implication for the prediction of evapotranspiration and its partition over irrigated crops in semi-arid areas of the South Mediterranean region.

  8. Integrated Medical Model (IMM) 4.0 Enhanced Functionalities

    NASA Technical Reports Server (NTRS)

    Young, M.; Keenan, A. B.; Saile, L.; Boley, L. A.; Walton, M. E.; Shah, R. V.; Kerstman, E. L.; Myers, J. G.

    2015-01-01

    The Integrated Medical Model is a probabilistic simulation model that uses input data on 100 medical conditions to simulate expected medical events, the resources required to treat, and the resulting impact to the mission for specific crew and mission characteristics. The newest development version of IMM, IMM v4.0, adds capabilities that remove some of the conservative assumptions that underlie the current operational version, IMM v3. While IMM v3 provides the framework to simulate whether a medical event occurred, IMMv4 also simulates when the event occurred during a mission timeline. This allows for more accurate estimation of mission time lost and resource utilization. In addition to the mission timeline, IMMv4.0 features two enhancements that address IMM v3 assumptions regarding medical event treatment. Medical events in IMMv3 are assigned the untreated outcome if any resource required to treat the event was unavailable. IMMv4 allows for partially treated outcomes that are proportional to the amount of required resources available, thus removing the dichotomous treatment assumption. An additional capability IMMv4 is to use an alternative medical resource when the primary resource assigned to the condition is depleted, more accurately reflecting the real-world system. The additional capabilities defining IMM v4.0the mission timeline, partial treatment, and alternate drug result in more realistic predicted mission outcomes. The primary model outcomes of IMM v4.0 for the ISS6 mission, including mission time lost, probability of evacuation, and probability of loss of crew life, are be compared to those produced by the current operational version of IMM to showcase enhanced prediction capabilities.

  9. An Overview of Tools for Creating, Validating and Using PDS Metadata

    NASA Astrophysics Data System (ADS)

    King, T. A.; Hardman, S. H.; Padams, J.; Mafi, J. N.; Cecconi, B.

    2017-12-01

    NASA's Planetary Data System (PDS) has defined information models for creating metadata to describe bundles, collections and products for all the assets acquired by a planetary science projects. Version 3 of the PDS Information Model (commonly known as "PDS3") is widely used and is used to describe most of the existing planetary archive. Recently PDS has released version 4 of the Information Model (commonly known as "PDS4") which is designed to improve consistency, efficiency and discoverability of information. To aid in creating, validating and using PDS4 metadata the PDS and a few associated groups have developed a variety of tools. In addition, some commercial tools, both free and for a fee, can be used to create and work with PDS4 metadata. We present an overview of these tools, describe those tools currently under development and provide guidance as to which tools may be most useful for missions, instrument teams and the individual researcher.

  10. Development and Implementation of Dynamic Scripts to Execute Cycled GSI/WRF Forecasts

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley; Srikishen, Jayanthi; Berndt, Emily; Li, Xuanli; Watson, Leela

    2014-01-01

    The Weather Research and Forecasting (WRF) numerical weather prediction (NWP) model and Gridpoint Statistical Interpolation (GSI) data assimilation (DA) are the operational systems that make up the North American Mesoscale (NAM) model and the NAM Data Assimilation System (NDAS) analysis used by National Weather Service forecasters. The Developmental Testbed Center (DTC) manages and distributes the code for the WRF and GSI, but it is up to individual researchers to link the systems together and write scripts to run the systems, which can take considerable time for those not familiar with the code. The objective of this project is to develop and disseminate a set of dynamic scripts that mimic the unique cycling configuration of the operational NAM to enable researchers to develop new modeling and data assimilation techniques that can be easily transferred to operations. The current version of the SPoRT GSI/WRF Scripts (v3.0.1) is compatible with WRF v3.3 and GSI v3.0.

  11. Development of Reduced-Order Models for Aeroelastic and Flutter Prediction Using the CFL3Dv6.0 Code

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Bartels, Robert E.

    2002-01-01

    A reduced-order model (ROM) is developed for aeroelastic analysis using the CFL3D version 6.0 computational fluid dynamics (CFD) code, recently developed at the NASA Langley Research Center. This latest version of the flow solver includes a deforming mesh capability, a modal structural definition for nonlinear aeroelastic analyses, and a parallelization capability that provides a significant increase in computational efficiency. Flutter results for the AGARD 445.6 Wing computed using CFL3D v6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are then computed using the CFL3Dv6 code and transformed into state-space form. Important numerical issues associated with the computation of the impulse responses are presented. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is used to rapidly compute aeroelastic transients including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly.

  12. Developing and Testing a 3d Cadastral Data Model a Case Study in Australia

    NASA Astrophysics Data System (ADS)

    Aien, A.; Kalantari, M.; Rajabifard, A.; Williamson, I. P.; Shojaei, D.

    2012-07-01

    Population growth, urbanization and industrialization place more pressure on land use with the need for increased space. To extend the use and functionality of the land, complex infrastructures are being built, both vertically and horizontally, layered and stacked. These three-dimensional (3D) developments affect the interests (Rights, Restrictions, and Responsibilities (RRRs)) attached to the underlying land. A 3D cadastre will assist in managing the effects of 3D development on a particular extent of land. There are many elements that contribute to developing a 3D cadastre, such as existing of 3D property legislations, 3D DBMS, 3D visualization. However, data modelling is one of the most important elements of a successful 3D cadastre. As architectural models of houses and high rise buildings help their users visualize the final product, 3D cadastre data model supports 3D cadastre users to understand the structure or behavior of the system and has a template that guides them to construct and implement the 3D cadastre. Many jurisdictions, organizations and software developers have built their own cadastral data model. Land Administration Domain Model (DIS-ISO 19152, The Netherlands) and ePlan (Intergovernmental Committee on Surveying and Mapping, Australia) are examples of existing data models. The variation between these data models is the result of different attitudes towards cadastres. However, there is a basic common thread among them all. Current cadastral data models use a 2D land-parcel concept and extend it to support 3D requirements. These data models cannot adequately manage and represent the spatial extent of 3D RRRs. Most of the current cadastral data models have been influenced by a very broad understanding of 3D cadastral concepts because better clarity in what needs to be represented and analysed in the cadastre needs to be established. This paper presents the first version of a 3D Cadastral Data Model (3DCDM_Version 1.0). 3DCDM models both the legal and physical extent of 3D properties and associated interests. The data model extends the traditional cadastral requirements to cover other applications such as urban planning and land valuation and taxation. A demonstration of a test system on the proposed data model is also presented. The test is based on a case study in Victoria, Australia to evaluate the effectiveness of the data model.

  13. Accelerator System Model (ASM) user manual with physics and engineering model documentation. ASM version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1993-07-01

    The Accelerator System Model (ASM) is a computer program developed to model proton radiofrequency accelerators and to carry out system level trade studies. The ASM FORTRAN subroutines are incorporated into an intuitive graphical user interface which provides for the {open_quotes}construction{close_quotes} of the accelerator in a window on the computer screen. The interface is based on the Shell for Particle Accelerator Related Codes (SPARC) software technology written for the Macintosh operating system in the C programming language. This User Manual describes the operation and use of the ASM application within the SPARC interface. The Appendix provides a detailed description of themore » physics and engineering models used in ASM. ASM Version 1.0 is joint project of G. H. Gillespie Associates, Inc. and the Accelerator Technology (AT) Division of the Los Alamos National Laboratory. Neither the ASM Version 1.0 software nor this ASM Documentation may be reproduced without the expressed written consent of both the Los Alamos National Laboratory and G. H. Gillespie Associates, Inc.« less

  14. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  15. Update of the Polar SWIFT model for polar stratospheric ozone loss (Polar SWIFT version 2)

    NASA Astrophysics Data System (ADS)

    Wohltmann, Ingo; Lehmann, Ralph; Rex, Markus

    2017-07-01

    The Polar SWIFT model is a fast scheme for calculating the chemistry of stratospheric ozone depletion in polar winter. It is intended for use in global climate models (GCMs) and Earth system models (ESMs) to enable the simulation of mutual interactions between the ozone layer and climate. To date, climate models often use prescribed ozone fields, since a full stratospheric chemistry scheme is computationally very expensive. Polar SWIFT is based on a set of coupled differential equations, which simulate the polar vortex-averaged mixing ratios of the key species involved in polar ozone depletion on a given vertical level. These species are O3, chemically active chlorine (ClOx), HCl, ClONO2 and HNO3. The only external input parameters that drive the model are the fraction of the polar vortex in sunlight and the fraction of the polar vortex below the temperatures necessary for the formation of polar stratospheric clouds. Here, we present an update of the Polar SWIFT model introducing several improvements over the original model formulation. In particular, the model is now trained on vortex-averaged reaction rates of the ATLAS Chemistry and Transport Model, which enables a detailed look at individual processes and an independent validation of the different parameterizations contained in the differential equations. The training of the original Polar SWIFT model was based on fitting complete model runs to satellite observations and did not allow for this. A revised formulation of the system of differential equations is developed, which closely fits vortex-averaged reaction rates from ATLAS that represent the main chemical processes influencing ozone. In addition, a parameterization for the HNO3 change by denitrification is included. The rates of change of the concentrations of the chemical species of the Polar SWIFT model are purely chemical rates of change in the new version, whereas in the original Polar SWIFT model, they included a transport effect caused by the original training on satellite data. Hence, the new version allows for an implementation into climate models in combination with an existing stratospheric transport scheme. Finally, the model is now formulated on several vertical levels encompassing the vertical range in which polar ozone depletion is observed. The results of the Polar SWIFT model are validated with independent Microwave Limb Sounder (MLS) satellite observations and output from the original detailed chemistry model of ATLAS.

  16. Activation of the marine ecosystem model 3D CEMBS for the Baltic Sea in operational mode

    NASA Astrophysics Data System (ADS)

    Dzierzbicka-Glowacka, Lidia; Jakacki, Jaromir; Janecki, Maciej; Nowicki, Artur

    2013-04-01

    The paper presents a new marine ecosystem model 3D CEMBS designed for the Baltic Sea. The ecosystem model is incorporated into the 3D POPCICE ocean-ice model. The Current Baltic Sea model is based on the Community Earth System Model (CESM from the National Center for Atmospheric Research) which was adapted for the Baltic Sea as a coupled sea-ice model. It consists of the Community Ice Code (CICE model, version 4.0) and the Parallel Ocean Program (version 2.1). The ecosystem model is a biological submodel of the 3D CEMBS. It consists of eleven mass conservation equations. There are eleven partial second-order differential equations of the diffusion type with the advective term for phytoplankton, zooplankton, nutrients, dissolved oxygen, and dissolved and particulate organic matter. This model is an effective tool for solving the problem of ecosystem bioproductivity. The model is forced by 48-hour atmospheric forecasts provided by the UM model from the Interdisciplinary Centre for Mathematical and Computational Modelling of Warsaw University (ICM). The study was financially supported by the Polish State Committee of Scientific Research (grants: No N N305 111636, N N306 353239). The partial support for this study was also provided by the project Satellite Monitoring of the Baltic Sea Environment - SatBaltyk founded by European Union through European Regional Development Fund contract no. POIG 01.01.02-22-011/09. Calculations were carried out at the Academy Computer Centre in Gdańsk.

  17. A framework for expanding aqueous chemistry in the ...

    EPA Pesticide Factsheets

    This paper describes the development and implementation of an extendable aqueous-phase chemistry option (AQCHEM − KMT(I)) for the Community Multiscale Air Quality (CMAQ) modeling system, version 5.1. Here, the Kinetic PreProcessor (KPP), version 2.2.3, is used to generate a Rosenbrock solver (Rodas3) to integrate the stiff system of ordinary differential equations (ODEs) that describe the mass transfer, chemical kinetics, and scavenging processes of CMAQ clouds. CMAQ's standard cloud chemistry module (AQCHEM) is structurally limited to the treatment of a simple chemical mechanism. This work advances our ability to test and implement more sophisticated aqueous chemical mechanisms in CMAQ and further investigate the impacts of microphysical parameters on cloud chemistry. Box model cloud chemistry simulations were performed to choose efficient solver and tolerance settings, evaluate the implementation of the KPP solver, and assess the direct impacts of alternative solver and kinetic mass transfer on predicted concentrations for a range of scenarios. Month-long CMAQ simulations for winter and summer periods over the US reveal the changes in model predictions due to these cloud module updates within the full chemical transport model. While monthly average CMAQ predictions are not drastically altered between AQCHEM and AQCHEM − KMT, hourly concentration differences can be significant. With added in-cloud secondary organic aerosol (SOA) formation from bio

  18. Quantitative prediction of cellular metabolism with constraint-based models: the COBRA Toolbox v2.0

    PubMed Central

    Schellenberger, Jan; Que, Richard; Fleming, Ronan M. T.; Thiele, Ines; Orth, Jeffrey D.; Feist, Adam M.; Zielinski, Daniel C.; Bordbar, Aarash; Lewis, Nathan E.; Rahmanian, Sorena; Kang, Joseph; Hyduke, Daniel R.; Palsson, Bernhard Ø.

    2012-01-01

    Over the past decade, a growing community of researchers has emerged around the use of COnstraint-Based Reconstruction and Analysis (COBRA) methods to simulate, analyze and predict a variety of metabolic phenotypes using genome-scale models. The COBRA Toolbox, a MATLAB package for implementing COBRA methods, was presented earlier. Here we present a significant update of this in silico ToolBox. Version 2.0 of the COBRA Toolbox expands the scope of computations by including in silico analysis methods developed since its original release. New functions include: (1) network gap filling, (2) 13C analysis, (3) metabolic engineering, (4) omics-guided analysis, and (5) visualization. As with the first version, the COBRA Toolbox reads and writes Systems Biology Markup Language formatted models. In version 2.0, we improved performance, usability, and the level of documentation. A suite of test scripts can now be used to learn the core functionality of the Toolbox and validate results. This Toolbox lowers the barrier of entry to use powerful COBRA methods. PMID:21886097

  19. Remote measurement methods for 3-D modeling purposes using BAE Systems' Software

    NASA Astrophysics Data System (ADS)

    Walker, Stewart; Pietrzak, Arleta

    2015-06-01

    Efficient, accurate data collection from imagery is the key to an economical generation of useful geospatial products. Incremental developments of traditional geospatial data collection and the arrival of new image data sources cause new software packages to be created and existing ones to be adjusted to enable such data to be processed. In the past, BAE Systems' digital photogrammetric workstation, SOCET SET®, met fin de siècle expectations in data processing and feature extraction. Its successor, SOCET GXP®, addresses today's photogrammetric requirements and new data sources. SOCET GXP is an advanced workstation for mapping and photogrammetric tasks, with automated functionality for triangulation, Digital Elevation Model (DEM) extraction, orthorectification and mosaicking, feature extraction and creation of 3-D models with texturing. BAE Systems continues to add sensor models to accommodate new image sources, in response to customer demand. New capabilities added in the latest version of SOCET GXP facilitate modeling, visualization and analysis of 3-D features.

  20. Extended-range prediction trials using the global cloud/cloud-system resolving model NICAM and its new ocean-coupled version NICOCO

    NASA Astrophysics Data System (ADS)

    Miyakawa, Tomoki

    2017-04-01

    The global cloud/cloud-system resolving model NICAM and its new fully-coupled version NICOCO is run on one of the worlds top-tier supercomputers, the K computer. NICOCO couples the full-3D ocean component COCO of the general circulation model MIROC using a general-purpose coupler Jcup. We carried out multiple MJO simulations using NICAM and the new ocean-coupled version NICOCO to examine their extended-range MJO prediction skills and the impact of ocean coupling. NICAM performs excellently in terms of MJO prediction, maintaining a valid skill up to 27 days after the model is initialized (Miyakawa et al 2014). As is the case in most global models, ocean coupling frees the model from being anchored by the observed SST and allows the model climate to drift away further from reality compared to the atmospheric version of the model. Thus, it is important to evaluate the model bias, and in an initial value problem such as the seasonal extended-range prediction, it is essential to be able to distinguish the actual signal from the early transition of the model from the observed state to its own climatology. Since NICAM is a highly resource-demanding model, evaluation and tuning of the model climatology (order of years) is challenging. Here we focus on the initial 100 days to estimate the early drift of the model, and subsequently evaluate MJO prediction skills of NICOCO. Results show that in the initial 100 days, NICOCO forms a La-Nina like SST bias compared to observation, with a warmer Maritime Continent warm pool and a cooler equatorial central Pacific. The enhanced convection over the Maritime Continent associated with this bias project on to the real-time multi-variate MJO indices (RMM, Wheeler and Hendon 2004), and contaminates the MJO skill score. However, the bias does not appear to demolish the MJO signal severely. The model maintains a valid MJO prediction skill up to nearly 4 weeks when evaluated after linearly removing the early drift component estimated from the 54 simulations. Furthermore, NICOCO outperforms NICAM by far if we focus on events associated with large oceanic signals.

  1. Simulations of the Mid-Pliocene Warm Period Using Two Versions of the NASA-GISS ModelE2-R Coupled Model

    NASA Technical Reports Server (NTRS)

    Chandler, M. A.; Sohl, L. E.; Jonas, J. A.; Dowsett, H. J.; Kelley, M.

    2013-01-01

    The mid-Pliocene Warm Period (mPWP) bears many similarities to aspects of future global warming as projected by the Intergovernmental Panel on Climate Change (IPCC, 2007). Both marine and terrestrial data point to high-latitude temperature amplification, including large decreases in sea ice and land ice, as well as expansion of warmer climate biomes into higher latitudes. Here we present our most recent simulations of the mid-Pliocene climate using the CMIP5 version of the NASAGISS Earth System Model (ModelE2-R). We describe the substantial impact associated with a recent correction made in the implementation of the Gent-McWilliams ocean mixing scheme (GM), which has a large effect on the simulation of ocean surface temperatures, particularly in the North Atlantic Ocean. The effect of this correction on the Pliocene climate results would not have been easily determined from examining its impact on the preindustrial runs alone, a useful demonstration of how the consequences of code improvements as seen in modern climate control runs do not necessarily portend the impacts in extreme climates.Both the GM-corrected and GM-uncorrected simulations were contributed to the Pliocene Model Intercomparison Project (PlioMIP) Experiment 2. Many findings presented here corroborate results from other PlioMIP multi-model ensemble papers, but we also emphasize features in the ModelE2-R simulations that are unlike the ensemble means. The corrected version yields results that more closely resemble the ocean core data as well as the PRISM3D reconstructions of the mid-Pliocene, especially the dramatic warming in the North Atlantic and Greenland-Iceland-Norwegian Sea, which in the new simulation appears to be far more realistic than previously found with older versions of the GISS model. Our belief is that continued development of key physical routines in the atmospheric model, along with higher resolution and recent corrections to mixing parameterisations in the ocean model, have led to an Earth System Model that will produce more accurate projections of future climate.

  2. The effects of atmospheric chemistry on radiation budget in the Community Earth Systems Model

    NASA Astrophysics Data System (ADS)

    Choi, Y.; Czader, B.; Diao, L.; Rodriguez, J.; Jeong, G.

    2013-12-01

    The Community Earth Systems Model (CESM)-Whole Atmosphere Community Climate Model (WACCM) simulations were performed to study the impact of atmospheric chemistry on the radiation budget over the surface within a weather prediction time scale. The secondary goal is to get a simplified and optimized chemistry module for the short time period. Three different chemistry modules were utilized to represent tropospheric and stratospheric chemistry, which differ in how their reactions and species are represented: (1) simplified tropospheric and stratospheric chemistry (approximately 30 species), (2) simplified tropospheric chemistry and comprehensive stratospheric chemistry from the Model of Ozone and Related Chemical Tracers, version 3 (MOZART-3, approximately 60 species), and (3) comprehensive tropospheric and stratospheric chemistry (MOZART-4, approximately 120 species). Our results indicate the different details in chemistry treatment from these model components affect the surface temperature and impact the radiation budget.

  3. AF-GEOSpace Version 2.1 Release

    NASA Astrophysics Data System (ADS)

    Hilmer, R. V.; Ginet, G. P.; Hall, T.; Holeman, E.; Madden, D.; Perry, K. L.; Tautz, M.; Roth, C.

    2006-05-01

    AF-GEOSpace Version 2.1 is a graphics-intensive software program with space environment models and applications developed recently by the Space Weather Center of Excellence at AFRL. A review of new and planned AF-GEOSpace capabilities will be given. The software addresses a wide range of physical domains and addresses such topics as solar disturbance propagation, geomagnetic field and radiation belt configurations, auroral particle precipitation, and ionospheric scintillation. Building on the success of previous releases, AF-GEOSpace has become a platform for the rapid prototyping of automated operational and simulation space weather visualization products and helps with a variety of tasks, including: orbit specification for radiation hazard avoidance; satellite design assessment and post-event anomaly analysis; solar disturbance effects forecasting; determination of link outage regions for active ionospheric conditions; satellite magnetic conjugate studies, scientific model validation and comparison, physics research, and education. Previously, Version 2.0 provided a simplified graphical user interface, improved science and application modules, significantly enhanced graphical performance, common input data archive sets, and 1-D, 2-D, and 3- D visualization tools for all models. Dynamic capabilities permit multiple environments to be generated at user- specified time intervals while animation tools enable the display of satellite orbits and environment data together as a function of time. Building on the Version 2.0 software architecture, AF-GEOSpace Version 2.1 includes a host of new modules providing, for example, plasma sheet charged particle fluxes, neutral atmosphere densities, 3-D cosmic ray cutoff maps, low-altitude trapped proton belt flux specification, DMSP particle data displays, satellite magnetic field footprint mapping determination, and meteor sky maps and shower/storm fluxes with spacecraft impact probabilities. AF-GEOSpace Version 2.1 was developed for Windows XP and Linux systems. To receive a copy of the AF-GEOSpace 2.1 software, please submit requests via e-mail to the first author.

  4. Super Cooled Large Droplet Analysis of Several Geometries Using LEWICE3D Version 3

    NASA Technical Reports Server (NTRS)

    Bidwell, Colin S.

    2011-01-01

    Super Cooled Large Droplet (SLD) collection efficiency calculations were performed for several geometries using the LEWICE3D Version 3 software. The computations were performed using the NASA Glenn Research Center SLD splashing model which has been incorporated into the LEWICE3D Version 3 software. Comparisons to experiment were made where available. The geometries included two straight wings, a swept 64A008 wing tip, two high lift geometries, and the generic commercial transport DLR-F4 wing body configuration. In general the LEWICE3D Version 3 computations compared well with the 2D LEWICE 3.2.2 results and with experimental data where available.

  5. Technical Note: Response time evolution of XR-QA2 GafChromic™ film models.

    PubMed

    Aldelaijan, Saad; Tomic, Nada; Papaconstadopoulos, Pavlos; Schneider, James; Seuntjens, Jan; Shih, Shelley; Lewis, David; Devic, Slobodan

    2018-01-01

    To evaluate the response of the newest XR-QA2 GafChromic™ film model in terms of postexposure signal growth and energy response in comparison with the older XR-QA (Version 2) model. Pieces of film were irradiated to air kerma in air values up to 12 cGy with several beam qualities (5.3-8.25 mm Al) commonly used for CT scanning. Film response was scored in terms of net reflectance from scanned film images at various points in time postirradiation ranging from 1 to 7 days and 5 months postexposure. To reconstruct the measurement signal changes with postirradiation delay, we irradiated one film piece and then scanned it at different point times starting from 2" min and up to 3 days postexposure. For all beam qualities and dose range investigated, it appears that the XR-QA2 film signal completely saturated after 15 h. Compared to 15 h postirradiation scanning time, the observed variation in net reflectance were 3%, 2%, and 1% for film scanned 2" min, 20 min, and 3 h after exposure, respectively, which is well within the measurement uncertainty of the XR-QA2 based reference radiochromic film dosimetry system. A comparison between the XR-QA (Version 2) and the XR-QA2 film response after several months (relative to their responses after 24 h) show differences in up to 8% and 1% for each film model respectively. The replacement of cesium bromide in the older XR-QA (Version 2) film model with bismuth oxide in the newer XR-QA2 film, while keeping the same single sensitive layer structure, lead to a significantly more stable postexposure response. © 2017 American Association of Physicists in Medicine.

  6. Results from CrIS/ATMS Obtained Using an "AIRS Version-6 Like" Retrieval Algorithm

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Kouvaris, Louis; Iredell, Lena; Blaisdell, John

    2015-01-01

    AIRS and CrIS Version-6.22 O3(p) and q(p) products are both superior to those of AIRS Version-6.Monthly mean August 2014 Version-6.22 AIRS and CrIS products agree reasonably well with OMPS, CERES, and witheach other. JPL plans to process AIRS and CrIS for many months and compare interannual differences. Updates to thecalibration of both CrIS and ATMS are still being finalized. We are also working with JPL to develop a joint AIRS/CrISlevel-1 to level-3 processing system using a still to be finalized Version-7 retrieval algorithm. The NASA Goddard DISCwill eventually use this system to reprocess all AIRS and recalibrated CrIS/ATMS. .

  7. Assessing the performance of formulations for nonlinear feedback of surface gravity waves on ocean currents over coastal waters

    NASA Astrophysics Data System (ADS)

    Wang, Pengcheng; Sheng, Jinyu; Hannah, Charles

    2017-08-01

    This study presents applications of a two-way coupled wave-circulation modelling system over coastal waters, with a special emphasis of performance assessments of two different methods for nonlinear feedback of ocean surface gravity waves on three-dimensional (3D) ocean currents. These two methods are the vortex force (VF) formulation suggested by Bennis et al. (2011) and the latest version of radiation stress (RS) formulation suggested by Mellor (2015). The coupled modelling system is first applied to two idealized test cases of surf-zone scales to validate implementations of these two methods in the coupled wave-circulation system. Model results show that the latest version of RS has difficulties in producing the undertow over the surf zone. The coupled system is then applied to Lunenburg Bay (LB) of Nova Scotia during Hurricane Juan in 2003. The coupled system using both the VF and RS formulations generates much stronger and more realistic 3D circulation in the Bay during Hurricane Juan than the circulation-only model, demonstrating the importance of surface wave forces to the 3D ocean circulation over coastal waters. However, the RS formulation generates some weak unphysical currents outside the wave breaking zone due to a less reasonable representation for the vertical distribution of the RS gradients over a slopping bottom. These weak unphysical currents are significantly magnified in a two-way coupled system when interacting with large surface waves, degrading the model performance in simulating currents at one observation site. Our results demonstrate that the VF formulation with an appropriate parameterization of wave breaking effects is able to produce reasonable results for applications over coastal waters during extreme weather events. The RS formulation requires a complex wave theory rather than the linear wave theory for the approximation of a vertical RS term to improve its performance under both breaking and non-breaking wave conditions.

  8. Effects of Nutrients and Physical Forcing on Satellite-Derived Optical Properties Near the Mississippi River Delta

    DTIC Science & Technology

    2007-07-17

    receiving system and NRL’s Automated Processing System (APS) (Martinolich 2005). APS Version 3.4 utilized atmospheric correction algorithms proscribed by... Automated Processing System User’s Guide Version 3.4, edited by N.R. Laboratory. Rabalais, N.N., R.E. Turner, and W.J. Wiseman, Jr. 2002. Hypoxia in the

  9. FINDS: A fault inferring nonlinear detection system programmers manual, version 3.0

    NASA Technical Reports Server (NTRS)

    Lancraft, R. E.

    1985-01-01

    Detailed software documentation of the digital computer program FINDS (Fault Inferring Nonlinear Detection System) Version 3.0 is provided. FINDS is a highly modular and extensible computer program designed to monitor and detect sensor failures, while at the same time providing reliable state estimates. In this version of the program the FINDS methodology is used to detect, isolate, and compensate for failures in simulated avionics sensors used by the Advanced Transport Operating Systems (ATOPS) Transport System Research Vehicle (TSRV) in a Microwave Landing System (MLS) environment. It is intended that this report serve as a programmers guide to aid in the maintenance, modification, and revision of the FINDS software.

  10. Upper Blue Nile basin water budget from a multi-model perspective

    NASA Astrophysics Data System (ADS)

    Jung, Hahn Chul; Getirana, Augusto; Policelli, Frederick; McNally, Amy; Arsenault, Kristi R.; Kumar, Sujay; Tadesse, Tsegaye; Peters-Lidard, Christa D.

    2017-12-01

    Improved understanding of the water balance in the Blue Nile is of critical importance because of increasingly frequent hydroclimatic extremes under a changing climate. The intercomparison and evaluation of multiple land surface models (LSMs) associated with different meteorological forcing and precipitation datasets can offer a moderate range of water budget variable estimates. In this context, two LSMs, Noah version 3.3 (Noah3.3) and Catchment LSM version Fortuna 2.5 (CLSMF2.5) coupled with the Hydrological Modeling and Analysis Platform (HyMAP) river routing scheme are used to produce hydrological estimates over the region. The two LSMs were forced with different combinations of two reanalysis-based meteorological datasets from the Modern-Era Retrospective analysis for Research and Applications datasets (i.e., MERRA-Land and MERRA-2) and three observation-based precipitation datasets, generating a total of 16 experiments. Modeled evapotranspiration (ET), streamflow, and terrestrial water storage estimates were evaluated against the Atmosphere-Land Exchange Inverse (ALEXI) ET, in-situ streamflow observations, and NASA Gravity Recovery and Climate Experiment (GRACE) products, respectively. Results show that CLSMF2.5 provided better representation of the water budget variables than Noah3.3 in terms of Nash-Sutcliffe coefficient when considering all meteorological forcing datasets and precipitation datasets. The model experiments forced with observation-based products, the Climate Hazards group Infrared Precipitation with Stations (CHIRPS) and the Tropical Rainfall Measuring Mission (TRMM) Multi-Satellite Precipitation Analysis (TMPA), outperform those run with MERRA-Land and MERRA-2 precipitation. The results presented in this paper would suggest that the Famine Early Warning Systems Network (FEWS NET) Land Data Assimilation System incorporate CLSMF2.5 and HyMAP routing scheme to better represent the water balance in this region.

  11. Disk Operating System--DOS. Teacher Packet. Learning Activity Packets.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.

    The Learning Activity Packets (LAPs) contained in this manual are designed to assist the beginning user in understanding DOS (Disk Operating System). LAPs will not work with any version below DOS Version 3.0 and do not address the enhanced features of versions 4.0 or higher. These elementary activities cover only the DOS commands necessary to…

  12. Stochastic hyperfine interactions modeling library-Version 2

    NASA Astrophysics Data System (ADS)

    Zacate, Matthew O.; Evenson, William E.

    2016-02-01

    The stochastic hyperfine interactions modeling library (SHIML) provides a set of routines to assist in the development and application of stochastic models of hyperfine interactions. The library provides routines written in the C programming language that (1) read a text description of a model for fluctuating hyperfine fields, (2) set up the Blume matrix, upon which the evolution operator of the system depends, and (3) find the eigenvalues and eigenvectors of the Blume matrix so that theoretical spectra of experimental techniques that measure hyperfine interactions can be calculated. The optimized vector and matrix operations of the BLAS and LAPACK libraries are utilized. The original version of SHIML constructed and solved Blume matrices for methods that measure hyperfine interactions of nuclear probes in a single spin state. Version 2 provides additional support for methods that measure interactions on two different spin states such as Mössbauer spectroscopy and nuclear resonant scattering of synchrotron radiation. Example codes are provided to illustrate the use of SHIML to (1) generate perturbed angular correlation spectra for the special case of polycrystalline samples when anisotropy terms of higher order than A22 can be neglected and (2) generate Mössbauer spectra for polycrystalline samples for pure dipole or pure quadrupole transitions.

  13. Impact of Physics Parameterization Ordering in a Global Atmosphere Model

    DOE PAGES

    Donahue, Aaron S.; Caldwell, Peter M.

    2018-02-02

    Because weather and climate models must capture a wide variety of spatial and temporal scales, they rely heavily on parameterizations of subgrid-scale processes. The goal of this study is to demonstrate that the assumptions used to couple these parameterizations have an important effect on the climate of version 0 of the Energy Exascale Earth System Model (E3SM) General Circulation Model (GCM), a close relative of version 1 of the Community Earth System Model (CESM1). Like most GCMs, parameterizations in E3SM are sequentially split in the sense that parameterizations are called one after another with each subsequent process feeling the effectmore » of the preceding processes. This coupling strategy is noncommutative in the sense that the order in which processes are called impacts the solution. By examining a suite of 24 simulations with deep convection, shallow convection, macrophysics/microphysics, and radiation parameterizations reordered, process order is shown to have a big impact on predicted climate. In particular, reordering of processes induces differences in net climate feedback that are as big as the intermodel spread in phase 5 of the Coupled Model Intercomparison Project. One reason why process ordering has such a large impact is that the effect of each process is influenced by the processes preceding it. Where output is written is therefore an important control on apparent model behavior. Application of k-means clustering demonstrates that the positioning of macro/microphysics and shallow convection plays a critical role on the model solution.« less

  14. Impact of Physics Parameterization Ordering in a Global Atmosphere Model

    NASA Astrophysics Data System (ADS)

    Donahue, Aaron S.; Caldwell, Peter M.

    2018-02-01

    Because weather and climate models must capture a wide variety of spatial and temporal scales, they rely heavily on parameterizations of subgrid-scale processes. The goal of this study is to demonstrate that the assumptions used to couple these parameterizations have an important effect on the climate of version 0 of the Energy Exascale Earth System Model (E3SM) General Circulation Model (GCM), a close relative of version 1 of the Community Earth System Model (CESM1). Like most GCMs, parameterizations in E3SM are sequentially split in the sense that parameterizations are called one after another with each subsequent process feeling the effect of the preceding processes. This coupling strategy is noncommutative in the sense that the order in which processes are called impacts the solution. By examining a suite of 24 simulations with deep convection, shallow convection, macrophysics/microphysics, and radiation parameterizations reordered, process order is shown to have a big impact on predicted climate. In particular, reordering of processes induces differences in net climate feedback that are as big as the intermodel spread in phase 5 of the Coupled Model Intercomparison Project. One reason why process ordering has such a large impact is that the effect of each process is influenced by the processes preceding it. Where output is written is therefore an important control on apparent model behavior. Application of k-means clustering demonstrates that the positioning of macro/microphysics and shallow convection plays a critical role on the model solution.

  15. Impact of Physics Parameterization Ordering in a Global Atmosphere Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Donahue, Aaron S.; Caldwell, Peter M.

    Because weather and climate models must capture a wide variety of spatial and temporal scales, they rely heavily on parameterizations of subgrid-scale processes. The goal of this study is to demonstrate that the assumptions used to couple these parameterizations have an important effect on the climate of version 0 of the Energy Exascale Earth System Model (E3SM) General Circulation Model (GCM), a close relative of version 1 of the Community Earth System Model (CESM1). Like most GCMs, parameterizations in E3SM are sequentially split in the sense that parameterizations are called one after another with each subsequent process feeling the effectmore » of the preceding processes. This coupling strategy is noncommutative in the sense that the order in which processes are called impacts the solution. By examining a suite of 24 simulations with deep convection, shallow convection, macrophysics/microphysics, and radiation parameterizations reordered, process order is shown to have a big impact on predicted climate. In particular, reordering of processes induces differences in net climate feedback that are as big as the intermodel spread in phase 5 of the Coupled Model Intercomparison Project. One reason why process ordering has such a large impact is that the effect of each process is influenced by the processes preceding it. Where output is written is therefore an important control on apparent model behavior. Application of k-means clustering demonstrates that the positioning of macro/microphysics and shallow convection plays a critical role on the model solution.« less

  16. Space Shuttle Orbiter flight heating rate measurement sensitivity to thermal protection system uncertainties

    NASA Technical Reports Server (NTRS)

    Bradley, P. F.; Throckmorton, D. A.

    1981-01-01

    A study was completed to determine the sensitivity of computed convective heating rates to uncertainties in the thermal protection system thermal model. Those parameters considered were: density, thermal conductivity, and specific heat of both the reusable surface insulation and its coating; coating thickness and emittance; and temperature measurement uncertainty. The assessment used a modified version of the computer program to calculate heating rates from temperature time histories. The original version of the program solves the direct one dimensional heating problem and this modified version of The program is set up to solve the inverse problem. The modified program was used in thermocouple data reduction for shuttle flight data. Both nominal thermal models and altered thermal models were used to determine the necessity for accurate knowledge of thermal protection system's material thermal properties. For many thermal properties, the sensitivity (inaccuracies created in the calculation of convective heating rate by an altered property) was very low.

  17. US Navy Global and Regional Wave Modeling

    DTIC Science & Technology

    2014-09-01

    Future plans call for increasing the resolution to 0.5 degree, upgrading to WW3 version 4, and including the ...NAVOCEANO WW3 system is in the early stages, and a number of key shortcomings have been identified for future improvement. The multigrid sys- tem...J. Shriver, R. Helber, P. Spence, S . Carroll, O.M. Smedstad, and B. Lunde. 2011. Validation Test Report for the Navy Coupled Ocean

  18. Modeling North Atlantic Nor'easters With Modern Wave Forecast Models

    NASA Astrophysics Data System (ADS)

    Perrie, Will; Toulany, Bechara; Roland, Aron; Dutour-Sikiric, Mathieu; Chen, Changsheng; Beardsley, Robert C.; Qi, Jianhua; Hu, Yongcun; Casey, Michael P.; Shen, Hui

    2018-01-01

    Three state-of-the-art operational wave forecast model systems are implemented on fine-resolution grids for the Northwest Atlantic. These models are: (1) a composite model system consisting of SWAN implemented within WAVEWATCHIII® (the latter is hereafter, WW3) on a nested system of traditional structured grids, (2) an unstructured grid finite-volume wave model denoted "SWAVE," using SWAN physics, and (3) an unstructured grid finite element wind wave model denoted as "WWM" (for "wind wave model") which uses WW3 physics. Models are implemented on grid systems that include relatively large domains to capture the wave energy generated by the storms, as well as including fine-resolution nearshore regions of the southern Gulf of Maine with resolution on the scale of 25 m to simulate areas where inundation and coastal damage have occurred, due to the storms. Storm cases include three intense midlatitude cases: a spring Nor'easter storm in May 2005, the Patriot's Day storm in 2007, and the Boxing Day storm in 2010. Although these wave model systems have comparable overall properties in terms of their performance and skill, it is found that there are differences. Models that use more advanced physics, as presented in recent versions of WW3, tuned to regional characteristics, as in the Gulf of Maine and the Northwest Atlantic, can give enhanced results.

  19. Performance Assessment of New Land-Surface and Planetary Boundary Layer Physics in the WRF-ARW

    EPA Science Inventory

    The Pleim-Xiu land surface model, Pleim surface layer scheme, and Asymmetric Convective Model (version 2) are now options in version 3.0 of the Weather Research and Forecasting model (WRF) Advanced Research WRF (ARW) core. These physics parameterizations were developed for the f...

  20. SATCOM antenna siting study on P-3C aircraft, volume 1

    NASA Technical Reports Server (NTRS)

    Bensman, D. A.; Marhefka, R. J.

    1991-01-01

    The NEC-BSC (Basic Scattering Code) was used to study the performance of a SATCOM antenna on a P-3C aircraft. After plate cylinder fields are added to version 3.1 of the NEC-BSC, it is shown that the NEC-BSC can be used to accurately predict the performance of a SATCOM antenna system on a P-3C aircraft. The study illustrates that the NEC-BSC gives good results when compared with scale model measurements provided by Boeing and Lockheed.

  1. STARS Conceptual Framework for Reuse Process (CFRP). Volume 1. Definition. Version 3.0

    DTIC Science & Technology

    1993-10-25

    Command, USAF Hanscom AFB, MA 01731-5000 DTIC QUALITY IN ,,P.’±U4) D Prepared by: The Boeing Company , IBM, Defense & Space Group, Federal Systems... Company , Unisys Corporation, P.O. Box 3999, MS 87-37 800 N. Frederick Pike, 12010 Sunrise Valley Drive, Seattle, WA 98124-2499 Gaithersburg, MD 20879...34 3.2.1.1 Domain Analysis and Modeling Process Category ............ 38 3.2.1.2 Domain Architecture Development Process

  2. Experiments in fault tolerant software reliability

    NASA Technical Reports Server (NTRS)

    Mcallister, David F.; Tai, K. C.; Vouk, Mladen A.

    1987-01-01

    The reliability of voting was evaluated in a fault-tolerant software system for small output spaces. The effectiveness of the back-to-back testing process was investigated. Version 3.0 of the RSDIMU-ATS, a semi-automated test bed for certification testing of RSDIMU software, was prepared and distributed. Software reliability estimation methods based on non-random sampling are being studied. The investigation of existing fault-tolerance models was continued and formulation of new models was initiated.

  3. Uniform California earthquake rupture forecast, version 3 (UCERF3): the time-independent model

    USGS Publications Warehouse

    Field, Edward H.; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David D.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin R.; Page, Morgan T.; Parsons, Thomas; Powers, Peter M.; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua; ,

    2013-01-01

    In this report we present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation assumptions and to include multifault ruptures, both limitations of the previous model (UCERF2). The rates of all earthquakes are solved for simultaneously, and from a broader range of data, using a system-level "grand inversion" that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (for example, magnitude-frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1,440 alternative logic tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M≥5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded because of lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg-Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (for example, constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 overprediction of M6.5–7 earthquake rates and also includes types of multifault ruptures seen in nature. Although UCERF3 fits the data better than UCERF2 overall, there may be areas that warrant further site-specific investigation. Supporting products may be of general interest, and we list key assumptions and avenues for future model improvements.

  4. Building Loads Analysis and System Thermodynamics (BLAST) Program Users Manual. Volume One. Supplement (Version 3.0).

    DTIC Science & Technology

    1981-03-01

    AD-A B99 054 CONSTRUCTION EN INEERIN RESEARCH LAB (ARMY) CHAMPAIGN IL F/ 9/2 BUILDING LOADS ANALYSIS AND SYSTEM THERMOD NAMICS (BLAST) PROGR...continued. systems , (11) induction unit systems , (12) direct-drive chillers, and (13) purchased steam from utilities. BLAST Version 3.0 also offers the user...their BLAST input. II UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGEftin Date Rnerod) FOREWORD This report was prepared for the Air Force Systems

  5. Regional climate modeling over the Maritime Continent: Assessment of RegCM3-BATS1e and RegCM3-IBIS

    NASA Astrophysics Data System (ADS)

    Gianotti, R. L.; Zhang, D.; Eltahir, E. A.

    2010-12-01

    Despite its importance to global rainfall and circulation processes, the Maritime Continent remains a region that is poorly simulated by climate models. Relatively few studies have been undertaken using a model with fine enough resolution to capture the small-scale spatial heterogeneity of this region and associated land-atmosphere interactions. These studies have shown that even regional climate models (RCMs) struggle to reproduce the climate of this region, particularly the diurnal cycle of rainfall. This study builds on previous work by undertaking a more thorough evaluation of RCM performance in simulating the timing and intensity of rainfall over the Maritime Continent, with identification of major sources of error. An assessment was conducted of the Regional Climate Model Version 3 (RegCM3) used in a coupled system with two land surface schemes: Biosphere Atmosphere Transfer System Version 1e (BATS1e) and Integrated Biosphere Simulator (IBIS). The model’s performance in simulating precipitation was evaluated against the 3-hourly TRMM 3B42 product, with some validation provided of this TRMM product against ground station meteorological data. It is found that the model suffers from three major errors in the rainfall histogram: underestimation of the frequency of dry periods, overestimation of the frequency of low intensity rainfall, and underestimation of the frequency of high intensity rainfall. Additionally, the model shows error in the timing of the diurnal rainfall peak, particularly over land surfaces. These four errors were largely insensitive to the choice of boundary conditions, convective parameterization scheme or land surface scheme. The presence of a wet or dry bias in the simulated volumes of rainfall was, however, dependent on the choice of convection scheme and boundary conditions. This study also showed that the coupled model system has significant error in overestimation of latent heat flux and evapotranspiration from the land surface, and specifically overestimation of interception loss with concurrent underestimation of transpiration, irrespective of the land surface scheme used. Discussion of the origin of these errors is provided, with some suggestions for improvement.

  6. Design analysis and computer-aided performance evaluation of shuttle orbiter electrical power system. Volume 2: SYSTID user's guide

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The manual for the use of the computer program SYSTID under the Univac operating system is presented. The computer program is used in the simulation and evaluation of the space shuttle orbiter electric power supply. The models described in the handbook are those which were available in the original versions of SYSTID. The subjects discussed are: (1) program description, (2) input language, (3) node typing, (4) problem submission, and (5) basic and power system SYSTID libraries.

  7. Logistics Support Analysis Techniques Guide

    DTIC Science & Technology

    1985-03-15

    LANGUAGE (DATA RECORDS) FORTRAN CDC 6600 D&V FSD P/D A H REMA-RKS: Program n-s-ists of F PLIATIffIONS, approx 4000 line of coding , 3 Safegard, AN/FSC... FORTRAN IV -EW-RAK9-- The model consz.sts of IT--k-LIC- I-U-0NS: approximately 367 lines of SiNCGARS, PERSHING II coding . %.’. ~ LSA TASK INTERFACE...system supported by Computer’ Systems Command. The current version of LADEN is coded totally in FORTRAN 󈧕 for virtual memory operating system

  8. Price Responsiveness in the AEO2003 NEMS Residential and Commercial Buildings Sector Models

    EIA Publications

    2003-01-01

    This paper describes the demand responses to changes in energy prices in the Annual Energy Outlook 2003 versions of the Residential and Commercial Demand Modules of the National Energy Modeling System (NEMS). It updates a similar paper completed for the Annual Energy Outlook 1999 version of the NEMS.

  9. Distributed Database Control and Allocation. Volume 3. Distributed Database System Designer’s Handbook.

    DTIC Science & Technology

    1983-10-01

    Multiversion Data 2-18 2.7.1 Multiversion Timestamping 2-20 2.T.2 Multiversion Looking 2-20 2.8 Combining the Techniques 2-22 3. Database Recovery Algorithms...See rTHEM79, GIFF79] for details. 2.7 Multiversion Data Let us return to a database system model where each logical data item is stored at one DM...In a multiversion database each Write wifxl, produces a new copy (or version) of x, denoted xi. Thus, the value of z is a set of ver- sions. For each

  10. Version 4.0 of code Java for 3D simulation of the CCA model

    NASA Astrophysics Data System (ADS)

    Fan, Linyu; Liao, Jianwei; Zuo, Junsen; Zhang, Kebo; Li, Chao; Xiong, Hailing

    2018-07-01

    This paper presents a new version Java code for the three-dimensional simulation of Cluster-Cluster Aggregation (CCA) model to replace the previous version. Many redundant traverses of clusters-list in the program were totally avoided, so that the consumed simulation time is significantly reduced. In order to show the aggregation process in a more intuitive way, we have labeled different clusters with varied colors. Besides, a new function is added for outputting the particle's coordinates of aggregates in file to benefit coupling our model with other models.

  11. Model-Driven Development for scientific computing. An upgrade of the RHEEDGr program

    NASA Astrophysics Data System (ADS)

    Daniluk, Andrzej

    2009-11-01

    Model-Driven Engineering (MDE) is the software engineering discipline, which considers models as the most important element for software development, and for the maintenance and evolution of software, through model transformation. Model-Driven Architecture (MDA) is the approach for software development under the Model-Driven Engineering framework. This paper surveys the core MDA technology that was used to upgrade of the RHEEDGR program to C++0x language standards. New version program summaryProgram title: RHEEDGR-09 Catalogue identifier: ADUY_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUY_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 21 263 No. of bytes in distributed program, including test data, etc.: 1 266 982 Distribution format: tar.gz Programming language: Code Gear C++ Builder Computer: Intel Core Duo-based PC Operating system: Windows XP, Vista, 7 RAM: more than 1 MB Classification: 4.3, 7.2, 6.2, 8, 14 Does the new version supersede the previous version?: Yes Nature of problem: Reflection High-Energy Electron Diffraction (RHEED) is a very useful technique for studying growth and surface analysis of thin epitaxial structures prepared by the Molecular Beam Epitaxy (MBE). The RHEED technique can reveal, almost instantaneously, changes either in the coverage of the sample surface by adsorbates or in the surface structure of a thin film. Solution method: The calculations are based on the use of a dynamical diffraction theory in which the electrons are taken to be diffracted by a potential, which is periodic in the dimension perpendicular to the surface. Reasons for new version: Responding to the user feedback the graphical version of the RHEED program has been upgraded to C++0x language standards. Also, functionality and documentation of the program have been improved. Summary of revisions: Model-Driven Architecture (MDA) is the approach defined by the Object Management Group (OMG) for software development under the Model-Driven Engineering framework [1]. The MDA approach shifts the focus of software development from writing code to building models. By adapting a model-centric approach, the MDA approach hopes to automate the generation of system implementation artifacts directly from the model. The following three models are the core of the MDA: (i) the Computation Independent Model (CIM), which is focused on basic requirements of the system, (ii) the Platform Independent Model (PIM), which is used by software architects and designers, and is focused on the operational capabilities of a system outside the context of a specific platform, and (iii) the Platform Specific Model (PSM), which is used by software developers and programmers, and includes details relating to the system for a specific platform. Basic requirements for the calculation of the RHEED intensity rocking curves in the one-beam condition have been described in Ref. [2]. Fig. 1 shows the PIM for the present version of the program. Fig. 2 presents the PSM for the program. The TGraph2D.bpk package has been recompiled to Graph2D0x.bpl and upgraded according to C++0x language standards. Fig. 3 shows the PSM of the Graph2D component, which is manifested by the Graph2D0x.bpl package presently. This diagram is a graphic presentation of the static view, which shows a collection of declarative model elements and their relationships. Installation instructions of the Graph2D0x package can be found in the new distribution. The program requires the user to provide the appropriate parameters for the crystal structure under investigation. These parameters are loaded from the parameters.ini file at run-time. Instructions for the preparation of the .ini files can be found in the new distribution. The program enables carrying out one-dimensional dynamical calculations for the fcc lattice, with a two-atoms basis and fcc lattice, with one atom basis but yet the zeroth Fourier component of the scattering potential in the TRHEED1D::crystPotUg() function can be modified according to users' specific application requirements. A graphical user interface (GUI) for the program has been reconstructed. The program has been compiled with English/USA regional and language options. Unusual features: The program is distributed in the form of main projects RHEEDGr_09.cbproj and Graph2D0x.cbproj with associated files, and should be compiled using Code Gear C++ Builder 2009 compilers. Running time: The typical running time is machine and user-parameters dependent. References: OMG, Model Driven Architecture Guide Version 1.0.1, 2003, http://www.omg.org/cgi-bin/doc?omg/03-06-01. A. Daniluk, Comput. Phys. Comm. 166 (2005) 123.

  12. PVWatts Version 5 Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobos, A. P.

    2014-09-01

    The NREL PVWatts calculator is a web application developed by the National Renewable Energy Laboratory (NREL) that estimates the electricity production of a grid-connected photovoltaic system based on a few simple inputs. PVWatts combines a number of sub-models to predict overall system performance, and makes includes several built-in parameters that are hidden from the user. This technical reference describes the sub-models, documents assumptions and hidden parameters, and explains the sequence of calculations that yield the final system performance estimate. This reference is applicable to the significantly revised version of PVWatts released by NREL in 2014.

  13. The CNRM-CM5.1 global climate model: description and basic evaluation

    NASA Astrophysics Data System (ADS)

    Voldoire, A.; Sanchez-Gomez, E.; Salas y Mélia, D.; Decharme, B.; Cassou, C.; Sénési, S.; Valcke, S.; Beau, I.; Alias, A.; Chevallier, M.; Déqué, M.; Deshayes, J.; Douville, H.; Fernandez, E.; Madec, G.; Maisonnave, E.; Moine, M.-P.; Planton, S.; Saint-Martin, D.; Szopa, S.; Tyteca, S.; Alkama, R.; Belamari, S.; Braun, A.; Coquart, L.; Chauvin, F.

    2013-05-01

    A new version of the general circulation model CNRM-CM has been developed jointly by CNRM-GAME (Centre National de Recherches Météorologiques—Groupe d'études de l'Atmosphère Météorologique) and Cerfacs (Centre Européen de Recherche et de Formation Avancée) in order to contribute to phase 5 of the Coupled Model Intercomparison Project (CMIP5). The purpose of the study is to describe its main features and to provide a preliminary assessment of its mean climatology. CNRM-CM5.1 includes the atmospheric model ARPEGE-Climat (v5.2), the ocean model NEMO (v3.2), the land surface scheme ISBA and the sea ice model GELATO (v5) coupled through the OASIS (v3) system. The main improvements since CMIP3 are the following. Horizontal resolution has been increased both in the atmosphere (from 2.8° to 1.4°) and in the ocean (from 2° to 1°). The dynamical core of the atmospheric component has been revised. A new radiation scheme has been introduced and the treatments of tropospheric and stratospheric aerosols have been improved. Particular care has been devoted to ensure mass/water conservation in the atmospheric component. The land surface scheme ISBA has been externalised from the atmospheric model through the SURFEX platform and includes new developments such as a parameterization of sub-grid hydrology, a new freezing scheme and a new bulk parameterisation for ocean surface fluxes. The ocean model is based on the state-of-the-art version of NEMO, which has greatly progressed since the OPA8.0 version used in the CMIP3 version of CNRM-CM. Finally, the coupling between the different components through OASIS has also received a particular attention to avoid energy loss and spurious drifts. These developments generally lead to a more realistic representation of the mean recent climate and to a reduction of drifts in a preindustrial integration. The large-scale dynamics is generally improved both in the atmosphere and in the ocean, and the bias in mean surface temperature is clearly reduced. However, some flaws remain such as significant precipitation and radiative biases in many regions, or a pronounced drift in three dimensional salinity.

  14. Inertial Survey Application to Civil Works,

    DTIC Science & Technology

    1983-01-01

    closer together the ZUPTS must be performed. ZUPTS are used by the system to provide external information to the error control system used (Kalman...the best estimate of the system states. Accuracy of the system is increased when the sensor information from the inertial platform is compared with the...building a new version of its Auto-Surveyor System to be known as LASS II. This version will be based on the production model of the PADS presently

  15. Modeling variably saturated multispecies reactive groundwater solute transport with MODFLOW-UZF and RT3D

    USGS Publications Warehouse

    Bailey, Ryan T.; Morway, Eric D.; Niswonger, Richard G.; Gates, Timothy K.

    2013-01-01

    A numerical model was developed that is capable of simulating multispecies reactive solute transport in variably saturated porous media. This model consists of a modified version of the reactive transport model RT3D (Reactive Transport in 3 Dimensions) that is linked to the Unsaturated-Zone Flow (UZF1) package and MODFLOW. Referred to as UZF-RT3D, the model is tested against published analytical benchmarks as well as other published contaminant transport models, including HYDRUS-1D, VS2DT, and SUTRA, and the coupled flow and transport modeling system of CATHY and TRAN3D. Comparisons in one-dimensional, two-dimensional, and three-dimensional variably saturated systems are explored. While several test cases are included to verify the correct implementation of variably saturated transport in UZF-RT3D, other cases are included to demonstrate the usefulness of the code in terms of model run-time and handling the reaction kinetics of multiple interacting species in variably saturated subsurface systems. As UZF1 relies on a kinematic-wave approximation for unsaturated flow that neglects the diffusive terms in Richards equation, UZF-RT3D can be used for large-scale aquifer systems for which the UZF1 formulation is reasonable, that is, capillary-pressure gradients can be neglected and soil parameters can be treated as homogeneous. Decreased model run-time and the ability to include site-specific chemical species and chemical reactions make UZF-RT3D an attractive model for efficient simulation of multispecies reactive transport in variably saturated large-scale subsurface systems.

  16. Implications of the methodological choices for hydrologic portrayals of climate change over the contiguous United States: Statistically downscaled forcing data and hydrologic models

    USGS Publications Warehouse

    Mizukami, Naoki; Clark, Martyn P.; Gutmann, Ethan D.; Mendoza, Pablo A.; Newman, Andrew J.; Nijssen, Bart; Livneh, Ben; Hay, Lauren E.; Arnold, Jeffrey R.; Brekke, Levi D.

    2016-01-01

    Continental-domain assessments of climate change impacts on water resources typically rely on statistically downscaled climate model outputs to force hydrologic models at a finer spatial resolution. This study examines the effects of four statistical downscaling methods [bias-corrected constructed analog (BCCA), bias-corrected spatial disaggregation applied at daily (BCSDd) and monthly scales (BCSDm), and asynchronous regression (AR)] on retrospective hydrologic simulations using three hydrologic models with their default parameters (the Community Land Model, version 4.0; the Variable Infiltration Capacity model, version 4.1.2; and the Precipitation–Runoff Modeling System, version 3.0.4) over the contiguous United States (CONUS). Biases of hydrologic simulations forced by statistically downscaled climate data relative to the simulation with observation-based gridded data are presented. Each statistical downscaling method produces different meteorological portrayals including precipitation amount, wet-day frequency, and the energy input (i.e., shortwave radiation), and their interplay affects estimations of precipitation partitioning between evapotranspiration and runoff, extreme runoff, and hydrologic states (i.e., snow and soil moisture). The analyses show that BCCA underestimates annual precipitation by as much as −250 mm, leading to unreasonable hydrologic portrayals over the CONUS for all models. Although the other three statistical downscaling methods produce a comparable precipitation bias ranging from −10 to 8 mm across the CONUS, BCSDd severely overestimates the wet-day fraction by up to 0.25, leading to different precipitation partitioning compared to the simulations with other downscaled data. Overall, the choice of downscaling method contributes to less spread in runoff estimates (by a factor of 1.5–3) than the choice of hydrologic model with use of the default parameters if BCCA is excluded.

  17. Dynamic Analysis of Spur Gear Transmissions (DANST). PC Version 3.00 User Manual

    NASA Technical Reports Server (NTRS)

    Oswald, Fred B.; Lin, Hsiang Hsi; Delgado, Irebert R.

    1996-01-01

    DANST is a FORTRAN computer program for static and dynamic analysis of spur gear systems. The program can be used for parametric studies to predict the static transmission error, dynamic load, tooth bending stress and other properties of spur gears as they are influenced by operating speed, torque, stiffness, damping, inertia, and tooth profile. DANST performs geometric modeling and dynamic analysis for low- or high-contact-ratio spur gears. DANST can simulate gear systems with contact ratios ranging from one to three. It was designed to be easy to use and it is extensively documented in several previous reports and by comments in the source code. This report describes installing and using a new PC version of DANST, covers input data requirements and presents examples.

  18. Development of a system emulating the global carbon cycle in Earth system models

    NASA Astrophysics Data System (ADS)

    Tachiiri, K.; Hargreaves, J. C.; Annan, J. D.; Oka, A.; Abe-Ouchi, A.; Kawamiya, M.

    2010-08-01

    Recent studies have indicated that the uncertainty in the global carbon cycle may have a significant impact on the climate. Since state of the art models are too computationally expensive for it to be possible to explore their parametric uncertainty in anything approaching a comprehensive fashion, we have developed a simplified system for investigating this problem. By combining the strong points of general circulation models (GCMs), which contain detailed and complex processes, and Earth system models of intermediate complexity (EMICs), which are quick and capable of large ensembles, we have developed a loosely coupled model (LCM) which can represent the outputs of a GCM-based Earth system model, using much smaller computational resources. We address the problem of relatively poor representation of precipitation within our EMIC, which prevents us from directly coupling it to a vegetation model, by coupling it to a precomputed transient simulation using a full GCM. The LCM consists of three components: an EMIC (MIROC-lite) which consists of a 2-D energy balance atmosphere coupled to a low resolution 3-D GCM ocean (COCO) including an ocean carbon cycle (an NPZD-type marine ecosystem model); a state of the art vegetation model (Sim-CYCLE); and a database of daily temperature, precipitation, and other necessary climatic fields to drive Sim-CYCLE from a precomputed transient simulation from a state of the art AOGCM. The transient warming of the climate system is calculated from MIROC-lite, with the global temperature anomaly used to select the most appropriate annual climatic field from the pre-computed AOGCM simulation which, in this case, is a 1% pa increasing CO2 concentration scenario. By adjusting the effective climate sensitivity (equivalent to the equilibrium climate sensitivity for an energy balance model) of MIROC-lite, the transient warming of the LCM could be adjusted to closely follow the low sensitivity (with an equilibrium climate sensitivity of 4.0 K) version of MIROC3.2. By tuning of the physical and biogeochemical parameters it was possible to reasonably reproduce the bulk physical and biogeochemical properties of previously published CO2 stabilisation scenarios for that model. As an example of an application of the LCM, the behavior of the high sensitivity version of MIROC3.2 (with a 6.3 K equilibrium climate sensitivity) is also demonstrated. Given the highly adjustable nature of the model, we believe that the LCM should be a very useful tool for studying uncertainty in global climate change, and we have named the model, JUMP-LCM, after the name of our research group (Japan Uncertainty Modelling Project).

  19. Integration of the predictions of two models with dose measurements in a case study of children exposed to the emissions of a lead smelter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonnard, R.; McKone, T.E.

    2009-03-01

    The predictions of two source-to-dose models are systematically evaluated with observed data collected in a village polluted by a currently operating secondary lead smelter. Both models were built up from several sub-models linked together and run using Monte-Carlo simulation, to calculate the distribution children's blood lead levels attributable to the emissions from the facility. The first model system is composed of the CalTOX model linked to a recoded version of the IEUBK model. This system provides the distribution of the media-specific lead concentrations (air, soil, fruit, vegetables and blood) in the whole area investigated. The second model consists of amore » statistical model to estimate the lead deposition on the ground, a modified version of the model HHRAP and the same recoded version of the IEUBK model. This system provides an estimate of the concentration of exposure of specific individuals living in the study area. The predictions of the first model system were improved in terms of accuracy and precision by performing a sensitivity analysis and using field data to correct the default value provided for the leaf wet density. However, in this case study, the first model system tends to overestimate the exposure due to exposed vegetables. The second model was tested for nine children with contrasting exposure conditions. It managed to capture the blood levels for eight of them. In the last case, the exposure of the child by pathways not considered in the model may explain the failure of the model. The interest of this integrated model is to provide outputs with lower variance than the first model system, but at the moment further tests are necessary to conclude about its accuracy.« less

  20. System of Least Prompts. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2018

    2018-01-01

    This intervention report presents findings from a systematic review of the System of Least Prompts (SLP) conducted using the WWC Procedures and Standards Handbook (version 3.0) and the Children and Students with Intellectual Disability review protocol (version 3.1). No studies of SLP that fall within the scope of the Children and Students with…

  1. Evaluation of precipitation forecasts from 3D-Var and hybrid GSI-based system during Indian summer monsoon 2015

    NASA Astrophysics Data System (ADS)

    Singh, Sanjeev Kumar; Prasad, V. S.

    2018-02-01

    This paper presents a systematic investigation of medium-range rainfall forecasts from two versions of the National Centre for Medium Range Weather Forecasting (NCMRWF)-Global Forecast System based on three-dimensional variational (3D-Var) and hybrid analysis system namely, NGFS and HNGFS, respectively, during Indian summer monsoon (June-September) 2015. The NGFS uses gridpoint statistical interpolation (GSI) 3D-Var data assimilation system, whereas HNGFS uses hybrid 3D ensemble-variational scheme. The analysis includes the evaluation of rainfall fields and comparisons of rainfall using statistical score such as mean precipitation, bias, correlation coefficient, root mean square error and forecast improvement factor. In addition to these, categorical scores like Peirce skill score and bias score are also computed to describe particular aspects of forecasts performance. The comparison results of mean precipitation reveal that both the versions of model produced similar large-scale feature of Indian summer monsoon rainfall for day-1 through day-5 forecasts. The inclusion of fully flow-dependent background error covariance significantly improved the wet biases in HNGFS over the Indian Ocean. The forecast improvement factor and Peirce skill score in the HNGFS have also found better than NGFS for day-1 through day-5 forecasts.

  2. Coupling of TRAC-PF1/MOD2, Version 5.4.25, with NESTLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knepper, P.L.; Hochreiter, L.E.; Ivanov, K.N.

    1999-09-01

    A three-dimensional (3-D) spatial kinetics capability within a thermal-hydraulics system code provides a more correct description of the core physics during reactor transients that involve significant variations in the neutron flux distribution. Coupled codes provide the ability to forecast safety margins in a best-estimate manner. The behavior of a reactor core and the feedback to the plant dynamics can be accurately simulated. For each time step, coupled codes are capable of resolving system interaction effects on neutronics feedback and are capable of describing local neutronics effects caused by the thermal hydraulics and neutronics coupling. With the improvements in computational technology,more » modeling complex reactor behaviors with coupled thermal hydraulics and spatial kinetics is feasible. Previously, reactor analysis codes were limited to either a detailed thermal-hydraulics model with simplified kinetics or multidimensional neutron kinetics with a simplified thermal-hydraulics model. The authors discuss the coupling of the Transient Reactor Analysis Code (TRAC)-PF1/MOD2, Version 5.4.25, with the NESTLE code.« less

  3. TADIR-production version: El-Op's high-resolution 480x4 TDI thermal imaging system

    NASA Astrophysics Data System (ADS)

    Sarusi, Gabby; Ziv, Natan; Zioni, O.; Gaber, J.; Shechterman, Mark S.; Lerner, M.

    1999-07-01

    Efforts invested at El-Op during the last four years have led to the development of TADIR - engineering model thermal imager, demonstrated in 1998, and eventually to the final production version of TADIR to be demonstrated in full operation during 1999. Both versions take advantage of the high resolution and high sensitivity obtained by the 480 X 4 TDI MCT detector as well as many more features implemented in the system to obtain a state of the art high- end thermal imager. The production version of TADIR uses a 480 X 6 TDI HgCdTe detector made by the SCD Israeli company. In this paper, we will present the main features of the production version of TADIR.

  4. A theoretical basis for the analysis of redundant software subject to coincident errors

    NASA Technical Reports Server (NTRS)

    Eckhardt, D. E., Jr.; Lee, L. D.

    1985-01-01

    Fundamental to the development of redundant software techniques fault-tolerant software, is an understanding of the impact of multiple-joint occurrences of coincident errors. A theoretical basis for the study of redundant software is developed which provides a probabilistic framework for empirically evaluating the effectiveness of the general (N-Version) strategy when component versions are subject to coincident errors, and permits an analytical study of the effects of these errors. The basic assumptions of the model are: (1) independently designed software components are chosen in a random sample; and (2) in the user environment, the system is required to execute on a stationary input series. The intensity of coincident errors, has a central role in the model. This function describes the propensity to introduce design faults in such a way that software components fail together when executing in the user environment. The model is used to give conditions under which an N-Version system is a better strategy for reducing system failure probability than relying on a single version of software. A condition which limits the effectiveness of a fault-tolerant strategy is studied, and it is posted whether system failure probability varies monotonically with increasing N or whether an optimal choice of N exists.

  5. STEPS: Modeling and Simulating Complex Reaction-Diffusion Systems with Python

    PubMed Central

    Wils, Stefan; Schutter, Erik De

    2008-01-01

    We describe how the use of the Python language improved the user interface of the program STEPS. STEPS is a simulation platform for modeling and stochastic simulation of coupled reaction-diffusion systems with complex 3-dimensional boundary conditions. Setting up such models is a complicated process that consists of many phases. Initial versions of STEPS relied on a static input format that did not cleanly separate these phases, limiting modelers in how they could control the simulation and becoming increasingly complex as new features and new simulation algorithms were added. We solved all of these problems by tightly integrating STEPS with Python, using SWIG to expose our existing simulation code. PMID:19623245

  6. FY17 Status Report on the Computing Systems for the Yucca Mountain Project TSPA-LA Models.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Appel, Gordon John; Hadgu, Teklu; Appel, Gordon John

    Sandia National Laboratories (SNL) continued evaluation of total system performance assessment (TSPA) computing systems for the previously considered Yucca Mountain Project (YMP). This was done to maintain the operational readiness of the computing infrastructure (computer hardware and software) and knowledge capability for total system performance assessment (TSPA) type analysis, as directed by the National Nuclear Security Administration (NNSA), DOE 2010. This work is a continuation of the ongoing readiness evaluation reported in Lee and Hadgu (2014), Hadgu et al. (2015) and Hadgu and Appel (2016). The TSPA computing hardware (CL2014) and storage system described in Hadgu et al. (2015) weremore » used for the current analysis. One floating license of GoldSim with Versions 9.60.300, 10.5, 11.1 and 12.0 was installed on the cluster head node, and its distributed processing capability was mapped on the cluster processors. Other supporting software were tested and installed to support the TSPA- type analysis on the server cluster. The current tasks included preliminary upgrade of the TSPA-LA from Version 9.60.300 to the latest version 12.0 and address DLL-related issues observed in the FY16 work. The model upgrade task successfully converted the Nominal Modeling case to GoldSim Versions 11.1/12. Conversions of the rest of the TSPA models were also attempted but program and operational difficulties precluded this. Upgrade of the remaining of the modeling cases and distributed processing tasks is expected to continue. The 2014 server cluster and supporting software systems are fully operational to support TSPA-LA type analysis.« less

  7. The NASA Modern Era Reanalysis for Research and Applications, Version-2 (MERRA-2)

    NASA Astrophysics Data System (ADS)

    Gelaro, R.; McCarty, W.; Molod, A.; Suarez, M.; Takacs, L.; Todling, R.

    2014-12-01

    The NASA Modern Era Reanalysis for Research Applications Version-2 (MERRA-2) is a reanalysis for the satellite era using an updated version of the Goddard Earth Observing System Data Assimilation System Version-5 (GEOS-5) produced by the Global Modeling and Assimilation Office (GMAO). MERRA-2 will assimilate meteorological and aerosol observations not available to MERRA and includes improvements to the GEOS-5 model and analysis scheme so as to provide an ongoing climate analysis beyond MERRA's terminus. MERRA-2 will also serve as a development milestone for a future GMAO coupled Earth system analysis. Production of MERRA-2 began in June 2014 in four processing streams, with convergence to a single near-real time climate analysis expected by early 2015. This talk provides an overview of the MERRA-2 system developments and key science results. For example, compared with MERRA, MERRA-2 exhibits a well-balanced relationship between global precipitation and evaporation, with significantly reduced sensitivity to changes in the global observing system through time. Other notable improvements include reduced biases in the tropical middle- and upper-tropospheric wind and near-surface temperature over continents.

  8. NASA AVOSS Fast-Time Models for Aircraft Wake Prediction: User's Guide (APA3.8 and TDP2.1)

    NASA Technical Reports Server (NTRS)

    Ahmad, Nash'at N.; VanValkenburg, Randal L.; Pruis, Matthew J.; Limon Duparcmeur, Fanny M.

    2016-01-01

    NASA's current distribution of fast-time wake vortex decay and transport models includes APA (Version 3.8) and TDP (Version 2.1). This User's Guide provides detailed information on the model inputs, file formats, and model outputs. A brief description of the Memphis 1995, Dallas/Fort Worth 1997, and the Denver 2003 wake vortex datasets is given along with the evaluation of models. A detailed bibliography is provided which includes publications on model development, wake field experiment descriptions, and applications of the fast-time wake vortex models.

  9. GNAQPMS v1.1: accelerating the Global Nested Air Quality Prediction Modeling System (GNAQPMS) on Intel Xeon Phi processors

    NASA Astrophysics Data System (ADS)

    Wang, Hui; Chen, Huansheng; Wu, Qizhong; Lin, Junmin; Chen, Xueshun; Xie, Xinwei; Wang, Rongrong; Tang, Xiao; Wang, Zifa

    2017-08-01

    The Global Nested Air Quality Prediction Modeling System (GNAQPMS) is the global version of the Nested Air Quality Prediction Modeling System (NAQPMS), which is a multi-scale chemical transport model used for air quality forecast and atmospheric environmental research. In this study, we present the porting and optimisation of GNAQPMS on a second-generation Intel Xeon Phi processor, codenamed Knights Landing (KNL). Compared with the first-generation Xeon Phi coprocessor (codenamed Knights Corner, KNC), KNL has many new hardware features such as a bootable processor, high-performance in-package memory and ISA compatibility with Intel Xeon processors. In particular, we describe the five optimisations we applied to the key modules of GNAQPMS, including the CBM-Z gas-phase chemistry, advection, convection and wet deposition modules. These optimisations work well on both the KNL 7250 processor and the Intel Xeon E5-2697 V4 processor. They include (1) updating the pure Message Passing Interface (MPI) parallel mode to the hybrid parallel mode with MPI and OpenMP in the emission, advection, convection and gas-phase chemistry modules; (2) fully employing the 512 bit wide vector processing units (VPUs) on the KNL platform; (3) reducing unnecessary memory access to improve cache efficiency; (4) reducing the thread local storage (TLS) in the CBM-Z gas-phase chemistry module to improve its OpenMP performance; and (5) changing the global communication from writing/reading interface files to MPI functions to improve the performance and the parallel scalability. These optimisations greatly improved the GNAQPMS performance. The same optimisations also work well for the Intel Xeon Broadwell processor, specifically E5-2697 v4. Compared with the baseline version of GNAQPMS, the optimised version was 3.51 × faster on KNL and 2.77 × faster on the CPU. Moreover, the optimised version ran at 26 % lower average power on KNL than on the CPU. With the combined performance and energy improvement, the KNL platform was 37.5 % more efficient on power consumption compared with the CPU platform. The optimisations also enabled much further parallel scalability on both the CPU cluster and the KNL cluster scaled to 40 CPU nodes and 30 KNL nodes, with a parallel efficiency of 70.4 and 42.2 %, respectively.

  10. Multi-model perspectives and inter-comparison of soil moisture and evapotranspiration in East Africa—an application of Famine Early Warning Systems Network (FEWS NET) Land Data Assimilation System (FLDAS)

    NASA Astrophysics Data System (ADS)

    Pervez, M. S.; McNally, A.; Arsenault, K. R.

    2017-12-01

    Convergence of evidence from different agro-hydrologic sources is particularly important for drought monitoring in data sparse regions. In Africa, a combination of remote sensing and land surface modeling experiments are used to evaluate past, present and future drought conditions. The Famine Early Warning Systems Network (FEWS NET) Land Data Assimilation System (FLDAS) routinely simulates daily soil moisture, evapotranspiration (ET) and other variables over Africa using multiple models and inputs. We found that Noah 3.3, Variable Infiltration Capacity (VIC) 4.1.2, and Catchment Land Surface Model based FLDAS simulations of monthly soil moisture percentile maps captured concurrent drought and water surplus episodes effectively over East Africa. However, the results are sensitive to selection of land surface model and hydrometeorological forcings. We seek to identify sources of uncertainty (input, model, parameter) to eventually improve the accuracy of FLDAS outputs. In absence of in situ data, previous work used European Space Agency Climate Change Initiative Soil Moisture (CCI-SM) data measured from merged active-passive microwave remote sensing to evaluate FLDAS soil moisture, and found that during the high rainfall months of April-May and November-December Noah-based soil moisture correlate well with CCI-SM over the Greater Horn of Africa region. We have found good correlations (r>0.6) for FLDAS Noah 3.3 ET anomalies and Operational Simplified Surface Energy Balance (SSEBop) ET over East Africa. Recently, SSEBop ET estimates (version 4) were improved by implementing a land surface temperature correction factor. We re-evaluate the correlations between FLDAS ET and version 4 SSEBop ET. To further investigate the reasons for differences between models we evaluate FLDAS soil moisture with Advanced Scatterometer and SMAP soil moisture and FLDAS outputs with MODIS and AVHRR normalized difference vegetation index. By exploring longer historic time series and near-real time products we will be aiding convergence of evidence for better understanding of historic drought, improved monitoring and forecasting, and better understanding of uncertainties of water availability estimation over Africa

  11. Developmental assessment of the Fort St. Vrain version of the Composite HTGR Analysis Program (CHAP-2)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stroh, K.R.

    1980-01-01

    The Composite HTGR Analysis Program (CHAP) consists of a model-independent systems analysis mainframe named LASAN and model-dependent linked code modules, each representing a component, subsystem, or phenomenon of an HTGR plant. The Fort St. Vrain (FSV) version (CHAP-2) includes 21 coded modules that model the neutron kinetics and thermal response of the core; the thermal-hydraulics of the reactor primary coolant system, secondary steam supply system, and balance-of-plant; the actions of the control system and plant protection system; the response of the reactor building; and the relative hazard resulting from fuel particle failure. FSV steady-state and transient plant data are beingmore » used to partially verify the component modeling and dynamic smulation techniques used to predict plant response to postulated accident sequences.« less

  12. UQTk Version 3.0.3 User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sargsyan, Khachik; Safta, Cosmin; Chowdhary, Kamaljit Singh

    2017-05-01

    The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 3.0.3 offers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sen- sitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.

  13. Automated 3D reconstruction of interiors with multiple scan views

    NASA Astrophysics Data System (ADS)

    Sequeira, Vitor; Ng, Kia C.; Wolfart, Erik; Goncalves, Joao G. M.; Hogg, David C.

    1998-12-01

    This paper presents two integrated solutions for realistic 3D model acquisition and reconstruction; an early prototype, in the form of a push trolley, and a later prototype in the form of an autonomous robot. The systems encompass all hardware and software required, from laser and video data acquisition, processing and output of texture-mapped 3D models in VRML format, to batteries for power supply and wireless network communications. The autonomous version is also equipped with a mobile platform and other sensors for the purpose of automatic navigation. The applications for such a system range from real estate and tourism (e.g., showing a 3D computer model of a property to a potential buyer or tenant) or as tool for content creation (e.g., creating 3D models of heritage buildings or producing broadcast quality virtual studios). The system can also be used in industrial environments as a reverse engineering tool to update the design of a plant, or as a 3D photo-archive for insurance purposes. The system is Internet compatible: the photo-realistic models can be accessed via the Internet and manipulated interactively in 3D using a common Web browser with a VRML plug-in. Further information and example reconstructed models are available on- line via the RESOLV web-page at http://www.scs.leeds.ac.uk/resolv/.

  14. Evaluation of scenario-specific modeling approaches to predict plane of array solar irradiation

    DOE PAGES

    Moslehi, Salim; Reddy, T. Agami; Katipamula, Srinivas

    2017-12-20

    Predicting thermal or electric power output from solar collectors requires knowledge of solar irradiance incident on the collector, known as plane of array irradiance. In the absence of such a measurement, plane of array irradiation can be predicted using relevant transposition models which essentially requires diffuse (or beam) radiation to be to be known along with total horizontal irradiation. The two main objectives of the current study are (1) to evaluate the extent to which the prediction of plane of array irradiance is improved when diffuse radiation is predicted using location-specific regression models developed from on-site measured data as againstmore » using generalized models; and (2) to estimate the expected uncertainties associated with plane of array irradiance predictions under different data collection scenarios likely to be encountered in practical situations. These issues have been investigated using monitored data for several U.S. locations in conjunction with the Typical Meteorological Year, version 3 database. An interesting behavior in the Typical Meteorological Year, version 3 data was also observed in correlation patterns between diffuse and total radiation taken from different years which seems to attest to a measurement problem. Furthermore, the current study was accomplished under a broader research agenda aimed at providing energy managers the necessary tools for predicting, scheduling, and controlling various sub-systems of an integrated energy system.« less

  15. Evaluation of scenario-specific modeling approaches to predict plane of array solar irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moslehi, Salim; Reddy, T. Agami; Katipamula, Srinivas

    Predicting thermal or electric power output from solar collectors requires knowledge of solar irradiance incident on the collector, known as plane of array irradiance. In the absence of such a measurement, plane of array irradiation can be predicted using relevant transposition models which essentially requires diffuse (or beam) radiation to be to be known along with total horizontal irradiation. The two main objectives of the current study are (1) to evaluate the extent to which the prediction of plane of array irradiance is improved when diffuse radiation is predicted using location-specific regression models developed from on-site measured data as againstmore » using generalized models; and (2) to estimate the expected uncertainties associated with plane of array irradiance predictions under different data collection scenarios likely to be encountered in practical situations. These issues have been investigated using monitored data for several U.S. locations in conjunction with the Typical Meteorological Year, version 3 database. An interesting behavior in the Typical Meteorological Year, version 3 data was also observed in correlation patterns between diffuse and total radiation taken from different years which seems to attest to a measurement problem. Furthermore, the current study was accomplished under a broader research agenda aimed at providing energy managers the necessary tools for predicting, scheduling, and controlling various sub-systems of an integrated energy system.« less

  16. Thermospheric dynamics - A system theory approach

    NASA Technical Reports Server (NTRS)

    Codrescu, M.; Forbes, J. M.; Roble, R. G.

    1990-01-01

    A system theory approach to thermospheric modeling is developed, based upon a linearization method which is capable of preserving nonlinear features of a dynamical system. The method is tested using a large, nonlinear, time-varying system, namely the thermospheric general circulation model (TGCM) of the National Center for Atmospheric Research. In the linearized version an equivalent system, defined for one of the desired TGCM output variables, is characterized by a set of response functions that is constructed from corresponding quasi-steady state and unit sample response functions. The linearized version of the system runs on a personal computer and produces an approximation of the desired TGCM output field height profile at a given geographic location.

  17. Extension of the CAPRAM mechanism with the improved mechanism generator GECKO-A

    NASA Astrophysics Data System (ADS)

    Bräuer, Peter; Mouchel-Vallon, Camille; Tilgner, Andreas; Wolke, Ralf; Aumont, Bernard; Herrmann, Hartmut

    2013-04-01

    Organic compounds are an ubiquitous constituent of the tropospheric multiphase system. With either biogenic or anthropogenic sources, they have a major influence on the atmospheric multiphase system and thus have become a main research topic within the last decades. Modelling can provide a useful tool to explore the tropospheric multiphase chemistry. While in the gas phase several comprehensive near-explicit mechanisms exist, in the aqueous phase those mechanisms are very limited. The current study aims to advance the currently most comprehensive aqueous phase mechanism CAPRAM 3.0 by means of automated mechanism construction. Therefore, the mechanism generator GECKO-A (Generator for Explicit Chemistry and Kinetics of Organics in the Atmosphere; see Aumont et al., 2005) has been advanced to the aqueous phase. A protocol has been designed for automated mechanism construction based on reviewed experimental data and evaluated prediction methods. The generator is able to describe the oxidation of aliphatic organic compounds by OH and NO3. For the mechanism construction, mainly structure-activity relationships are used, which are completed by Evans-Polanyi-type correlations and further suitable estimates. GECKO-A has been used to create new CAPRAM versions, where branching ratios are introduced and new chemical subsystems with species with up to 4 carbon atoms are added. The currently most comprehensive version, CAPRAM 3.7, includes about 2000 aqueous phase species and more than 3300 reactions in the aqueous phase. Box model studies have been performed using a meteorological scenario with non-permanent clouds. Besides the investigation of the concentration-time profiles, detailed time-resolved flux analyses have been performed. Several aqueous phase subsystems have been investigated, such as the formation of oxidised mono- and diacids in the aqueous phase, as well as interactions to inorganic cycles and the influence on the gas phase chemistry and composition. Results have been compared to results of previous versions and show a significant improvement in the new mechanism versions, when comparing the modelled data to field data from literature. For example, in CAPRAM 3.7 there is a malonic acid production of about 80 ng m-3 compared to a few ng m-3 in CAPRAM 3.0. The results in CAPRAM 3.7 confirm recent measurements by Bao et al. (2012), who measure up to 137 ng m-3. Moreover, several attempts have been undertaken to validate the mechanisms created by GECKO-A with own field experiments, such as the HCCT-2010 campaign and chamber experiments in the LEAK chamber. References Aumont, B., Szopa, S., Madronich, S.: Modelling the evolution of organic carbon during its gas-phase tropospheric oxidation: development of an explicit model based on a self generating approach. Atmos. Chem. Phys., 5, 2497-2517 (2005). Bao, L., Matsumoto, M., Kubota, T., Sekiguchi, K., Wang, Q., Sakamoto, K.: Gas/particle partitioning of low-molecular-weight dicarboxylic acids at a suburban site in Saitama, Japan. Atmos. Env., 47, 546 - 553 (2012).

  18. NOAA Fisheries Toolbox - Welcome

    Science.gov Websites

    Fitting Model » Stock Synthesis Version 3 » Survival Estimation in Non-Equilibrium situations » Virtual -Sissenwine Analysis (CSA) 4.3 01/13/2014 Dual Zone Virtual Population Analysis (VPA-2BOX) 3.05 8/4/2004 /2013 Stock Synthesis Version 3 (SS3) 3.45f 10/18/2012 Virtual Population Analysis (VPA) 3.4.4 3/3/2014

  19. An integrated assessment modeling framework for uncertainty studies in global and regional climate change: the MIT IGSM-CAM (version 1.0)

    NASA Astrophysics Data System (ADS)

    Monier, E.; Scott, J. R.; Sokolov, A. P.; Forest, C. E.; Schlosser, C. A.

    2013-12-01

    This paper describes a computationally efficient framework for uncertainty studies in global and regional climate change. In this framework, the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity to a human activity model, is linked to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). Since the MIT IGSM-CAM framework (version 1.0) incorporates a human activity model, it is possible to analyze uncertainties in emissions resulting from both uncertainties in the underlying socio-economic characteristics of the economic model and in the choice of climate-related policies. Another major feature is the flexibility to vary key climate parameters controlling the climate system response to changes in greenhouse gases and aerosols concentrations, e.g., climate sensitivity, ocean heat uptake rate, and strength of the aerosol forcing. The IGSM-CAM is not only able to realistically simulate the present-day mean climate and the observed trends at the global and continental scale, but it also simulates ENSO variability with realistic time scales, seasonality and patterns of SST anomalies, albeit with stronger magnitudes than observed. The IGSM-CAM shares the same general strengths and limitations as the Coupled Model Intercomparison Project Phase 3 (CMIP3) models in simulating present-day annual mean surface temperature and precipitation. Over land, the IGSM-CAM shows similar biases to the NCAR Community Climate System Model (CCSM) version 3, which shares the same atmospheric model. This study also presents 21st century simulations based on two emissions scenarios (unconstrained scenario and stabilization scenario at 660 ppm CO2-equivalent) similar to, respectively, the Representative Concentration Pathways RCP8.5 and RCP4.5 scenarios, and three sets of climate parameters. Results of the simulations with the chosen climate parameters provide a good approximation for the median, and the 5th and 95th percentiles of the probability distribution of 21st century changes in global mean surface air temperature from previous work with the IGSM. Because the IGSM-CAM framework only considers one particular climate model, it cannot be used to assess the structural modeling uncertainty arising from differences in the parameterization suites of climate models. However, comparison of the IGSM-CAM projections with simulations of 31 CMIP5 models under the RCP4.5 and RCP8.5 scenarios show that the range of warming at the continental scale shows very good agreement between the two ensemble simulations, except over Antarctica, where the IGSM-CAM overestimates the warming. This demonstrates that by sampling the climate system response, the IGSM-CAM, even though it relies on one single climate model, can essentially reproduce the range of future continental warming simulated by more than 30 different models. Precipitation changes projected in the IGSM-CAM simulations and the CMIP5 multi-model ensemble both display a large uncertainty at the continental scale. The two ensemble simulations show good agreement over Asia and Europe. However, the ranges of precipitation changes do not overlap - but display similar size - over Africa and South America, two continents where models generally show little agreement in the sign of precipitation changes and where CCSM3 tends to be an outlier. Overall, the IGSM-CAM provides an efficient and consistent framework to explore the large uncertainty in future projections of global and regional climate change associated with uncertainty in the climate response and projected emissions.

  20. Using Intel's Knight Landing Processor to Accelerate Global Nested Air Quality Prediction Modeling System (GNAQPMS) Model

    NASA Astrophysics Data System (ADS)

    Wang, H.; Chen, H.; Chen, X.; Wu, Q.; Wang, Z.

    2016-12-01

    The Global Nested Air Quality Prediction Modeling System for Hg (GNAQPMS-Hg) is a global chemical transport model coupled Hg transport module to investigate the mercury pollution. In this study, we present our work of transplanting the GNAQPMS model on Intel Xeon Phi processor, Knights Landing (KNL) to accelerate the model. KNL is the second-generation product adopting Many Integrated Core Architecture (MIC) architecture. Compared with the first generation Knight Corner (KNC), KNL has more new hardware features, that it can be used as unique processor as well as coprocessor with other CPU. According to the Vtune tool, the high overhead modules in GNAQPMS model have been addressed, including CBMZ gas chemistry, advection and convection module, and wet deposition module. These high overhead modules were accelerated by optimizing code and using new techniques of KNL. The following optimized measures was done: 1) Changing the pure MPI parallel mode to hybrid parallel mode with MPI and OpenMP; 2.Vectorizing the code to using the 512-bit wide vector computation unit. 3. Reducing unnecessary memory access and calculation. 4. Reducing Thread Local Storage (TLS) for common variables with each OpenMP thread in CBMZ. 5. Changing the way of global communication from files writing and reading to MPI functions. After optimization, the performance of GNAQPMS is greatly increased both on CPU and KNL platform, the single-node test showed that optimized version has 2.6x speedup on two sockets CPU platform and 3.3x speedup on one socket KNL platform compared with the baseline version code, which means the KNL has 1.29x speedup when compared with 2 sockets CPU platform.

  1. The effects of FreeSurfer version, workstation type, and Macintosh operating system version on anatomical volume and cortical thickness measurements.

    PubMed

    Gronenschild, Ed H B M; Habets, Petra; Jacobs, Heidi I L; Mengelers, Ron; Rozendaal, Nico; van Os, Jim; Marcelis, Machteld

    2012-01-01

    FreeSurfer is a popular software package to measure cortical thickness and volume of neuroanatomical structures. However, little if any is known about measurement reliability across various data processing conditions. Using a set of 30 anatomical T1-weighted 3T MRI scans, we investigated the effects of data processing variables such as FreeSurfer version (v4.3.1, v4.5.0, and v5.0.0), workstation (Macintosh and Hewlett-Packard), and Macintosh operating system version (OSX 10.5 and OSX 10.6). Significant differences were revealed between FreeSurfer version v5.0.0 and the two earlier versions. These differences were on average 8.8 ± 6.6% (range 1.3-64.0%) (volume) and 2.8 ± 1.3% (1.1-7.7%) (cortical thickness). About a factor two smaller differences were detected between Macintosh and Hewlett-Packard workstations and between OSX 10.5 and OSX 10.6. The observed differences are similar in magnitude as effect sizes reported in accuracy evaluations and neurodegenerative studies.The main conclusion is that in the context of an ongoing study, users are discouraged to update to a new major release of either FreeSurfer or operating system or to switch to a different type of workstation without repeating the analysis; results thus give a quantitative support to successive recommendations stated by FreeSurfer developers over the years. Moreover, in view of the large and significant cross-version differences, it is concluded that formal assessment of the accuracy of FreeSurfer is desirable.

  2. The Effects of FreeSurfer Version, Workstation Type, and Macintosh Operating System Version on Anatomical Volume and Cortical Thickness Measurements

    PubMed Central

    Gronenschild, Ed H. B. M.; Habets, Petra; Jacobs, Heidi I. L.; Mengelers, Ron; Rozendaal, Nico; van Os, Jim; Marcelis, Machteld

    2012-01-01

    FreeSurfer is a popular software package to measure cortical thickness and volume of neuroanatomical structures. However, little if any is known about measurement reliability across various data processing conditions. Using a set of 30 anatomical T1-weighted 3T MRI scans, we investigated the effects of data processing variables such as FreeSurfer version (v4.3.1, v4.5.0, and v5.0.0), workstation (Macintosh and Hewlett-Packard), and Macintosh operating system version (OSX 10.5 and OSX 10.6). Significant differences were revealed between FreeSurfer version v5.0.0 and the two earlier versions. These differences were on average 8.8±6.6% (range 1.3–64.0%) (volume) and 2.8±1.3% (1.1–7.7%) (cortical thickness). About a factor two smaller differences were detected between Macintosh and Hewlett-Packard workstations and between OSX 10.5 and OSX 10.6. The observed differences are similar in magnitude as effect sizes reported in accuracy evaluations and neurodegenerative studies. The main conclusion is that in the context of an ongoing study, users are discouraged to update to a new major release of either FreeSurfer or operating system or to switch to a different type of workstation without repeating the analysis; results thus give a quantitative support to successive recommendations stated by FreeSurfer developers over the years. Moreover, in view of the large and significant cross-version differences, it is concluded that formal assessment of the accuracy of FreeSurfer is desirable. PMID:22675527

  3. Incorporating 3-dimensional models in online articles.

    PubMed

    Cevidanes, Lucia H S; Ruellas, Antonio C O; Jomier, Julien; Nguyen, Tung; Pieper, Steve; Budin, Francois; Styner, Martin; Paniagua, Beatriz

    2015-05-01

    The aims of this article are to introduce the capability to view and interact with 3-dimensional (3D) surface models in online publications, and to describe how to prepare surface models for such online 3D visualizations. Three-dimensional image analysis methods include image acquisition, construction of surface models, registration in a common coordinate system, visualization of overlays, and quantification of changes. Cone-beam computed tomography scans were acquired as volumetric images that can be visualized as 3D projected images or used to construct polygonal meshes or surfaces of specific anatomic structures of interest. The anatomic structures of interest in the scans can be labeled with color (3D volumetric label maps), and then the scans are registered in a common coordinate system using a target region as the reference. The registered 3D volumetric label maps can be saved in .obj, .ply, .stl, or .vtk file formats and used for overlays, quantification of differences in each of the 3 planes of space, or color-coded graphic displays of 3D surface distances. All registered 3D surface models in this study were saved in .vtk file format and loaded in the Elsevier 3D viewer. In this study, we describe possible ways to visualize the surface models constructed from cone-beam computed tomography images using 2D and 3D figures. The 3D surface models are available in the article's online version for viewing and downloading using the reader's software of choice. These 3D graphic displays are represented in the print version as 2D snapshots. Overlays and color-coded distance maps can be displayed using the reader's software of choice, allowing graphic assessment of the location and direction of changes or morphologic differences relative to the structure of reference. The interpretation of 3D overlays and quantitative color-coded maps requires basic knowledge of 3D image analysis. When submitting manuscripts, authors can now upload 3D models that will allow readers to interact with or download them. Such interaction with 3D models in online articles now will give readers and authors better understanding and visualization of the results. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  4. Evolution of N-species Kimura/voter models towards criticality, a surrogate for general models of accidental pathogens

    NASA Astrophysics Data System (ADS)

    Ghaffari, Peyman; Stollenwerk, Nico

    2012-09-01

    In models for accidental pathogens, with the paradigmatic epidemiological system of bacterial meningitis, there was evolution towards states exhibiting critical fluctuations with power law behaviour observed [1]. This is a model with many possibly pathogenic strains essentially evolving independently to low pathogenicity. A first and previous study had shown that in the limit of vanishing pathogenicity there are critical fluctuations with power law distributions observed, already when only two strains interact [2]. This earlier version of a two strain model was very recently reinvestigated [3] and named as Stollenwerk-Jansen model (SJ). Muñoz et al. demonstrated that this two-strain model for accidental pathogens is in the universality class of the so-called voter model. Though this model clearly shows criticality, its control parameter, the pathogenicity, is not self-tuning towards criticality. However, the multi-strain version mentioned above [1] is well evolving towards criticality, as well as a spatially explicit version of this, shown in [4] p. 155. These models of multi-strain type including explicitly mutations of the pathogenicity can be called SJ-models of type II [5]. Since the original epidemiological model is of SIRYX-type, the evolution to zero pathogenicity is slow and perturbed by large population noise. In the present article we now show on the basis of the notion of the voter-model universality classes the evolution of n-voter models with mutaion towards criticality, now much less perturbed by population noise, hence demonstrating a clear mechanism of self-organized criticality in the sense of [6, 7]. The present results have wide implications for many diseases in which a large proportion of infections is asymptomatic, meaning that the system has already evolved towards an average low pathogenicity. This holds not only for the original paradigmatic case of bacterial meningitis, but was reecently also suggested for example for dengue fever (DENFREE project).

  5. Simulated convective systems using a cloud resolving model: Impact of large-scale temperature and moisture forcing using observations and GEOS-3 reanalysis

    NASA Technical Reports Server (NTRS)

    Shie, C.-L.; Tao, W.-K.; Hou, A.; Lin, X.

    2006-01-01

    The GCE (Goddard Cumulus Ensemble) model, which has been developed and improved at NASA Goddard Space Flight Center over the past two decades, is considered as one of the finer and state-of-the-art CRMs (Cloud Resolving Models) in the research community. As the chosen CRM for a NASA Interdisciplinary Science (IDS) Project, GCE has recently been successfully upgraded into an MPI (Message Passing Interface) version with which great improvement has been achieved in computational efficiency, scalability, and portability. By basically using the large-scale temperature and moisture advective forcing, as well as the temperature, water vapor and wind fields obtained from TRMM (Tropical Rainfall Measuring Mission) field experiments such as SCSMEX (South China Sea Monsoon Experiment) and KWAJEX (Kwajalein Experiment), our recent 2-D and 3-D GCE simulations were able to capture detailed convective systems typical of the targeted (simulated) regions. The GEOS-3 [Goddard EOS (Earth Observing System) Version-3] reanalysis data have also been proposed and successfully implemented for usage in the proposed/performed GCE long-term simulations (i.e., aiming at producing massive simulated cloud data -- Cloud Library) in compensating the scarcity of real field experimental data in both time and space (location). Preliminary 2-D or 3-D pilot results using GEOS-3 data have generally showed good qualitative agreement (yet some quantitative difference) with the respective numerical results using the SCSMEX observations. The first objective of this paper is to ensure the GEOS-3 data quality by comparing the model results obtained from several pairs of simulations using the real observations and GEOS-3 reanalysis data. The different large-scale advective forcing obtained from these two kinds of resources (i.e., sounding observations and GEOS-3 reanalysis) has been considered as a major critical factor in producing various model results. The second objective of this paper is therefore to investigate and present such an impact of large-scale forcing on various modeled quantities (such as hydrometeors, rainfall, and etc.). A third objective is to validate the overall GCE 3-D model performance by comparing the numerical results with sounding observations, as well as available satellite retrievals.

  6. SARA - SURE/ASSIST RELIABILITY ANALYSIS WORKSTATION (VAX VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1994-01-01

    SARA, the SURE/ASSIST Reliability Analysis Workstation, is a bundle of programs used to solve reliability problems. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. The Systems Validation Methods group at NASA Langley Research Center has created a set of four software packages that form the basis for a reliability analysis workstation, including three for use in analyzing reconfigurable, fault-tolerant systems and one for analyzing non-reconfigurable systems. The SARA bundle includes the three for reconfigurable, fault-tolerant systems: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), and PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920). As indicated by the program numbers in parentheses, each of these three packages is also available separately in two machine versions. The fourth package, which is only available separately, is FTC, the Fault Tree Compiler (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree which describes a non-reconfigurable system. PAWS/STEM and SURE are analysis programs which utilize different solution methods, but have a common input language, the SURE language. ASSIST is a preprocessor that generates SURE language from a more abstract definition. ASSIST, SURE, and PAWS/STEM are described briefly in the following paragraphs. For additional details about the individual packages, including pricing, please refer to their respective abstracts. ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, allows a reliability engineer to describe the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. A one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. The semi-Markov model generated by ASSIST is in the format needed for input to SURE and PAWS/STEM. The Semi-Markov Unreliability Range Evaluator, SURE, is an analysis tool for reconfigurable, fault-tolerant systems. SURE provides an efficient means for calculating accurate upper and lower bounds for the death state probabilities for a large class of semi-Markov models, not just those which can be reduced to critical-pair architectures. The calculated bounds are close enough (usually within 5 percent of each other) for use in reliability studies of ultra-reliable computer systems. The SURE bounding theorems have algebraic solutions and are consequently computationally efficient even for large and complex systems. SURE can optionally regard a specified parameter as a variable over a range of values, enabling an automatic sensitivity analysis. SURE output is tabular. The PAWS/STEM package includes two programs for the creation and evaluation of pure Markov models describing the behavior of fault-tolerant reconfigurable computer systems: the Pade Approximation with Scaling (PAWS) and Scaled Taylor Exponential Matrix (STEM) programs. PAWS and STEM produce exact solutions for the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. Markov models of fault-tolerant architectures inevitably lead to numerically stiff differential equations. Both PAWS and STEM have the capability to solve numerically stiff models. These complementary programs use separate methods to determine the matrix exponential in the solution of the model's system of differential equations. In general, PAWS is better suited to evaluate small and dense models. STEM operates at lower precision, but works faster than PAWS for larger models. The programs that comprise the SARA package were originally developed for use on DEC VAX series computers running VMS and were later ported for use on Sun series computers running SunOS. They are written in C-language, Pascal, and FORTRAN 77. An ANSI compliant C compiler is required in order to compile the C portion of the Sun version source code. The Pascal and FORTRAN code can be compiled on Sun computers using Sun Pascal and Sun Fortran. For the VMS version, VAX C, VAX PASCAL, and VAX FORTRAN can be used to recompile the source code. The standard distribution medium for the VMS version of SARA (COS-10041) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of SARA (COS-10039) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the ASSIST user's manual in TeX and PostScript formats are provided on the distribution medium. DEC, VAX, VMS, and TK50 are registered trademarks of Digital Equipment Corporation. Sun, Sun3, Sun4, and SunOS are trademarks of Sun Microsystems, Inc. TeX is a trademark of the American Mathematical Society. PostScript is a registered trademark of Adobe Systems Incorporated.

  7. SARA - SURE/ASSIST RELIABILITY ANALYSIS WORKSTATION (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1994-01-01

    SARA, the SURE/ASSIST Reliability Analysis Workstation, is a bundle of programs used to solve reliability problems. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. The Systems Validation Methods group at NASA Langley Research Center has created a set of four software packages that form the basis for a reliability analysis workstation, including three for use in analyzing reconfigurable, fault-tolerant systems and one for analyzing non-reconfigurable systems. The SARA bundle includes the three for reconfigurable, fault-tolerant systems: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), and PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920). As indicated by the program numbers in parentheses, each of these three packages is also available separately in two machine versions. The fourth package, which is only available separately, is FTC, the Fault Tree Compiler (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree which describes a non-reconfigurable system. PAWS/STEM and SURE are analysis programs which utilize different solution methods, but have a common input language, the SURE language. ASSIST is a preprocessor that generates SURE language from a more abstract definition. ASSIST, SURE, and PAWS/STEM are described briefly in the following paragraphs. For additional details about the individual packages, including pricing, please refer to their respective abstracts. ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, allows a reliability engineer to describe the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. A one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. The semi-Markov model generated by ASSIST is in the format needed for input to SURE and PAWS/STEM. The Semi-Markov Unreliability Range Evaluator, SURE, is an analysis tool for reconfigurable, fault-tolerant systems. SURE provides an efficient means for calculating accurate upper and lower bounds for the death state probabilities for a large class of semi-Markov models, not just those which can be reduced to critical-pair architectures. The calculated bounds are close enough (usually within 5 percent of each other) for use in reliability studies of ultra-reliable computer systems. The SURE bounding theorems have algebraic solutions and are consequently computationally efficient even for large and complex systems. SURE can optionally regard a specified parameter as a variable over a range of values, enabling an automatic sensitivity analysis. SURE output is tabular. The PAWS/STEM package includes two programs for the creation and evaluation of pure Markov models describing the behavior of fault-tolerant reconfigurable computer systems: the Pade Approximation with Scaling (PAWS) and Scaled Taylor Exponential Matrix (STEM) programs. PAWS and STEM produce exact solutions for the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. Markov models of fault-tolerant architectures inevitably lead to numerically stiff differential equations. Both PAWS and STEM have the capability to solve numerically stiff models. These complementary programs use separate methods to determine the matrix exponential in the solution of the model's system of differential equations. In general, PAWS is better suited to evaluate small and dense models. STEM operates at lower precision, but works faster than PAWS for larger models. The programs that comprise the SARA package were originally developed for use on DEC VAX series computers running VMS and were later ported for use on Sun series computers running SunOS. They are written in C-language, Pascal, and FORTRAN 77. An ANSI compliant C compiler is required in order to compile the C portion of the Sun version source code. The Pascal and FORTRAN code can be compiled on Sun computers using Sun Pascal and Sun Fortran. For the VMS version, VAX C, VAX PASCAL, and VAX FORTRAN can be used to recompile the source code. The standard distribution medium for the VMS version of SARA (COS-10041) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of SARA (COS-10039) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the ASSIST user's manual in TeX and PostScript formats are provided on the distribution medium. DEC, VAX, VMS, and TK50 are registered trademarks of Digital Equipment Corporation. Sun, Sun3, Sun4, and SunOS are trademarks of Sun Microsystems, Inc. TeX is a trademark of the American Mathematical Society. PostScript is a registered trademark of Adobe Systems Incorporated.

  8. Evaluation of Teacher Perceptions and Potential of OpenOffice in a K-12 School District

    ERIC Educational Resources Information Center

    Vajda, James; Abbitt, Jason T.

    2011-01-01

    Through this mixed-method evaluation study the authors investigated a pilot implementation of an open-source productivity suite for teachers in a K-12 public school district. The authors evaluated OpenOffice version 3.0 using measures identified by the technology acceptance model as predictors of acceptance and use of technology systems. During a…

  9. Intercomparison of the community multiscale air quality model and CALGRID using process analysis.

    PubMed

    O'Neill, Susan M; Lamb, Brian K

    2005-08-01

    This study was designed to examine the similarities and differences between two advanced photochemical air quality modeling systems: EPA Models-3/CMAQ and CALGRID/CALMET. Both modeling systems were applied to an ozone episode that occurred along the I-5 urban corridor in western Washington and Oregon during July 11-14, 1996. Both models employed the same modeling domain and used the same detailed gridded emission inventory. The CMAQ model was run using both the CB-IV and RADM2 chemical mechanisms, while CALGRID was used with the SAPRC-97 chemical mechanism. Outputfrom the Mesoscale Meteorological Model (MM5) employed with observational nudging was used in both models. The two modeling systems, representing three chemical mechanisms and two sets of meteorological inputs, were evaluated in terms of statistical performance measures for both 1- and 8-h average observed ozone concentrations. The results showed that the different versions of the systems were more similar than different, and all versions performed well in the Portland region and downwind of Seattle but performed poorly in the more rural region north of Seattle. Improving the meteorological input into the CALGRID/CALMET system with planetary boundary layer (PBL) parameters from the Models-3/CMAQ meteorology preprocessor (MCIP) improved the performance of the CALGRID/CALMET system. The 8-h ensemble case was often the best performer of all the cases indicating that the models perform better over longer analysis periods. The 1-h ensemble case, derived from all runs, was not necessarily an improvement over the five individual cases, but the standard deviation about the mean provided a measure of overall modeling uncertainty. Process analysis was applied to examine the contribution of the individual processes to the species conservation equation. The process analysis results indicated that the two modeling systems arrive at similar solutions by very different means. Transport rates are faster and exhibit greater fluctuations in the CMAQ cases than in the CALGRID cases, which lead to different placement of the urban ozone plumes. The CALGRID cases, which rely on the SAPRC97 chemical mechanism, exhibited a greater diurnal production/loss cycle of ozone concentrations per hour compared to either the RADM2 or CBIV chemical mechanisms in the CMAQ cases. These results demonstrate the need for specialized process field measurements to confirm whether we are modeling ozone with valid processes.

  10. GEOS S2S-2_1 File Specification: GMAO Seasonal and Sub-Seasonal Forecast Output

    NASA Technical Reports Server (NTRS)

    Kovach, Robin M.; Marshak, Jelena; Molod, Andrea; Nakada, Kazumi

    2018-01-01

    The NASA GMAO seasonal (9 months) and subseasonal (45 days) forecasts are produced with the Goddard Earth Observing System (GEOS) Atmosphere-Ocean General Circulation Model and Data Assimilation System Version S2S-2_1. The new system replaces version S2S-1.0 described in Borovikov et al (2017), and includes upgrades to many components of the system. The atmospheric model includes an upgrade from a pre-MERRA-2 version running on a latitude-longitude grid at approx. 1 degree resolution to a current version running on a cubed sphere grid at approximately 1/2 degree resolution. The important developments are related to the dynamical core (Putman et al., 2011), the moist physics (''two-moment microphysics'' of Barahona et al., 2014) and the cryosphere (Cullather et al., 2014). As in the previous GMAO S2S system, the land model is that of Koster et al (2000). GMAO S2S-2_1 now includes the Goddard Chemistry Aerosol Radiation and Transport (GOCART, Colarco et al., 2010) single moment interactive aerosol model that includes predictive aerosols including dust, sea salt and several species of carbon and sulfate. The previous version of GMAO S2S specified aerosol amounts from climatology, which were used to inform the atmospheric radiation only. The ocean model includes an upgrade from MOM4 to MOM5 (Griffies 2012), and continues to be run on the tripolar grid at approximately 1/2 degree resolution in the tropics with 40 vertical levels. As in S2S-1.0, the sea ice model is from the Los Alamos Sea Ice model (CICE4, Hunke and Lipscomb 2010). The Ocean Data Assimilation System (ODAS) has been upgraded from the one described in Borovikov et al., 2017 to one that uses a modified version of the Penny, 2014 Local Ensemble Transform Kalman Filter (LETKF), and now assimilates along-track altimetry. The ODAS also does a nudging to MERRA-2 SST and sea ice boundary conditions. The atmospheric data assimilation fields used to constrain the atmosphere in the ODAS have been upgraded from MERRA to a MERRA-2 like system. The system is initialized using a MERRA-2-like atmospheric reanalysis (Gelaro et al. 2017) and the GMAO S2S-2_1 ocean analysis. Additional ensemble members for forecasts are produced with initial states at 5-day intervals, with additional members based on perturbations of the atmospheric and ocean states. Both subseasonal and seasonal forecasts are submitted to the National MultiModel Ensemble (NMME) project, and are part of the US/Canada multimodel seasonal forecasts (http://www.cpc.ncep.noaa.gov/products/NMME/). A large suite of retrospective forecasts (''hindcasts'') have been completed, and contribute to the calculation of the model's baseline climatology and drift, anomalies from which are the basis of the seasonal forecasts.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rasmussen, Martin; Hastings, Alan; Smith, Matthew J.

    We develop a theory for residence times and mean ages for nonautonomous compartmental systems. Using the McKendrick–von Forster equation, we show that the mean ages of mass in a compartmental system satisfy a linear nonautonomous ordinary differential equation that is exponentially stable. We then define a nonautonomous version of residence time as the mean age of mass leaving the compartmental system at a particular time and show that our nonautonomous theory is consistent with the autonomous case. We apply these results to study a nine-dimensional nonautonomous compartmental system modeling the carbon cycle, which is a simplified version of the Carnegie–Ames–Stanfordmore » approach (CASA) model.« less

  12. SHABERTH - ANALYSIS OF A SHAFT BEARING SYSTEM (CRAY VERSION)

    NASA Technical Reports Server (NTRS)

    Coe, H. H.

    1994-01-01

    The SHABERTH computer program was developed to predict operating characteristics of bearings in a multibearing load support system. Lubricated and non-lubricated bearings can be modeled. SHABERTH calculates the loads, torques, temperatures, and fatigue life for ball and/or roller bearings on a single shaft. The program also allows for an analysis of the system reaction to the termination of lubricant supply to the bearings and other lubricated mechanical elements. SHABERTH has proven to be a valuable tool in the design and analysis of shaft bearing systems. The SHABERTH program is structured with four nested calculation schemes. The thermal scheme performs steady state and transient temperature calculations which predict system temperatures for a given operating state. The bearing dimensional equilibrium scheme uses the bearing temperatures, predicted by the temperature mapping subprograms, and the rolling element raceway load distribution, predicted by the bearing subprogram, to calculate bearing diametral clearance for a given operating state. The shaft-bearing system load equilibrium scheme calculates bearing inner ring positions relative to the respective outer rings such that the external loading applied to the shaft is brought into equilibrium by the rolling element loads which develop at each bearing inner ring for a given operating state. The bearing rolling element and cage load equilibrium scheme calculates the rolling element and cage equilibrium positions and rotational speeds based on the relative inner-outer ring positions, inertia effects, and friction conditions. The ball bearing subprograms in the current SHABERTH program have several model enhancements over similar programs. These enhancements include an elastohydrodynamic (EHD) film thickness model that accounts for thermal heating in the contact area and lubricant film starvation; a new model for traction combined with an asperity load sharing model; a model for the hydrodynamic rolling and shear forces in the inlet zone of lubricated contacts, which accounts for the degree of lubricant film starvation; modeling normal and friction forces between a ball and a cage pocket, which account for the transition between the hydrodynamic and elastohydrodynamic regimes of lubrication; and a model of the effect on fatigue life of the ratio of the EHD plateau film thickness to the composite surface roughness. SHABERTH is intended to be as general as possible. The models in SHABERTH allow for the complete mathematical simulation of real physical systems. Systems are limited to a maximum of five bearings supporting the shaft, a maximum of thirty rolling elements per bearing, and a maximum of one hundred temperature nodes. The SHABERTH program structure is modular and has been designed to permit refinement and replacement of various component models as the need and opportunities develop. A preprocessor is included in the IBM PC version of SHABERTH to provide a user friendly means of developing SHABERTH models and executing the resulting code. The preprocessor allows the user to create and modify data files with minimal effort and a reduced chance for errors. Data is utilized as it is entered; the preprocessor then decides what additional data is required to complete the model. Only this required information is requested. The preprocessor can accommodate data input for any SHABERTH compatible shaft bearing system model. The system may include ball bearings, roller bearings, and/or tapered roller bearings. SHABERTH is written in FORTRAN 77, and two machine versions are available from COSMIC. The CRAY version (LEW-14860) has a RAM requirement of 176K of 64 bit words. The IBM PC version (MFS-28818) is written for IBM PC series and compatible computers running MS-DOS, and includes a sample MS-DOS executable. For execution, the PC version requires at least 1Mb of RAM and an 80386 or 486 processor machine with an 80x87 math co-processor. The standard distribution medium for the IBM PC version is a set of two 5.25 inch 360K MS-DOS format diskettes. The contents of the diske

  13. Key Questions in Building Defect Prediction Models in Practice

    NASA Astrophysics Data System (ADS)

    Ramler, Rudolf; Wolfmaier, Klaus; Stauder, Erwin; Kossak, Felix; Natschläger, Thomas

    The information about which modules of a future version of a software system are defect-prone is a valuable planning aid for quality managers and testers. Defect prediction promises to indicate these defect-prone modules. However, constructing effective defect prediction models in an industrial setting involves a number of key questions. In this paper we discuss ten key questions identified in context of establishing defect prediction in a large software development project. Seven consecutive versions of the software system have been used to construct and validate defect prediction models for system test planning. Furthermore, the paper presents initial empirical results from the studied project and, by this means, contributes answers to the identified questions.

  14. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS.

    PubMed

    Sato, Tatsuhiko

    2015-01-01

    By extending our previously established model, here we present a new model called "PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0," which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth's atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research.

  15. Analytical Model for Estimating Terrestrial Cosmic Ray Fluxes Nearly Anytime and Anywhere in the World: Extension of PARMA/EXPACS

    PubMed Central

    Sato, Tatsuhiko

    2015-01-01

    By extending our previously established model, here we present a new model called “PHITS-based Analytical Radiation Model in the Atmosphere (PARMA) version 3.0,” which can instantaneously estimate terrestrial cosmic ray fluxes of neutrons, protons, ions with charge up to 28 (Ni), muons, electrons, positrons, and photons nearly anytime and anywhere in the Earth’s atmosphere. The model comprises numerous analytical functions with parameters whose numerical values were fitted to reproduce the results of the extensive air shower (EAS) simulation performed by Particle and Heavy Ion Transport code System (PHITS). The accuracy of the EAS simulation was well verified using various experimental data, while that of PARMA3.0 was confirmed by the high R 2 values of the fit. The models to be used for estimating radiation doses due to cosmic ray exposure, cosmic ray induced ionization rates, and count rates of neutron monitors were validated by investigating their capability to reproduce those quantities measured under various conditions. PARMA3.0 is available freely and is easy to use, as implemented in an open-access software program EXcel-based Program for Calculating Atmospheric Cosmic ray Spectrum (EXPACS). Because of these features, the new version of PARMA/EXPACS can be an important tool in various research fields such as geosciences, cosmic ray physics, and radiation research. PMID:26674183

  16. Solid waste projection model: Model version 1. 0 technical reference manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilkins, M.L.; Crow, V.L.; Buska, D.E.

    1990-11-01

    The Solid Waste Projection Model (SWPM) system is an analytical tool developed by Pacific Northwest Laboratory (PNL) for Westinghouse Hanford Company (WHC). The SWPM system provides a modeling and analysis environment that supports decisions in the process of evaluating various solid waste management alternatives. This document, one of a series describing the SWPM system, contains detailed information regarding the software utilized in developing Version 1.0 of the modeling unit of SWPM. This document is intended for use by experienced software engineers and supports programming, code maintenance, and model enhancement. Those interested in using SWPM should refer to the SWPM Modelmore » User's Guide. This document is available from either the PNL project manager (D. L. Stiles, 509-376-4154) or the WHC program monitor (B. C. Anderson, 509-373-2796). 8 figs.« less

  17. VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system. Version 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shapiro, A.; Huria, H.C.; Cho, K.W.

    1991-12-01

    VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less

  18. CERES Fast Longwave And SHortwave Radiative Flux (FLASHFlux) Version4A.

    NASA Astrophysics Data System (ADS)

    Sawaengphokhai, P.; Stackhouse, P. W., Jr.; Kratz, D. P.; Gupta, S. K.

    2017-12-01

    The agricultural, renewable energy management, and science communities need global surface and top-of-atmosphere (TOA) radiative fluxes on a low latency basis. The Clouds and Earth's Radiant Energy System (CERES) FLASHFlux (Fast Longwave and SHortwave radiative Flux) data products address this need by enhancing the speed of CERES processing using simplified calibration and parameterized model of surface fluxes to provide a daily global radiative fluxes data set within one week of satellite observations. The CERES FLASHFlux provides two data products: 1) an overpass swath Level 2 Single Scanner Footprint (SSF) data products separately for both Aqua and Terra observations, and 2) a daily Level 3 Time Interpolated and Spatially Averaged (TISA) 1o x 1o gridded data that combines Aqua and Terra observations. The CERES FLASHFlux data product is being promoted to Version4A. Updates to FLASHFlux Version4A include a new cloud retrieval algorithm and an improved shortwave surface flux parameterization. We inter-compared FLASHFlux Version4A, FLASHFlux Version3C, CERES Edition 4 Syn1Deg and at the monthly scale CERES Edition4 EBAF (Energy Balanced and Filled) Top-of-Atmosphere and Edition 4 Surface EBAF fluxes to evaluate these improvements. We also analyze the impact of the new inputs and cloud algorithm to the surface shortwave and longwave radiative fluxes using ground sites measurement provided by CAVE (CERES/ARM Validation Experiment).

  19. Groundwater model of the Great Basin carbonate and alluvial aquifer system version 3.0: Incorporating revisions in southwestern Utah and east central Nevada

    USGS Publications Warehouse

    Brooks, Lynette E.

    2017-12-01

    The groundwater model described in this report is a new version of previously published steady-state numerical groundwater flow models of the Great Basin carbonate and alluvial aquifer system, and was developed in conjunction with U.S. Geological Survey studies in Parowan, Pine, and Wah Wah Valleys, Utah. This version of the model is GBCAAS v. 3.0 and supersedes previous versions. The objectives of the model for Parowan Valley were to simulate revised conceptual estimates of recharge and discharge, to estimate simulated aquifer storage properties and the amount of reduction in storage as a result of historical groundwater withdrawals, and to assess reduction in groundwater withdrawals necessary to mitigate groundwater-level declines in the basin. The objectives of the model for the area near Pine and Wah Wah Valleys were to recalibrate the model using new observations of groundwater levels and evapotranspiration of groundwater; to provide new estimates of simulated recharge, hydraulic conductivity, and interbasin flow; and to simulate the effects of proposed groundwater withdrawals on the regional flow system. Meeting these objectives required the addition of 15 transient calibration stress periods and 14 projection stress periods, aquifer storage properties, historical withdrawals in Parowan Valley, and observations of water-level changes in Parowan Valley. Recharge in Parowan Valley and withdrawal from wells in Parowan Valley and two nearby wells in Cedar City Valley vary for each calibration stress period representing conditions from March 1940 to November 2013. Stresses, including recharge, are the same in each stress period as in the steady-state stress period for all areas outside of Parowan Valley. The model was calibrated to transient conditions only in Parowan Valley. Simulated storage properties outside of Parowan Valley were set the same as the Parowan Valley properties and are not considered calibrated. Model observations in GBCAAS v. 3.0 are groundwater levels at wells and discharge locations; water-level changes; and discharge to springs, evapotranspiration of groundwater, rivers, and lakes. All observations in the model outside of Parowan Valley are considered to represent steady-state conditions. Composite scaled sensitivities indicate the observations of discharge to rivers and springs provide more information about model parameters in the model focus area than do water-level observations. Water levels and water-level changes, however, provide the only information about specific yield and specific storage parameters and provide more information about recharge and withdrawals in Parowan Valley than any other observation group. Comparisons of simulated water levels and measured water levels in Parowan Valley indicated that the model fits the overall trend of declining water levels and provides reasonable estimates of long-term reduction in storage and of storage changes from 2012 to 2013. The conceptual and simulated groundwater budgets for Parowan Valley from November 2012 to November 2013 are similar, with recharge of about 20,000 acre-feet and discharge of about 45,000 acre-feet. In the simulation, historical withdrawals averaging about 28,000 acre-feet per year (acre-ft/yr) cause major changes in the groundwater system in Parowan Valley. These changes include the cessation of almost all natural discharge in the valley and the long-term removal of water from storage. Simulated recharge in Pine Valley of 11,000 acre-ft/yr and in Wah Wah Valley of 3,200 acre-ft/yr is substantially less in GBCAAS v. 3.0 than that simulated by previous model versions. In addition, the valleys have less simulated inflow from and outflow to other hydrographic areas than were simulated by previous model versions. The effects of groundwater development in these valleys, however, are independent of the amount of water recharging in and flowing through the valleys. Groundwater withdrawals in Pine and Wah Wah Valleys will decrease groundwater storage (causing drawdown) until discharge in surrounding areas and mountain springs around the two valleys is reduced by the rate of withdrawal. The model was used to estimate that reducing withdrawals in Parowan Valley from 35,000 to about 22,000 acre-ft/yr would likely stabilize groundwater levels in the valley if recharge varies as it did from about 1950 to 2012. The model was also used to demonstrate that withdrawals of 15,000 acre-ft/yr from Pine Valley and 6,500 acre-ft/yr from Wah Wah Valley could ultimately cause long-term steady-state water-level declines of about 1,900 feet near the withdrawal wells and of more than 5 feet in an area of about 10,500 square miles. The timing of drawdown and capture and the ultimate amount of drawdown are dependent on the proximity to areas of simulated natural groundwater discharge, simulated transmissivity, and simulated storage properties. The model projections are a representation of possible effects.

  20. Stirling System Modeling for Space Nuclear Power Systems

    NASA Technical Reports Server (NTRS)

    Lewandowski, Edward J.; Johnson, Paul K.

    2008-01-01

    A dynamic model of a high-power Stirling convertor has been developed for space nuclear power systems modeling. The model is based on the Component Test Power Convertor (CTPC), a 12.5-kWe free-piston Stirling convertor. The model includes the fluid heat source, the Stirling convertor, output power, and heat rejection. The Stirling convertor model includes the Stirling cycle thermodynamics, heat flow, mechanical mass-spring damper systems, and the linear alternator. The model was validated against test data. Both nonlinear and linear versions of the model were developed. The linear version algebraically couples two separate linear dynamic models; one model of the Stirling cycle and one model of the thermal system, through the pressure factors. Future possible uses of the Stirling system dynamic model are discussed. A pair of commercially available 1-kWe Stirling convertors is being purchased by NASA Glenn Research Center. The specifications of those convertors may eventually be incorporated into the dynamic model and analysis compared to the convertor test data. Subsequent potential testing could include integrating the convertors into a pumped liquid metal hot-end interface. This test would provide more data for comparison to the dynamic model analysis.

  1. 4D-Var Developement at GMAO

    NASA Technical Reports Server (NTRS)

    Pelc, Joanna S.; Todling, Ricardo; Akkraoui, Amal El

    2014-01-01

    The Global Modeling and Assimilation Offce (GMAO) is currently using an IAU-based 3D-Var data assimilation system. GMAO has been experimenting with a 3D-Var-hybrid version of its data assimilation system (DAS) for over a year now, which will soon become operational and it will rapidly progress toward a 4D-EnVar. Concurrently, the machinery to exercise traditional 4DVar is in place and it is desirable to have a comparison of the traditional 4D approach with the other available options, and evaluate their performance in the Goddard Earth Observing System (GEOS) DAS. This work will also explore the possibility for constructing a reduced order model (ROM) to make traditional 4D-Var computationally attractive for increasing model resolutions. Part of the research on ROM will be to search for a suitably acceptable space to carry on the corresponding reduction. This poster illustrates how the IAU-based 4D-Var assimilation compares with our currently used IAU-based 3D-Var.

  2. Geologic Framework Model Analysis Model Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompassmore » the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the repository design. These downstream models include the hydrologic flow models and the radionuclide transport models. All the models and the repository design, in turn, will be incorporated into the Total System Performance Assessment (TSPA) of the potential radioactive waste repository block and vicinity to determine the suitability of Yucca Mountain as a host for the repository. The interrelationship of the three components of the ISM and their interface with downstream uses are illustrated in Figure 2.« less

  3. Selected Tether Applications Cost Model

    NASA Technical Reports Server (NTRS)

    Keeley, Michael G.

    1988-01-01

    Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.

  4. An Update on Modifications to Water Treatment Plant Model

    EPA Science Inventory

    Water treatment plant (WTP) model is an EPA tool for informing regulatory options. WTP has a few versions: 1). WTP2.2 can help in regulatory analysis. An updated version (WTP3.0) will allow plant-specific analysis (WTP-ccam) and thus help meet plant-specific treatment objectives...

  5. LB3D: A parallel implementation of the Lattice-Boltzmann method for simulation of interacting amphiphilic fluids

    NASA Astrophysics Data System (ADS)

    Schmieschek, S.; Shamardin, L.; Frijters, S.; Krüger, T.; Schiller, U. D.; Harting, J.; Coveney, P. V.

    2017-08-01

    We introduce the lattice-Boltzmann code LB3D, version 7.1. Building on a parallel program and supporting tools which have enabled research utilising high performance computing resources for nearly two decades, LB3D version 7 provides a subset of the research code functionality as an open source project. Here, we describe the theoretical basis of the algorithm as well as computational aspects of the implementation. The software package is validated against simulations of meso-phases resulting from self-assembly in ternary fluid mixtures comprising immiscible and amphiphilic components such as water-oil-surfactant systems. The impact of the surfactant species on the dynamics of spinodal decomposition are tested and quantitative measurement of the permeability of a body centred cubic (BCC) model porous medium for a simple binary mixture is described. Single-core performance and scaling behaviour of the code are reported for simulations on current supercomputer architectures.

  6. Conceptual modeling of coincident failures in multiversion software

    NASA Technical Reports Server (NTRS)

    Littlewood, Bev; Miller, Douglas R.

    1989-01-01

    Recent work by Eckhardt and Lee (1985) shows that independently developed program versions fail dependently (specifically, simultaneous failure of several is greater than would be the case under true independence). The present authors show there is a precise duality between input choice and program choice in this model and consider a generalization in which different versions can be developed using diverse methodologies. The use of diverse methodologies is shown to decrease the probability of the simultaneous failure of several versions. Indeed, it is theoretically possible to obtain versions which exhibit better than independent failure behavior. The authors try to formalize the notion of methodological diversity by considering the sequence of decision outcomes that constitute a methodology. They show that diversity of decision implies likely diversity of behavior for the different verions developed under such forced diversity. For certain one-out-of-n systems the authors obtain an optimal method for allocating diversity between versions. For two-out-of-three systems there seem to be no simple optimality results which do not depend on constraints which cannot be verified in practice.

  7. A computationally tractable version of the collective model

    NASA Astrophysics Data System (ADS)

    Rowe, D. J.

    2004-05-01

    A computationally tractable version of the Bohr-Mottelson collective model is presented which makes it possible to diagonalize realistic collective models and obtain convergent results in relatively small appropriately chosen subspaces of the collective model Hilbert space. Special features of the proposed model are that it makes use of the beta wave functions given analytically by the softened-beta version of the Wilets-Jean model, proposed by Elliott et al., and a simple algorithm for computing SO(5)⊃SO(3) spherical harmonics. The latter has much in common with the methods of Chacon, Moshinsky, and Sharp but is conceptually and computationally simpler. Results are presented for collective models ranging from the spherical vibrator to the Wilets-Jean and axially symmetric rotor-vibrator models.

  8. JPSS-1 VIIRS Version 2 At-Launch Relative Spectral Response Characterization and Performance

    NASA Technical Reports Server (NTRS)

    Moeller, Chris; Schwarting, Thomas; McIntire, Jeff; Moyer, Dave; Zeng, Jinan

    2017-01-01

    The relative spectral response (RSR) characterization of the JPSS-1 VIIRS spectral bands has achieved at launch status in the VIIRS Data Analysis Working Group February 2016 Version 2 RSR release. The Version 2 release improves upon the June 2015 Version 1 release by including December 2014 NIST TSIRCUS spectral measurements of VIIRS VisNIR bands in the analysis plus correcting CO2 influence on the band M13 RSR. The T-SIRCUS based characterization is merged with the summer 2014 SpMA based characterization of VisNIR bands (Version 1 release) to yield a fused RSR for these bands, combining the strengths of the T-SIRCUS and the SpMA measurement systems. The M13 RSR is updated by applying a model-based correction to mitigate CO2 attenuation of the SpMA source signal that occurred during M13 spectral measurements. The Version 2 release carries forward the Version 1 RSR for those bands that were not updated (M8-M12, M14-M16AB, I3-I5, DNBMGS). The Version 2 release includes band average (overall detectors and subsamples) RSR plus supporting RSR for each detector and subsample. The at-launch band average RSR have been used to populate Look-Up Tables supporting the sensor data record and environmental data record at-launch science products. Spectral performance metrics show that JPSS-1VIIRS RSR are compliant on specifications with a few minor exceptions. The Version 2 release, which replaces the Version 1 release, is currently available on the password-protected NASA JPSS-1 eRooms under EAR99 control.

  9. A Short Version of SIS (Support Intensity Scale): The Utility of the Application of Artificial Adaptive Systems

    ERIC Educational Resources Information Center

    Gomiero, Tiziano; Croce, Luigi; Grossi, Enzo; Luc, De Vreese; Buscema, Massimo; Mantesso, Ulrico; De Bastiani, Elisa

    2011-01-01

    The aim of this paper is to present a shortened version of the SIS (support intensity scale) obtained by the application of mathematical models and instruments, adopting special algorithms based on the most recent developments in artificial adaptive systems. All the variables of SIS applied to 1,052 subjects with ID (intellectual disabilities)…

  10. Optics Program Modified for Multithreaded Parallel Computing

    NASA Technical Reports Server (NTRS)

    Lou, John; Bedding, Dave; Basinger, Scott

    2006-01-01

    A powerful high-performance computer program for simulating and analyzing adaptive and controlled optical systems has been developed by modifying the serial version of the Modeling and Analysis for Controlled Optical Systems (MACOS) program to impart capabilities for multithreaded parallel processing on computing systems ranging from supercomputers down to Symmetric Multiprocessing (SMP) personal computers. The modifications included the incorporation of OpenMP, a portable and widely supported application interface software, that can be used to explicitly add multithreaded parallelism to an application program under a shared-memory programming model. OpenMP was applied to parallelize ray-tracing calculations, one of the major computing components in MACOS. Multithreading is also used in the diffraction propagation of light in MACOS based on pthreads [POSIX Thread, (where "POSIX" signifies a portable operating system for UNIX)]. In tests of the parallelized version of MACOS, the speedup in ray-tracing calculations was found to be linear, or proportional to the number of processors, while the speedup in diffraction calculations ranged from 50 to 60 percent, depending on the type and number of processors. The parallelized version of MACOS is portable, and, to the user, its interface is basically the same as that of the original serial version of MACOS.

  11. SBML Level 3 package: Hierarchical Model Composition, Version 1 Release 3

    PubMed Central

    Smith, Lucian P.; Hucka, Michael; Hoops, Stefan; Finney, Andrew; Ginkel, Martin; Myers, Chris J.; Moraru, Ion; Liebermeister, Wolfram

    2017-01-01

    Summary Constructing a model in a hierarchical fashion is a natural approach to managing model complexity, and offers additional opportunities such as the potential to re-use model components. The SBML Level 3 Version 1 Core specification does not directly provide a mechanism for defining hierarchical models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Hierarchical Model Composition package for SBML Level 3 adds the necessary features to SBML to support hierarchical modeling. The package enables a modeler to include submodels within an enclosing SBML model, delete unneeded or redundant elements of that submodel, replace elements of that submodel with element of the containing model, and replace elements of the containing model with elements of the submodel. In addition, the package defines an optional “port” construct, allowing a model to be defined with suggested interfaces between hierarchical components; modelers can chose to use these interfaces, but they are not required to do so and can still interact directly with model elements if they so chose. Finally, the SBML Hierarchical Model Composition package is defined in such a way that a hierarchical model can be “flattened” to an equivalent, non-hierarchical version that uses only plain SBML constructs, thus enabling software tools that do not yet support hierarchy to nevertheless work with SBML hierarchical models. PMID:26528566

  12. DFN Modeling for the Safety Case of the Final Disposal of Spent Nuclear Fuel in Olkiluoto, Finland

    NASA Astrophysics Data System (ADS)

    Vanhanarkaus, O.

    2017-12-01

    Olkiluoto Island is a site in SW Finland chosen to host a deep geological repository for high-level nuclear waste generated by nuclear power plants of power companies TVO and Fortum. Posiva, a nuclear waste management organization, submitted a construction license application for the Olkiluoto repository to the Finnish government in 2012. A key component of the license application was an integrated geological, hydrological and biological description of the Olkiluoto site. After the safety case was reviewed in 2015 by the Radiation and Nuclear Safety Authority in Finland, Posiva was granted a construction license. Posiva is now preparing an updated safety case for the operating license application to be submitted in 2022, and an update of the discrete fracture network (DFN) model used for site characterization is part of that. The first step describing and modelling the network of fractures in the Olkiluoto bedrock was DFN model version 1 (2009), which presented an initial understanding of the relationships between rock fracturing and geology at the site and identified the important primary controls on fracturing. DFN model version 2 (2012) utilized new subsurface data from additional drillholes, tunnels and excavated underground facilities in ONKALO to better understand spatial variability of the geological controls on geological and hydrogeological fracture properties. DFN version 2 connected fracture geometric and hydraulic properties to distinct tectonic domains and to larger-scale hydraulically conductive fault zones. In the version 2 DFN model, geological and hydrogeological models were developed along separate parallel tracks. The version 3 (2017) DFN model for the Olkiluoto site integrates geological and hydrogeological elements into a single consistent model used for geological, rock mechanical, hydrogeological and hydrogeochemical studies. New elements in the version 3 DFN model include a stochastic description of fractures within Brittle Fault Zones (BFZ), integration of geological and hydrostructural interpretations of BFZ, greater use of 3D geological models to better constrain the spatial variability of fracturing and fractures using hydromechanical principles to account for material behavior and in-situ stresses.

  13. P- and S-wave velocity models incorporating the Cascadia subduction zone for 3D earthquake ground motion simulations—Update for Open-File Report 2007–1348

    USGS Publications Warehouse

    Stephenson, William J.; Reitman, Nadine G.; Angster, Stephen J.

    2017-12-20

    In support of earthquake hazards studies and ground motion simulations in the Pacific Northwest, threedimensional (3D) P- and S-wave velocity (VP and VS , respectively) models incorporating the Cascadia subduction zone were previously developed for the region encompassed from about 40.2°N. to 50°N. latitude, and from about 122°W. to 129°W. longitude (fig. 1). This report describes updates to the Cascadia velocity property volumes of model version 1.3 ([V1.3]; Stephenson, 2007), herein called model version 1.6 (V1.6). As in model V1.3, the updated V1.6 model volume includes depths from 0 kilometers (km) (mean sea level) to 60 km, and it is intended to be a reference for researchers who have used, or are planning to use, this model in their earth science investigations. To this end, it is intended that the VP and VS property volumes of model V1.6 will be considered a template for a community velocity model of the Cascadia region as additional results become available. With the recent and ongoing development of the National Crustal Model (NCM; Boyd and Shah, 2016), we envision any future versions of this model will be directly integrated with that effort

  14. Regional Energy Deployment System (ReEDS) | Energy Analysis | NREL

    Science.gov Websites

    System Model The Regional Energy Deployment System (ReEDS) model helps the U.S. Department of model. Visualize Future Capacity Expansion of Renewable Energy Watch this video of the ReEDS model audio. Model Documentation ReEDS Model Documentation: Version 2016 ReEDS Map with Numbered Regions

  15. SCIATRAN 3.1: A new radiative transfer model and retrieval package

    NASA Astrophysics Data System (ADS)

    Rozanov, Alexei; Rozanov, Vladimir; Kokhanovsky, Alexander; Burrows, John P.

    The SCIATRAN 3.1 package is a result of further development of the SCIATRAN 2.X software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. After an implementation of the vector radiative transfer model in SCIATRAN 3.0 the spectral range covered by the model has been extended into the thermal infrared ranging to approximately 40 micrometers. Another major improvement has been done accounting for the underlying surface effects. Among others, a sophisticated representation of the water surface with a bidirectional reflection distribution function (BRDF) has been implemented accounting for the Fresnel reflection of the polarized light and for the effect of foam. A newly developed representation for a snow surface allows radiative transfer calculations to be performed within an unpolluted or soiled snow layer. Furthermore, a new approach has been implemented allowing radiative transfer calculations to be performed for a coupled atmosphere-ocean system. This means that, the underlying ocean is not considered as a purely reflecting surface any more. Instead, full radiative transfer calculations are performed within the water allowing the user to simulate the radiance within both the atmosphere and the ocean. Similar to previous versions, the simulations can be performed for any viewing geometry typi-cal for atmospheric observations in the UV-Vis-NIR-TIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer location within or outside the Earth's atmosphere including underwater observations. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new features of the radiative transfer model is given, including remarks on the availability for the scientific community. Furthermore, some application examples of the radiative transfer model are shown.

  16. HIPPO Unit Commitment Version 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-01-17

    Developed for the Midcontinent Independent System Operator, Inc. (MISO), HIPPO-Unit Commitment Version 1 is for solving security constrained unit commitment problem. The model was developed to solve MISO's cases. This version of codes includes I/O module to read in MISO's csv files, modules to create a state-based mixed integer programming formulation for solving MIP, and modules to test basic procedures to solve MIP via HPC.

  17. Assessment of radionuclide databases in CAP88 mainframe version 1.0 and Windows-based version 3.0.

    PubMed

    LaBone, Elizabeth D; Farfán, Eduardo B; Lee, Patricia L; Jannik, G Timothy; Donnelly, Elizabeth H; Foley, Trevor Q

    2009-09-01

    In this study the radionuclide databases for two versions of the Clean Air Act Assessment Package-1988 (CAP88) computer model were assessed in detail. CAP88 estimates radiation dose and the risk of health effects to human populations from radionuclide emissions to air. This program is used by several U.S. Department of Energy (DOE) facilities to comply with National Emission Standards for Hazardous Air Pollutants regulations. CAP88 Mainframe, referred to as version 1.0 on the U.S. Environmental Protection Agency Web site (http://www.epa.gov/radiation/assessment/CAP88/), was the very first CAP88 version released in 1988. Some DOE facilities including the Savannah River Site still employ this version (1.0) while others use the more user-friendly personal computer Windows-based version 3.0 released in December 2007. Version 1.0 uses the program RADRISK based on International Commission on Radiological Protection Publication 30 as its radionuclide database. Version 3.0 uses half-life, dose, and risk factor values based on Federal Guidance Report 13. Differences in these values could cause different results for the same input exposure data (same scenario), depending on which version of CAP88 is used. Consequently, the differences between the two versions are being assessed in detail at Savannah River National Laboratory. The version 1.0 and 3.0 database files contain 496 and 838 radionuclides, respectively, and though one would expect the newer version to include all the 496 radionuclides, 35 radionuclides are listed in version 1.0 that are not included in version 3.0. The majority of these has either extremely short or long half-lives or is no longer in production; however, some of the short-lived radionuclides might produce progeny of great interest at DOE sites. In addition, 122 radionuclides were found to have different half-lives in the two versions, with 21 over 3 percent different and 12 over 10 percent different.

  18. Technical report series on global modeling and data assimilation. Volume 4: Documentation of the Goddard Earth Observing System (GEOS) data assimilation system, version 1

    NASA Technical Reports Server (NTRS)

    Suarez, Max J. (Editor); Pfaendtner, James; Bloom, Stephen; Lamich, David; Seablom, Michael; Sienkiewicz, Meta; Stobie, James; Dasilva, Arlindo

    1995-01-01

    This report describes the analysis component of the Goddard Earth Observing System, Data Assimilation System, Version 1 (GEOS-1 DAS). The general features of the data assimilation system are outlined, followed by a thorough description of the statistical interpolation algorithm, including specification of error covariances and quality control of observations. We conclude with a discussion of the current status of development of the GEOS data assimilation system. The main components of GEOS-1 DAS are an atmospheric general circulation model and an Optimal Interpolation algorithm. The system is cycled using the Incremental Analysis Update (IAU) technique in which analysis increments are introduced as time independent forcing terms in a forecast model integration. The system is capable of producing dynamically balanced states without the explicit use of initialization, as well as a time-continuous representation of non- observables such as precipitation and radiational fluxes. This version of the data assimilation system was used in the five-year reanalysis project completed in April 1994 by Goddard's Data Assimilation Office (DAO) Data from this reanalysis are available from the Goddard Distributed Active Center (DAAC), which is part of NASA's Earth Observing System Data and Information System (EOSDIS). For information on how to obtain these data sets, contact the Goddard DAAC at (301) 286-3209, EMAIL daac@gsfc.nasa.gov.

  19. Ensemble-based flash-flood modelling: Taking into account hydrodynamic parameters and initial soil moisture uncertainties

    NASA Astrophysics Data System (ADS)

    Edouard, Simon; Vincendon, Béatrice; Ducrocq, Véronique

    2018-05-01

    Intense precipitation events in the Mediterranean often lead to devastating flash floods (FF). FF modelling is affected by several kinds of uncertainties and Hydrological Ensemble Prediction Systems (HEPS) are designed to take those uncertainties into account. The major source of uncertainty comes from rainfall forcing and convective-scale meteorological ensemble prediction systems can manage it for forecasting purpose. But other sources are related to the hydrological modelling part of the HEPS. This study focuses on the uncertainties arising from the hydrological model parameters and initial soil moisture with aim to design an ensemble-based version of an hydrological model dedicated to Mediterranean fast responding rivers simulations, the ISBA-TOP coupled system. The first step consists in identifying the parameters that have the strongest influence on FF simulations by assuming perfect precipitation. A sensitivity study is carried out first using a synthetic framework and then for several real events and several catchments. Perturbation methods varying the most sensitive parameters as well as initial soil moisture allow designing an ensemble-based version of ISBA-TOP. The first results of this system on some real events are presented. The direct perspective of this work will be to drive this ensemble-based version with the members of a convective-scale meteorological ensemble prediction system to design a complete HEPS for FF forecasting.

  20. The BRIDGE HadCM3 family of climate models: HadCM3@Bristol v1.0

    NASA Astrophysics Data System (ADS)

    Valdes, Paul J.; Armstrong, Edward; Badger, Marcus P. S.; Bradshaw, Catherine D.; Bragg, Fran; Crucifix, Michel; Davies-Barnard, Taraka; Day, Jonathan J.; Farnsworth, Alex; Gordon, Chris; Hopcroft, Peter O.; Kennedy, Alan T.; Lord, Natalie S.; Lunt, Dan J.; Marzocchi, Alice; Parry, Louise M.; Pope, Vicky; Roberts, William H. G.; Stone, Emma J.; Tourte, Gregory J. L.; Williams, Jonny H. T.

    2017-10-01

    Understanding natural and anthropogenic climate change processes involves using computational models that represent the main components of the Earth system: the atmosphere, ocean, sea ice, and land surface. These models have become increasingly computationally expensive as resolution is increased and more complex process representations are included. However, to gain robust insight into how climate may respond to a given forcing, and to meaningfully quantify the associated uncertainty, it is often required to use either or both ensemble approaches and very long integrations. For this reason, more computationally efficient models can be very valuable tools. Here we provide a comprehensive overview of the suite of climate models based around the HadCM3 coupled general circulation model. This model was developed at the UK Met Office and has been heavily used during the last 15 years for a range of future (and past) climate change studies, but has now been largely superseded for many scientific studies by more recently developed models. However, it continues to be extensively used by various institutions, including the BRIDGE (Bristol Research Initiative for the Dynamic Global Environment) research group at the University of Bristol, who have made modest adaptations to the base HadCM3 model over time. These adaptations mean that the original documentation is not entirely representative, and several other relatively undocumented configurations are in use. We therefore describe the key features of a number of configurations of the HadCM3 climate model family, which together make up HadCM3@Bristol version 1.0. In order to differentiate variants that have undergone development at BRIDGE, we have introduced the letter B into the model nomenclature. We include descriptions of the atmosphere-only model (HadAM3B), the coupled model with a low-resolution ocean (HadCM3BL), the high-resolution atmosphere-only model (HadAM3BH), and the regional model (HadRM3B). These also include three versions of the land surface scheme. By comparing with observational datasets, we show that these models produce a good representation of many aspects of the climate system, including the land and sea surface temperatures, precipitation, ocean circulation, and vegetation. This evaluation, combined with the relatively fast computational speed (up to 1000 times faster than some CMIP6 models), motivates continued development and scientific use of the HadCM3B family of coupled climate models, predominantly for quantifying uncertainty and for long multi-millennial-scale simulations.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rasmussen, Martin; Hastings, Alan; Smith, Matthew J.

    In this study, we develop a theory for transit times and mean ages for nonautonomous compartmental systems. Using the McKendrick–von Förster equation, we show that the mean ages of mass in a compartmental system satisfy a linear nonautonomous ordinary differential equation that is exponentially stable. We then define a nonautonomous version of transit time as the mean age of mass leaving the compartmental system at a particular time and show that our nonautonomous theory generalises the autonomous case. We apply these results to study a nine-dimensional nonautonomous compartmental system modeling the terrestrial carbon cycle, which is a modification of themore » Carnegie–Ames–Stanford approach model, and we demonstrate that the nonautonomous versions of transit time and mean age differ significantly from the autonomous quantities when calculated for that model.« less

  2. Transit times and mean ages for nonautonomous and autonomous compartmental systems

    DOE PAGES

    Rasmussen, Martin; Hastings, Alan; Smith, Matthew J.; ...

    2016-04-01

    In this study, we develop a theory for transit times and mean ages for nonautonomous compartmental systems. Using the McKendrick–von Förster equation, we show that the mean ages of mass in a compartmental system satisfy a linear nonautonomous ordinary differential equation that is exponentially stable. We then define a nonautonomous version of transit time as the mean age of mass leaving the compartmental system at a particular time and show that our nonautonomous theory generalises the autonomous case. We apply these results to study a nine-dimensional nonautonomous compartmental system modeling the terrestrial carbon cycle, which is a modification of themore » Carnegie–Ames–Stanford approach model, and we demonstrate that the nonautonomous versions of transit time and mean age differ significantly from the autonomous quantities when calculated for that model.« less

  3. A user's manual for DELSOL3: A computer code for calculating the optical performance and optimal system design for solar thermal central receiver plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kistler, B.L.

    DELSOL3 is a revised and updated version of the DELSOL2 computer program (SAND81-8237) for calculating collector field performance and layout and optimal system design for solar thermal central receiver plants. The code consists of a detailed model of the optical performance, a simpler model of the non-optical performance, an algorithm for field layout, and a searching algorithm to find the best system design based on energy cost. The latter two features are coupled to a cost model of central receiver components and an economic model for calculating energy costs. The code can handle flat, focused and/or canted heliostats, and externalmore » cylindrical, multi-aperture cavity, and flat plate receivers. The program optimizes the tower height, receiver size, field layout, heliostat spacings, and tower position at user specified power levels subject to flux limits on the receiver and land constraints for field layout. DELSOL3 maintains the advantages of speed and accuracy which are characteristics of DELSOL2.« less

  4. Markov chains for testing redundant software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1988-01-01

    A preliminary design for a validation experiment has been developed that addresses several problems unique to assuring the extremely high quality of multiple-version programs in process-control software. The procedure uses Markov chains to model the error states of the multiple version programs. The programs are observed during simulated process-control testing, and estimates are obtained for the transition probabilities between the states of the Markov chain. The experimental Markov chain model is then expanded into a reliability model that takes into account the inertia of the system being controlled. The reliability of the multiple version software is computed from this reliability model at a given confidence level using confidence intervals obtained for the transition probabilities during the experiment. An example demonstrating the method is provided.

  5. Improving Navigation information for the Rotterdam Harbour access through a 3D Model and HF radar

    NASA Astrophysics Data System (ADS)

    Schroevers, Marinus

    2015-04-01

    The Port of Rotterdam is one of the largest harbours in the world and a gateway to Europe. For the access to Rotterdam harbour, information on hydrodynamic and meteorological conditions is of vital importance for safe and swift navigation. This information focuses on the deep navigation channel in the shallow foreshore, which accommodates large seagoing vessels. Due to a large seaward extension of the Port of Rotterdam area in 2011, current patterns have changed. A re-evaluation of the information needed, showed a need for an improved accuracy of the cross channel currents and swell, and an extended forecast horizon. To obtain this, new information system was designed based on a three dimensional hydrodynamic model which produces a 72 hour forecast. Furthermore, the system will assimilate HF radars surface current to optimize the short term forecast. The project has started in 2013 by specifying data needed from the HF radar. At the same time (temporary) buoys were deployed to monitor vertical current profiles. The HF radar will be operational in July 2015, while the model development starts beginning 2015. A pre operational version of the system is presently planned for the end of 2016. A full operational version which assimilates the HF radar data is planned for 2017.

  6. INM Integrated Noise Model Version 2. Programmer’s Guide

    DTIC Science & Technology

    1979-09-01

    cost, turnaround time, and system-dependent limitations. 3.2 CONVERSION PROBLEMS Item Item Item No. Desciption Category 1 BLOCK DATA Initialization IBM ...Restricted 2 Boolean Operations Differences Call Statement Parameters Extensions 4 Data Initialization IBM Restricted 5 ENTRY Differences 6 EQUIVALENCE...Machine Dependent 7 Format: A CDC Extension 8 Hollerith Strings IBM Restricted 9 Hollerith Variables IBM Restricted 10 Identifier Names CDC Extension

  7. PHAST Version 2-A Program for Simulating Groundwater Flow, Solute Transport, and Multicomponent Geochemical Reactions

    USGS Publications Warehouse

    Parkhurst, David L.; Kipp, Kenneth L.; Charlton, Scott R.

    2010-01-01

    The computer program PHAST (PHREEQC And HST3D) simulates multicomponent, reactive solute transport in three-dimensional saturated groundwater flow systems. PHAST is a versatile groundwater flow and solute-transport simulator with capabilities to model a wide range of equilibrium and kinetic geochemical reactions. The flow and transport calculations are based on a modified version of HST3D that is restricted to constant fluid density and constant temperature. The geochemical reactions are simulated with the geochemical model PHREEQC, which is embedded in PHAST. Major enhancements in PHAST Version 2 allow spatial data to be defined in a combination of map and grid coordinate systems, independent of a specific model grid (without node-by-node input). At run time, aquifer properties are interpolated from the spatial data to the model grid; regridding requires only redefinition of the grid without modification of the spatial data. PHAST is applicable to the study of natural and contaminated groundwater systems at a variety of scales ranging from laboratory experiments to local and regional field scales. PHAST can be used in studies of migration of nutrients, inorganic and organic contaminants, and radionuclides; in projects such as aquifer storage and recovery or engineered remediation; and in investigations of the natural rock/water interactions in aquifers. PHAST is not appropriate for unsaturated-zone flow, multiphase flow, or density-dependent flow. A variety of boundary conditions are available in PHAST to simulate flow and transport, including specified-head, flux (specified-flux), and leaky (head-dependent) conditions, as well as the special cases of rivers, drains, and wells. Chemical reactions in PHAST include (1) homogeneous equilibria using an ion-association or Pitzer specific interaction thermodynamic model; (2) heterogeneous equilibria between the aqueous solution and minerals, ion exchange sites, surface complexation sites, solid solutions, and gases; and (3) kinetic reactions with rates that are a function of solution composition. The aqueous model (elements, chemical reactions, and equilibrium constants), minerals, exchangers, surfaces, gases, kinetic reactants, and rate expressions may be defined or modified by the user. A number of options are available to save results of simulations to output files. The data may be saved in three formats: a format suitable for viewing with a text editor; a format suitable for exporting to spreadsheets and postprocessing programs; and in Hierarchical Data Format (HDF), which is a compressed binary format. Data in the HDF file can be visualized on Windows computers with the program Model Viewer and extracted with the utility program PHASTHDF; both programs are distributed with PHAST.

  8. Oil spill model coupled to an ultra-high-resolution circulation model: implementation for the Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Korotenko, K.

    2003-04-01

    An ultra-high-resolution version of DieCAST was adjusted for the Adriatic Sea and coupled with an oil spill model. Hydrodynamic module was developed on base of th low dissipative, four-order-accuracy version DieCAST with the resolution of ~2km. The oil spill model was developed on base of particle tracking technique The effect of evaporation is modeled with an original method developed on the base of the pseudo-component approach. A special dialog interface of this hybrid system allowing direct coupling to meteorlogical data collection systems or/and meteorological models. Experiments with hypothetic oil spill are analyzed for the Northern Adriatic Sea. Results (animations) of mesoscale circulation and oil slick modeling are presented at wabsite http://thayer.dartmouth.edu/~cushman/adriatic/movies/

  9. User's Guide for Evaluating Subsurface Vapor Intrusion into Buildings

    EPA Pesticide Factsheets

    This revised version of the User's Guide corresponds with the release of Version 3.1 of the Johnson and Ettinger (1991) model (J E) spreadsheets for estimating subsurface vapor intrusion into buildings.

  10. OLS Client and OLS Dialog: Open Source Tools to Annotate Public Omics Datasets.

    PubMed

    Perez-Riverol, Yasset; Ternent, Tobias; Koch, Maximilian; Barsnes, Harald; Vrousgou, Olga; Jupp, Simon; Vizcaíno, Juan Antonio

    2017-10-01

    The availability of user-friendly software to annotate biological datasets and experimental details is becoming essential in data management practices, both in local storage systems and in public databases. The Ontology Lookup Service (OLS, http://www.ebi.ac.uk/ols) is a popular centralized service to query, browse and navigate biomedical ontologies and controlled vocabularies. Recently, the OLS framework has been completely redeveloped (version 3.0), including enhancements in the data model, like the added support for Web Ontology Language based ontologies, among many other improvements. However, the new OLS is not backwards compatible and new software tools are needed to enable access to this widely used framework now that the previous version is no longer available. We here present the OLS Client as a free, open-source Java library to retrieve information from the new version of the OLS. It enables rapid tool creation by providing a robust, pluggable programming interface and common data model to programmatically access the OLS. The library has already been integrated and is routinely used by several bioinformatics resources and related data annotation tools. Secondly, we also introduce an updated version of the OLS Dialog (version 2.0), a Java graphical user interface that can be easily plugged into Java desktop applications to access the OLS. The software and related documentation are freely available at https://github.com/PRIDE-Utilities/ols-client and https://github.com/PRIDE-Toolsuite/ols-dialog. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. The portals 4.0.1 network programming interface.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrett, Brian W.; Brightwell, Ronald Brian; Pedretti, Kevin

    2013-04-01

    This report presents a specification for the Portals 4.0 network programming interface. Portals 4.0 is intended to allow scalable, high-performance network communication between nodes of a parallel computing system. Portals 4.0 is well suited to massively parallel processing and embedded systems. Portals 4.0 represents an adaption of the data movement layer developed for massively parallel processing platforms, such as the 4500-node Intel TeraFLOPS machine. Sandias Cplant cluster project motivated the development of Version 3.0, which was later extended to Version 3.3 as part of the Cray Red Storm machine and XT line. Version 4.0 is targeted to the next generationmore » of machines employing advanced network interface architectures that support enhanced offload capabilities. 3« less

  12. SimPackJ/S: a web-oriented toolkit for discrete event simulation

    NASA Astrophysics Data System (ADS)

    Park, Minho; Fishwick, Paul A.

    2002-07-01

    SimPackJ/S is the JavaScript and Java version of SimPack, which means SimPackJ/S is a collection of JavaScript and Java libraries and executable programs for computer simulations. The main purpose of creating SimPackJ/S is that we allow existing SimPack users to expand simulation areas and provide future users with a freeware simulation toolkit to simulate and model a system in web environments. One of the goals for this paper is to introduce SimPackJ/S. The other goal is to propose translation rules for converting C to JavaScript and Java. Most parts demonstrate the translation rules with examples. In addition, we discuss a 3D dynamic system model and overview an approach to 3D dynamic systems using SimPackJ/S. We explain an interface between SimPackJ/S and the 3D language--Virtual Reality Modeling Language (VRML). This paper documents how to translate C to JavaScript and Java and how to utilize SimPackJ/S within a 3D web environment.

  13. Software Design Description for the Polar Ice Prediction System (PIPS) Version 3.0

    DTIC Science & Technology

    2008-11-05

    Naval Research Laboratory Stennis Space Center, MS 39529-5004 NRL/MR/7320--08-9150 Approved for public release; distribution is unlimited. Software ...collection of information, including suggestions for reducing this burden to Department of Defense, Washington Headquarters Services , Directorate for...THIS PAGE 18. NUMBER OF PAGES 17. LIMITATION OF ABSTRACT Software Design Description for the Polar Ice Prediction System (PIPS) Version 3.0 Pamela G

  14. Naval Observatory Vector Astrometry Software (NOVAS) Version 3.1:Fortran, C, and Python Editions

    NASA Astrophysics Data System (ADS)

    Kaplan, G. H.; Bangert, J. A.; Barron, E. G.; Bartlett, J. L.; Puatua, W.; Harris, W.; Barrett, P.

    2012-08-01

    The Naval Observatory Vector Astrometry Software (NOVAS) is a source - code library that provides common astrometric quantities and transformations to high precision. The library can supply, in one or two subroutine or function calls, the instantaneous celestial position of any star or planet in a variety of coordinate systems. NOVAS also provides access to all of the building blocks that go into such computations. NOVAS is used for a wide variety of applications, including the U.S. portions of The Astronomical Almanac and a number of telescope control systems. NOVAS uses IAU recommended models for Earth orientation, including the IAU 2006 precession theory, the IAU 2000A and 2000B nutation series, and diurnal rotation based on the celestial and terrestrial intermediate origins. Equinox - based quantities, such as sidereal time, are also supported. NOVAS Earth orientation calculations match those from SOFA at the sub - microarcsecond level for comparable transformations. NOVAS algorithms for aberration an d gravitational light deflection are equivalent, at the microarcsecond level, to those inherent in the current consensus VLBI delay algorithm. NOVAS can be easily connected to the JPL planetary/lunar ephemerides (e.g., DE405), and connections to IMCCE and IAA planetary ephemerides are planned. NOVAS Version 3.1 introduces a Python edition alongside the Fortran and C editions. The Python edition uses the computational code from the C edition and currently mimics the function calls of the C edition. Future versions will expand the functionality of the Python edition to exploit the object - oriented features of Python. In the Version 3.1 C edition, the ephemeris - access functions have been revised for use on 64 - bit systems and for improved performance in general. NOVAS source code, auxiliary files, and documentation are available from the USNO website (http://aa.usno.navy.mil/software/novas/novas_info.php).

  15. Mars Global Reference Atmospheric Model (Mars-GRAM): Release No. 2 - Overview and applications

    NASA Technical Reports Server (NTRS)

    James, B.; Johnson, D.; Tyree, L.

    1993-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM), a science and engineering model for empirically parameterizing the temperature, pressure, density, and wind structure of the Martian atmosphere, is described with particular attention to the model's newest version, Mars-GRAM, Release No. 2 and to the improvements incorporated into the Release No. 2 model as compared with the Release No. 1 version. These improvements include (1) an addition of a new capability to simulate local-scale Martian dust storms and the growth and decay of these storms; (2) an addition of the Zurek and Haberle (1988) wave perturbation model, for simulating tidal perturbation effects; and (3) a new modular version of Mars-GRAM, for incorporation as a subroutine into other codes.

  16. Description of the NCAR Community Climate Model (CCM3). Technical note

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiehl, J.T.; Hack, J.J.; Bonan, G.B.

    This repor presents the details of the governing equations, physical parameterizations, and numerical algorithms defining the version of the NCAR Community Climate Model designated CCM3. The material provides an overview of the major model components, and the way in which they interact as the numerical integration proceeds. This version of the CCM incorporates significant improvements to the physic package, new capabilities such as the incorporation of a slab ocean component, and a number of enhancements to the implementation (e.g., the ability to integrate the model on parallel distributed-memory computational platforms).

  17. GeoSciML v3.0 - a significant upgrade of the CGI-IUGS geoscience data model

    NASA Astrophysics Data System (ADS)

    Raymond, O.; Duclaux, G.; Boisvert, E.; Cipolloni, C.; Cox, S.; Laxton, J.; Letourneau, F.; Richard, S.; Ritchie, A.; Sen, M.; Serrano, J.-J.; Simons, B.; Vuollo, J.

    2012-04-01

    GeoSciML version 3.0 (http://www.geosciml.org), released in late 2011, is the latest version of the CGI-IUGS* Interoperability Working Group geoscience data interchange standard. The new version is a significant upgrade and refactoring of GeoSciML v2 which was released in 2008. GeoSciML v3 has already been adopted by several major international interoperability initiatives, including OneGeology, the EU INSPIRE program, and the US Geoscience Information Network, as their standard data exchange format for geoscience data. GeoSciML v3 makes use of recently upgraded versions of several Open Geospatial Consortium (OGC) and ISO data transfer standards, including GML v3.2, SWE Common v2.0, and Observations and Measurements v2 (ISO 19156). The GeoSciML v3 data model has been refactored from a single large application schema with many packages, into a number of smaller, but related, application schema modules with individual namespaces. This refactoring allows the use and future development of modules of GeoSciML (eg; GeologicUnit, GeologicStructure, GeologicAge, Borehole) in smaller, more manageable units. As a result of this refactoring and the integration with new OGC and ISO standards, GeoSciML v3 is not backwardly compatible with previous GeoSciML versions. The scope of GeoSciML has been extended in version 3.0 to include new models for geomorphological data (a Geomorphology application schema), and for geological specimens, geochronological interpretations, and metadata for geochemical and geochronological analyses (a LaboratoryAnalysis-Specimen application schema). In addition, there is better support for borehole data, and the PhysicalProperties model now supports a wider range of petrophysical measurements. The previously used CGI_Value data type has been superseded in favour of externally governed data types provided by OGC's SWE Common v2 and GML v3.2 data standards. The GeoSciML v3 release includes worked examples of best practice in delivering geochemical analytical data using the Observations and Measurements (ISO19156) and SWE Common v2 models. The GeoSciML v3 data model does not include vocabularies to support the data model. However, it does provide a standard pattern to reference controlled vocabulary concepts using HTTP-URIs. The international GeoSciML community has developed distributed RDF-based geoscience vocabularies that can be accessed by GeoSciML web services using the standard pattern recommended in GeoSciML v3. GeoSciML v3 is the first version of GeoSciML that will be accompanied by web service validation tools using Schematron rules. For example, these validation tools may check for compliance of a web service to a particular profile of GeoSciML, or for logical consistency of data content that cannot be enforced by the application schemas. This validation process will support accreditation of GeoSciML services and a higher degree of semantic interoperability. * International Union of Geological Sciences Commission for Management and Application of Geoscience Information (CGI-IUGS)

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malczynski, Leonard A.

    This guide addresses software quality in the construction of Powersim{reg_sign} Studio 8 system dynamics simulation models. It is the result of almost ten years of experience with the Powersim suite of system dynamics modeling tools (Constructor and earlier Studio versions). It is a guide that proposes a common look and feel for the construction of Powersim Studio system dynamics models.

  19. A Coupled Surface Nudging Scheme for use in Retrospective ...

    EPA Pesticide Factsheets

    A surface analysis nudging scheme coupling atmospheric and land surface thermodynamic parameters has been implemented into WRF v3.8 (latest version) for use with retrospective weather and climate simulations, as well as for applications in air quality, hydrology, and ecosystem modeling. This scheme is known as the flux-adjusting surface data assimilation system (FASDAS) developed by Alapaty et al. (2008). This scheme provides continuous adjustments for soil moisture and temperature (via indirect nudging) and for surface air temperature and water vapor mixing ratio (via direct nudging). The simultaneous application of indirect and direct nudging maintains greater consistency between the soil temperature–moisture and the atmospheric surface layer mass-field variables. The new method, FASDAS, consistently improved the accuracy of the model simulations at weather prediction scales for different horizontal grid resolutions, as well as for high resolution regional climate predictions. This new capability has been released in WRF Version 3.8 as option grid_sfdda = 2. This new capability increased the accuracy of atmospheric inputs for use air quality, hydrology, and ecosystem modeling research to improve the accuracy of respective end-point research outcome. IMPACT: A new method, FASDAS, was implemented into the WRF model to consistently improve the accuracy of the model simulations at weather prediction scales for different horizontal grid resolutions, as wel

  20. Validez Convergente de la Version Espanola Preliminar del Child Abuse Potential Inventory: Depresion y Aduste Marital (Convergent Validity of the Preliminary Spanish Version of the Child Abuse Potential Inventory: Depression and Marital Adjustment).

    ERIC Educational Resources Information Center

    Arruabarrena, M. Ignacia; de Paul, Joaquin

    1992-01-01

    "Convergent validity" of preliminary Spanish version of Child Abuse Potential (CAP) Inventory was studied. CAP uses ecological-systemic model of child maltreatment to evaluate individual, family, and social factors facilitating physical child abuse. Depression and marital adjustment were measured in three groups of mothers. Results found…

  1. Prospective Evaluation of PI-RADS™ Version 2 Using the International Society of Urological Pathology Prostate Cancer Grade Group System.

    PubMed

    Mehralivand, Sherif; Bednarova, Sandra; Shih, Joanna H; Mertan, Francesca V; Gaur, Sonia; Merino, Maria J; Wood, Bradford J; Pinto, Peter A; Choyke, Peter L; Turkbey, Baris

    2017-09-01

    The PI-RADS™ (Prostate Imaging Reporting and Data System), version 2 scoring system, introduced in 2015, is based on expert consensus. In the same time frame ISUP (International Society of Urological Pathology) introduced a new pathological scoring system for prostate cancer. Our goal was to prospectively evaluate the cancer detection rates for each PI-RADS, version 2 category and compare them to ISUP group scores in patients undergoing systematic biopsy and magnetic resonance imaging-transrectal ultrasound fusion guided biopsy. A total of 339 treatment naïve patients prospectively underwent multiparametric magnetic resonance imaging evaluated with PI-RADS, version 2 with subsequent systematic and fusion guided biopsy from May 2015 to May 2016. ISUP scores were applied to pathological specimens. An ISUP score of 2 or greater (ie Gleason 3 + 4 or greater) was defined as clinically significant prostate cancer. Cancer detection rates were determined for each PI-RADS, version 2 category as well as for the T2 weighted PI-RADS, version 2 categories in the peripheral zone. The cancer detection rate for PI-RADS, version 2 categories 1, 2, 3, 4 and 5 was 25%, 20.2%, 24.8%, 39.1% and 86.9% for all prostate cancer, and 0%, 9.6%, 12%, 22.1% and 72.4% for clinically significant prostate cancer, respectively. On T2-weighted magnetic resonance imaging the cancer detection rate in the peripheral zone was significantly higher for PI-RADS, version 2 category 4 than for overall PI-RADS, version 2 category 4 in the peripheral zone (all prostate cancer 36.6% vs 48.1%, p = 0.001, and clinically significant prostate cancer 22.9% vs 32.6%, p = 0.002). The cancer detection rate increases with higher PI-RADS, version 2 categories. Copyright © 2017 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  2. Analytical investigation of the dynamics of tethered constellations in Earth orbit, phase 2

    NASA Technical Reports Server (NTRS)

    Lorenzini, E.; Arnold, D. A.; Grossi, M. D.; Gullahorn, G. E.

    1986-01-01

    The development of a two dimensional analytical model that describes the dynamics of an n-mass vertical tethered system is reported. Two different approaches are described: in the first one the control quantities are the independent variables while in the second one the Cartesian coordinates of each mass expressed in the orbiting reference frame are the independent variables. The latter model was used in the 3-mass version to simulate the dynamics of the tethered system in applications involving the displacement of the middle mass along the tether. In particular, issues related to reproducing predetermined acceleration profiles and g-tuning are reported.

  3. The Portals 4.0 network programming interface.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrett, Brian W.; Brightwell, Ronald Brian; Pedretti, Kevin

    2012-11-01

    This report presents a specification for the Portals 4.0 network programming interface. Portals 4.0 is intended to allow scalable, high-performance network communication between nodes of a parallel computing system. Portals 4.0 is well suited to massively parallel processing and embedded systems. Portals 4.0 represents an adaption of the data movement layer developed for massively parallel processing platforms, such as the 4500-node Intel TeraFLOPS machine. Sandias Cplant cluster project motivated the development of Version 3.0, which was later extended to Version 3.3 as part of the Cray Red Storm machine and XT line. Version 4.0 is targeted to the next generationmore » of machines employing advanced network interface architectures that support enhanced offload capabilities.« less

  4. Evaluating the improvements of the BOLAM meteorological model operational at ISPRA: A case study approach - preliminary results

    NASA Astrophysics Data System (ADS)

    Mariani, S.; Casaioli, M.; Lastoria, B.; Accadia, C.; Flavoni, S.

    2009-04-01

    The Institute for Environmental Protection and Research - ISPRA (former Agency for Environmental Protection and Technical Services - APAT) runs operationally since 2000 an integrated meteo-marine forecasting chain, named the Hydro-Meteo-Marine Forecasting System (Sistema Idro-Meteo-Mare - SIMM), formed by a cascade of four numerical models, telescoping from the Mediterranean basin to the Venice Lagoon, and initialized by means of analyses and forecasts from the European Centre for Medium-Range Weather Forecasts (ECMWF). The operational integrated system consists of a meteorological model, the parallel verision of BOlogna Limited Area Model (BOLAM), coupled over the Mediterranean sea with a WAve Model (WAM), a high-resolution shallow-water model of the Adriatic and Ionian Sea, namely the Princeton Ocean Model (POM), and a finite-element version of the same model (VL-FEM) on the Venice Lagoon, aimed to forecast the acqua alta events. Recently, the physically based, fully distributed, rainfall-runoff TOPographic Kinematic APproximation and Integration (TOPKAPI) model has been integrated into the system, coupled to BOLAM, over two river basins, located in the central and northeastern part of Italy, respectively. However, at the present time, this latter part of the forecasting chain is not operational and it is used in a research configuration. BOLAM was originally implemented in 2000 onto the Quadrics parallel supercomputer (and for this reason referred to as QBOLAM, as well) and only at the end of 2006 it was ported (together with the other operational marine models of the forecasting chain) onto the Silicon Graphics Inc. (SGI) Altix 8-processor machine. In particular, due to the Quadrics implementation, the Kuo scheme was formerly implemented into QBOLAM for the cumulus convection parameterization. On the contrary, when porting SIMM onto the Altix Linux cluster, it was achievable to implement into QBOLAM the more advanced convection parameterization by Kain and Fritsch. A fully updated serial version of the BOLAM code has been recently acquired. Code improvements include a more precise advection scheme (Weighted Average Flux); explicit advection of five hydrometeors, and state-of-the-art parameterization schemes for radiation, convection, boundary layer turbulence and soil processes (also with possible choice among different available schemes). The operational implementation of the new code into the SIMM model chain, which requires the development of a parallel version, will be achieved during 2009. In view of this goal, the comparative verification of the different model versions' skill represents a fundamental task. On this purpose, it has been decided to evaluate the performance improvement of the new BOLAM code (in the available serial version, hereinafter BOLAM 2007) with respect to the version with the Kain-Fritsch scheme (hereinafter KF version) and to the older one employing the Kuo scheme (hereinafter Kuo version). In the present work, verification of precipitation forecasts from the three BOLAM versions is carried on in a case study approach. The intense rainfall episode occurred on 10th - 17th December 2008 over Italy has been considered. This event produced indeed severe damages in Rome and its surrounding areas. Objective and subjective verification methods have been employed in order to evaluate model performance against an observational dataset including rain gauge observations and satellite imagery. Subjective comparison of observed and forecast precipitation fields is suitable to give an overall description of the forecast quality. Spatial errors (e.g., shifting and pattern errors) and rainfall volume error can be assessed quantitatively by means of object-oriented methods. By comparing satellite images with model forecast fields, it is possible to investigate the differences between the evolution of the observed weather system and the predicted ones, and its sensitivity to the improvements in the model code. Finally, the error in forecasting the cyclone evolution can be tentatively related with the precipitation forecast error.

  5. Evaluation of NorESM-OC (versions 1 and 1.2), the ocean carbon-cycle stand-alone configuration of the Norwegian Earth System Model (NorESM1)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwinger, Jorg; Goris, Nadine; Tjiputra, Jerry F.

    Idealised and hindcast simulations performed with the stand-alone ocean carbon-cycle configuration of the Norwegian Earth System Model (NorESM-OC) are described and evaluated. We present simulation results of three different model configurations (two different model versions at different grid resolutions) using two different atmospheric forcing data sets. Model version NorESM-OC1 corresponds to the version that is included in the NorESM-ME1 fully coupled model, which participated in CMIP5. The main update between NorESM-OC1 and NorESM-OC1.2 is the addition of two new options for the treatment of sinking particles. We find that using a constant sinking speed, which has been the standard in NorESM'smore » ocean carbon cycle module HAMOCC (HAMburg Ocean Carbon Cycle model), does not transport enough particulate organic carbon (POC) into the deep ocean below approximately 2000 m depth. The two newly implemented parameterisations, a particle aggregation scheme with prognostic sinking speed, and a simpler scheme that uses a linear increase in the sinking speed with depth, provide better agreement with observed POC fluxes. Additionally, reduced deep ocean biases of oxygen and remineralised phosphate indicate a better performance of the new parameterisations. For model version 1.2, a re-tuning of the ecosystem parameterisation has been performed, which (i) reduces previously too high primary production at high latitudes, (ii) consequently improves model results for surface nutrients, and (iii) reduces alkalinity and dissolved inorganic carbon biases at low latitudes. We use hindcast simulations with prescribed observed and constant (pre-industrial) atmospheric CO 2 concentrations to derive the past and contemporary ocean carbon sink. As a result, for the period 1990–1999 we find an average ocean carbon uptake ranging from 2.01 to 2.58 Pg C yr -1 depending on model version, grid resolution, and atmospheric forcing data set.« less

  6. Evaluation of NorESM-OC (versions 1 and 1.2), the ocean carbon-cycle stand-alone configuration of the Norwegian Earth System Model (NorESM1)

    DOE PAGES

    Schwinger, Jorg; Goris, Nadine; Tjiputra, Jerry F.; ...

    2016-08-02

    Idealised and hindcast simulations performed with the stand-alone ocean carbon-cycle configuration of the Norwegian Earth System Model (NorESM-OC) are described and evaluated. We present simulation results of three different model configurations (two different model versions at different grid resolutions) using two different atmospheric forcing data sets. Model version NorESM-OC1 corresponds to the version that is included in the NorESM-ME1 fully coupled model, which participated in CMIP5. The main update between NorESM-OC1 and NorESM-OC1.2 is the addition of two new options for the treatment of sinking particles. We find that using a constant sinking speed, which has been the standard in NorESM'smore » ocean carbon cycle module HAMOCC (HAMburg Ocean Carbon Cycle model), does not transport enough particulate organic carbon (POC) into the deep ocean below approximately 2000 m depth. The two newly implemented parameterisations, a particle aggregation scheme with prognostic sinking speed, and a simpler scheme that uses a linear increase in the sinking speed with depth, provide better agreement with observed POC fluxes. Additionally, reduced deep ocean biases of oxygen and remineralised phosphate indicate a better performance of the new parameterisations. For model version 1.2, a re-tuning of the ecosystem parameterisation has been performed, which (i) reduces previously too high primary production at high latitudes, (ii) consequently improves model results for surface nutrients, and (iii) reduces alkalinity and dissolved inorganic carbon biases at low latitudes. We use hindcast simulations with prescribed observed and constant (pre-industrial) atmospheric CO 2 concentrations to derive the past and contemporary ocean carbon sink. As a result, for the period 1990–1999 we find an average ocean carbon uptake ranging from 2.01 to 2.58 Pg C yr -1 depending on model version, grid resolution, and atmospheric forcing data set.« less

  7. RSM 1.0 - A RESUPPLY SCHEDULER USING INTEGER OPTIMIZATION

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.

    1994-01-01

    RSM, Resupply Scheduling Modeler, is a fully menu-driven program that uses integer programming techniques to determine an optimum schedule for replacing components on or before the end of a fixed replacement period. Although written to analyze the electrical power system on the Space Station Freedom, RSM is quite general and can be used to model the resupply of almost any system subject to user-defined resource constraints. RSM is based on a specific form of the general linear programming problem in which all variables in the objective function and all variables in the constraints are integers. While more computationally intensive, integer programming was required for accuracy when modeling systems with small quantities of components. Input values for component life cane be real numbers, RSM converts them to integers by dividing the lifetime by the period duration, then reducing the result to the next lowest integer. For each component, there is a set of constraints that insure that it is replaced before its lifetime expires. RSM includes user-defined constraints such as transportation mass and volume limits, as well as component life, available repair crew time and assembly sequences. A weighting factor allows the program to minimize factors such as cost. The program then performs an iterative analysis, which is displayed during the processing. A message gives the first period in which resources are being exceeded on each iteration. If the scheduling problem is unfeasible, the final message will also indicate the first period in which resources were exceeded. RSM is written in APL2 for IBM PC series computers and compatibles. A stand-alone executable version of RSM is provided; however, this is a "packed" version of RSM which can only utilize the memory within the 640K DOS limit. This executable requires at least 640K of memory and DOS 3.1 or higher. Source code for an APL2/PC workspace version is also provided. This version of RSM can make full use of any installed extended memory but must be run with the APL2 interpreter; and it requires an 80486 based microcomputer or an 80386 based microcomputer with an 80387 math coprocessor, at least 2Mb of extended memory, and DOS 3.3 or higher. The standard distribution medium for this package is one 5.25 inch 360K MS-DOS format diskette. RSM was developed in 1991. APL2 and IBM PC are registered trademarks of International Business Machines Corporation. MS-DOS is a registered trademark of Microsoft Corporation.

  8. SU-D-207-07: Implementation of Full/half Bowtie Filter Model in a Commercial Treatment Planning System for Kilovoltage X-Ray Imaging Dose Estimation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, S; Alaei, P

    2015-06-15

    Purpose: To implement full/half bowtie filter models in a commercial treatment planning system (TPS) to calculate kilovoltage (kV) x-ray imaging dose of Varian On-Board Imager (OBI) cone beam CT (CBCT) system. Methods: Full/half bowtie filters of Varian OBI were created as compensator models in Pinnacle TPS (version 9.6) using Matlab software (version 2011a). The profiles of both bowtie filters were acquired from the manufacturer, imported into the Matlab system and hard coded in binary file format. A Pinnacle script was written to import each bowtie filter data into a Pinnacle treatment plan as a compensator. A kV x-ray beam modelmore » without including the compensator model was commissioned per each bowtie filter setting based on percent depth dose and lateral profile data acquired from Monte Carlo simulations. To validate the bowtie filter models, a rectangular water phantom was generated in the planning system and an anterior/posterior beam with each bowtie filter was created. Using the Pinnacle script, each bowtie filter compensator was added to the treatment plan. Lateral profile at the depth of 3cm and percent depth dose were measured using an ion chamber and compared with the data extracted from the treatment plans. Results: The kV x-ray beams for both full and half bowtie filter have been modeled in a commercial TPS. The difference of lateral and depth dose profiles between dose calculations and ion chamber measurements were within 6%. Conclusion: Both full/half bowtie filter models provide reasonable results in kV x-ray dose calculations in the water phantom. This study demonstrates the possibility of using a model-based treatment planning system to calculate the kV imaging dose for both full and half bowtie filter modes. Further study is to be performed to evaluate the models in clinical situations.« less

  9. Regional Precipitation Forecast with Atmospheric InfraRed Sounder (AIRS) Profile Assimilation

    NASA Technical Reports Server (NTRS)

    Chou, S.-H.; Zavodsky, B. T.; Jedloved, G. J.

    2010-01-01

    Advanced technology in hyperspectral sensors such as the Atmospheric InfraRed Sounder (AIRS; Aumann et al. 2003) on NASA's polar orbiting Aqua satellite retrieve higher vertical resolution thermodynamic profiles than their predecessors due to increased spectral resolution. Although these capabilities do not replace the robust vertical resolution provided by radiosondes, they can serve as a complement to radiosondes in both space and time. These retrieved soundings can have a significant impact on weather forecasts if properly assimilated into prediction models. Several recent studies have evaluated the performance of specific operational weather forecast models when AIRS data are included in the assimilation process. LeMarshall et al. (2006) concluded that AIRS radiances significantly improved 500 hPa anomaly correlations in medium-range forecasts of the Global Forecast System (GFS) model. McCarty et al. (2009) demonstrated similar forecast improvement in 0-48 hour forecasts in an offline version of the operational North American Mesoscale (NAM) model when AIRS radiances were assimilated at the regional scale. Reale et al. (2008) showed improvements to Northern Hemisphere 500 hPa height anomaly correlations in NASA's Goddard Earth Observing System Model, Version 5 (GEOS-5) global system with the inclusion of partly cloudy AIRS temperature profiles. Singh et al. (2008) assimilated AIRS temperature and moisture profiles into a regional modeling system for a study of a heavy rainfall event during the summer monsoon season in Mumbai, India. This paper describes an approach to assimilate AIRS temperature and moisture profiles into a regional configuration of the Advanced Research Weather Research and Forecasting (WRF-ARW) model using its three-dimensional variational (3DVAR) assimilation system (WRF-Var; Barker et al. 2004). Section 2 describes the AIRS instrument and how the quality indicators are used to intelligently select the highest-quality data for assimilation. Section 3 presents an overall precipitation improvement with AIRS assimilation during a 37-day case study period, and Section 4 focuses on a single case study to further investigate the meteorological impact of AIRS profiles on synoptic scale models. Finally, Section 5 provides a summary of the paper.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tribbia, Joseph

    NCAR brought the latest version of the Community Earth System Model (version 1, CESM1) into the mix of models in the NMME effort. This new version uses our newest atmospheric model CAM5 and produces a coupled climate and ENSO that are generally as good or better than those of the Community Climate System Model version 4 (CCSM4). Compared to CCSM4, the new coupled model has a superior climate response with respect to low clouds in both the subtropical stratus regimes and the Arctic. However, CESM1 has been run to date using a prognostic aerosol model that more than doubles itsmore » computational cost. We are currently evaluating a version of the new model using prescribed aerosols and expect it will be ready for integrations in summer 2012. Because of this NCAR has not been able to complete the hindcast integrations using the NCAR loosely-coupled ensemble Kalman filter assimilation method nor has it contributed to the current (Stage I) NMME operational utilization. The expectation is that this model will be included in the NMME in late 2012 or early 2013. The initialization method will utilize the Ensemble Kalman Filter Assimilation methods developed at NCAR using the Data Assimilation Research Testbed (DART) in conjunction with Jeff Anderson’s team in CISL. This methodology has been used in our decadal prediction contributions to CMIP5. During the course of this project, NCAR has setup and performed all the needed hindcast and forecast simulations and provide the requested fields to our collaborators. In addition, NCAR researchers have participated fully in research themes (i) and (ii). Specifically, i) we have begun to evaluate and optimize our system in hindcast mode, focusing on the optimal number of ensemble members, methodologies to recalibrate individual dynamical models, and accessing our forecasts across multiple time scales, i.e., beyond two weeks, and ii) we have begun investigation of the role of different ocean initial conditions in seasonal forecasts. The completion of the calibration hindcasts for Seasonal to Interannual (SI) predictions and the maintenance of the data archive associated with the NCAR portion of this effort has been the responsibility of the Project Scientist I (Alicia Karspeck) that was partially supported on this project.« less

  11. [Reconstruction assisted by 3D printing in maxillofacial surgery].

    PubMed

    Ernoult, C; Bouletreau, P; Meyer, C; Aubry, S; Breton, P; Bachelet, J-T

    2015-04-01

    3-dimensional models (3D) appeared in the medical field 20 years ago. The recent development of consumer 3D printers explains the renewed interest in this technology. We describe the technical and practical modalities of this surgical tool, illustrated by concrete examples. The OsiriX(®) software (version 5.8.5, Geneva, Switzerland) was used for 3D surface reconstruction of the area of interest, the generation and export of ".stl" file. The NetFabb(®) software (Basic version 5.1.1, Lupburg, Germany) provided the preparation of ".stl" file. The 3D-printer was an Up plus 2 Easy 120(®) (PP3DP, Beijing Technology Co. TierTime Ltd., Chine). The printer used fused deposition modeling. The softwar Up!(®) allowed the 3d impression as required. The first case illustrated the value of 3D printing in the upper (frontal sinus and orbital roof). The second case concerned the preconfiguration of the osteosynthesis material for a complex fracture of the midface through the "mirroring" system. The third case showed the conformation of a prereconstruction for segmental mandibulectomy. Current 3D-printers are easy to use and represent a promising solution for medical prototyping. The 3D printing will quickly become undeniable because of its advantages: information sharing, simulation, surgical guides, pedagogy. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  12. SMP: A solid modeling program version 2.0

    NASA Technical Reports Server (NTRS)

    Randall, D. P.; Jones, K. H.; Vonofenheim, W. H.; Gates, R. L.; Matthews, C. G.

    1986-01-01

    The Solid Modeling Program (SMP) provides the capability to model complex solid objects through the composition of primitive geometric entities. In addition to the construction of solid models, SMP has extensive facilities for model editing, display, and analysis. The geometric model produced by the software system can be output in a format compatible with existing analysis programs such as PATRAN-G. The present version of the SMP software supports six primitives: boxes, cones, spheres, paraboloids, tori, and trusses. The details for creating each of the major primitive types is presented. The analysis capabilities of SMP, including interfaces to existing analysis programs, are discussed.

  13. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION)

    NASA Technical Reports Server (NTRS)

    Riley, G.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  14. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (MACINTOSH VERSION)

    NASA Technical Reports Server (NTRS)

    Culbert, C.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  15. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION WITH CLIPSITS)

    NASA Technical Reports Server (NTRS)

    Riley, , .

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  16. JPSS-1 VIIRS version 2 at-launch relative spectral response characterization and performance

    NASA Astrophysics Data System (ADS)

    Moeller, Chris; Schwarting, Tom; McIntire, Jeff; Moyer, David I.; Zeng, Jinan

    2016-09-01

    The relative spectral response (RSR) characterization of the JPSS-1 VIIRS spectral bands has achieved "at launch" status in the VIIRS Data Analysis Working Group February 2016 Version 2 RSR release. The Version 2 release improves upon the June 2015 Version 1 release by including December 2014 NIST TSIRCUS spectral measurements of VIIRS VisNIR bands in the analysis plus correcting CO2 influence on the band M13 RSR. The T-SIRCUS based characterization is merged with the summer 2014 SpMA based characterization of VisNIR bands (Version 1 release) to yield a "fused" RSR for these bands, combining the strengths of the T-SIRCUS and the SpMA measurement systems. The M13 RSR is updated by applying a model-based correction to mitigate CO2 attenuation of the SpMA source signal that occurred during M13 spectral measurements. The Version 2 release carries forward the Version 1 RSR for those bands that were not updated (M8-M12, M14-M16A/B, I3-I5, DNBMGS). The Version 2 release includes band average (over all detectors and subsamples) RSR plus supporting RSR for each detector and subsample. The at-launch band average RSR have been used to populate Look-Up Tables supporting the sensor data record and environmental data record at-launch science products. Spectral performance metrics show that JPSS-1 VIIRS RSR are compliant on specifications with a few minor exceptions. The Version 2 release, which replaces the Version 1 release, is currently available on the password-protected NASA JPSS-1 eRooms under EAR99 control.

  17. Evaluation of a new CNRM-CM6 model version for seasonal climate predictions

    NASA Astrophysics Data System (ADS)

    Volpi, Danila; Ardilouze, Constantin; Batté, Lauriane; Dorel, Laurant; Guérémy, Jean-François; Déqué, Michel

    2017-04-01

    This work presents the quality assessment of a new version of the Météo-France coupled climate prediction system, which has been developed in the EU COPERNICUS Climate Change Services framework to carry out seasonal forecast. The system is based on the CNRM-CM6 model, with Arpege-Surfex 6.2.2 as atmosphere/land component and Nemo 3.2 as ocean component, which has directly embedded the sea-ice component Gelato 6.0. In order to have a robust diagnostic, the experiment is composed by 60 ensemble members generated with stochastic dynamic perturbations. The experiment has been performed over a 37-year re-forecast period from 1979 to 2015, with two start dates per year, respectively in May 1st and November 1st. The evaluation of the predictive skill of the model is shown under two perspectives: on the one hand, the ability of the model to faithfully respond to positive or negative ENSO, NAO and QBO events, independently of the predictability of these events. Such assessment is carried out through a composite analysis, and shows that the model succeeds in reproducing the main patterns for 2-meter temperature, precipitation and geopotential height at 500 hPa during the winter season. On the other hand, the model predictive skill of the same events (positive and negative ENSO, NAO and QBO) is evaluated.

  18. Parallel Unsteady Turbopump Simulations for Liquid Rocket Engines

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin C.; Kwak, Dochan; Chan, William

    2000-01-01

    This paper reports the progress being made towards complete turbo-pump simulation capability for liquid rocket engines. Space Shuttle Main Engine (SSME) turbo-pump impeller is used as a test case for the performance evaluation of the MPI and hybrid MPI/Open-MP versions of the INS3D code. Then, a computational model of a turbo-pump has been developed for the shuttle upgrade program. Relative motion of the grid system for rotor-stator interaction was obtained by employing overset grid techniques. Time-accuracy of the scheme has been evaluated by using simple test cases. Unsteady computations for SSME turbo-pump, which contains 136 zones with 35 Million grid points, are currently underway on Origin 2000 systems at NASA Ames Research Center. Results from time-accurate simulations with moving boundary capability, and the performance of the parallel versions of the code will be presented in the final paper.

  19. Efficient Preconditioning for the p-Version Finite Element Method in Two Dimensions

    DTIC Science & Technology

    1989-10-01

    paper, we study fast parallel preconditioners for systems of equations arising from the p-version finite element method. The p-version finite element...computations and the solution of a relatively small global auxiliary problem. We study two different methods. In the first (Section 3), the global...20], will be studied in the next section. Problem (3.12) is obviously much more easily solved than the original problem ,nd the procedure is highly

  20. Confirmatory factor analysis of 2 versions of the Brief Pain Inventory in an ambulatory population indicates that sleep interference should be interpreted separately.

    PubMed

    Walton, David M; Putos, Joseph; Beattie, Tyler; MacDermid, Joy C

    2016-07-01

    The Brief Pain Inventory (BPI-SF) is a widely-used generic pain interference scale, however its factor structure remains unclear. An expanded 10-item version of the Interference subscale has been proposed, but the additional value of the 3 extra items has not been rigorously evaluated. The purpose of this study was to evaluate and contrast the factorial and concurrent validity of the original 7-item and 10-item versions of the BPI-SF in a large heterogeneous sample of patients with chronic pain. Exploratory and confirmatory factor analyses were conducted on independent subsets of the sample, and concurrent correlations with scales capturing similar constructs were evaluated. Two independent exploratory factor analyses (n=500 each) supported a single interference factor in both the 7- and 10-item versions, while confirmatory factor analysis (N=1000) suggested that a 2-factor structure (Physical and Affective) provided better fit. A 3-factor model, where sleep interference was the third factor, improved in model fit further. There was no significant difference in model fit between the 7- and 10-item versions. Concurrent associations with measures of general health, pain intensity and pain-related cognitions were all in the anticipated direction and magnitude and were not different by version of the BPI-SF. The addition of 3 extra items to the original 7-item Interference subscale of the BPI-SF did not improve psychometric properties. The combined results lead us to endorse a 3-factor structure (Physical, Affective, and Sleep Interference) as the more statistically and conceptually sound option. Copyright © 2016 Scandinavian Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  1. A new version of code Java for 3D simulation of the CCA model

    NASA Astrophysics Data System (ADS)

    Zhang, Kebo; Xiong, Hailing; Li, Chao

    2016-07-01

    In this paper we present a new version of the program of CCA model. In order to benefit from the advantages involved in the latest technologies, we migrated the running environment from JDK1.6 to JDK1.7. And the old program was optimized into a new framework, so promoted extendibility.

  2. Mexican/Mexican American Adolescents and "Keepin' It REAL": An Evidence-Based Substance Use Prevention Program

    ERIC Educational Resources Information Center

    Kulis, Stephen; Marsiglia, Flavio F.; Elek, Elvira; Dustman, Patricia; Wagstaff, David A.; Hecht, Michael L.

    2005-01-01

    A randomized trial tested the efficacy of three curriculum versions teaching drug resistance strategies, one modeled on Mexican American culture; another modeled on European American and African American culture; and a multicultural version. Self-report data at baseline and 14 months post-intervention were obtained from 3,402 Mexican heritage…

  3. Description of input and examples for PHREEQC version 3: a computer program for speciation, batch-reaction, one-dimensional transport, and inverse geochemical calculations

    USGS Publications Warehouse

    Parkhurst, David L.; Appelo, C.A.J.

    2013-01-01

    PHREEQC version 3 is a computer program written in the C and C++ programming languages that is designed to perform a wide variety of aqueous geochemical calculations. PHREEQC implements several types of aqueous models: two ion-association aqueous models (the Lawrence Livermore National Laboratory model and WATEQ4F), a Pitzer specific-ion-interaction aqueous model, and the SIT (Specific ion Interaction Theory) aqueous model. Using any of these aqueous models, PHREEQC has capabilities for (1) speciation and saturation-index calculations; (2) batch-reaction and one-dimensional (1D) transport calculations with reversible and irreversible reactions, which include aqueous, mineral, gas, solid-solution, surface-complexation, and ion-exchange equilibria, and specified mole transfers of reactants, kinetically controlled reactions, mixing of solutions, and pressure and temperature changes; and (3) inverse modeling, which finds sets of mineral and gas mole transfers that account for differences in composition between waters within specified compositional uncertainty limits. Many new modeling features were added to PHREEQC version 3 relative to version 2. The Pitzer aqueous model (pitzer.dat database, with keyword PITZER) can be used for high-salinity waters that are beyond the range of application for the Debye-Hückel theory. The Peng-Robinson equation of state has been implemented for calculating the solubility of gases at high pressure. Specific volumes of aqueous species are calculated as a function of the dielectric properties of water and the ionic strength of the solution, which allows calculation of pressure effects on chemical reactions and the density of a solution. The specific conductance and the density of a solution are calculated and printed in the output file. In addition to Runge-Kutta integration, a stiff ordinary differential equation solver (CVODE) has been included for kinetic calculations with multiple rates that occur at widely different time scales. Surface complexation can be calculated with the CD-MUSIC (Charge Distribution MUltiSIte Complexation) triple-layer model in addition to the diffuse-layer model. The composition of the electrical double layer of a surface can be estimated by using the Donnan approach, which is more robust and faster than the alternative Borkovec-Westall integration. Multicomponent diffusion, diffusion in the electrostatic double layer on a surface, and transport of colloids with simultaneous surface complexation have been added to the transport module. A series of keyword data blocks has been added for isotope calculations—ISOTOPES, CALCULATE_VALUES, ISOTOPE_ALPHAS, ISOTOPE_RATIOS, and NAMED_EXPRESSIONS. Solution isotopic data can be input in conventional units (for example, permil, percent modern carbon, or tritium units) and the numbers are converted to moles of isotope by PHREEQC. The isotopes are treated as individual components (they must be defined as individual master species) so that each isotope has its own set of aqueous species, gases, and solids. The isotope-related keywords allow calculating equilibrium fractionation of isotopes among the species and phases of a system. The calculated isotopic compositions are printed in easily readable conventional units. New keywords and options facilitate the setup of input files and the interpretation of the results. Keyword data blocks can be copied (keyword COPY) and deleted (keyword DELETE). Keyword data items can be altered by using the keyword data blocks with the _MODIFY extension and a simulation can be run with all reactants of a given index number (keyword RUN_CELLS). The definition of the complete chemical state of all reactants of PHREEQC can be saved in a file in a raw data format ( DUMP and _RAW keywords). The file can be read as part of another input file with the INCLUDE$ keyword. These keywords facilitate the use of IPhreeqc, which is a module implementing all PHREEQC version 3 capabilities; the module is designed to be used in other programs that need to implement geochemical calculations; for example, transport codes. Charting capabilities have been added to some versions of PHREEQC. Charting capabilities have been added to Windows distributions of PHREEQC version 3. (Charting on Linux requires installation of Wine.) The keyword data block USER_GRAPH allows selection of data for plotting and manipulation of chart appearance. Almost any results from geochemical simulations (for example, concentrations, activities, or saturation indices) can be retrieved by using Basic language functions and specified as data for plotting in USER_GRAPH. Results of transport simulations can be plotted against distance or time. Data can be added to a chart from tab-separated-values files. All input for PHREEQC version 3 is defined in keyword data blocks, each of which may have a series of identifiers for specific types of data. This report provides a complete description of each keyword data block and its associated identifiers. Input files for 22 examples that demonstrate most of the capabilities of PHREEQC version 3 are described and the results of the example simulations are presented and discussed.

  4. Enhancements to the caliop aerosol subtyping and lidar ratio selection algorithms for level II version 4

    NASA Astrophysics Data System (ADS)

    Omar, A.; Tackett, J.; Kim, M.-H.; Vaughan, M.; Kar, J.; Trepte, C.; Winker, D.

    2018-04-01

    Several enhancements have been implemented for the version 4 aerosol subtyping and lidar ratio selection algorithms of Cloud Aerosol Lidar with Orthogonal Polarization (CALIOP). Version 4 eliminates the confusion between smoke and clean marine aerosols seen in version 3 by modifications to the elevated layer flag definitions used to identify smoke aerosols over the ocean. To differentiate between mixtures of dust and smoke, and dust and marine aerosols, a new aerosol type will be added in the version 4 data products. In the marine boundary layer, moderately depolarizing aerosols are no longer modeled as mixtures of dust and smoke (polluted dust) but rather as mixtures of dust and seasalt (dusty marine). Some lidar ratios have been updated in the version 4 algorithms. In particular, the dust lidar ratios have been adjusted to reflect the latest measurements and model studies.

  5. Standards for detailed clinical models as the basis for medical data exchange and decision support.

    PubMed

    Coyle, Joseph F; Mori, Angelo Rossi; Huff, Stanley M

    2003-03-01

    Detailed clinical models are necessary to exchange medical data between heterogeneous computer systems and to maintain consistency in a longitudinal electronic medical record system. At Intermountain Health Care (IHC), we have a history of designing detailed clinical models. The purpose of this paper is to share our experience and the lessons we have learned over the last 5 years. IHC's newest model is implemented using eXtensible Markup Language (XML) Schema as the formalism, and conforms to the Health Level Seven (HL7) version 3 data types. The centerpiece of the new strategy is the Clinical Event Model, which is a flexible name-value pair data structure that is tightly linked to a coded terminology. We describe IHC's third-generation strategy for representing and implementing detailed clinical models, and discuss the reasons for this design.

  6. The dynamic radiation environment assimilation model (DREAM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reeves, Geoffrey D; Koller, Josef; Tokar, Robert L

    2010-01-01

    The Dynamic Radiation Environment Assimilation Model (DREAM) is a 3-year effort sponsored by the US Department of Energy to provide global, retrospective, or real-time specification of the natural and potential nuclear radiation environments. The DREAM model uses Kalman filtering techniques that combine the strengths of new physical models of the radiation belts with electron observations from long-term satellite systems such as GPS and geosynchronous systems. DREAM includes a physics model for the production and long-term evolution of artificial radiation belts from high altitude nuclear explosions. DREAM has been validated against satellites in arbitrary orbits and consistently produces more accurate resultsmore » than existing models. Tools for user-specific applications and graphical displays are in beta testing and a real-time version of DREAM has been in continuous operation since November 2009.« less

  7. MERRA-2: File Specification

    NASA Technical Reports Server (NTRS)

    Bosilovich, M. G.; Lucchesi, R.; Suarez, M.

    2015-01-01

    The second Modern-Era Retrospective analysis for Research and Applications (MERRA-2) is a NASA atmospheric reanalysis that begins in 1980. It replaces the original MERRA reanalysis (Rienecker et al., 2011) using an upgraded version of the Goddard Earth Observing System Model, Version 5 (GEOS-5) data assimilation system. The file collections for MERRA-2 are described in detail in this document, including some important changes from those of the MERRA dataset (Lucchesi, 2012).

  8. A new vector radiative transfer model as a part of SCIATRAN 3.0 software package.

    NASA Astrophysics Data System (ADS)

    Rozanov, Alexei; Rozanov, Vladimir; Burrows, John P.

    The SCIATRAN 3.0 package is a result of further development of the SCIATRAN 2.x software family which, similar to previous versions, comprises a radiative transfer model and a retrieval block. A major improvement was achieved in comparison to previous software versions by adding the vector mode to the radiative transfer model. Thus, the well-established Discrete Ordinate solver can now be run in the vector mode to calculate the scattered solar radiation including polarization, i.e., to simulate all four components of the Stockes vector. Similar to the scalar version, the simulations can be performed for any viewing geometry typical for atmospheric observations in the UV-Vis-NIR spectral range (nadir, limb, off-axis, etc.) as well as for any observer position within or outside the Earth's atmosphere. Similar to the precursor version, the new model is freely available for non-commercial use via the web page of the University of Bremen. In this presentation a short description of the software package, especially of the new vector radiative transfer model will be given, including remarks on the availability for the scientific community. Furthermore, comparisons to other vector models will be shown and some example problems will be considered where the polarization of the observed radiation must be accounted for to obtain high quality results.

  9. Molecular Signatures and Diagnostic Biomarkers of Cumulative, Blast-Graded Mild TBI

    DTIC Science & Technology

    2013-10-01

    Kirk, Department of Mechanical and Aerospace Engineering, Florida Institute of Technology, Melbourne FL 32901 January 3 rd , 2013 2. Prima V...induced Neurotrauma “Neuro-glial and systemic mechanisms of pathological responses in rat models of primary blast overpressure compared to "composite...inflammation biomarkers such as L-selectin and s-ICAM involved in molecular mechanisms of blast-induced injury. The FIT prototype sensor (version 1) to

  10. Railroad Performance Model

    DOT National Transportation Integrated Search

    1977-10-01

    This report describes an operational, though preliminary, version of the Railroad Performance Model, which is a computer simulation model of the nation's railroad system. The ultimate purpose of this model is to predict the effect of changes in gover...

  11. Collaborative Modelling of the Vascular System--Designing and Evaluating a New Learning Method for Secondary Students

    ERIC Educational Resources Information Center

    Haugwitz, Marion; Sandmann, Angela

    2010-01-01

    Understanding biological structures and functions is often difficult because of their complexity and micro-structure. For example, the vascular system is a complex and only partly visible system. Constructing models to better understand biological functions is seen as a suitable learning method. Models function as simplified versions of real…

  12. The Community Climate System Model Version 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gent, Peter R.; Danabasoglu, Gokhan; Donner, Leo J.

    The fourth version of the Community Climate System Model (CCSM4) was recently completed and released to the climate community. This paper describes developments to all the CCSM components, and documents fully coupled pre-industrial control runs compared to the previous version, CCSM3. Using the standard atmosphere and land resolution of 1{sup o} results in the sea surface temperature biases in the major upwelling regions being comparable to the 1.4{sup o} resolution CCSM3. Two changes to the deep convection scheme in the atmosphere component result in the CCSM4 producing El Nino/Southern Oscillation variability with a much more realistic frequency distribution than themore » CCSM3, although the amplitude is too large compared to observations. They also improve the representation of the Madden-Julian Oscillation, and the frequency distribution of tropical precipitation. A new overflow parameterization in the ocean component leads to an improved simulation of the deep ocean density structure, especially in the North Atlantic. Changes to the CCSM4 land component lead to a much improved annual cycle of water storage, especially in the tropics. The CCSM4 sea ice component uses much more realistic albedos than the CCSM3, and the Arctic sea ice concentration is improved in the CCSM4. An ensemble of 20th century simulations runs produce an excellent match to the observed September Arctic sea ice extent from 1979 to 2005. The CCSM4 ensemble mean increase in globally-averaged surface temperature between 1850 and 2005 is larger than the observed increase by about 0.4 C. This is consistent with the fact that the CCSM4 does not include a representation of the indirect effects of aerosols, although other factors may come into play. The CCSM4 still has significant biases, such as the mean precipitation distribution in the tropical Pacific Ocean, too much low cloud in the Arctic, and the latitudinal distributions of short-wave and long-wave cloud forcings.« less

  13. Data analysis environment (DASH2000) for the Subaru telescope

    NASA Astrophysics Data System (ADS)

    Mizumoto, Yoshihiko; Yagi, Masafumi; Chikada, Yoshihiro; Ogasawara, Ryusuke; Kosugi, George; Takata, Tadafumi; Yoshida, Michitoshi; Ishihara, Yasuhide; Yanaka, Hiroshi; Yamamoto, Tadahiro; Morita, Yasuhiro; Nakamoto, Hiroyuki

    2000-06-01

    New framework of data analysis system (DASH) has been developed for the SUBARU Telescope. It is designed using object-oriented methodology and adopted a restaurant model. DASH shares the load of CPU and I/O among distributed heterogeneous computers. The distributed object environment of the system is implemented with JAVA and CORBA. DASH has been evaluated by several prototypings. DASH2000 is the latest version, which will be released as the beta version of data analysis system for the SUBARU Telescope.

  14. A Reference Model for Software and System Inspections. White Paper

    NASA Technical Reports Server (NTRS)

    He, Lulu; Shull, Forrest

    2009-01-01

    Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.

  15. System for assessing Aviation's Global Emissions (SAGE), part 1 : model description and inventory results

    DOT National Transportation Integrated Search

    2007-07-01

    In early 2001, the US Federal Aviation Administration embarked on a multi-year effort to develop a new computer model, the System for assessing Aviation's Global Emissions (SAGE). Currently at Version 1.5, the basic use of the model has centered on t...

  16. Umbra (core)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, Jon David; Oppel III, Fred J.; Hart, Brian E.

    Umbra is a flexible simulation framework for complex systems that can be used by itself for modeling, simulation, and analysis, or to create specific applications. It has been applied to many operations, primarily dealing with robotics and system of system simulations. This version, from 4.8 to 4.8.3b, incorporates bug fixes, refactored code, and new managed C++ wrapper code that can be used to bridge new applications written in C# to the C++ libraries. The new managed C++ wrapper code includes (project/directories) BasicSimulation, CSharpUmbraInterpreter, LogFileView, UmbraAboutBox, UmbraControls, UmbraMonitor and UmbraWrapper.

  17. Transforming reflectance spectra into Munsell color space by using prime colors.

    PubMed

    Romney, A Kimball; Fulton, James T

    2006-10-17

    Independent researchers have proved mathematically that, given a set of color-matching functions, there exists a unique set of three monochromatic spectral lights that optimizes luminous efficiency and color gamut. These lights are called prime colors. We present a method for transforming reflectance spectra into Munsell color space by using hypothetical absorbance curves based on Gaussian approximations of the prime colors and a simplified version of opponent process theory. The derived color appearance system is represented as a 3D color system that is qualitatively similar to a conceptual representation of the Munsell color system. We illustrate the application of the model and compare it with existing models by using reflectance spectra obtained from 1,269 Munsell color samples.

  18. A multi-decade record of high-quality fCO2 data in version 3 of the Surface Ocean CO2 Atlas (SOCAT)

    USGS Publications Warehouse

    Bakker, Dorothee; Landa, Camilla S.; Pfeil, Benjamin; Metzl, Nicolas; O’Brien, Kevin; Olsen, Are; Smith, Karl; Cosca, Cathy; Harasawa, Sumiko; Nakaoka, Shin-ichiro; Jones, Stephen; Nojiri, Yukihiro; Steinhoff, Tobias; Sweeney, Colm; Schuster, Ute; Takahashi, Taro; Tilbrook, Bronte; Wada, Chisato; Wanninkhof, Rik; Alin, Simone R.; Balestrini, Carlos F.; Barbero, Leticia; Bates, Nicholas; Bianchi, Alejandro A.; Bonou, Frédéric; Boutin, Jacqueline; Bozec, Yann; Burger, Eugene F.; Cai, Wei-Jun; Castle, Robert D.; Chen, Liqi; Chierici, Melissa; Currie, Kim; Evans, Wiley; Featherstone, Charles; Feely, Richard; Fransson, Agneta; Goyet, Catherine; Greenwood, Naomi; Gregor, Luke; Hankin, Steven C.; Hardman-Mountford, Nick J.; Harlay, Jérôme; Hauck, Judith; Hoppema, Mario; Humphreys, Matthew P.; Hunt, Christopher W.; Huss, Betty; Ibánhez, J. Severino P.; Johannessen, Truls; Keeling, Ralph F.; Kitidis, Vassilis; Körtzinger, Arne; Kozyr, Alex; Krasakopoulou, Evangelia; Kuwata, Akira; Landschützer, Peter; Lauvset, Siv K.; Lefèvre, Nathalie; Lo Monaco, Claire; Manke, Ansley; Mathis, Jeremy T.; Merlivat, Liliane; Millero, Frank J.; Monteiro, Pedro M. S.; Munro, David R.; Murata, Akihiko; Newberger, Timothy; Omar, Abdirahman M.; Ono, Tsuneo; Paterson, Kristina; Pearce, David; Pierrot, Denis; Robbins, Lisa L.; Saito, Shu; Salisbury, Joe; Schlitzer, Reiner; Schneider, Bernd; Schweitzer, Roland; Sieger, Rainer; Skjelvan, Ingunn; Sullivan, Kevin F.; Sutherland, Stewart C.; Sutton, Adrienne J.; Tadokoro, Kazuaki; Telszewski, Maciej; Tuma, Matthias; van Heuven, Steven M. A. C.; Vandemark, Douglas; Ward, Brian; Watson, Andrew J.; Xu, Suqing

    2016-01-01

    The Surface Ocean CO2 Atlas (SOCAT) is a synthesis of quality-controlled f CO2 (fugacity of carbon dioxide) values for the global surface oceans and coastal seas with regular updates. Version 3 of SOCAT has 14.7 million f CO2 values from 3646 data sets covering the years 1957 to 2014. This latest version has an additional 4.6 million f CO2 values relative to version 2 and extends the record from 2011 to 2014. Version 3 also significantly increases the data availability for 2005 to 2013. SOCAT has an average of approximately 1.2 million surface water f CO2 values per year for the years 2006 to 2012. Quality and documentation of the data has improved. A new feature is the data set quality control (QC) flag of E for data from alternative sensors and platforms. The accuracy of surface water f CO2 has been defined for all data set QC flags. Automated range checking has been carried out for all data sets during their upload into SOCAT. The upgrade of the interactive Data Set Viewer (previously known as the Cruise Data Viewer) allows better interrogation of the SOCAT data collection and rapid creation of high-quality figures for scientific presentations. Automated data upload has been launched for version 4 and will enable more frequent SOCAT releases in the future. High-profile scientific applications of SOCAT include quantification of the ocean sink for atmospheric carbon dioxide and its long-term variation, detection of ocean acidification, as well as evaluation of coupled-climate and ocean-only biogeochemical models. Users of SOCAT data products are urged to acknowledge the contribution of data providers, as stated in the SOCAT Fair Data Use Statement. This ESSD (Earth System Science Data) “living data” publication documents the methods and data sets used for the assembly of this new version of the SOCAT data collection and compares these with those used for earlier versions of the data collection (Pfeil et al., 2013; Sabine et al., 2013; Bakker et al., 2014). 

  19. The Community Climate System Model.

    NASA Astrophysics Data System (ADS)

    Blackmon, Maurice; Boville, Byron; Bryan, Frank; Dickinson, Robert; Gent, Peter; Kiehl, Jeffrey; Moritz, Richard; Randall, David; Shukla, Jagadish; Solomon, Susan; Bonan, Gordon; Doney, Scott; Fung, Inez; Hack, James; Hunke, Elizabeth; Hurrell, James; Kutzbach, John; Meehl, Jerry; Otto-Bliesner, Bette; Saravanan, R.; Schneider, Edwin K.; Sloan, Lisa; Spall, Michael; Taylor, Karl; Tribbia, Joseph; Washington, Warren

    2001-11-01

    The Community Climate System Model (CCSM) has been created to represent the principal components of the climate system and their interactions. Development and applications of the model are carried out by the U.S. climate research community, thus taking advantage of both wide intellectual participation and computing capabilities beyond those available to most individual U.S. institutions. This article outlines the history of the CCSM, its current capabilities, and plans for its future development and applications, with the goal of providing a summary useful to present and future users. The initial version of the CCSM included atmosphere and ocean general circulation models, a land surface model that was grafted onto the atmosphere model, a sea-ice model, and a flux coupler that facilitates information exchanges among the component models with their differing grids. This version of the model produced a successful 300-yr simulation of the current climate without artificial flux adjustments. The model was then used to perform a coupled simulation in which the atmospheric CO2 concentration increased by 1% per year. In this version of the coupled model, the ocean salinity and deep-ocean temperature slowly drifted away from observed values. A subsequent correction to the roughness length used for sea ice significantly reduced these errors. An updated version of the CCSM was used to perform three simulations of the twentieth century's climate, and several pro-jections of the climate of the twenty-first century. The CCSM's simulation of the tropical ocean circulation has been significantly improved by reducing the background vertical diffusivity and incorporating an anisotropic horizontal viscosity tensor. The meridional resolution of the ocean model was also refined near the equator. These changes have resulted in a greatly improved simulation of both the Pacific equatorial undercurrent and the surface countercurrents. The interannual variability of the sea surface temperature in the central and eastern tropical Pacific is also more realistic in simulations with the updated model. Scientific challenges to be addressed with future versions of the CCSM include realistic simulation of the whole atmosphere, including the middle and upper atmosphere, as well as the troposphere; simulation of changes in the chemical composition of the atmosphere through the incorporation of an integrated chemistry model; inclusion of global, prognostic biogeochemical components for land, ocean, and atmosphere; simulations of past climates, including times of extensive continental glaciation as well as times with little or no ice; studies of natural climate variability on seasonal-to-centennial timescales; and investigations of anthropogenic climate change. In order to make such studies possible, work is under way to improve all components of the model. Plans call for a new version of the CCSM to be released in 2002. Planned studies with the CCSM will require much more computer power than is currently available.

  20. The SRFR 5 modeling system for surface irrigation

    USDA-ARS?s Scientific Manuscript database

    The SRFR program is a modeling system for surface irrigation. It is a central component of WinSRFR, a software package for the hydraulic analysis of surface irrigation systems. SRFR solves simplified versions of the equations of unsteady open channel flow coupled to a user selected infiltration mod...

  1. OASIS: PARAMETER ESTIMATION SYSTEM FOR AQUIFER RESTORATION MODELS, USER'S MANUAL VERSION 2.0

    EPA Science Inventory

    OASIS, a decision support system for ground water contaminant modeling, has been developed for the CPA by Rice University, through the National Center for Ground Water Research. As a decision support system, OASIS was designed to provide a set of tools which will help scientists ...

  2. Phenology of forest-grassland transition zones in the Community Land Model

    NASA Astrophysics Data System (ADS)

    Dahlin, K.; Fisher, R. A.

    2013-12-01

    Forest-grassland transition zones (savannas, woodlands, wooded grasslands, and shrublands) are highly sensitive to climate and may already be changing due to warming, changes in precipitation patterns, and/or CO2 fertilization. Shifts between closed canopy forest and open grassland, as well as shifts in phenology, could have large impacts on the global carbon cycle, water balance, albedo, and on the humans and other animals that depend on these regions. From an earth system perspective these impacts may then feed back into the climate system and impact how, when, and where climate change occurs. Here we compare 29 years of monthly leaf area index (LAI) outputs from several offline versions of the Community Land Model (CLM), the land component of the Community Earth System Model, to LAI derived from the AVHRR NDVI3g product (LAI3g). Specifically, we focus on seasonal patterns in regions dominated by tropical broadleaved deciduous trees (T-BDT), broadleaved deciduous shrubs (BDS) and grasslands (C3 and C4) in CLM, all of which follow a 'stress deciduous' phenological algorithm. We consider and compare two versions of CLM (v. 4CN and v. 4.5BGC) to the satellite derived product. We found that both versions of CLM were able to capture seasonal variations in grasslands relatively well at the regional scale, but that the 'stress deciduous' phenology algorithm did not perform well in areas dominated by T-BDT or BDS. When we compared the performance of the models at single points we found slight improvements in CLM4.5BGC over CLM4CN, but generally that the magnitude of seasonality was too low in CLM as compared to the LAI3g satellite product. To explore the parameters within CLM that had the most leverage on seasonality of LAI, we used a Latin hypercube approach to vary values for critical soil water potential (threshold at which plants drop leaves), the critical number of days that soil water potential must be too low for leaves to drop, and the carbon allocation scheme. In single-point simulations we found that changing how carbon is allocated improved the 'flat-topped' nature of the CLM LAI during summer, which is not present in LAI3g, while adjustments to the soil water potential parameters allowed for less extreme and fewer switches between leaf-on and leaf-off. Future work will include applying a subset of the new parameter values to global runs of the model to assess whether the improvements to phenology at single points improve global phenological patterns and/or other components of the CLM carbon cycle.

  3. Comparison of the Tangent Linear Properties of Tracer Transport Schemes Applied to Geophysical Problems.

    NASA Technical Reports Server (NTRS)

    Kent, James; Holdaway, Daniel

    2015-01-01

    A number of geophysical applications require the use of the linearized version of the full model. One such example is in numerical weather prediction, where the tangent linear and adjoint versions of the atmospheric model are required for the 4DVAR inverse problem. The part of the model that represents the resolved scale processes of the atmosphere is known as the dynamical core. Advection, or transport, is performed by the dynamical core. It is a central process in many geophysical applications and is a process that often has a quasi-linear underlying behavior. However, over the decades since the advent of numerical modelling, significant effort has gone into developing many flavors of high-order, shape preserving, nonoscillatory, positive definite advection schemes. These schemes are excellent in terms of transporting the quantities of interest in the dynamical core, but they introduce nonlinearity through the use of nonlinear limiters. The linearity of the transport schemes used in Goddard Earth Observing System version 5 (GEOS-5), as well as a number of other schemes, is analyzed using a simple 1D setup. The linearized version of GEOS-5 is then tested using a linear third order scheme in the tangent linear version.

  4. Preliminary Thermal Modeling of Hi-Storm 100S-218 Version B Storage Modules at Hope Creek Nuclear Power Station ISFSI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuta, Judith M.; Adkins, Harold E.

    2013-08-30

    This report fulfills the M3 milestone M3FT-13PN0810022, “Report on Inspection 1”, under Work Package FT-13PN081002. Thermal analysis is being undertaken at Pacific Northwest National Laboratory (PNNL) in support of inspections of selected storage modules at various locations around the United States, as part of the Used Fuel Disposition Campaign of the U.S. Department of Energy, Office of Nuclear Energy (DOE-NE) Fuel Cycle Research and Development. This report documents pre-inspection predictions of temperatures for four modules at the Hope Creek Nuclear Generating Station ISFSI that have been identified as candidates for inspection in late summer or early fall/winter of 2013. Thesemore » are HI-STORM 100S-218 Version B modules storing BWR 8x8 fuel in MPC-68 canisters. The temperature predictions reported in this document were obtained with detailed COBRA-SFS models of these four storage systems, with the following boundary conditions and assumptions.« less

  5. Status of the NASA GMAO Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Prive, Nikki C.; Errico, Ronald M.

    2014-01-01

    An Observing System Simulation Experiment (OSSE) is a pure modeling study used when actual observations are too expensive or difficult to obtain. OSSEs are valuable tools for determining the potential impact of new observing systems on numerical weather forecasts and for evaluation of data assimilation systems (DAS). An OSSE has been developed at the NASA Global Modeling and Assimilation Office (GMAO, Errico et al 2013). The GMAO OSSE uses a 13-month integration of the European Centre for Medium- Range Weather Forecasts 2005 operational model at T511/L91 resolution for the Nature Run (NR). Synthetic observations have been updated so that they are based on real observations during the summer of 2013. The emulated observation types include AMSU-A, MHS, IASI, AIRS, and HIRS4 radiance data, GPS-RO, and conventional types including aircraft, rawinsonde, profiler, surface, and satellite winds. The synthetic satellite wind observations are colocated with the NR cloud fields, and the rawinsondes are advected during ascent using the NR wind fields. Data counts for the synthetic observations are matched as closely as possible to real data counts, as shown in Figure 2. Errors are added to the synthetic observations to emulate representativeness and instrument errors. The synthetic errors are calibrated so that the statistics of observation innovation and analysis increments in the OSSE are similar to the same statistics for assimilation of real observations, in an iterative method described by Errico et al (2013). The standard deviations of observation minus forecast (xo-H(xb)) are compared for the OSSE and real data in Figure 3. The synthetic errors include both random, uncorrelated errors, and an additional correlated error component for some observational types. Vertically correlated errors are included for conventional sounding data and GPS-RO, and channel correlated errors are introduced to AIRS and IASI (Figure 4). HIRS, AMSU-A, and MHS have a component of horizontally correlated error. The forecast model used by the GMAO OSSE is the Goddard Earth Observing System Model, Version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) DAS. The model version has been updated to v. 5.13.3, corresponding to the current operational model. Forecasts are run on a cube-sphere grid with 180 points along each edge of the cube (approximately 0.5 degree horizontal resolution) with 72 vertical levels. The DAS is cycled at 6-hour intervals, with 240 hour forecasts launched daily at 0000 UTC. Evaluation of the forecasting skill for July and August is currently underway. Prior versions of the GMAO OSSE have been found to have greater forecasting skill than real world forecasts. It is anticipated that similar forecast skill will be found in the updated OSSE.

  6. SMMP v. 3.0—Simulating proteins and protein interactions in Python and Fortran

    NASA Astrophysics Data System (ADS)

    Meinke, Jan H.; Mohanty, Sandipan; Eisenmenger, Frank; Hansmann, Ulrich H. E.

    2008-03-01

    We describe a revised and updated version of the program package SMMP. SMMP is an open-source FORTRAN package for molecular simulation of proteins within the standard geometry model. It is designed as a simple and inexpensive tool for researchers and students to become familiar with protein simulation techniques. SMMP 3.0 sports a revised API increasing its flexibility, an implementation of the Lund force field, multi-molecule simulations, a parallel implementation of the energy function, Python bindings, and more. Program summaryTitle of program:SMMP Catalogue identifier:ADOJ_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADOJ_v3_0.html Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions:Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html Programming language used:FORTRAN, Python No. of lines in distributed program, including test data, etc.:52 105 No. of bytes in distributed program, including test data, etc.:599 150 Distribution format:tar.gz Computer:Platform independent Operating system:OS independent RAM:2 Mbytes Classification:3 Does the new version supersede the previous version?:Yes Nature of problem:Molecular mechanics computations and Monte Carlo simulation of proteins. Solution method:Utilizes ECEPP2/3, FLEX, and Lund potentials. Includes Monte Carlo simulation algorithms for canonical, as well as for generalized ensembles. Reasons for new version:API changes and increased functionality. Summary of revisions:Added Lund potential; parameters used in subroutines are now passed as arguments; multi-molecule simulations; parallelized energy calculation for ECEPP; Python bindings. Restrictions:The consumed CPU time increases with the size of protein molecule. Running time:Depends on the size of the simulated molecule.

  7. A Revised Thermosphere for the Mars Global Reference Atmospheric Model (Mars-GRAM Version 3.4)

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Johnson, D. L.; James, B. F.

    1996-01-01

    This report describes the newly-revised model thermosphere for the Mars Global Reference Atmospheric Model (Mars-GRAM, Version 3.4). It also provides descriptions of other changes made to the program since publication of the programmer's guide for Mars-GRAM Version 3.34. The original Mars-GRAM model thermosphere was based on the global-mean model of Stewart. The revised thermosphere is based largely on parameterizations derived from output data from the three-dimensional Mars Thermospheric Global Circulation Model (MTGCM). The new thermospheric model includes revised dependence on the 10.7 cm solar flux for the global means of exospheric temperature, temperature of the base of the thermosphere, and scale height for the thermospheric temperature variations, as well as revised dependence on orbital position for global mean height of the base of the thermosphere. Other features of the new thermospheric model are: (1) realistic variations of temperature and density with latitude and time of day, (2) more realistic wind magnitudes, based on improved estimates of horizontal pressure gradients, and (3) allowance for user-input adjustments to the model values for mean exospheric temperature and for height and temperature at the base of the thermosphere. Other new features of Mars-GRAM 3.4 include: (1) allowance for user-input values of climatic adjustment factors for temperature profiles from the surface to 75 km, and (2) a revised method for computing the sub-solar longitude position in the 'ORBIT' subroutine.

  8. Next-to-minimal SOFTSUSY

    NASA Astrophysics Data System (ADS)

    Allanach, B. C.; Athron, P.; Tunstall, Lewis C.; Voigt, A.; Williams, A. G.

    2014-09-01

    We describe an extension to the SOFTSUSY program that provides for the calculation of the sparticle spectrum in the Next-to-Minimal Supersymmetric Standard Model (NMSSM), where a chiral superfield that is a singlet of the Standard Model gauge group is added to the Minimal Supersymmetric Standard Model (MSSM) fields. Often, a Z3 symmetry is imposed upon the model. SOFTSUSY can calculate the spectrum in this case as well as the case where general Z3 violating (denoted as =) terms are added to the soft supersymmetry breaking terms and the superpotential. The user provides a theoretical boundary condition for the couplings and mass terms of the singlet. Radiative electroweak symmetry breaking data along with electroweak and CKM matrix data are used as weak-scale boundary conditions. The renormalisation group equations are solved numerically between the weak scale and a high energy scale using a nested iterative algorithm. This paper serves as a manual to the NMSSM mode of the program, detailing the approximations and conventions used. Catalogue identifier: ADPM_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPM_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154886 No. of bytes in distributed program, including test data, etc.: 1870890 Distribution format: tar.gz Programming language: C++, fortran. Computer: Personal computer. Operating system: Tested on Linux 3.x. Word size: 64 bits Classification: 11.1, 11.6. Does the new version supersede the previous version?: Yes Catalogue identifier of previous version: ADPM_v3_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 785 Nature of problem: Calculating supersymmetric particle spectrum and mixing parameters in the next-to-minimal supersymmetric standard model. The solution to the renormalisation group equations must be consistent with boundary conditions on supersymmetry breaking parameters, as well as on the weak-scale boundary condition on gauge couplings, Yukawa couplings and the Higgs potential parameters. Solution method: Nested iterative algorithm and numerical minimisation of the Higgs potential. Reasons for new version: Major extension to include the next-to-minimal supersymmetric standard model. Summary of revisions: Added additional supersymmetric and supersymmetry breaking parameters associated with the additional gauge singlet. Electroweak symmetry breaking conditions are significantly changed in the next-to-minimal mode, and some sparticle mixing changes. An interface to NMSSMTools has also been included. Some of the object structure has also changed, and the command line interface has been made more user friendly. Restrictions: SOFTSUSY will provide a solution only in the perturbative regime and it assumes that all couplings of the model are real (i.e. CP-conserving). If the parameter point under investigation is non-physical for some reason (for example because the electroweak potential does not have an acceptable minimum), SOFTSUSY returns an error message. Running time: A few seconds per parameter point.

  9. Brugga basin's TACD Model Adaptation to current GIS PCRaster 4.1

    NASA Astrophysics Data System (ADS)

    Lopez Rozo, Nicolas Antonio; Corzo Perez, Gerald Augusto; Santos Granados, Germán Ricardo

    2017-04-01

    The process-oriented catchment model TACD (Tracer-Aided Catchment model - Distributed) was developed in the Brugga Basin (Dark Forest, Germany) with a modular structure in the Geographic Information System PCRaster Version 2, in order to dynamically model the natural processes of a complex Basin, such as rainfall, air temperature, solar radiation, evapotranspiration and flow routing among others. Further research and application on this model has been done, such as adapting other meso-scaled basins and adding erosion processes in the hydrological model. However, TACD model is computationally intensive. This has made it not efficient on large and well discretized river basins. Aswell, the current version is not compatible with latest PCRaster Version 4.1, which offers new capabilities on 64-bit hardware architecture, hydraulic calculation improvements, in maps creation, some error and bug fixes. The current work studied and adapted TACD model into the latest GIS PCRaster Version 4.1. This was done by editing the original scripts, replacing deprecated functionalities without losing correctness of the TACD model. The correctness of the adapted TACD model was verified by using the original study case of the Brugga Basin and comparing the adapted model results with the original model results by Stefan Roser in 2001. Small differences were found due to the fact that some hydraulic and hydrological routines were optimized since version 2 of GIS PCRaster. Therefore, the hydraulic and hydrological processes are well represented. With this new working model, further research and development on current topics like uncertainty analysis, GCM downscaling techniques and spatio-temporal modelling are encouraged.

  10. Groundwars Version 5.0. User’s Guide

    DTIC Science & Technology

    1992-08-01

    model, Monte Carlo, land duel , heterogeneous forces, TANKWARS, target acquisition, combat survivability 19. ABSTRACT (Continue on reverse if necessary...land duel between two heterogeneous forces. The model simuJ.ates individual weapon systems and employs Monte Carlo probability theory as its primary...is a weapon systems effectiveness model which provides the results of a land duel between two forces. The model simulates individual weapon systems

  11. MATILDA Version 2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain - Part I

    DTIC Science & Technology

    2017-03-13

    support of airborne laser designator use during test and training exercises on military ranges. The initial MATILDA tool, MATILDA PRO Version-1.6.1...was based on the 2007 PRA model developed to perform range safety clearances for the UK Thermal Imaging Airborne Laser Designator (TIALD) system...AFRL Technical Reports. This Technical Report, designated Part I, con- tains documentation of the computational procedures for probabilistic fault

  12. Development of a GIS-based spill management information system.

    PubMed

    Martin, Paul H; LeBoeuf, Eugene J; Daniel, Edsel B; Dobbins, James P; Abkowitz, Mark D

    2004-08-30

    Spill Management Information System (SMIS) is a geographic information system (GIS)-based decision support system designed to effectively manage the risks associated with accidental or intentional releases of a hazardous material into an inland waterway. SMIS provides critical planning and impact information to emergency responders in anticipation of, or following such an incident. SMIS couples GIS and database management systems (DBMS) with the 2-D surface water model CE-QUAL-W2 Version 3.1 and the air contaminant model Computer-Aided Management of Emergency Operations (CAMEO) while retaining full GIS risk analysis and interpretive capabilities. Live 'real-time' data links are established within the spill management software to utilize current meteorological information and flowrates within the waterway. Capabilities include rapid modification of modeling conditions to allow for immediate scenario analysis and evaluation of 'what-if' scenarios. The functionality of the model is illustrated through a case study of the Cheatham Reach of the Cumberland River near Nashville, TN.

  13. Three-dimensional computer-assisted study model analysis of long-term oral-appliance wear. Part 1: Methodology.

    PubMed

    Chen, Hui; Lowe, Alan A; de Almeida, Fernanda Riberiro; Wong, Mary; Fleetham, John A; Wang, Bangkang

    2008-09-01

    The aim of this study was to test a 3-dimensional (3D) computer-assisted dental model analysis system that uses selected landmarks to describe tooth movement during treatment with an oral appliance. Dental casts of 70 patients diagnosed with obstructive sleep apnea and treated with oral appliances for a mean time of 7 years 4 months were evaluated with a 3D digitizer (MicroScribe-3DX, Immersion, San Jose, Calif) compatible with the Rhinoceros modeling program (version 3.0 SR3c, Robert McNeel & Associates, Seattle, Wash). A total of 86 landmarks on each model were digitized, and 156 variables were calculated as either the linear distance between points or the distance from points to reference planes. Four study models for each patient (maxillary baseline, mandibular baseline, maxillary follow-up, and mandibular follow-up) were superimposed on 2 sets of reference points: 3 points on the palatal rugae for maxillary model superimposition, and 3 occlusal contact points for the same set of maxillary and mandibular model superimpositions. The patients were divided into 3 evaluation groups by 5 orthodontists based on the changes between baseline and follow-up study models. Digital dental measurements could be analyzed, including arch width, arch length, curve of Spee, overbite, overjet, and the anteroposterior relationship between the maxillary and mandibular arches. A method error within 0.23 mm in 14 selected variables was found for the 3D system. The statistical differences in the 3 evaluation groups verified the division criteria determined by the orthodontists. The system provides a method to record 3D measurements of study models that permits computer visualization of tooth position and movement from various perspectives.

  14. How Well Has Global Ocean Heat Content Variability Been Measured?

    NASA Astrophysics Data System (ADS)

    Nelson, A.; Weiss, J.; Fox-Kemper, B.; Fabienne, G.

    2016-12-01

    We introduce a new strategy that uses synthetic observations of an ensemble of model simulations to test the fidelity of an observational strategy, quantifying how well it captures the statistics of variability. We apply this test to the 0-700m global ocean heat content anomaly (OHCA) as observed with in-situ measurements by the Coriolis Dataset for Reanalysis (CORA), using the Community Climate System Model (CCSM) version 3.5. One-year running mean OHCAs for the years 2005 onward are found to faithfully capture the variability. During these years, synthetic observations of the model are strongly correlated at 0.94±0.06 with the actual state of the model. Overall, sub-annual variability and data before 2005 are significantly affected by the variability of the observing system. In contrast, the sometimes-used weighted integral of observations is not a good indicator of OHCA as variability in the observing system contaminates dynamical variability.

  15. Factor Structure of the Hare Psychopathy Checklist: Youth Version in German Female and Male Detainees and Community Adolescents

    ERIC Educational Resources Information Center

    Sevecke, Kathrin; Pukrop, Ralf; Kosson, David S.; Krischer, Maya K.

    2009-01-01

    Substantial evidence exists for 3- and 4-factor models of psychopathy underlying patterns of covariation among the items of the Psychopathy Checklist-Revised (PCL-R) in diverse adult samples. Although initial studies conducted with the Psychopathy Checklist: Youth Version (PCL:YV) indicated reasonable fit for these models in incarcerated male…

  16. An integrated crop and hydrologic modeling system to estimate hydrologic impacts of crop irrigation demands

    Treesearch

    R.T. McNider; C. Handyside; K. Doty; W.L. Ellenburg; J.F. Cruise; J.R. Christy; D. Moss; V. Sharda; G. Hoogenboom; Peter Caldwell

    2015-01-01

    The present paper discusses a coupled gridded crop modeling and hydrologic modeling system that can examine the benefits of irrigation and costs of irrigation and the coincident impact of the irrigation water withdrawals on surface water hydrology. The system is applied to the Southeastern U.S. The system tools to be discussed include a gridded version (GriDSSAT) of...

  17. Performance study of LMS based adaptive algorithms for unknown system identification

    NASA Astrophysics Data System (ADS)

    Javed, Shazia; Ahmad, Noor Atinah

    2014-07-01

    Adaptive filtering techniques have gained much popularity in the modeling of unknown system identification problem. These techniques can be classified as either iterative or direct. Iterative techniques include stochastic descent method and its improved versions in affine space. In this paper we present a comparative study of the least mean square (LMS) algorithm and some improved versions of LMS, more precisely the normalized LMS (NLMS), LMS-Newton, transform domain LMS (TDLMS) and affine projection algorithm (APA). The performance evaluation of these algorithms is carried out using adaptive system identification (ASI) model with random input signals, in which the unknown (measured) signal is assumed to be contaminated by output noise. Simulation results are recorded to compare the performance in terms of convergence speed, robustness, misalignment, and their sensitivity to the spectral properties of input signals. Main objective of this comparative study is to observe the effects of fast convergence rate of improved versions of LMS algorithms on their robustness and misalignment.

  18. Performance study of LMS based adaptive algorithms for unknown system identification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Javed, Shazia; Ahmad, Noor Atinah

    Adaptive filtering techniques have gained much popularity in the modeling of unknown system identification problem. These techniques can be classified as either iterative or direct. Iterative techniques include stochastic descent method and its improved versions in affine space. In this paper we present a comparative study of the least mean square (LMS) algorithm and some improved versions of LMS, more precisely the normalized LMS (NLMS), LMS-Newton, transform domain LMS (TDLMS) and affine projection algorithm (APA). The performance evaluation of these algorithms is carried out using adaptive system identification (ASI) model with random input signals, in which the unknown (measured) signalmore » is assumed to be contaminated by output noise. Simulation results are recorded to compare the performance in terms of convergence speed, robustness, misalignment, and their sensitivity to the spectral properties of input signals. Main objective of this comparative study is to observe the effects of fast convergence rate of improved versions of LMS algorithms on their robustness and misalignment.« less

  19. Behavioural Regulation in Exercise Questionnaire in people with schizophrenia: construct validity of the Portuguese versions.

    PubMed

    Costa, Raquel; Probst, Michel; Bastos, Tânia; Vilhena, Estela; Seabra, André; Corredeira, Rui

    2017-06-22

    People with schizophrenia have low physical activity levels that can be explained by the restriction in motivation. The Behavioural Regulation in Exercise Questionnaire-2 is a 19-item scale commonly used to assess five different motivational subtypes for physical activity. However, there are limited psychometric analyses of this version in the schizophrenia context. Moreover, there is a lack of information related to the psychometric properties of version 3 of this questionnaire, with 24 items and six different motivational subtypes. The aim of this study was to examine the construct validity of both Portuguese versions in people with schizophrenia. A total of 118 persons with schizophrenia were included (30 women). Cronbach's alpha was used for internal consistency, Pearson's correlation for the retained motivation-types, confirmatory factor analysis for the structural validity of version 2 and exploratory factor analysis for the factor structure of version 3. Analyses of version 2 provided an adequate fit index for the structure of the five factors. Exploratory analyses suggested retaining 2 factors of version 3. The results of this study suggest that version 3 was an appropriate measure to assess controlled and autonomous motivation for physical activity in people with schizophrenia and support its use in clinical practice and research. Implications for Rehabilitation This study supports the need to identify the reasons why people with schizophrenia practice physical activity. For that purpose, it is important to use valid and cost-effective instruments. The Portuguese version of BREQ-2 confirmed a 5-factor model and showed adequate fit for the application in people with schizophrenia. However, the incremental indices values were lower than expected. The Portuguese version of BREQ-3 showed acceptable psychometric properties to assess controlled and autonomous motivation for physical activity in people with schizophrenia.

  20. Opendf - An Implementation of the Dual Fermion Method for Strongly Correlated Systems

    NASA Astrophysics Data System (ADS)

    Antipov, Andrey E.; LeBlanc, James P. F.; Gull, Emanuel

    The dual fermion method is a multiscale approach for solving lattice problems of interacting strongly correlated systems. In this paper, we present the opendfcode, an open-source implementation of the dual fermion method applicable to fermionic single- orbital lattice models in dimensions D = 1, 2, 3 and 4. The method is built on a dynamical mean field starting point, which neglects all local correlations, and perturbatively adds spatial correlations. Our code is distributed as an open-source package under the GNU public license version 2.

  1. Spectrum orbit utilization program technical manual SOUP5 Version 3.8

    NASA Technical Reports Server (NTRS)

    Davidson, J.; Ottey, H. R.; Sawitz, P.; Zusman, F. S.

    1984-01-01

    The underlying engineering and mathematical models as well as the computational methods used by the SOUP5 analysis programs, which are part of the R2BCSAT-83 Broadcast Satellite Computational System, are described. Included are the algorithms used to calculate the technical parameters and references to the relevant technical literature. The system provides the following capabilities: requirements file maintenance, data base maintenance, elliptical satellite beam fitting to service areas, plan synthesis from specified requirements, plan analysis, and report generation/query. Each of these functions are briefly described.

  2. Computational models for the analysis/design of hypersonic scramjet components. I - Combustor and nozzle models

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Sinha, N.; Wolf, D. E.; York, B. J.

    1986-01-01

    An overview of computational models developed for the complete, design-oriented analysis of a scramjet propulsion system is provided. The modular approach taken involves the use of different PNS models to analyze the individual propulsion system components. The external compression and internal inlet flowfields are analyzed by the SCRAMP and SCRINT components discussed in Part II of this paper. The combustor is analyzed by the SCORCH code which is based upon SPLITP PNS pressure-split methodology formulated by Dash and Sinha. The nozzle is analyzed by the SCHNOZ code which is based upon SCIPVIS PNS shock-capturing methodology formulated by Dash and Wolf. The current status of these models, previous developments leading to this status, and, progress towards future hybrid and 3D versions are discussed in this paper.

  3. Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0

    NASA Technical Reports Server (NTRS)

    Knox, J. C.

    1996-01-01

    The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.

  4. Montage Version 3.0

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph; Katz, Daniel; Prince, Thomas; Berriman, Graham; Good, John; Laity, Anastasia

    2006-01-01

    The final version (3.0) of the Montage software has been released. To recapitulate from previous NASA Tech Briefs articles about Montage: This software generates custom, science-grade mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. This software can be executed on single-processor computers, multi-processor computers, and such networks of geographically dispersed computers as the National Science Foundation s TeraGrid or NASA s Information Power Grid. The primary advantage of running Montage in a grid environment is that computations can be done on a remote supercomputer for efficiency. Multiple computers at different sites can be used for different parts of a computation a significant advantage in cases of computations for large mosaics that demand more processor time than is available at any one site. Version 3.0 incorporates several improvements over prior versions. The most significant improvement is that this version is accessible to scientists located anywhere, through operational Web services that provide access to data from several large astronomical surveys and construct mosaics on either local workstations or remote computational grids as needed.

  5. User's guide for mapIMG 3--Map image re-projection software package

    USGS Publications Warehouse

    Finn, Michael P.; Mattli, David M.

    2012-01-01

    Version 0.0 (1995), Dan Steinwand, U.S. Geological Survey (USGS)/Earth Resources Observation Systems (EROS) Data Center (EDC)--Version 0.0 was a command line version for UNIX that required four arguments: the input metadata, the output metadata, the input data file, and the output destination path. Version 1.0 (2003), Stephen Posch and Michael P. Finn, USGS/Mid-Continent Mapping Center (MCMC--Version 1.0 added a GUI interface that was built using the Qt library for cross platform development. Version 1.01 (2004), Jason Trent and Michael P. Finn, USGS/MCMC--Version 1.01 suggested bounds for the parameters of each projection. Support was added for larger input files, storage of the last used input and output folders, and for TIFF/ GeoTIFF input images. Version 2.0 (2005), Robert Buehler, Jason Trent, and Michael P. Finn, USGS/National Geospatial Technical Operations Center (NGTOC)--Version 2.0 added Resampling Methods (Mean, Mode, Min, Max, and Sum), updated the GUI design, and added the viewer/pre-viewer. The metadata style was changed to XML and was switched to a new naming convention. Version 3.0 (2009), David Mattli and Michael P. Finn, USGS/Center of Excellence for Geospatial Information Science (CEGIS)--Version 3.0 brings optimized resampling methods, an updated GUI, support for less than global datasets, UTM support and the whole codebase was ported to Qt4.

  6. PAN AIR: A computer program for predicting subsonic or supersonic linear potential flows about arbitrary configurations using a higher order panel method. Volume 4: Maintenance document (version 3.0)

    NASA Technical Reports Server (NTRS)

    Purdon, David J.; Baruah, Pranab K.; Bussoletti, John E.; Epton, Michael A.; Massena, William A.; Nelson, Franklin D.; Tsurusaki, Kiyoharu

    1990-01-01

    The Maintenance Document Version 3.0 is a guide to the PAN AIR software system, a system which computes the subsonic or supersonic linear potential flow about a body of nearly arbitrary shape, using a higher order panel method. The document describes the overall system and each program module of the system. Sufficient detail is given for program maintenance, updating, and modification. It is assumed that the reader is familiar with programming and CRAY computer systems. The PAN AIR system was written in FORTRAN 4 language except for a few CAL language subroutines which exist in the PAN AIR library. Structured programming techniques were used to provide code documentation and maintainability. The operating systems accommodated are COS 1.11, COS 1.12, COS 1.13, and COS 1.14 on the CRAY 1S, 1M, and X-MP computing systems. The system is comprised of a data base management system, a program library, an execution control module, and nine separate FORTRAN technical modules. Each module calculates part of the posed PAN AIR problem. The data base manager is used to communicate between modules and within modules. The technical modules must be run in a prescribed fashion for each PAN AIR problem. In order to ease the problem of supplying the many JCL cards required to execute the modules, a set of CRAY procedures (PAPROCS) was created to automatically supply most of the JCL cards. Most of this document has not changed for Version 3.0. It now, however, strictly applies only to PAN AIR version 3.0. The major changes are: (1) additional sections covering the new FDP module (which calculates streamlines and offbody points); (2) a complete rewrite of the section on the MAG module; and (3) strict applicability to CRAY computing systems.

  7. Basin-Scale Assessment of the Land Surface Water Budget in the National Centers for Environmental Prediction Operational and Research NLDAS-2 Systems

    NASA Technical Reports Server (NTRS)

    Xia, Youlong; Cosgrove, Brian A.; Mitchell, Kenneth E.; Peters-Lidard, Christa D.; Ek, Michael B.; Brewer, Michael; Mocko, David; Kumar, Sujay V.; Wei, Helin; Meng, Jesse; hide

    2016-01-01

    The purpose of this study is to evaluate the components of the land surface water budget in the four land surface models (Noah, SAC-Sacramento Soil Moisture Accounting Model, (VIC) Variable Infiltration Capacity Model, and Mosaic) applied in the newly implemented National Centers for Environmental Prediction (NCEP) operational and research versions of the North American Land Data Assimilation System version 2 (NLDAS-2). This work focuses on monthly and annual components of the water budget over 12 National Weather Service (NWS) River Forecast Centers (RFCs). Monthly gridded FLUX Network (FLUXNET) evapotranspiration (ET) from the Max-Planck Institute (MPI) of Germany, U.S. Geological Survey (USGS) total runoff (Q), changes in total water storage (dS/dt, derived as a residual by utilizing MPI ET and USGS Q in the water balance equation), and Gravity Recovery and Climate Experiment (GRACE) observed total water storage anomaly (TWSA) and change (TWSC) are used as reference data sets. Compared to these ET and Q benchmarks, Mosaic and SAC (Noah and VIC) in the operational NLDAS-2 overestimate (underestimate) mean annual reference ET and underestimate (overestimate) mean annual reference Q. The multimodel ensemble mean (MME) is closer to the mean annual reference ET and Q. An anomaly correlation (AC) analysis shows good AC values for simulated monthly mean Q and dS/dt but significantly smaller AC values for simulated ET. Upgraded versions of the models utilized in the research side of NLDAS-2 yield largely improved performance in the simulation of these mean annual and monthly water component diagnostics. These results demonstrate that the three intertwined efforts of improving (1) the scientific understanding of parameterization of land surface processes, (2) the spatial and temporal extent of systematic validation of land surface processes, and (3) the engineering-oriented aspects such as parameter calibration and optimization are key to substantially improving product quality in various land data assimilation systems.

  8. Carbon monoxide screen for signalized intersections : COSIM, version 4.0 - technical documentation.

    DOT National Transportation Integrated Search

    2013-06-01

    Illinois Carbon Monoxide Screen for Intersection Modeling (COSIM) Version 3.0 is a Windows-based computer : program currently used by the Illinois Department of Transportation (IDOT) to estimate worst-case carbon : monoxide (CO) concentrations near s...

  9. Comparison of Origin 2000 and Origin 3000 Using NAS Parallel Benchmarks

    NASA Technical Reports Server (NTRS)

    Turney, Raymond D.

    2001-01-01

    This report describes results of benchmark tests on the Origin 3000 system currently being installed at the NASA Ames National Advanced Supercomputing facility. This machine will ultimately contain 1024 R14K processors. The first part of the system, installed in November, 2000 and named mendel, is an Origin 3000 with 128 R12K processors. For comparison purposes, the tests were also run on lomax, an Origin 2000 with R12K processors. The BT, LU, and SP application benchmarks in the NAS Parallel Benchmark Suite and the kernel benchmark FT were chosen to determine system performance and measure the impact of changes on the machine as it evolves. Having been written to measure performance on Computational Fluid Dynamics applications, these benchmarks are assumed appropriate to represent the NAS workload. Since the NAS runs both message passing (MPI) and shared-memory, compiler directive type codes, both MPI and OpenMP versions of the benchmarks were used. The MPI versions used were the latest official release of the NAS Parallel Benchmarks, version 2.3. The OpenMP versiqns used were PBN3b2, a beta version that is in the process of being released. NPB 2.3 and PBN 3b2 are technically different benchmarks, and NPB results are not directly comparable to PBN results.

  10. ElarmS Earthquake Early Warning System: 2017 Performance and New ElarmS Version 3.0 (E3)

    NASA Astrophysics Data System (ADS)

    Chung, A. I.; Henson, I. H.; Allen, R. M.; Hellweg, M.; Neuhauser, D. S.

    2017-12-01

    The ElarmS earthquake early warning (EEW) system has been successfully detecting earthquakes throughout California since 2007. ElarmS version 2.0 (E2) is one of the three algorithms contributing alerts to ShakeAlert, a public EEW system being developed by the USGS in collaboration with UC Berkeley, Caltech, University of Washington, and University of Oregon. E2 began operating in test mode in the Pacific Northwest in 2013, and since April of this year E2 has been contributing real-time alerts from Oregon and Washington to the ShakeAlert production prototype system as part of the ShakeAlert roll-out throughout the West Coast. Since it began operating west-coast-wide, E2 has correctly alerted on 5 events that matched ANSS catalog events with M≥4, missed 1 event with M≥4, and incorrectly created alerts for 5 false events with M≥4. The most recent version of the algorithm, ElarmS version 3.0 (E3), is a significant improvement over E2. It addresses some of the most problematic causes of false events for which E2 produced alerts, without impacting reliability in terms of matched and missed events. Of the 5 false events that were generated by E2 since April, 4 would have been suppressed by E3. In E3, we have added a filterbank teleseismic filter. By analyzing the amplitude of the waveform filtered in various passbands, it is possible to distinguish between local and teleseismic events. We have also added a series of checks to validate triggers and filter out spurious and S-wave triggers. Additional improvements to the waveform associator also improve detections. In this presentation, we describe the improvements and compare the performance of the current production (E2) and development (E3) versions of ElarmS over the past year. The ShakeAlert project is now working through a streamlining process to identify the best components of various algorithms and merge them. The ElarmS team is participating in this effort and we anticipate that much of E3 will continue in the final system.

  11. An evaluation of the STEMS tree growth projection system.

    Treesearch

    Margaret R. Holdaway; Gary J. Brand

    1983-01-01

    STEMS (Stand and Tree Evaluation and Modeling System) is a tree growth projection system. This paper (1) compares the performance of the current version of STEMS developed for the Lake States with that of the original model and (2) reports the results of an analysis of the current model over a wide range of conditions and identifies its main strengths and weaknesses...

  12. BehavePlus fire modeling system, version 5.0: Design and Features

    Treesearch

    Faith Ann Heinsch; Patricia L. Andrews

    2010-01-01

    The BehavePlus fire modeling system is a computer program that is based on mathematical models that describe wildland fire behavior and effects and the fire environment. It is a flexible system that produces tables, graphs, and simple diagrams. It can be used for a host of fire management applications, including projecting the behavior of an ongoing fire, planning...

  13. BehavePlus fire modeling system, version 4.0: User's Guide

    Treesearch

    Patricia L. Andrews; Collin D. Bevins; Robert C. Seli

    2005-01-01

    The BehavePlus fire modeling system is a program for personal computers that is a collection of mathematical models that describe fire and the fire environment. It is a flexible system that produces tables, graphs, and simple diagrams. It can be used for a multitude of fire management applications including projecting the behavior of an ongoing fire, planning...

  14. Earth system modelling on system-level heterogeneous architectures: EMAC (version 2.42) on the Dynamical Exascale Entry Platform (DEEP)

    NASA Astrophysics Data System (ADS)

    Christou, Michalis; Christoudias, Theodoros; Morillo, Julián; Alvarez, Damian; Merx, Hendrik

    2016-09-01

    We examine an alternative approach to heterogeneous cluster-computing in the many-core era for Earth system models, using the European Centre for Medium-Range Weather Forecasts Hamburg (ECHAM)/Modular Earth Submodel System (MESSy) Atmospheric Chemistry (EMAC) model as a pilot application on the Dynamical Exascale Entry Platform (DEEP). A set of autonomous coprocessors interconnected together, called Booster, complements a conventional HPC Cluster and increases its computing performance, offering extra flexibility to expose multiple levels of parallelism and achieve better scalability. The EMAC model atmospheric chemistry code (Module Efficiently Calculating the Chemistry of the Atmosphere (MECCA)) was taskified with an offload mechanism implemented using OmpSs directives. The model was ported to the MareNostrum 3 supercomputer to allow testing with Intel Xeon Phi accelerators on a production-size machine. The changes proposed in this paper are expected to contribute to the eventual adoption of Cluster-Booster division and Many Integrated Core (MIC) accelerated architectures in presently available implementations of Earth system models, towards exploiting the potential of a fully Exascale-capable platform.

  15. Whole-system carbon balance for a regional temperate forest in Northern Wisconsin, USA

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; Gower, S. T.

    2010-12-01

    The whole-system (biological + industrial) carbon (C) balance was estimated for the Chequamegon-Nicolet National Forest (CNNF), a temperate forest covering 600,000 ha in Northern Wisconsin, USA. The biological system was modeled using a spatially-explicit version of the ecosystem process model Biome-BGC. The industrial system was modeled using life cycle inventory (LCI) models for wood and paper products. Biome-BGC was used to estimate net primary production, net ecosystem production (NEP), and timber harvest (H) over the entire CNNF. The industrial carbon budget (Ci) was estimated by applying LCI models of CO2 emissions resulting from timber harvest and production of specific wood and paper products in the CNNF region. In 2009, simulated NEP of the CNNF averaged 3.0 tC/ha and H averaged 0.1 tC/ha. Despite model uncertainty, the CNNF region is likely a carbon sink (NEP - Ci > 0), even when CO2 emissions from timber harvest and production of wood and paper products are included in the calculation of the entire forest system C budget.

  16. Status of the NASA Micro Pulse Lidar Network (MPLNET): overview of the network and future plans, new version 3 data products, and the polarized MPL

    NASA Astrophysics Data System (ADS)

    Welton, Ellsworth J.; Stewart, Sebastian A.; Lewis, Jasper R.; Belcher, Larry R.; Campbell, James R.; Lolli, Simone

    2018-04-01

    The NASA Micro Pulse Lidar Network (MPLNET) is a global federated network of Micro-Pulse Lidars (MPL) co-located with the NASA Aerosol Robotic Network (AERONET). MPLNET began in 2000, and there are currently 17 long-term sites, numerous field campaigns, and more planned sites on the way. We have developed a new Version 3 processing system including the deployment of polarized MPLs across the network. Here we provide an overview of Version 3, the polarized MPL, and current and future plans.

  17. Smart Grid Educational Series | Energy Systems Integration Facility | NREL

    Science.gov Websites

    generation through transmission, all the way to the distribution infrastructure. Download presentation | Text on key takeaways from breakout group discussions. Learn more about the workshop. Text Version Text presentation PDF | Text Version Using MultiSpeak Data Model Standard & Essence Anomaly Detection for ICS

  18. Computer Simulation Modeling: A Method for Predicting the Utilities of Alternative Computer-Aided Treat Evaluation Algorithms

    DTIC Science & Technology

    1990-09-01

    1988). Current versions of the ADATS have CATE systems insLzlled, but the software is still under development by the radar manufacturer, Contraves ...Italiana, a subcontractor to Martin Marietta (USA). Contraves Italiana will deliver the final version of the software to Martin Marietta in 1991. Until then

  19. Analysis of Effects of Organizational Behavior on Evolving System of Systems Acquisition Programs Through Agent Based Modeling

    DTIC Science & Technology

    2013-03-01

    function is based on how individualistic or collectivistic a system is. Low individualism values mean the system is more collective and is less likely...Hofstede’s cultural dimensions, integrated with a modified version of the Bak- Sneppen biological evolutionary model, this research highlights which set...14 Hofstede’s Cultural Dimensions

  20. Setup and Operation of the TeleEngineering Communications Equipment - Fixed Site (TCE-F), Version II

    DTIC Science & Technology

    2004-10-01

    Overview ......................................................................... 3 A D T R A N IM U X ...the interconnections of the components. ADTRAN The ADTRAN provided with the continental United States systems is typi- cally a 2 x 64 version and...34FDX." After the FDX message appears and the system remains in the FDX mode, the Online button will flash , indicating the system is ready for a

  1. Documentation of model input and output values for simulation of pumping effects in Paradise Valley, a basin tributary to the Humboldt River, Humboldt County, Nevada

    USGS Publications Warehouse

    Carey, A.E.; Prudic, David E.

    1996-01-01

    Documentation is provided of model input and sample output used in a previous report for analysis of ground-water flow and simulated pumping scenarios in Paradise Valley, Humboldt County, Nevada.Documentation includes files containing input values and listings of sample output. The files, in American International Standard Code for Information Interchange (ASCII) or binary format, are compressed and put on a 3-1/2-inch diskette. The decompressed files require approximately 8.4 megabytes of disk space on an International Business Machine (IBM)- compatible microcomputer using the MicroSoft Disk Operating System (MS-DOS) operating system version 5.0 or greater.

  2. Modelling Middle Infrared Thermal Imagery from Observed or Simulated Active Fire

    NASA Astrophysics Data System (ADS)

    Paugam, R.; Gastellu-Etchegorry, J. P.; Mell, W.; Johnston, J.; Filippi, J. B.

    2016-12-01

    The Fire Radiative Power (FRP) is used in the atmospheric and fire communities to estimate fire emission. For example, the current version of the emission inventory GFAS is using FRP observation from the MODIS sensors to derive daily global distribution of fire emissions. Although the FRP product is widely accepted, most of its theoretical justifications are still based on small scale burns. When up-scaling to large fires effects of view angle, canopy cover, or smoke absorption are still unknown. To cover those questions, we are building a system based on the DART radiative transfer model to simulate the middle infrared radiance emitted by a propagating fire front and propagating in the surrounding scene made of ambient vegetation and plume aerosols. The current version of the system was applied to fire ranging from a 1m2 to 7ha. The 3D fire scene used as input in DART is made of the flame, the vegetation (burnt and unburnt), and the plume. It can be either set up from [i] 3D physical based model scene (ie WFDS, mainly applicable for small scale burn), [ii] coupled 2D fire spread - atmospheric models outputs (eg ForeFire-MesoNH) or [iii] derived from thermal imageries observations (here plume effects are not considered). In the last two cases, as the complexity of physical processes occurring in the flame (in particular soot formation and emission) is not to solved, the flames structures are parameterized with (a) temperature and soot concentration based on empirical derived profiles and (b) 3D triangular shape hull interpolated at the fire front location. Once the 3D fire scene is set up, DART is then used to render thermal imageries in the middle infrared. Using data collected from burns conducted at different scale, the modelled thermal imageries are compared against observations, and effects of view angle are discussed.

  3. Pesticides exposure assessment of kettleman city using the industrial source complex short-term model version 3.

    PubMed

    Tao, Jing; Barry, Terrell; Segawa, Randy; Neal, Rosemary; Tuli, Atac

    2013-01-01

    Kettleman City, California, reported a higher than expected number of birth defect cases between 2007 and 2010, raising the concern of community and government agencies. A pesticide exposure evaluation was conducted as part of a complete assessment of community chemical exposure. Nineteen pesticides that potentially cause birth defects were investigated. The Industrial Source Complex Short-Term Model Version 3 (ISCST3) was used to estimate off-site air concentrations associated with pesticide applications within 8 km of the community from late 2006 to 2009. The health screening levels were designed to indicate potential health effects and used for preliminary health evaluations of estimated air concentrations. A tiered approach was conducted. The first tier modeled simple, hypothetical worst-case situations for each of 19 pesticides. The second tier modeled specific applications of the pesticides with estimated concentrations exceeding health screening levels in the first tier. The pesticide use report database of the California Department of Pesticide Regulation provided application information. Weather input data were summarized from the measurements of a local weather station in the California Irrigation Management Information System. The ISCST3 modeling results showed that during the target period, only two application days of one pesticide (methyl isothiocyanate) produced air concentration estimates above the health screening level for developmental effects at the boundary of Kettleman City. These results suggest that the likelihood of birth defects caused by pesticide exposure was low. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  4. A multisensor evaluation of the asymmetric convective model, version 2, in southeast Texas.

    PubMed

    Kolling, Jenna S; Pleim, Jonathan E; Jeffries, Harvey E; Vizuete, William

    2013-01-01

    There currently exist a number of planetary boundary layer (PBL) schemes that can represent the effects of turbulence in daytime convective conditions, although these schemes remain a large source of uncertainty in meteorology and air quality model simulations. This study evaluates a recently developed combined local and nonlocal closure PBL scheme, the Asymmetric Convective Model, version 2 (ACM2), against PBL observations taken from radar wind profilers, a ground-based lidar, and multiple daytime radiosonde balloon launches. These observations were compared against predictions of PBLs from the Weather Research and Forecasting (WRF) model version 3.1 with the ACM2 PBL scheme option, and the Fifth-Generation Meteorological Model (MM5) version 3.7.3 with the Eta PBL scheme option that is currently being used to develop ozone control strategies in southeast Texas. MM5 and WRF predictions during the regulatory modeling episode were evaluated on their ability to predict the rise and fall of the PBL during daytime convective conditions across southeastern Texas. The MM5 predicted PBLs consistently underpredicted observations, and were also less than the WRF PBL predictions. The analysis reveals that the MM5 predicted a slower rising and shallower PBL not representative of the daytime urban boundary layer. Alternatively, the WRF model predicted a more accurate PBL evolution improving the root mean square error (RMSE), both temporally and spatially. The WRF model also more accurately predicted vertical profiles of temperature and moisture in the lowest 3 km of the atmosphere. Inspection of median surface temperature and moisture time-series plots revealed higher predicted surface temperatures in WRF and more surface moisture in MM5. These could not be attributed to surface heat fluxes, and thus the differences in performance of the WRF and MM5 models are likely due to the PBL schemes. An accurate depiction of the diurnal evolution of the planetary boundary layer (PBL) is necessary for realistic air quality simulations, and for formulating effective policy. The meteorological model used to support the southeast Texas 03 attainment demonstration made predictions of the PBL that were consistently less than those found in observations. The use of the Asymmetric Convective Model, version 2 (ACM2), predicted taller PBL heights and improved model predictions. A lower predicted PBL height in an air quality model would increase precursor concentrations and change the chemical production of O3 and possibly the response to control strategies.

  5. [Application of quantum-chemical methods to prediction of the carcinogenicity of chemical substances].

    PubMed

    Zholdikova, Z I; Kharchevnikova, N V

    2006-01-01

    A version of logical-combinatorial JSM type intelligent system was used to predict the presence and the degree of a carcinogenic effect. This version was based on combined description of chemical substances including both structural and numeric parameters. The new version allows for the fact that the toxicity and danger caused by chemical substances often depend on their biological activation in the organism. The authors substantiate classifying chemicals according to their carcinogenic activity, and illustrate the use of the system to predict the carcinogenicity of polycyclic aromatic hydrocarbons using a model of bioactivation via the formation of diolepoxides, and the carcinogenicity of halogenated alkanes using a model of bioactivation via oxidative dehalogenation. The paper defined the boundary level of an energetic parameter, the exceeding of which correlated with the inhibition of halogenated alkanes's metabolism and the absence of carcinogenic activity.

  6. General Electromagnetic Model for the Analysis of Complex Systems (GEMACS) Computer Code Documentation (Version 3). Volume 3. Part 2.

    DTIC Science & Technology

    1983-09-01

    F.P. PX /AMPZIJ/ REFH /AMPZIJ/ REFV /AI4PZIJ/ * RHOX /AI4PZIJ/ RHOY /At4PZIJ/ RHOZ /AI4PZIJ/ S A-ZJ SA /AMPZIJ/ SALP /AMPZIJ/ 6. CALLING ROUTINE: FLDDRV...US3NG ALGORITHM 72 COMPUTE P- YES .~:*:.~~ USING* *. 1. NAME: PLAINT (GTD) ] 2. PURPOSE: To determine if a ray traveling from a given source loca...determine if a source ray reflection from plate MP occurs. If a ray traveling from the source image location in the reflected ray direction passes through

  7. Assessing the Tangent Linear Behaviour of Common Tracer Transport Schemes and Their Use in a Linearised Atmospheric General Circulation Model

    NASA Technical Reports Server (NTRS)

    Holdaway, Daniel; Kent, James

    2015-01-01

    The linearity of a selection of common advection schemes is tested and examined with a view to their use in the tangent linear and adjoint versions of an atmospheric general circulation model. The schemes are tested within a simple offline one-dimensional periodic domain as well as using a simplified and complete configuration of the linearised version of NASA's Goddard Earth Observing System version 5 (GEOS-5). All schemes which prevent the development of negative values and preserve the shape of the solution are confirmed to have nonlinear behaviour. The piecewise parabolic method (PPM) with certain flux limiters, including that used by default in GEOS-5, is found to support linear growth near the shocks. This property can cause the rapid development of unrealistically large perturbations within the tangent linear and adjoint models. It is shown that these schemes with flux limiters should not be used within the linearised version of a transport scheme. The results from tests using GEOS-5 show that the current default scheme (a version of PPM) is not suitable for the tangent linear and adjoint model, and that using a linear third-order scheme for the linearised model produces better behaviour. Using the third-order scheme for the linearised model improves the correlations between the linear and non-linear perturbation trajectories for cloud liquid water and cloud liquid ice in GEOS-5.

  8. TODS BioCast User Manual, Forecasting 3D Satellite Derived Optical Properties Using Eulerian Advection Procedure, Version 1.0

    DTIC Science & Technology

    2015-06-17

    Example 2: OpCast_cron.sh #!/bin/sh # # # # # Cron helper script This script may be called with the appropriate arguments to reproduce what the...testing. Example 3: OpCast.sh #!/bin/sh # # helper script to set up environment for call to make_merged_product.sh # # This script can be called stand...ecosystem model skill assessment, Journal of Marine Systems, 76(1-2), 64-82, doi:10.1016/j.jmarsys.2008.05.014. Jolliff, J. K., S. Ladner, R. Crout, P

  9. Optimal A-Train Data Utilization: A Use Case of Aura OMI L2G and MERRA-2 Aerosol Products

    NASA Technical Reports Server (NTRS)

    Zeng, Jian; Shen, Suhung; Wei, Jennifer; Meyer, David J.

    2017-01-01

    Ozone Monitoring Instrument (OMI) aboard NASA's Aura mission measures ozone column and profile, aerosols, clouds, surface UV irradiance, and the trace gases including NO2, SO2, HCHO, BrO, and OClO using UltraViolet electromagnetic spectrum (280 - 400 nm) with a daily global coverage and a pixel spatial resolution of 13 km × 24 km at nadir, and it's been one of the key instruments to study the Earth's atmospheric composition and chemistry. The second Modern-Era Retrospective analysis for Research and Applications (MERRA-2) is NASA's atmospheric reanalysis using an upgraded version of Goddard Earth Observing System Model, version 5 (GEOS-5) data assimilation system. Compared to its predecessor MERRA, MERRA-2 is enhanced with more aspects of the Earth system among which is aerosol assimilation. When comparing between satellite pixel measurements and modeled grid data, how to properly handle counterpart pairing is critical considering their spatial and temporal variations. The comparison between satellite and model data by simply using Level 3 (L3) products may result biases due to lack of detailed temporal information. It has been preferred to inter-compare or implement satellite derived physical quantity (i.e., Level 2 (L2) Swath type) directly with/to model measurements with higher temporal and spatial resolution as possible. However, this has posed a challenge in the community to handle. Rather than directly handling the L2 or L3 data, there is a Level 2G (L2G) product conserving L2 pixel scientific data quality but in Grid type with the global coverage. In this presentation, we would like to demonstrate the optimal utilization of OMI L2G daily aerosol products by comparing with MERRA-2 hourly aerosol simulations matched well in both space and time.

  10. Buffering of potassium in seawater by alteration of basalt in low-temperature, off-axis, hydrothermal systems

    NASA Astrophysics Data System (ADS)

    Laureijs, C. T.; Coogan, L. A.

    2016-12-01

    It is generally accepted that the composition of seawater has varied through the Phanerzoic and that the variation is linked to changes in the same global fluxes that control the long-term carbon cycle. However, K is observed to be stable at a value of 10 mmol/L despite variable river and hydrothermal fluxes [1]. Secondary K-bearing phases are widely observed in altered upper oceanic crust, suggesting that reactions between seawater and basalt in low-temperature, off-axis, oceanic hydrothermal systems could buffer the K concentration of seawater [2]. As K-feldspar is a common secondary K-bearing mineral in Cretaceous and rare in Cenozoic oceanic crust, the formation of K-feldspar by breakdown of plagioclase reacting with a model Cretaceous seawater was modeled at 15 ºC using the PhreeqC code (version 3.2) and the associated llnl.dat database. A fluid with a K-content of 11 mmol/L in equilibrium with K-feldspar and calcite was generated, consistent with K-feldspar acting as a buffer for the K-content in Cretaceous seawater and the production of alkalinity stabilizing atmospheric CO2 levels on the long-term timescales. A compilation of the K2O content of lavas from DSDP and ODP drill cores (from: http://www.earthchem.org/petdb) shows that the average K-content of altered crust was higher in the Cretaceous than the Cenozoic. This data is inconsistent with the model for the composition of seawater presented in [2], but is consistent with an updated and modified version of this model, that uses more realistic fluxes [3]. We conclude that oceanic off-axis hydrothermal systems probably do buffer the K-content of seawater. [1] Timofeeff et al. (2006), Geochim. Cosmochim. Acta. 70, 1977-1994; [2] Demicco et al. (2005), Geology 33, 877-880. [3] Coogan & Dosso (2012), Earth Planet. Sci. Lett. 323-324, 92-101.

  11. On the Form of a Systemic Grammar

    ERIC Educational Resources Information Center

    McCord, Michael C.

    1975-01-01

    This paper concerns the theory of systemic grammar developed by Halliday, Hudson and others. It suggests modifications of Hudson's generative version, and the model presented resembles transformational grammar. (CHK)

  12. ASSIST - THE ABSTRACT SEMI-MARKOV SPECIFICATION INTERFACE TO THE SURE TOOL PROGRAM (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1994-01-01

    ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, is an interface that will enable reliability engineers to accurately design large semi-Markov models. The user describes the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. The abstract language allows efficient description of large, complex systems; a one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. Instead of listing the individual states of the Markov model, reliability engineers can specify the rules governing the behavior of a system, and these are used to automatically generate the model. ASSIST reads an input file describing the failure behavior of a system in an abstract language and generates a Markov model in the format needed for input to SURE, the semi-Markov Unreliability Range Evaluator program, and PAWS/STEM, the Pade Approximation with Scaling program and Scaled Taylor Exponential Matrix. A Markov model consists of a number of system states and transitions between them. Each state in the model represents a possible state of the system in terms of which components have failed, which ones have been removed, etc. Within ASSIST, each state is defined by a state vector, where each element of the vector takes on an integer value within a defined range. An element can represent any meaningful characteristic, such as the number of working components of one type in the system, or the number of faulty components of another type in use. Statements representing transitions between states in the model have three parts: a condition expression, a destination expression, and a rate expression. The first expression is a Boolean expression describing the state space variable values of states for which the transition is valid. The second expression defines the destination state for the transition in terms of state space variable values. The third expression defines the distribution of elapsed time for the transition. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. ASSIST was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The VMS version (LAR14193) is written in C-language and can be compiled with the VAX C compiler. The standard distribution medium for the VMS version of ASSIST is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The Sun version (LAR14923) is written in ANSI C-language. An ANSI compliant C compiler is required in order to compile this package. The standard distribution medium for the Sun version of ASSIST is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the documentation in PostScript, TeX, and DVI formats are provided on the distribution medium. (The VMS distribution lacks the .DVI format files, however.) ASSIST was developed in 1986 and last updated in 1992. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. SunOS, Sun3, and Sun4 are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  13. ASSIST - THE ABSTRACT SEMI-MARKOV SPECIFICATION INTERFACE TO THE SURE TOOL PROGRAM (VAX VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Johnson, S. C.

    1994-01-01

    ASSIST, the Abstract Semi-Markov Specification Interface to the SURE Tool program, is an interface that will enable reliability engineers to accurately design large semi-Markov models. The user describes the failure behavior of a fault-tolerant computer system in an abstract, high-level language. The ASSIST program then automatically generates a corresponding semi-Markov model. The abstract language allows efficient description of large, complex systems; a one-page ASSIST-language description may result in a semi-Markov model with thousands of states and transitions. The ASSIST program also includes model-reduction techniques to facilitate efficient modeling of large systems. Instead of listing the individual states of the Markov model, reliability engineers can specify the rules governing the behavior of a system, and these are used to automatically generate the model. ASSIST reads an input file describing the failure behavior of a system in an abstract language and generates a Markov model in the format needed for input to SURE, the semi-Markov Unreliability Range Evaluator program, and PAWS/STEM, the Pade Approximation with Scaling program and Scaled Taylor Exponential Matrix. A Markov model consists of a number of system states and transitions between them. Each state in the model represents a possible state of the system in terms of which components have failed, which ones have been removed, etc. Within ASSIST, each state is defined by a state vector, where each element of the vector takes on an integer value within a defined range. An element can represent any meaningful characteristic, such as the number of working components of one type in the system, or the number of faulty components of another type in use. Statements representing transitions between states in the model have three parts: a condition expression, a destination expression, and a rate expression. The first expression is a Boolean expression describing the state space variable values of states for which the transition is valid. The second expression defines the destination state for the transition in terms of state space variable values. The third expression defines the distribution of elapsed time for the transition. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. ASSIST was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The VMS version (LAR14193) is written in C-language and can be compiled with the VAX C compiler. The standard distribution medium for the VMS version of ASSIST is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The Sun version (LAR14923) is written in ANSI C-language. An ANSI compliant C compiler is required in order to compile this package. The standard distribution medium for the Sun version of ASSIST is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. Electronic copies of the documentation in PostScript, TeX, and DVI formats are provided on the distribution medium. (The VMS distribution lacks the .DVI format files, however.) ASSIST was developed in 1986 and last updated in 1992. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. SunOS, Sun3, and Sun4 are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  14. The Swedish version of the Acceptance of Chronic Health Conditions Scale for people with multiple sclerosis: Translation, cultural adaptation and psychometric properties.

    PubMed

    Forslin, Mia; Kottorp, Anders; Kierkegaard, Marie; Johansson, Sverker

    2016-11-11

    To translate and culturally adapt the Acceptance of Chronic Health Conditions (ACHC) Scale for people with multiple sclerosis into Swedish, and to analyse the psychometric properties of the Swedish version. Ten people with multiple sclerosis participated in translation and cultural adaptation of the ACHC Scale; 148 people with multiple sclerosis were included in evaluation of the psychometric properties of the scale. Translation and cultural adaptation were carried out through translation and back-translation, by expert committee evaluation and pre-test with cognitive interviews in people with multiple sclerosis. The psychometric properties of the Swedish version were evaluated using Rasch analysis. The Swedish version of the ACHC Scale was an acceptable equivalent to the original version. Seven of the original 10 items fitted the Rasch model and demonstrated ability to separate between groups. A 5-item version, including 2 items and 3 super-items, demonstrated better psychometric properties, but lower ability to separate between groups. The Swedish version of the ACHC Scale with the original 10 items did not fit the Rasch model. Two solutions, either with 7 items (ACHC-7) or with 2 items and 3 super-items (ACHC-5), demonstrated acceptable psychometric properties. Use of the ACHC-5 Scale with super-items is recommended, since this solution adjusts for local dependency among items.

  15. Rotary engine performance computer program (RCEMAP and RCEMAPPC): User's guide

    NASA Technical Reports Server (NTRS)

    Bartrand, Timothy A.; Willis, Edward A.

    1993-01-01

    This report is a user's guide for a computer code that simulates the performance of several rotary combustion engine configurations. It is intended to assist prospective users in getting started with RCEMAP and/or RCEMAPPC. RCEMAP (Rotary Combustion Engine performance MAP generating code) is the mainframe version, while RCEMAPPC is a simplified subset designed for the personal computer, or PC, environment. Both versions are based on an open, zero-dimensional combustion system model for the prediction of instantaneous pressures, temperature, chemical composition and other in-chamber thermodynamic properties. Both versions predict overall engine performance and thermal characteristics, including bmep, bsfc, exhaust gas temperature, average material temperatures, and turbocharger operating conditions. Required inputs include engine geometry, materials, constants for use in the combustion heat release model, and turbomachinery maps. Illustrative examples and sample input files for both versions are included.

  16. A Goddard Multi-Scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2010-01-01

    A multi-scale modeling system with unified physics has been developed at NASA Goddard Space Flight Center (GSFC). The system consists of an MMF, the coupled NASA Goddard finite-volume GCM (fvGCM) and Goddard Cumulus Ensemble model (GCE, a CRM); the state-of-the-art Weather Research and Forecasting model (WRF) and the stand alone GCE. These models can share the same microphysical schemes, radiation (including explicitly calculated cloud optical properties), and surface models that have been developed, improved and tested for different environments. In this talk, I will present: (1) A brief review on GCE model and its applications on the impact of the aerosol on deep precipitation processes, (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), and (3) A discussion on the Goddard WRF version (its developments and applications). We are also performing the inline tracer calculation to comprehend the ph ysical processes (i.e., boundary layer and each quadrant in the boundary layer) related to the development and structure of hurricanes and mesoscale convective systems.

  17. Validation of the French version of the Acceptability E-scale (AES) for mental E-health systems.

    PubMed

    Micoulaud-Franchi, Jean-Arthur; Sauteraud, Alain; Olive, Jérôme; Sagaspe, Patricia; Bioulac, Stéphanie; Philip, Pierre

    2016-03-30

    Despite the increasing use of E-health systems for mental-health organizations, there is a lack of psychometric tools to evaluate their acceptability by patients with mental disorders. Thus, this study aimed to translate and validate a French version of the Acceptability E-scale (AES), a 6-item self-reported questionnaire that evaluates the extent to which patients find E-health systems acceptable. A forward-backward translation of the AES was performed. The psychometric properties of the French AES version, with construct validity, internal structural validity and external validity (Pearson's coefficient between AES scores and depression symptoms on the Beck Depression Inventory II) were analyzed. In a sample of 178 patients (mean age=46.51 years, SD=12.91 years), the validation process revealed satisfactory psychometric properties: factor analysis revealed two factors: "Satisfaction" (3 items) and "Usability" (3 items) and Cronbach's alpha was 0.7. No significant relation was found between AES scores and depression symptoms. The French version of the AES revealed a two-factor scale that differs from the original version. In line with the importance of acceptability in mental health and with a view to E-health systems for patients with mental disorders, the use of the AES in psychiatry may provide important information on acceptability (i.e., satisfaction and usability). Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. ALSSAT Version 6.0

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsia; Brown, Cheryl; Jeng, Frank

    2012-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) at the time of this reporting has been updated to version 6.0. A previous version was described in Tool for Sizing Analysis of the Advanced Life Support System (MSC- 23506), NASA Tech Briefs, Vol. 29, No. 12 (December 2005), page 43. To recapitulate: ALSSAT is a computer program for sizing and analyzing designs of environmental-control and life-support systems for spacecraft and surface habitats to be involved in exploration of Mars and the Moon. Of particular interest for analysis by ALSSAT are conceptual designs of advanced life-support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water and process human wastes to reduce the need of resource resupply. ALSSAT is a means of investigating combinations of such subsystems technologies featuring various alternative conceptual designs and thereby assisting in determining which combination is most cost-effective. ALSSAT version 6.0 has been improved over previous versions in several respects, including the following additions: an interface for reading sizing data from an ALS database, computational models of a redundant regenerative CO2 and Moisture Removal Amine Swing Beds (CAMRAS) for CO2 removal, upgrade of the Temperature & Humidity Control's Common Cabin Air Assembly to a detailed sizing model, and upgrade of the Food-management subsystem.

  19. Engine Load Path Calculations - Project Neo

    NASA Technical Reports Server (NTRS)

    Fisher, Joseph

    2014-01-01

    A mathematical model of the engine and actuator geometry was developed and used to perform a static force analysis of the system with the engine at different pitch and yaw angles. This analysis yielded the direction and magnitude of the reaction forces at the mounting points of the engine and actuators. These data were used to validate the selection of the actuators installed in the system and to design a new spherical joint to mount the engine on the test fixture. To illustrate the motion of the system and to further interest in the project, a functional 3D printed version of the system was made, featuring the full mobility of the real system.

  20. Computational Control of Flexible Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Sharpe, Lonnie, Jr.; Shen, Ji Yao

    1994-01-01

    The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years of the project. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed. A theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modelling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide a embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.

  1. Evaluation of the Community Multi-scale Air Quality (CMAQ) ...

    EPA Pesticide Factsheets

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Protection Agency develops the CMAQ model and periodically releases new versions of the model that include bug fixes and various other improvements to the modeling system. In the fall of 2015, CMAQ version 5.1 was released. This new version of CMAQ will contain important bug fixes to several issues that were identified in CMAQv5.0.2 and additionally include updates to other portions of the code. Several annual, and numerous episodic, CMAQv5.1 simulations were performed to assess the impact of these improvements on the model results. These results will be presented, along with a base evaluation of the performance of the CMAQv5.1 modeling system against available surface and upper-air measurements available during the time period simulated. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, proces

  2. LANL* V2.0: global modeling and validation

    NASA Astrophysics Data System (ADS)

    Koller, J.; Zaharia, S.

    2011-03-01

    We describe in this paper the new version of LANL*. Just like the previous version, this new version V2.0 of LANL* is an artificial neural network (ANN) for calculating the magnetic drift invariant, L*, that is used for modeling radiation belt dynamics and for other space weather applications. We have implemented the following enhancements in the new version: (1) we have removed the limitation to geosynchronous orbit and the model can now be used for any type of orbit. (2) The new version is based on the improved magnetic field model by Tsyganenko and Sitnov (2005) (TS05) instead of the older model by Tsyganenko et al. (2003). We have validated the model and compared our results to L* calculations with the TS05 model based on ephemerides for CRRES, Polar, GPS, a LANL geosynchronous satellite, and a virtual RBSP type orbit. We find that the neural network performs very well for all these orbits with an error typically Δ L* < 0.2 which corresponds to an error of 3% at geosynchronous orbit. This new LANL-V2.0 artificial neural network is orders of magnitudes faster than traditional numerical field line integration techniques with the TS05 model. It has applications to real-time radiation belt forecasting, analysis of data sets involving decades of satellite of observations, and other problems in space weather.

  3. SURE - SEMI-MARKOV UNRELIABILITY RANGE EVALUATOR (VAX VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1994-01-01

    The Semi-Markov Unreliability Range Evaluator, SURE, is an analysis tool for reconfigurable, fault-tolerant systems. Traditional reliability analyses are based on aggregates of fault-handling and fault-occurrence models. SURE provides an efficient means for calculating accurate upper and lower bounds for the death state probabilities for a large class of semi-Markov models, not just those which can be reduced to critical-pair architectures. The calculated bounds are close enough (usually within 5 percent of each other) for use in reliability studies of ultra-reliable computer systems. The SURE bounding theorems have algebraic solutions and are consequently computationally efficient even for large and complex systems. SURE can optionally regard a specified parameter as a variable over a range of values, enabling an automatic sensitivity analysis. Highly reliable systems employ redundancy and reconfiguration as methods of ensuring operation. When such systems are modeled stochastically, some state transitions are orders of magnitude faster than others; that is, fault recovery is usually faster than fault arrival. SURE takes these time differences into account. Slow transitions are described by exponential functions and fast transitions are modeled by either the White or Lee theorems based on means, variances, and percentiles. The user must assign identifiers to every state in the system and define all transitions in the semi-Markov model. SURE input statements are composed of variables and constants related by FORTRAN-like operators such as =, +, *, SIN, EXP, etc. There are a dozen major commands such as READ, READO, SAVE, SHOW, PRUNE, TRUNCate, CALCulator, and RUN. Once the state transitions have been defined, SURE calculates the upper and lower probability bounds for entering specified death states within a specified mission time. SURE output is tabular. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. SURE was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The VMS version (LAR13789) is written in PASCAL, C-language, and FORTRAN 77. The standard distribution medium for the VMS version of SURE is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The Sun UNIX version (LAR14921) is written in ANSI C-language and PASCAL. An ANSI compliant C compiler is required in order to compile the C portion of this package. The standard distribution medium for the Sun version of SURE is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. SURE was developed in 1988 and last updated in 1992. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. TEMPLATE is a registered trademark of Template Graphics Software, Inc. UNIX is a registered trademark of AT&T Bell Laboratories. Sun3 and Sun4 are trademarks of Sun Microsystems, Inc.

  4. SURE - SEMI-MARKOV UNRELIABILITY RANGE EVALUATOR (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1994-01-01

    The Semi-Markov Unreliability Range Evaluator, SURE, is an analysis tool for reconfigurable, fault-tolerant systems. Traditional reliability analyses are based on aggregates of fault-handling and fault-occurrence models. SURE provides an efficient means for calculating accurate upper and lower bounds for the death state probabilities for a large class of semi-Markov models, not just those which can be reduced to critical-pair architectures. The calculated bounds are close enough (usually within 5 percent of each other) for use in reliability studies of ultra-reliable computer systems. The SURE bounding theorems have algebraic solutions and are consequently computationally efficient even for large and complex systems. SURE can optionally regard a specified parameter as a variable over a range of values, enabling an automatic sensitivity analysis. Highly reliable systems employ redundancy and reconfiguration as methods of ensuring operation. When such systems are modeled stochastically, some state transitions are orders of magnitude faster than others; that is, fault recovery is usually faster than fault arrival. SURE takes these time differences into account. Slow transitions are described by exponential functions and fast transitions are modeled by either the White or Lee theorems based on means, variances, and percentiles. The user must assign identifiers to every state in the system and define all transitions in the semi-Markov model. SURE input statements are composed of variables and constants related by FORTRAN-like operators such as =, +, *, SIN, EXP, etc. There are a dozen major commands such as READ, READO, SAVE, SHOW, PRUNE, TRUNCate, CALCulator, and RUN. Once the state transitions have been defined, SURE calculates the upper and lower probability bounds for entering specified death states within a specified mission time. SURE output is tabular. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. SURE was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The VMS version (LAR13789) is written in PASCAL, C-language, and FORTRAN 77. The standard distribution medium for the VMS version of SURE is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The Sun UNIX version (LAR14921) is written in ANSI C-language and PASCAL. An ANSI compliant C compiler is required in order to compile the C portion of this package. The standard distribution medium for the Sun version of SURE is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. SURE was developed in 1988 and last updated in 1992. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. TEMPLATE is a registered trademark of Template Graphics Software, Inc. UNIX is a registered trademark of AT&T Bell Laboratories. Sun3 and Sun4 are trademarks of Sun Microsystems, Inc.

  5. Ada (Trade Name) Compiler Validation Summary Report: International Business Machines Corporation. IBM Development System for the Ada Language System, Version 1.1.0, IBM 4381 under VM/SP CMS Host, IBM 4381 under MVS Target

    DTIC Science & Technology

    1988-05-20

    AVF Control Number: AVF-VSR-84.1087 ’S (0 87-03-10-TEL I- Ada® COMPILER VALIDATION SUMMARY REPORT: International Business Machines Corporation IBM...System, Version 1.1.0, International Business Machines Corporation, Wright-Patterson AFB. IBM 4381 under VM/SP CMS, Release 3.6 (host) and IBM 4381...an IBM 4381 operating under MVS, Release 3.8. On-site testing was performed 18 May 1987 through 20 May 1987 at International Business Machines

  6. Sensitivity of Assimilated Tropical Tropospheric Ozone to the Meteorological Analyses

    NASA Technical Reports Server (NTRS)

    Hayashi, Hiroo; Stajner, Ivanka; Pawson, Steven; Thompson, Anne M.

    2002-01-01

    Tropical tropospheric ozone fields from two different experiments performed with an off-line ozone assimilation system developed in NASA's Data Assimilation Office (DAO) are examined. Assimilated ozone fields from the two experiments are compared with the collocated ozone profiles from the Southern Hemispheric Additional Ozonesondes (SHADOZ) network. Results are presented for 1998. The ozone assimilation system includes a chemistry-transport model, which uses analyzed winds from the Goddard Earth Observing System (GEOS) Data Assimilation System (DAS). The two experiments use wind fields from different versions of GEOS DAS: an operational version of the GEOS-2 system and a prototype of the GEOS-4 system. While both versions of the DAS utilize the Physical-space Statistical Analysis System and use comparable observations, they use entirely different general circulation models and data insertion techniques. The shape of the annual-mean vertical profile of the assimilated ozone fields is sensitive to the meteorological analyses, with the GEOS-4-based ozone being closest to the observations. This indicates that the resolved transport in GEOS-4 is more realistic than in GEOS-2. Remaining uncertainties include quantification of the representation of sub-grid-scale processes in the transport calculations, which plays an important role in the locations and seasons where convection dominates the transport.

  7. Simulation of n-qubit quantum systems. III. Quantum operations

    NASA Astrophysics Data System (ADS)

    Radtke, T.; Fritzsche, S.

    2007-05-01

    During the last decade, several quantum information protocols, such as quantum key distribution, teleportation or quantum computation, have attracted a lot of interest. Despite the recent success and research efforts in quantum information processing, however, we are just at the beginning of understanding the role of entanglement and the behavior of quantum systems in noisy environments, i.e. for nonideal implementations. Therefore, in order to facilitate the investigation of entanglement and decoherence in n-qubit quantum registers, here we present a revised version of the FEYNMAN program for working with quantum operations and their associated (Jamiołkowski) dual states. Based on the implementation of several popular decoherence models, we provide tools especially for the quantitative analysis of quantum operations. Apart from the implementation of different noise models, the current program extension may help investigate the fragility of many quantum states, one of the main obstacles in realizing quantum information protocols today. Program summaryTitle of program: Feynman Catalogue identifier: ADWE_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWE_v3_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: None Operating systems: Any system that supports MAPLE; tested under Microsoft Windows XP, SuSe Linux 10 Program language used:MAPLE 10 Typical time and memory requirements: Most commands that act upon quantum registers with five or less qubits take ⩽10 seconds of processor time (on a Pentium 4 processor with ⩾2 GHz or equivalent) and 5-20 MB of memory. Especially when working with symbolic expressions, however, the memory and time requirements critically depend on the number of qubits in the quantum registers, owing to the exponential dimension growth of the associated Hilbert space. For example, complex (symbolic) noise models (with several Kraus operators) for multi-qubit systems often result in very large symbolic expressions that dramatically slow down the evaluation of measures or other quantities. In these cases, MAPLE's assume facility sometimes helps to reduce the complexity of symbolic expressions, but often only numerical evaluation is possible. Since the complexity of the FEYNMAN commands is very different, no general scaling law for the CPU time and memory usage can be given. No. of bytes in distributed program including test data, etc.: 799 265 No. of lines in distributed program including test data, etc.: 18 589 Distribution format: tar.gz Reasons for new version: While the previous program versions were designed mainly to create and manipulate the state of quantum registers, the present extension aims to support quantum operations as the essential ingredient for studying the effects of noisy environments. Does this version supersede the previous version: Yes Nature of the physical problem: Today, entanglement is identified as the essential resource in virtually all aspects of quantum information theory. In most practical implementations of quantum information protocols, however, decoherence typically limits the lifetime of entanglement. It is therefore necessary and highly desirable to understand the evolution of entanglement in noisy environments. Method of solution: Using the computer algebra system MAPLE, we have developed a set of procedures that support the definition and manipulation of n-qubit quantum registers as well as (unitary) logic gates and (nonunitary) quantum operations that act on the quantum registers. The provided hierarchy of commands can be used interactively in order to simulate and analyze the evolution of n-qubit quantum systems in ideal and nonideal quantum circuits.

  8. A Personalization Effect in Multimedia Learning: Students Learn Better When Words Are in Conversational Style Rather Than Formal Style

    ERIC Educational Resources Information Center

    Mayer, Richard E.; Fennell, Sherry; Farmer, Lindsay; Campbell, Julie

    2004-01-01

    Students received a personalized or nonpersonalized version of a narrated animation explaining how the human respiratory system works. The narration for the nonpersonalized version was in formal style, whereas the narration for the personalized version was in conversational style in which "the" was changed to "your" in 12 places. In 3 experiments,…

  9. The Carbon-Land Model Intercomparison Project (C-LAMP): A Model-Data Comparison System for Evaluation of Coupled Biosphere-Atmosphere Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forrest M; Randerson, Jim; Thornton, Peter E

    2009-01-01

    The need to capture important climate feebacks in general circulation models (GCMs) has resulted in new efforts to include atmospheric chemistry and land and ocean biogeochemistry into the next generation of production climate models, now often referred to as Earth System Models (ESMs). While many terrestrial and ocean carbon models have been coupled to GCMs, recent work has shown that such models can yield a wide range of results, suggesting that a more rigorous set of offline and partially coupled experiments, along with detailed analyses of processes and comparisons with measurements, are warranted. The Carbon-Land Model Intercomparison Project (C-LAMP) providesmore » a simulation protocol and model performance metrics based upon comparisons against best-available satellite- and ground-based measurements (Hoffman et al., 2007). C-LAMP provides feedback to the modeling community regarding model improvements and to the measurement community by suggesting new observational campaigns. C-LAMP Experiment 1 consists of a set of uncoupled simulations of terrestrial carbon models specifically designed to examine the ability of the models to reproduce surface carbon and energy fluxes at multiple sites and to exhibit the influence of climate variability, prescribed atmospheric carbon dioxide (CO{sub 2}), nitrogen (N) deposition, and land cover change on projections of terrestrial carbon fluxes during the 20th century. Experiment 2 consists of partially coupled simulations of the terrestrial carbon model with an active atmosphere model exchanging energy and moisture fluxes. In all experiments, atmospheric CO{sub 2} follows the prescribed historical trajectory from C{sup 4}MIP. In Experiment 2, the atmosphere model is forced with prescribed sea surface temperatures (SSTs) and corresponding sea ice concentrations from the Hadley Centre; prescribed CO{sub 2} is radiatively active; and land, fossil fuel, and ocean CO{sub 2} fluxes are advected by the model. Both sets of experiments have been performed using two different terrestrial biogeochemistry modules coupled to the Community Land Model version 3 (CLM3) in the Community Climate System Model version 3 (CCSM3): The CASA model of Fung, et al., and the carbon-nitrogen (CN) model of Thornton. Comparisons against Ameriflus site measurements, MODIS satellite observations, NOAA flask records, TRANSCOM inversions, and Free Air CO{sub 2} Enrichment (FACE) site measurements, and other datasets have been performed and are described in Randerson et al. (2009). The C-LAMP diagnostics package was used to validate improvements to CASA and CN for use in the next generation model, CLM4. It is hoped that this effort will serve as a prototype for an international carbon-cycle model benchmarking activity for models being used for the Inter-governmental Panel on Climate Change (IPCC) Fifth Assessment Report. More information about C-LAMP, the experimental protocol, performance metrics, output standards, and model-data comparisons from the CLM3-CASA and CLM3-CN models are available at http://www.climatemodeling.org/c-lamp.« less

  10. Data set for phylogenetic tree and RAMPAGE Ramachandran plot analysis of SODs in Gossypium raimondii and G. arboreum.

    PubMed

    Wang, Wei; Xia, Minxuan; Chen, Jie; Deng, Fenni; Yuan, Rui; Zhang, Xiaopei; Shen, Fafu

    2016-12-01

    The data presented in this paper is supporting the research article "Genome-Wide Analysis of Superoxide Dismutase Gene Family in Gossypium raimondii and G. arboreum" [1]. In this data article, we present phylogenetic tree showing dichotomy with two different clusters of SODs inferred by the Bayesian method of MrBayes (version 3.2.4), "Bayesian phylogenetic inference under mixed models" [2], Ramachandran plots of G. raimondii and G. arboreum SODs, the protein sequence used to generate 3D sructure of proteins and the template accession via SWISS-MODEL server, "SWISS-MODEL: modelling protein tertiary and quaternary structure using evolutionary information." [3] and motif sequences of SODs identified by InterProScan (version 4.8) with the Pfam database, "Pfam: the protein families database" [4].

  11. FPL-PELPS : a price endogenous linear programming system for economic modeling, supplement to PELPS III, version 1.1.

    Treesearch

    Patricia K. Lebow; Henry Spelter; Peter J. Ince

    2003-01-01

    This report provides documentation and user information for FPL-PELPS, a personal computer price endogenous linear programming system for economic modeling. Originally developed to model the North American pulp and paper industry, FPL-PELPS follows its predecessors in allowing the modeling of any appropriate sector to predict consumption, production and capacity by...

  12. PCACE-Personal-Computer-Aided Cabling Engineering

    NASA Technical Reports Server (NTRS)

    Billitti, Joseph W.

    1987-01-01

    PCACE computer program developed to provide inexpensive, interactive system for learning and using engineering approach to interconnection systems. Basically database system that stores information as files of individual connectors and handles wiring information in circuit groups stored as records. Directly emulates typical manual engineering methods of handling data, thus making interface between user and program very natural. Apple version written in P-Code Pascal and IBM PC version of PCACE written in TURBO Pascal 3.0

  13. AEOSS runtime manual for system analysis on Advanced Earth-Orbital Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Lee, Hwa-Ping

    1990-01-01

    Advanced earth orbital spacecraft system (AEOSS) enables users to project the required power, weight, and cost for a generic earth-orbital spacecraft system. These variables are calculated on the component and subsystem levels, and then the system level. The included six subsystems are electric power, thermal control, structure, auxiliary propulsion, attitude control, and communication, command, and data handling. The costs are computed using statistically determined models that were derived from the flown spacecraft in the past and were categorized into classes according to their functions and structural complexity. Selected design and performance analyses for essential components and subsystems are also provided. AEOSS has the feature permitting a user to enter known values of these parameters, totally and partially, at all levels. All information is of vital importance to project managers of subsystems or a spacecraft system. AEOSS is a specially tailored software coded from the relational database program of the Acius' 4th Dimension with a Macintosh version. Because of the licensing agreements, two versions of the AEOSS documents were prepared. This version, AEOSS Runtime Manual, is permitted to be distributed with a finite number of the restrictive 4D Runtime version. It can perform all contained applications without any programming alterations.

  14. A Global Data Assimilation System for Atmospheric Aerosol

    NASA Technical Reports Server (NTRS)

    daSilva, Arlindo

    1999-01-01

    We will give an overview of an aerosol data assimilation system which combines advances in remote sensing of atmospheric aerosols, aerosol modeling and data assimilation methodology to produce high spatial and temporal resolution 3D aerosol fields. Initially, the Goddard Aerosol Assimilation System (GAAS) will assimilate TOMS, AVHRR and AERONET observations; later we will include MODIS and MISR. This data assimilation capability will allows us to integrate complementing aerosol observations from these platforms, enabling the development of an assimilated aerosol climatology as well as a global aerosol forecasting system in support of field campaigns. Furthermore, this system provides an interactive retrieval framework for each aerosol observing satellites, in particular TOMS and AVHRR. The Goddard Aerosol Assimilation System (GAAS) takes advantage of recent advances in constituent data assimilation at DAO, including flow dependent parameterizations of error covariances and the proper consideration of model bias. For its prognostic transport model, GAAS will utilize the Goddard Ozone, Chemistry, Aerosol, Radiation and Transport (GOCART) model developed at NASA/GSFC Codes 916 and 910.3. GOCART includes the Lin-Rood flux-form, semi-Langrangian transport model with parameterized aerosol chemistry and physical processes for absorbing (dust and black carbon) and non-absorbing aerosols (sulfate and organic carbon). Observations and model fields are combined using a constituent version of DAO's Physical-space Statistical Analysis System (PSAS), including its adaptive quality control system. In this talk we describe the main components of this assimilation system and present preliminary results obtained by assimilating TOMS data.

  15. Guide for Regional Integrated Assessments: Handbook of Methods and Procedures, Version 5.1. Appendix 1

    NASA Technical Reports Server (NTRS)

    Rosenzweig, Cynthia E.; Jones, James W.; Hatfield, Jerry; Antle, John; Ruane, Alex; Boote, Ken; Thorburn, Peter; Valdivia, Roberto; Porter, Cheryl; Janssen, Sander; hide

    2015-01-01

    The purpose of this handbook is to describe recommended methods for a trans-disciplinary, systems-based approach for regional-scale (local to national scale) integrated assessment of agricultural systems under future climate, bio-physical and socio-economic conditions. An earlier version of this Handbook was developed and used by several AgMIP Regional Research Teams (RRTs) in Sub-Saharan Africa (SSA) and South Asia (SA)(AgMIP handbook version 4.2, www.agmip.org/regional-integrated-assessments-handbook/). In contrast to the earlier version, which was written specifically to guide a consistent set of integrated assessments across SSA and SA, this version is intended to be more generic such that the methods can be applied to any region globally. These assessments are the regional manifestation of research activities described by AgMIP in its online protocols document (available at www.agmip.org). AgMIP Protocols were created to guide climate, crop modeling, economics, and information technology components of its projects.

  16. Translation and integration of CCC nursing diagnoses into ICNP.

    PubMed

    Matney, Susan A; DaDamio, Rebecca; Couderc, Carmela; Dlugos, Mary; Evans, Jonathan; Gianonne, Gay; Haskell, Robert; Hardiker, Nicholas; Coenen, Amy; Saba, Virginia K

    2008-01-01

    The purpose of this study was to translate and integrate nursing diagnosis concepts from the Clinical Care Classification (CCC) System Version 2.0 to DiagnosticPhenomenon or nursing diagnostic statements in the International Classification for Nursing Practice (ICNP) Version 1.0. Source concepts for CCC were mapped by the project team, where possible, to pre-coordinated ICNP terms. The manual decomposition of source concepts according to the ICNP 7-Axis Model served to validate the mappings. A total of 62% of the CCC Nursing Diagnoses were a pre-coordinated match to an ICNP concept, 35% were a post-coordinated match and only 3% had no match. During the mapping process, missing CCC concepts were submitted to the ICNP Programme, with a recommendation for inclusion in future releases.

  17. The effects of the Union for International Cancer Control/American Joint Committee on Cancer Tumour, Node, Metastasis system version 8 on staging of differentiated thyroid cancer: a comparison to version 7.

    PubMed

    Verburg, Frederik A; Mäder, Uwe; Luster, Markus; Reiners, Christoph

    2018-06-01

    To assess the changes resulting from the changes from UICC/AJCC TNM version 7 to version 8 and to subsequently determine whether TNM version 8 is an improvement compared to previous iterations of the TNM system and other staging systems for differentiated thyroid cancer (DTC) with regard to prognostic power. Database study of DTC patients treated in our centre between 1978 up to and including 1 July 2014. Results were compared to our previous comparison of prognostic systems using the same data set. 2257 DTC patients. Staging in accordance with TNM 7 and TNM 8. Thyroid cancer-specific mortality; comparison was based on p-values of univariate Cox regression analyses as well as analysis of the proportion of variance explained (PVE). There is a redistribution from stage 3 to lower stages affecting 206 (9.1%) patients. DTC-related mortality according to Kaplan-Meier for younger and older patients in TNM 7 had a slightly lower prognostic power than that in accordance with TNM 8 (P = 8.0 10 -16 and P = 1.5 10 -21 , respectively). Overall staging is lower in 627/2257 (27.8%) patients. PVE (TNM 7: 0.29; TNM 8: 0.28) and the P-value of Cox regressions (TNM 7: P = 7.1*10 -52 ; TNM 8: P = 3.9*10 -49 ) for TNM version 8 are marginally lower than that for TNM version 7, but still better than for any other DTC staging system. TNM 8 results in a marked downstaging of patients compared to TNM 7. Although some changes, like the change in age boundary, appear to be associated with an improvement in prognostic power, the overall effect of the changes does not improve the predictive power compared to TNM 7. © 2018 John Wiley & Sons Ltd.

  18. Joint inversion of 3-PG using eddy-covariance and inventory plot measurements in temperate-maritime conifer forests: Uncertainty in transient carbon-balance responses to climate change

    NASA Astrophysics Data System (ADS)

    Hember, R. A.; Kurz, W. A.; Coops, N. C.; Black, T. A.

    2010-12-01

    Temperate-maritime forests of coastal British Columbia store large amounts of carbon (C) in soil, detritus, and trees. To better understand the sensitivity of these C stocks to climate variability, simulations were conducted using a hybrid version of the model, Physiological Principles Predicting Growth (3-PG), combined with algorithms from the Carbon Budget Model of the Canadian Forest Sector - version 3 (CBM-CFS3) to account for full ecosystem C dynamics. The model was optimized based on a combination of monthly CO2 and H2O flux measurements derived from three eddy-covariance systems and multi-annual stemwood growth (Gsw) and mortality (Msw) derived from 1300 permanent sample plots by means of Markov chain Monte Carlo sampling. The calibrated model serves as an unbiased estimator of stemwood C with enhanced precision over that of strictly-empirical models, minimized reliance on local prescriptions, and the flexibility to study impacts of environmental change on regional C stocks. We report the contribution of each dataset in identifying key physiological parameters and the posterior uncertainty in predictions of net ecosystem production (NEP). The calibrated model was used to spin up pre-industrial C pools and estimate the sensitivity of regional net carbon balance to a gradient of temperature changes, λ=ΔC/ΔT, during three 62-year harvest rotations, spanning 1949-2135. Simulations suggest that regional net primary production, tree mortality, and heterotrophic respiration all began increasing, while NEP began decreasing in response to warming following the 1976 shift in northeast-Pacific climate. We quantified the uncertainty of λ and how it was mediated by initial dead C, tree mortality, precipitation change, and the time horizon in which it was calculated.

  19. PLOT3D/AMES, GENERIC UNIX VERSION USING DISSPLA (WITH TURB3D)

    NASA Technical Reports Server (NTRS)

    Buning, P.

    1994-01-01

    PLOT3D is an interactive graphics program designed to help scientists visualize computational fluid dynamics (CFD) grids and solutions. Today, supercomputers and CFD algorithms can provide scientists with simulations of such highly complex phenomena that obtaining an understanding of the simulations has become a major problem. Tools which help the scientist visualize the simulations can be of tremendous aid. PLOT3D/AMES offers more functions and features, and has been adapted for more types of computers than any other CFD graphics program. Version 3.6b+ is supported for five computers and graphic libraries. Using PLOT3D, CFD physicists can view their computational models from any angle, observing the physics of problems and the quality of solutions. As an aid in designing aircraft, for example, PLOT3D's interactive computer graphics can show vortices, temperature, reverse flow, pressure, and dozens of other characteristics of air flow during flight. As critical areas become obvious, they can easily be studied more closely using a finer grid. PLOT3D is part of a computational fluid dynamics software cycle. First, a program such as 3DGRAPE (ARC-12620) helps the scientist generate computational grids to model an object and its surrounding space. Once the grids have been designed and parameters such as the angle of attack, Mach number, and Reynolds number have been specified, a "flow-solver" program such as INS3D (ARC-11794 or COS-10019) solves the system of equations governing fluid flow, usually on a supercomputer. Grids sometimes have as many as two million points, and the "flow-solver" produces a solution file which contains density, x- y- and z-momentum, and stagnation energy for each grid point. With such a solution file and a grid file containing up to 50 grids as input, PLOT3D can calculate and graphically display any one of 74 functions, including shock waves, surface pressure, velocity vectors, and particle traces. PLOT3D's 74 functions are organized into five groups: 1) Grid Functions for grids, grid-checking, etc.; 2) Scalar Functions for contour or carpet plots of density, pressure, temperature, Mach number, vorticity magnitude, helicity, etc.; 3) Vector Functions for vector plots of velocity, vorticity, momentum, and density gradient, etc.; 4) Particle Trace Functions for rake-like plots of particle flow or vortex lines; and 5) Shock locations based on pressure gradient. TURB3D is a modification of PLOT3D which is used for viewing CFD simulations of incompressible turbulent flow. Input flow data consists of pressure, velocity and vorticity. Typical quantities to plot include local fluctuations in flow quantities and turbulent production terms, plotted in physical or wall units. PLOT3D/TURB3D includes both TURB3D and PLOT3D because the operation of TURB3D is identical to PLOT3D, and there is no additional sample data or printed documentation for TURB3D. Graphical capabilities of PLOT3D version 3.6b+ vary among the implementations available through COSMIC. Customers are encouraged to purchase and carefully review the PLOT3D manual before ordering the program for a specific computer and graphics library. There is only one manual for use with all implementations of PLOT3D, and although this manual generally assumes that the Silicon Graphics Iris implementation is being used, informative comments concerning other implementations appear throughout the text. With all implementations, the visual representation of the object and flow field created by PLOT3D consists of points, lines, and polygons. Points can be represented with dots or symbols, color can be used to denote data values, and perspective is used to show depth. Differences among implementations impact the program's ability to use graphical features that are based on 3D polygons, the user's ability to manipulate the graphical displays, and the user's ability to obtain alternate forms of output. The UNIX/DISSPLA implementation of PLOT3D supports 2-D polygons as well as 2-D and 3-D lines, but does not support graphics features requiring 3-D polygons (shading and hidden line removal, for example). Views can be manipulated using keyboard commands. This version of PLOT3D is potentially able to produce files for a variety of output devices; however, site-specific capabilities will vary depending on the device drivers supplied with the user's DISSPLA library. The version 3.6b+ UNIX/DISSPLA implementations of PLOT3D (ARC-12788) and PLOT3D/TURB3D (ARC-12778) were developed for use on computers running UNIX SYSTEM 5 with BSD 4.3 extensions. The standard distribution media for each ofthese programs is a 9track, 6250 bpi magnetic tape in TAR format. Customers purchasing one implementation version of PLOT3D or PLOT3D/TURB3D will be given a $200 discount on each additional implementation version ordered at the same time. Version 3.6b+ of PLOT3D and PLOT3D/TURB3D are also supported for the following computers and graphics libraries: (1) generic UNIX Supercomputer and IRIS, suitable for CRAY 2/UNICOS, CONVEX, Alliant with remote IRIS 2xxx/3xxx or IRIS 4D (ARC-12779, ARC-12784); (2) Silicon Graphics IRIS 2xxx/3xxx or IRIS 4D (ARC-12783, ARC-12782); (3) VAX computers running VMS Version 5.0 and DISSPLA Version 11.0 (ARC-12777, ARC-12781); and (4) Apollo computers running UNIX and GMR3D Version 2.0 (ARC-12789, ARC-12785 which have no capabilities to put text on plots). Silicon Graphics Iris, IRIS 4D, and IRIS 2xxx/3xxx are trademarks of Silicon Graphics Incorporated. VAX and VMS are trademarks of Digital Electronics Corporation. DISSPLA is a trademark of Computer Associates. CRAY 2 and UNICOS are trademarks of CRAY Research, Incorporated. CONVEX is a trademark of Convex Computer Corporation. Alliant is a trademark of Alliant. Apollo and GMR3D are trademarks of Hewlett-Packard, Incorporated. System 5 is a trademark of Bell Labs, Incorporated. BSD4.3 is a trademark of the University of California at Berkeley. UNIX is a registered trademark of AT&T.

  20. PLOT3D/AMES, GENERIC UNIX VERSION USING DISSPLA (WITHOUT TURB3D)

    NASA Technical Reports Server (NTRS)

    Buning, P.

    1994-01-01

    PLOT3D is an interactive graphics program designed to help scientists visualize computational fluid dynamics (CFD) grids and solutions. Today, supercomputers and CFD algorithms can provide scientists with simulations of such highly complex phenomena that obtaining an understanding of the simulations has become a major problem. Tools which help the scientist visualize the simulations can be of tremendous aid. PLOT3D/AMES offers more functions and features, and has been adapted for more types of computers than any other CFD graphics program. Version 3.6b+ is supported for five computers and graphic libraries. Using PLOT3D, CFD physicists can view their computational models from any angle, observing the physics of problems and the quality of solutions. As an aid in designing aircraft, for example, PLOT3D's interactive computer graphics can show vortices, temperature, reverse flow, pressure, and dozens of other characteristics of air flow during flight. As critical areas become obvious, they can easily be studied more closely using a finer grid. PLOT3D is part of a computational fluid dynamics software cycle. First, a program such as 3DGRAPE (ARC-12620) helps the scientist generate computational grids to model an object and its surrounding space. Once the grids have been designed and parameters such as the angle of attack, Mach number, and Reynolds number have been specified, a "flow-solver" program such as INS3D (ARC-11794 or COS-10019) solves the system of equations governing fluid flow, usually on a supercomputer. Grids sometimes have as many as two million points, and the "flow-solver" produces a solution file which contains density, x- y- and z-momentum, and stagnation energy for each grid point. With such a solution file and a grid file containing up to 50 grids as input, PLOT3D can calculate and graphically display any one of 74 functions, including shock waves, surface pressure, velocity vectors, and particle traces. PLOT3D's 74 functions are organized into five groups: 1) Grid Functions for grids, grid-checking, etc.; 2) Scalar Functions for contour or carpet plots of density, pressure, temperature, Mach number, vorticity magnitude, helicity, etc.; 3) Vector Functions for vector plots of velocity, vorticity, momentum, and density gradient, etc.; 4) Particle Trace Functions for rake-like plots of particle flow or vortex lines; and 5) Shock locations based on pressure gradient. TURB3D is a modification of PLOT3D which is used for viewing CFD simulations of incompressible turbulent flow. Input flow data consists of pressure, velocity and vorticity. Typical quantities to plot include local fluctuations in flow quantities and turbulent production terms, plotted in physical or wall units. PLOT3D/TURB3D includes both TURB3D and PLOT3D because the operation of TURB3D is identical to PLOT3D, and there is no additional sample data or printed documentation for TURB3D. Graphical capabilities of PLOT3D version 3.6b+ vary among the implementations available through COSMIC. Customers are encouraged to purchase and carefully review the PLOT3D manual before ordering the program for a specific computer and graphics library. There is only one manual for use with all implementations of PLOT3D, and although this manual generally assumes that the Silicon Graphics Iris implementation is being used, informative comments concerning other implementations appear throughout the text. With all implementations, the visual representation of the object and flow field created by PLOT3D consists of points, lines, and polygons. Points can be represented with dots or symbols, color can be used to denote data values, and perspective is used to show depth. Differences among implementations impact the program's ability to use graphical features that are based on 3D polygons, the user's ability to manipulate the graphical displays, and the user's ability to obtain alternate forms of output. The UNIX/DISSPLA implementation of PLOT3D supports 2-D polygons as well as 2-D and 3-D lines, but does not support graphics features requiring 3-D polygons (shading and hidden line removal, for example). Views can be manipulated using keyboard commands. This version of PLOT3D is potentially able to produce files for a variety of output devices; however, site-specific capabilities will vary depending on the device drivers supplied with the user's DISSPLA library. The version 3.6b+ UNIX/DISSPLA implementations of PLOT3D (ARC-12788) and PLOT3D/TURB3D (ARC-12778) were developed for use on computers running UNIX SYSTEM 5 with BSD 4.3 extensions. The standard distribution media for each ofthese programs is a 9track, 6250 bpi magnetic tape in TAR format. Customers purchasing one implementation version of PLOT3D or PLOT3D/TURB3D will be given a $200 discount on each additional implementation version ordered at the same time. Version 3.6b+ of PLOT3D and PLOT3D/TURB3D are also supported for the following computers and graphics libraries: (1) generic UNIX Supercomputer and IRIS, suitable for CRAY 2/UNICOS, CONVEX, Alliant with remote IRIS 2xxx/3xxx or IRIS 4D (ARC-12779, ARC-12784); (2) Silicon Graphics IRIS 2xxx/3xxx or IRIS 4D (ARC-12783, ARC-12782); (3) VAX computers running VMS Version 5.0 and DISSPLA Version 11.0 (ARC-12777, ARC-12781); and (4) Apollo computers running UNIX and GMR3D Version 2.0 (ARC-12789, ARC-12785 which have no capabilities to put text on plots). Silicon Graphics Iris, IRIS 4D, and IRIS 2xxx/3xxx are trademarks of Silicon Graphics Incorporated. VAX and VMS are trademarks of Digital Electronics Corporation. DISSPLA is a trademark of Computer Associates. CRAY 2 and UNICOS are trademarks of CRAY Research, Incorporated. CONVEX is a trademark of Convex Computer Corporation. Alliant is a trademark of Alliant. Apollo and GMR3D are trademarks of Hewlett-Packard, Incorporated. System 5 is a trademark of Bell Labs, Incorporated. BSD4.3 is a trademark of the University of California at Berkeley. UNIX is a registered trademark of AT&T.

  1. SBML Level 3 package: Groups, Version 1 Release 1

    PubMed Central

    Hucka, Michael; Smith, Lucian P.

    2017-01-01

    Summary Biological models often contain components that have relationships with each other, or that modelers want to treat as belonging to groups with common characteristics or shared metadata. The SBML Level 3 Version 1 Core specification does not provide an explicit mechanism for expressing such relationships, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Groups package for SBML Level 3 adds the necessary features to SBML to allow grouping of model components to be expressed. Such groups do not affect the mathematical interpretation of a model, but they do provide a way to add information that can be useful for modelers and software tools. The SBML Groups package enables a modeler to include definitions of groups and nested groups, each of which may be annotated to convey why that group was created, and what it represents. PMID:28187406

  2. CFD Modeling of Helium Pressurant Effects on Cryogenic Tank Pressure Rise Rates in Normal Gravity

    NASA Technical Reports Server (NTRS)

    Grayson, Gary; Lopez, Alfredo; Chandler, Frank; Hastings, Leon; Hedayat, Ali; Brethour, James

    2007-01-01

    A recently developed computational fluid dynamics modeling capability for cryogenic tanks is used to simulate both self-pressurization from external heating and also depressurization from thermodynamic vent operation. Axisymmetric models using a modified version of the commercially available FLOW-3D software are used to simulate actual physical tests. The models assume an incompressible liquid phase with density that is a function of temperature only. A fully compressible formulation is used for the ullage gas mixture that contains both condensable vapor and a noncondensable gas component. The tests, conducted at the NASA Marshall Space Flight Center, include both liquid hydrogen and nitrogen in tanks with ullage gas mixtures of each liquid's vapor and helium. Pressure and temperature predictions from the model are compared to sensor measurements from the tests and a good agreement is achieved. This further establishes the accuracy of the developed FLOW-3D based modeling approach for cryogenic systems.

  3. PAWS/STEM - PADE APPROXIMATION WITH SCALING AND SCALED TAYLOR EXPONENTIAL MATRIX (VAX VMS VERSION)

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1994-01-01

    Traditional fault-tree techniques for analyzing the reliability of large, complex systems fail to model the dynamic reconfiguration capabilities of modern computer systems. Markov models, on the other hand, can describe fault-recovery (via system reconfiguration) as well as fault-occurrence. The Pade Approximation with Scaling (PAWS) and Scaled Taylor Exponential Matrix (STEM) programs provide a flexible, user-friendly, language-based interface for the creation and evaluation of Markov models describing the behavior of fault-tolerant reconfigurable computer systems. PAWS and STEM produce exact solutions for the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. The calculation of the probability of entering a death state of a Markov model (representing system failure) requires the solution of a set of coupled differential equations. Because of the large disparity between the rates of fault arrivals and system recoveries, Markov models of fault-tolerant architectures inevitably lead to numerically stiff differential equations. Both PAWS and STEM have the capability to solve numerically stiff models. These complementary programs use separate methods to determine the matrix exponential in the solution of the model's system of differential equations. In general, PAWS is better suited to evaluate small and dense models. STEM operates at lower precision, but works faster than PAWS for larger models. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. PAWS/STEM was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The package is written in PASCAL, ANSI compliant C-language, and FORTRAN 77. The standard distribution medium for the VMS version of PAWS/STEM (LAR-14165) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of PAWS/STEM (LAR-14920) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. PAWS/STEM was developed in 1989 and last updated in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. SunOS, Sun3, and Sun4 are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  4. PAWS/STEM - PADE APPROXIMATION WITH SCALING AND SCALED TAYLOR EXPONENTIAL MATRIX (SUN VERSION)

    NASA Technical Reports Server (NTRS)

    Butler, R. W.

    1994-01-01

    Traditional fault-tree techniques for analyzing the reliability of large, complex systems fail to model the dynamic reconfiguration capabilities of modern computer systems. Markov models, on the other hand, can describe fault-recovery (via system reconfiguration) as well as fault-occurrence. The Pade Approximation with Scaling (PAWS) and Scaled Taylor Exponential Matrix (STEM) programs provide a flexible, user-friendly, language-based interface for the creation and evaluation of Markov models describing the behavior of fault-tolerant reconfigurable computer systems. PAWS and STEM produce exact solutions for the probability of system failure and provide a conservative estimate of the number of significant digits in the solution. The calculation of the probability of entering a death state of a Markov model (representing system failure) requires the solution of a set of coupled differential equations. Because of the large disparity between the rates of fault arrivals and system recoveries, Markov models of fault-tolerant architectures inevitably lead to numerically stiff differential equations. Both PAWS and STEM have the capability to solve numerically stiff models. These complementary programs use separate methods to determine the matrix exponential in the solution of the model's system of differential equations. In general, PAWS is better suited to evaluate small and dense models. STEM operates at lower precision, but works faster than PAWS for larger models. The mathematical approach chosen to solve a reliability problem may vary with the size and nature of the problem. Although different solution techniques are utilized on different programs, it is possible to have a common input language. The Systems Validation Methods group at NASA Langley Research Center has created a set of programs that form the basis for a reliability analysis workstation. The set of programs are: SURE reliability analysis program (COSMIC program LAR-13789, LAR-14921); the ASSIST specification interface program (LAR-14193, LAR-14923), PAWS/STEM reliability analysis programs (LAR-14165, LAR-14920); and the FTC fault tree tool (LAR-14586, LAR-14922). FTC is used to calculate the top-event probability for a fault tree. PAWS/STEM and SURE are programs which interpret the same SURE language, but utilize different solution methods. ASSIST is a preprocessor that generates SURE language from a more abstract definition. SURE, ASSIST, and PAWS/STEM are also offered as a bundle. Please see the abstract for COS-10039/COS-10041, SARA - SURE/ASSIST Reliability Analysis Workstation, for pricing details. PAWS/STEM was originally developed for DEC VAX series computers running VMS and was later ported for use on Sun computers running SunOS. The package is written in PASCAL, ANSI compliant C-language, and FORTRAN 77. The standard distribution medium for the VMS version of PAWS/STEM (LAR-14165) is a 9-track 1600 BPI magnetic tape in VMSINSTAL format. It is also available on a TK50 tape cartridge in VMSINSTAL format. Executables are included. The standard distribution medium for the Sun version of PAWS/STEM (LAR-14920) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. Both Sun3 and Sun4 executables are included. PAWS/STEM was developed in 1989 and last updated in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation. SunOS, Sun3, and Sun4 are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  5. Urban weather data and building models for the inclusion of the urban heat island effect in building performance simulation.

    PubMed

    Palme, M; Inostroza, L; Villacreses, G; Lobato, A; Carrasco, C

    2017-10-01

    This data article presents files supporting calculation for urban heat island (UHI) inclusion in building performance simulation (BPS). Methodology is used in the research article "From urban climate to energy consumption. Enhancing building performance simulation by including the urban heat island effect" (Palme et al., 2017) [1]. In this research, a Geographical Information System (GIS) study is done in order to statistically represent the most important urban scenarios of four South-American cities (Guayaquil, Lima, Antofagasta and Valparaíso). Then, a Principal Component Analysis (PCA) is done to obtain reference Urban Tissues Categories (UTC) to be used in urban weather simulation. The urban weather files are generated by using the Urban Weather Generator (UWG) software (version 4.1 beta). Finally, BPS is run out with the Transient System Simulation (TRNSYS) software (version 17). In this data paper, four sets of data are presented: 1) PCA data (excel) to explain how to group different urban samples in representative UTC; 2) UWG data (text) to reproduce the Urban Weather Generation for the UTC used in the four cities (4 UTC in Lima, Guayaquil, Antofagasta and 5 UTC in Valparaíso); 3) weather data (text) with the resulting rural and urban weather; 4) BPS models (text) data containing the TRNSYS models (four building models).

  6. Evaluation of LIS-based Soil Moisture and Evapotranspiration in the Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Jung, H. C.; Kang, D. H.; Kim, E. J.; Yoon, Y.; Kumar, S.; Peters-Lidard, C. D.; Baeck, S. H.; Hwang, E.; Chae, H.

    2017-12-01

    K-water is the South Korean national water agency. It is the government-funded private agency for water resource development that provides both civil and industrial water in S. Korea. K-water is interested in exploring how earth remote sensing and modeling can help their tasks. In this context, the NASA Land Information System (LIS) is implemented to simulate land surface processes in the Korean Peninsula. The Noah land surface model with Multi-Parameterization, version 3.6 (Noah-MP) is used to reproduce the water budget variables on a 1 km spatial resolution grid with a daily temporal resolution. The Modern-Era Retrospective analysis for Research and Applications, version 2 (MERRA-2) datasets is used to force the system. The rainfall data are spatially downscaled from high resolution WorldClim precipitation climatology. The other meteorological inputs (i.e. air temperature, humidity, pressure, winds, radiation) are also downscaled by statistical methods (i.e. lapse-rate, slope-aspect). Additional model experiments are conducted with local rainfall datasets and soil maps to replace the downscaled MERRA-2 precipitation field and the hybrid STATSGO/FAO soil texture, respectively. For the evaluation of model performance, daily soil moisture and evapotranspiration measurements at several stations are compared to the LIS-based outputs. This study demonstrates that application of NASA's LIS can enhance drought and flood prediction capabilities in South Asia and Korea.

  7. Online Simulations of Global Aerosol Distributions in the NASA GEOS-4 Model and Comparisons to Satellite and Ground-Based Aerosol Optical Depth

    NASA Technical Reports Server (NTRS)

    Colarco, Peter; daSilva, Arlindo; Chin, Mian; Diehl, Thomas

    2010-01-01

    We have implemented a module for tropospheric aerosols (GO CART) online in the NASA Goddard Earth Observing System version 4 model and simulated global aerosol distributions for the period 2000-2006. The new online system offers several advantages over the previous offline version, providing a platform for aerosol data assimilation, aerosol-chemistry-climate interaction studies, and short-range chemical weather forecasting and climate prediction. We introduce as well a methodology for sampling model output consistently with satellite aerosol optical thickness (AOT) retrievals to facilitate model-satellite comparison. Our results are similar to the offline GOCART model and to the models participating in the AeroCom intercomparison. The simulated AOT has similar seasonal and regional variability and magnitude to Aerosol Robotic Network (AERONET), Moderate Resolution Imaging Spectroradiometer, and Multiangle Imaging Spectroradiometer observations. The model AOT and Angstrom parameter are consistently low relative to AERONET in biomass-burning-dominated regions, where emissions appear to be underestimated, consistent with the results of the offline GOCART model. In contrast, the model AOT is biased high in sulfate-dominated regions of North America and Europe. Our model-satellite comparison methodology shows that diurnal variability in aerosol loading is unimportant compared to sampling the model where the satellite has cloud-free observations, particularly in sulfate-dominated regions. Simulated sea salt burden and optical thickness are high by a factor of 2-3 relative to other models, and agreement between model and satellite over-ocean AOT is improved by reducing the model sea salt burden by a factor of 2. The best agreement in both AOT magnitude and variability occurs immediately downwind of the Saharan dust plume.

  8. Implementation of a Sage-Based Stirling Model Into a System-Level Numerical Model of the Fission Power System Technology Demonstration Unit

    NASA Technical Reports Server (NTRS)

    Briggs, Maxwell H.

    2011-01-01

    The Fission Power System (FPS) project is developing a Technology Demonstration Unit (TDU) to verify the performance and functionality of a subscale version of the FPS reference concept in a relevant environment, and to verify component and system models. As hardware is developed for the TDU, component and system models must be refined to include the details of specific component designs. This paper describes the development of a Sage-based pseudo-steady-state Stirling convertor model and its implementation into a system-level model of the TDU.

  9. Concordance of Interests in Dynamic Models of Social Partnership in the System of Continuing Professional Education

    ERIC Educational Resources Information Center

    Tarasenko, Larissa V.; Ougolnitsky, Guennady A.; Usov, Anatoly B.; Vaskov, Maksim A.; Kirik, Vladimir A.; Astoyanz, Margarita S.; Angel, Olga Y.

    2016-01-01

    A dynamic game theoretic model of concordance of interests in the process of social partnership in the system of continuing professional education is proposed. Non-cooperative, cooperative, and hierarchical setups are examined. Analytical solution for a linear state version of the model is provided. Nash equilibrium algorithms (for non-cooperative…

  10. Aviation Environmental Design Tool (AEDT): technical manual, version 2b, service pack 3

    DOT National Transportation Integrated Search

    2016-05-03

    The Federal Aviation Administration, Office of Environment and Energy (FAA-AEE) has developed the Aviation Environmental Design Tool (AEDT) version 2b software system with the support of the following development team: FAA, National Aeronautics and S...

  11. Future climate change under RCP emission scenarios with GISS ModelE2

    DOE PAGES

    Nazarenko, L.; Schmidt, G. A.; Miller, R. L.; ...

    2015-02-24

    We examine the anthropogenically forced climate response for the 21st century representative concentration pathway (RCP) emission scenarios and their extensions for the period 2101–2500. The experiments were performed with ModelE2, a new version of the NASA Goddard Institute for Space Sciences (GISS) coupled general circulation model that includes three different versions for the atmospheric composition components: a noninteractive version (NINT) with prescribed composition and a tuned aerosol indirect effect (AIE), the TCAD version with fully interactive aerosols, whole-atmosphere chemistry, and the tuned AIE, and the TCADI version which further includes a parameterized first indirect aerosol effect on clouds. Each atmosphericmore » version is coupled to two different ocean general circulation models: the Russell ocean model (GISS-E2-R) and HYCOM (GISS-E2-H). By 2100, global mean warming in the RCP scenarios ranges from 1.0 to 4.5° C relative to 1850–1860 mean temperature in the historical simulations. In the RCP2.6 scenario, the surface warming in all simulations stays below a 2 °C threshold at the end of the 21st century. For RCP8.5, the range is 3.5–4.5° C at 2100. Decadally averaged sea ice area changes are highly correlated to global mean surface air temperature anomalies and show steep declines in both hemispheres, with a larger sensitivity during winter months. By the year 2500, there are complete recoveries of the globally averaged surface air temperature for all versions of the GISS climate model in the low-forcing scenario RCP2.6. TCADI simulations show enhanced warming due to greater sensitivity to CO₂, aerosol effects, and greater methane feedbacks, and recovery is much slower in RCP2.6 than with the NINT and TCAD versions. All coupled models have decreases in the Atlantic overturning stream function by 2100. In RCP2.6, there is a complete recovery of the Atlantic overturning stream function by the year 2500 while with scenario RCP8.5, the E2-R climate model produces a complete shutdown of deep water formation in the North Atlantic.« less

  12. Accounting for observation uncertainties in an evaluation metric of low latitude turbulent air-sea fluxes: application to the comparison of a suite of IPSL model versions

    NASA Astrophysics Data System (ADS)

    Servonnat, Jérôme; Găinuşă-Bogdan, Alina; Braconnot, Pascale

    2017-09-01

    Turbulent momentum and heat (sensible heat and latent heat) fluxes at the air-sea interface are key components of the whole energetic of the Earth's climate. The evaluation of these fluxes in the climate models is still difficult because of the large uncertainties associated with the reference products. In this paper we present an objective metric accounting for reference uncertainties to evaluate the annual cycle of the low latitude turbulent fluxes of a suite of IPSL climate models. This metric consists in a Hotelling T 2 test between the simulated and observed field in a reduce space characterized by the dominant modes of variability that are common to both the model and the reference, taking into account the observational uncertainty. The test is thus more severe when uncertainties are small as it is the case for sea surface temperature (SST). The results of the test show that for almost all variables and all model versions the model-reference differences are not zero. It is not possible to distinguish between model versions for sensible heat and meridional wind stress, certainly due to the large observational uncertainties. All model versions share similar biases for the different variables. There is no improvement between the reference versions of the IPSL model used for CMIP3 and CMIP5. The test also reveals that the higher horizontal resolution fails to improve the representation of the turbulent surface fluxes compared to the other versions. The representation of the fluxes is further degraded in a version with improved atmospheric physics with an amplification of some of the biases in the Indian Ocean and in the intertropical convergence zone. The ranking of the model versions for the turbulent fluxes is not correlated with the ranking found for SST. This highlights that despite the fact that SST gradients are important for the large-scale atmospheric circulation patterns, other factors such as wind speed, and air-sea temperature contrast play an important role in the representation of turbulent fluxes.

  13. Simulating the 2012 High Plains Drought Using Three Single Column Models (SCM)

    NASA Astrophysics Data System (ADS)

    Medina, I. D.; Baker, I. T.; Denning, S.; Dazlich, D. A.

    2015-12-01

    The impact of changes in the frequency and severity of drought on fresh water sustainability is a great concern for many regions of the world. One such location is the High Plains, where the local economy is primarily driven by fresh water withdrawals from the Ogallala Aquifer, which accounts for approximately 30% of total irrigation withdrawals from all U.S. aquifers combined. Modeling studies that focus on the feedback mechanisms that control the climate and eco-hydrology during times of drought are limited, and have used conventional General Circulation Models (GCMs) with grid length scales ranging from one hundred to several hundred kilometers. Additionally, these models utilize crude statistical parameterizations of cloud processes for estimating sub-grid fluxes of heat and moisture and have a poor representation of land surface heterogeneity. For this research, we focus on the 2012 High Plains drought and perform numerical simulations using three single column model (SCM) versions of BUGS5 (Colorado State University (CSU) GCM coupled to the Simple Biosphere Model (SiB3)). In the first version of BUGS5, the model is used in its standard bulk setting (single atmospheric column coupled to a single instance of SiB3), secondly, the Super-Parameterized Community Atmospheric Model (SP-CAM), a cloud resolving model (CRM) (CRM consists of 32 atmospheric columns), replaces the single CSU GCM atmospheric parameterization and is coupled to a single instance of SiB3, and for the third version of BUGS5, an instance of SiB3 is coupled to each CRM column of the SP-CAM (32 CRM columns coupled to 32 instances of SiB3). To assess the physical realism of the land-atmosphere feedbacks simulated by all three versions of BUGS5, differences in simulated energy and moisture fluxes are computed between the 2011 and 2012 period and are compared to those calculated using observational data from the AmeriFlux Tower Network for the same period at the ARM Site in Lamont, OK. This research will provide a better understanding of model deficiencies in reproducing and predicting droughts in the future, which is essential to the economic, ecologic and social well being of the High Plains.

  14. SCELib3.0: The new revision of SCELib, the parallel computational library of molecular properties in the Single Center Approach

    NASA Astrophysics Data System (ADS)

    Sanna, N.; Baccarelli, I.; Morelli, G.

    2009-12-01

    SCELib is a computer program which implements the Single Center Expansion (SCE) method to describe molecular electronic densities and the interaction potentials between a charged projectile (electron or positron) and a target molecular system. The first version (CPC Catalog identifier ADMG_v1_0) was submitted to the CPC Program Library in 2000, and version 2.0 (ADMG_v2_0) was submitted in 2004. We here announce the new release 3.0 which presents additional features with respect to the previous versions aiming at a significative enhance of its capabilities to deal with larger molecular systems. SCELib 3.0 allows for ab initio effective core potential (ECP) calculations of the molecular wavefunctions to be used in the SCE method in addition to the standard all-electron description of the molecule. The list of supported architectures has been updated and the code has been ported to platforms based on accelerating coprocessors, such as the NVIDIA GPGPU and the new parallel model adopted is able to efficiently run on a mixed many-core computing system. Program summaryProgram title: SCELib3.0 Catalogue identifier: ADMG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADMG_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2 018 862 No. of bytes in distributed program, including test data, etc.: 4 955 014 Distribution format: tar.gz Programming language: C Compilers used: xlc V8.x, Intel C V10.x, Portland Group V7.x, nvcc V2.x Computer: All SMP platforms based on AIX, Linux and SUNOS operating systems over SPARC, POWER, Intel Itanium2, X86, em64t and Opteron processors Operating system: SUNOS, IBM AIX, Linux RedHat (Enterprise), Linux SuSE (SLES) Has the code been vectorized or parallelized?: Yes. 1 to 32 (CPU or GPU) used RAM: Up to 32 GB depending on the molecular system and runtime parameters Classification: 16.5 Catalogue identifier of previous version: ADMG_v2_0 Journal reference of previous version: Comput. Phys. Comm. 162 (2004) 51 External routines: CUDA libraries (SDK V2.x). Does the new version supersede the previous version?: Yes Nature of problem: In this set of codes an efficient procedure is implemented to describe the wavefunction and related molecular properties of a polyatomic molecular system within the Single Center of Expansion (SCE) approximation. The resulting SCE wavefunction, electron density, electrostatic and correlation/polarization potentials can then be used in a wide variety of applications, such as electron-molecule scattering calculations, quantum chemistry studies, biomodelling and drug design. Solution method: The polycentre Hartree-Fock solution for a molecule of arbitrary geometry, based on linear combination of Gaussian-Type Orbital (GTO), is expanded over a single center, typically the Center Of Mass (C.O.M.), by means of a Gauss Legendre/Chebyschev quadrature over the θ,φ angular coordinates. The resulting SCE numerical wavefunction is then used to calculate the one-particle electron density, the electrostatic potential and two different models for the correlation/polarization potentials induced by the impinging electron, which have the correct asymptotic behavior for the leading dipole molecular polarizabilities. Reasons for new version: The present release of SCELib allows the study of larger molecular systems with respect to the previous versions by means of theoretical and technological advances, with the first implementation of the code over a many-core computing system. Summary of revisions: The major features added with respect to SCELib Version 2.0 are molecular wavefunctions obtained via the Los Alamos (Hay and Wadt) LAN ECP plus DZ description of the inner-shell electrons (on Na-La, Hf-Bi elements) [1] can now be single-center-expanded; the addition required modifications of: (i) the filtering code readgau, (ii) the main reading function setinp, (iii) the sphint code (including changes to the CalcMO code), (iv) the densty code, (v) the vst code; the classes of platforms supported now include two more architectures based on accelerated coprocessors (Nvidia GSeries GPGPU and ClearSpeed e720 (ClearSpeed version, experimental; initial preliminary porting of the sphint() function not for production runs - see the code documentation for additional detail). A single-precision representation for real numbers in the SCE mapping of the GTOs ( sphint code), has been implemented into the new code; the I h symmetry point group for the molecular systems has been added to those already allowed in the SCE procedure; the orientation of the molecular axis system for the Cs (planar) symmetry has been changed in accord with the standard orientation adopted by the latest version of the quantum chemistry code (Gaussian C03 [2]), which is used to generate the input multi-centre molecular wavefunctions ( z-axis perpendicular to the symmetry plane); the abelian subgroup for the Cs point group has been changed from C 1 to Cs; atomic basis functions including g-type GTOs can now be single-center-expanded. Restrictions: Depending on the molecular system under study and on the operating conditions the program may or may not fit into available RAM memory. In this case a feature of the program is to memory map a disk file in order to efficiently access the memory data through a disk device. The parallel GP-GPU implementation limits the number of CPU threads to the number of GPU cores present. Running time: The execution time strongly depends on the molecular target description and on the hardware/OS chosen, it is directly proportional to the ( r,θ,φ) grid size and to the number of angular basis functions used. Thus, from the program printout of the main arrays memory occupancy, the user can approximately derive the expected computer time needed for a given calculation executed in serial mode. For parallel executions the overall efficiency must be further taken into account, and this depends on the no. of processors used as well as on the parallel architecture chosen, so a simple general law is at present not determinable. References:[1] P.J. Hay, W.R. Wadt, J. Chem. Phys. 82 (1985) 270; W.R. Wadt, P.J. Hay, J. Chem. Phys. 284 (1985);P.J. Hay, W.R. Wadt, J. Chem. Phys. 299 (1985). [2] M.J. Frisch et al., Gaussian 03, revision C.02, Gaussian, Inc., Wallingford, CT, 2004.

  15. PLOT3D user's manual

    NASA Technical Reports Server (NTRS)

    Walatka, Pamela P.; Buning, Pieter G.; Pierce, Larry; Elson, Patricia A.

    1990-01-01

    PLOT3D is a computer graphics program designed to visualize the grids and solutions of computational fluid dynamics. Seventy-four functions are available. Versions are available for many systems. PLOT3D can handle multiple grids with a million or more grid points, and can produce varieties of model renderings, such as wireframe or flat shaded. Output from PLOT3D can be used in animation programs. The first part of this manual is a tutorial that takes the reader, keystroke by keystroke, through a PLOT3D session. The second part of the manual contains reference chapters, including the helpfile, data file formats, advice on changing PLOT3D, and sample command files.

  16. Distributed Energy Resources Customer Adoption Model Plus (DER-CAM+), Version 1.0.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stadler, Michael; Cardorso, Goncalo; Mashayekh, Salman

    DER-CAM+ v1.0.0 is internally referred to as DER-CAM v5.0.0. Due to fundamental changes from previous versions, a new name (DER-CAM+) will be used for DER-CAM version 5.0.0 and above. DER-CAM+ is a Decision Support Tool for Decentralized Energy Systems that has been tailored for microgrid applications, and now explicitly considers electrical and thermal networks within a microgrid, ancillary services, and operating reserve. DER-CAM was initially created as an exclusively economic energy model, able to find the cost minimizing combination and operation profile of a set of DER technologies that meet energy loads of a building or microgrid for a typicalmore » test year. The previous versions of DER-CAM were formulated without modeling the electrical/thermal networks within the microgrid, and hence, used aggregate single-node approaches. Furthermore, they were not able to consider operating reserve constraints, and microgrid revenue streams from participating in ancillary services markets. This new version DER-CAM+ considers these issues by including electrical power flow and thermal flow equations and constraints in the microgrid, revenues from various ancillary services markets, and operating reserve constraints.« less

  17. USING MM5 VERSION 2 WITH CMAQ AND MODELS-3, A USER'S GUIDE AND TUTORIAL

    EPA Science Inventory

    Meteorological data are important in many of the processes simulated in the Community Multi-Scale Air Quality (CMAQ) model and the Models-3 framework. The first meteorology model that has been selected and evaluated with CMAQ is the Fifth-Generation Pennsylvania State University...

  18. CMIP5 Historical Simulations (1850-2012) with GISS ModelE2

    NASA Technical Reports Server (NTRS)

    Miller, Ronald Lindsay; Schmidt, Gavin A.; Nazarenko, Larissa S.; Tausnev, Nick; Bauer, Susanne E.; DelGenio, Anthony D.; Kelley, Max; Lo, Ken K.; Ruedy, Reto; Shindell, Drew T.; hide

    2014-01-01

    Observations of climate change during the CMIP5 extended historical period (1850-2012) are compared to trends simulated by six versions of the NASA Goddard Institute for Space Studies ModelE2 Earth System Model. The six models are constructed from three versions of the ModelE2 atmospheric general circulation model, distinguished by their treatment of atmospheric composition and the aerosol indirect effect, combined with two ocean general circulation models, HYCOM and Russell. Forcings that perturb the model climate during the historical period are described. Five-member ensemble averages from each of the six versions of ModelE2 simulate trends of surface air temperature, atmospheric temperature, sea ice and ocean heat content that are in general agreement with observed trends, although simulated warming is slightly excessive within the past decade. Only simulations that include increasing concentrations of long-lived greenhouse gases match the warming observed during the twentieth century. Differences in twentieth-century warming among the six model versions can be attributed to differences in climate sensitivity, aerosol and ozone forcing, and heat uptake by the deep ocean. Coupled models with HYCOM export less heat to the deep ocean, associated with reduced surface warming in regions of deepwater formation, but greater warming elsewhere at high latitudes along with reduced sea ice. All ensembles show twentieth-century annular trends toward reduced surface pressure at southern high latitudes and a poleward shift of the midlatitude westerlies, consistent with observations.

  19. Ada (Tradename) Compiler Validation Summary Report. International Business Machines Corporation. IBM Development System for the Ada Language for VM/CMS, Version 1.0. IBM 4381 (IBM System/370) under VM/CMS.

    DTIC Science & Technology

    1986-04-29

    COMPILER VALIDATION SUMMARY REPORT: International Business Machines Corporation IBM Development System for the Ada Language for VM/CMS, Version 1.0 IBM 4381...tested using command scripts provided by International Business Machines Corporation. These scripts were reviewed by the validation team. Test.s were run...s): IBM 4381 (System/370) Operating System: VM/CMS, release 3.6 International Business Machines Corporation has made no deliberate extensions to the

  20. Applying an economical scale-aware PDF-based turbulence closure model in NOAA NCEP GCMs.

    NASA Astrophysics Data System (ADS)

    Belochitski, A.; Krueger, S. K.; Moorthi, S.; Bogenschutz, P.; Cheng, A.

    2017-12-01

    A novel unified representation of sub-grid scale (SGS) turbulence, cloudiness, and shallow convection is being implemented into the NOAA NCEP Global Forecasting System (GFS) general circulation model. The approach, known as Simplified High Order Closure (SHOC), is based on predicting a joint PDF of SGS thermodynamic variables and vertical velocity, and using it to diagnose turbulent diffusion coefficients, SGS fluxes, condensation, and cloudiness. Unlike other similar methods, comparatively few new prognostic variables needs to be introduced, making the technique computationally efficient. In the base version of SHOC it is SGS turbulent kinetic energy (TKE), and in the developmental version — SGS TKE, and variances of total water and moist static energy (MSE). SHOC is now incorporated into a version of GFS that will become a part of the NOAA Next Generation Global Prediction System based around NOAA GFDL's FV3 dynamical core, NOAA Environmental Modeling System (NEMS) coupled modeling infrastructure software, and a set novel physical parameterizations. Turbulent diffusion coefficients computed by SHOC are now used in place of those produced by the boundary layer turbulence and shallow convection parameterizations. Large scale microphysics scheme is no longer used to calculate cloud fraction or the large-scale condensation/deposition. Instead, SHOC provides these quantities. Radiative transfer parameterization uses cloudiness computed by SHOC. An outstanding problem with implementation of SHOC in the NCEP global models is excessively large high level tropical cloudiness. Comparison of the moments of the SGS PDF diagnosed by SHOC to the moments calculated in a GigaLES simulation of tropical deep convection case (GATE), shows that SHOC diagnoses too narrow PDF distributions of total cloud water and MSE in the areas of deep convective detrainment. A subsequent sensitivity study of SHOC's diagnosed cloud fraction (CF) to higher order input moments of the SGS PDF demonstrated that CF is improved if SHOC is provided with correct variances of total water and MSE. Consequently, SHOC was modified to include two new prognostic equations for variances of total water and MSE, and coupled with the Chikira-Sugiyama parameterization of deep convection to include effects of detrainment on the prognostic variances.

Top