Sample records for zone modeling software

  1. VISUAL PLUMES MIXING ZONE MODELING SOFTWARE

    EPA Science Inventory

    The U.S. Environmental Protection Agency has a long history of both supporting plume model development and providing mixing zone modeling software. The Visual Plumes model is the most recent addition to the suite of public-domain models available through the EPA-Athens Center f...

  2. Federal Highway Administration (FHWA) work zone driver model software

    DOT National Transportation Integrated Search

    2016-11-01

    FHWA and the U.S. Department of Transportation (USDOT) Volpe Center are developing a work zone car-following model and simulation software that interfaces with existing microsimulation tools, enabling more accurate simulation of car-following through...

  3. Evaluation of work zone enhancement software programs.

    DOT National Transportation Integrated Search

    2009-09-01

    The Missouri Department of Transportation (MoDOT) is looking for software tools that can assist in : developing effective plans to manage and communicate work zone activities. QuickZone, CA4PRS, : VISSIM, and Spreadsheet models are the tools that MoD...

  4. Phase II, improved work zone design guidelines and enhanced model of traffic delays in work zones : final report, March 2009.

    DOT National Transportation Integrated Search

    2009-03-01

    This project contains three major parts. In the first part a digital computer simulation model was developed with the aim to model the traffic through a freeway work zone situation. The model was based on the Arena simulation software and used cumula...

  5. Phase II, improved work zone design guidelines and enhanced model of traffic delays in work zones : executive summary report.

    DOT National Transportation Integrated Search

    2009-03-01

    This project contains three major parts. In the first part a digital computer simulation model was developed with the aim to model the traffic through a freeway work zone situation. The model was based on the Arena simulation software and used cumula...

  6. System Operations Studies : Feeder System Model. User's Manual.

    DOT National Transportation Integrated Search

    1982-11-01

    The Feeder System Model (FSM) is one of the analytic models included in the System Operations Studies (SOS) software package developed for urban transit systems analysis. The objective of the model is to assign a proportion of the zone-to-zone travel...

  7. The community-driven BiG CZ software system for integration and analysis of bio- and geoscience data in the critical zone

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.; Valentine, D. W., Jr.; Richard, S. M.; Cheetham, R.; Meyer, F.; Henry, C.; Berg-Cross, G.; Packman, A. I.; Aronson, E. L.

    2014-12-01

    Here we present the prototypes of a new scientific software system designed around the new Observations Data Model version 2.0 (ODM2, https://github.com/UCHIC/ODM2) to substantially enhance integration of biological and Geological (BiG) data for Critical Zone (CZ) science. The CZ science community takes as its charge the effort to integrate theory, models and data from the multitude of disciplines collectively studying processes on the Earth's surface. The central scientific challenge of the CZ science community is to develop a "grand unifying theory" of the critical zone through a theory-model-data fusion approach, for which the key missing need is a cyberinfrastructure for seamless 4D visual exploration of the integrated knowledge (data, model outputs and interpolations) from all the bio and geoscience disciplines relevant to critical zone structure and function, similar to today's ability to easily explore historical satellite imagery and photographs of the earth's surface using Google Earth. This project takes the first "BiG" steps toward answering that need. The overall goal of this project is to co-develop with the CZ science and broader community, including natural resource managers and stakeholders, a web-based integration and visualization environment for joint analysis of cross-scale bio and geoscience processes in the critical zone (BiG CZ), spanning experimental and observational designs. We will: (1) Engage the CZ and broader community to co-develop and deploy the BiG CZ software stack; (2) Develop the BiG CZ Portal web application for intuitive, high-performance map-based discovery, visualization, access and publication of data by scientists, resource managers, educators and the general public; (3) Develop the BiG CZ Toolbox to enable cyber-savvy CZ scientists to access BiG CZ Application Programming Interfaces (APIs); and (4) Develop the BiG CZ Central software stack to bridge data systems developed for multiple critical zone domains into a single metadata catalog. The entire BiG CZ Software system is being developed on public repositories as a modular suite of open source software projects. It will be built around a new Observations Data Model Version 2.0 (ODM2) that has been developed by members of the BiG CZ project team, with community input, under separate funding.

  8. COMPILATION OF SATURATED AND UNSATURATED ZONE MODELING SOFTWARE

    EPA Science Inventory

    The full report provides readers an overview of available ground-water modeling programs and related software. It is an update of EPA/600/R-93/118 and EPA/600/R-94/028, two previous reports from the same program at the International Ground Water Modeling Center (IGWMC) in Colora...

  9. A steep peripheral ring in irregular cornea topography, real or an instrument error?

    PubMed

    Galindo-Ferreiro, Alicia; Galvez-Ruiz, Alberto; Schellini, Silvana A; Galindo-Alonso, Julio

    2016-01-01

    To demonstrate that the steep peripheral ring (red zone) on corneal topography after myopic laser in situ keratomileusis (LASIK) could possibly due to instrument error and not always to a real increase in corneal curvature. A spherical model for the corneal surface and modifying topography software was used to analyze the cause of an error due to instrument design. This study involved modification of the software of a commercially available topographer. A small modification of the topography image results in a red zone on the corneal topography color map. Corneal modeling indicates that the red zone could be an artifact due to an instrument-induced error. The steep curvature changes after LASIK, signified by the red zone, could be also an error due to the plotting algorithms of the corneal topographer, besides a steep curvature change.

  10. EVALUATION OF VADOSE ZONE AND SOURCE MODELS FOR MULTI-MEDIA, MULTI-PATHWAY, MULTI-RECEPTOR RISK ASSESSMENT USING LARGE SOIL COLUMN EXPERIMENT DATA

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) is developing a comprehensive environmental exposure and risk analysis software system for agency-wide application using the methodology of a Multi-media, Multi-pathway, Multi-receptor Risk Assessment (3MRA) model. This software sys...

  11. THE U.S. ENVIRONMENTAL PROTECTION AGENCY VISUAL PLUMES MODELING SOFTWARE

    EPA Science Inventory

    The U.S. Environmental Protection Agency's Center for Exposure Assessment Modeling (CEAM) at the Ecosystems Research Division in Athens, Georgia develops environmental exposure models, including plume models, and provides technical assistance to model users. The mixing zone and f...

  12. VISUAL PLUMES MIXING ZONE MODELING SOFTWARE

    EPA Science Inventory

    The US Environmental Protection Agency has a history of developing plume models and providing technical assistance. The Visual Plumes model (VP) is a recent addition to the public-domain models available on the EPA Center for Exposure Assessment Modeling (CEAM) web page. The Wind...

  13. COMPILATION OF SATURATED AND UNSATURATED ZONE MODELING SOFTWARE (EPA/600/SR-96/009)

    EPA Science Inventory

    The study reflects the ongoing groundwater modeling information collection and processing activities at the International Ground Water Modeling Center (IGWMC). The full report briefly discusses the information acquisition and processing procedures, the MARS information database, ...

  14. Three-Dimensional (3D) Nanometrology Based on Scanning Electron Microscope (SEM) Stereophotogrammetry.

    PubMed

    Tondare, Vipin N; Villarrubia, John S; Vlada R, András E

    2017-10-01

    Three-dimensional (3D) reconstruction of a sample surface from scanning electron microscope (SEM) images taken at two perspectives has been known for decades. Nowadays, there exist several commercially available stereophotogrammetry software packages. For testing these software packages, in this study we used Monte Carlo simulated SEM images of virtual samples. A virtual sample is a model in a computer, and its true dimensions are known exactly, which is impossible for real SEM samples due to measurement uncertainty. The simulated SEM images can be used for algorithm testing, development, and validation. We tested two stereophotogrammetry software packages and compared their reconstructed 3D models with the known geometry of the virtual samples used to create the simulated SEM images. Both packages performed relatively well with simulated SEM images of a sample with a rough surface. However, in a sample containing nearly uniform and therefore low-contrast zones, the height reconstruction error was ≈46%. The present stereophotogrammetry software packages need further improvement before they can be used reliably with SEM images with uniform zones.

  15. Progress in catalytic ignition fabrication, modeling and infrastructure : (part 2) development of a multi-zone engine model simulated using MATLAB software.

    DOT National Transportation Integrated Search

    2014-02-01

    A mathematical model was developed for the purpose of providing students with data : acquisition and engine modeling experience at the University of Idaho. In developing the : model, multiple heat transfer and emissions models were researched and com...

  16. Data-driven traffic impact assessment tool for work zones.

    DOT National Transportation Integrated Search

    2017-03-01

    Traditionally, traffic impacts of work zones have been assessed using planning software such as Quick Zone, custom spreadsheets, and others. These software programs generate delay, queuing, and other mobility measures but are difficult to validate du...

  17. Integration of bio- and geoscience data with the ODM2 standards and software ecosystem for the CZOData and BiG CZ Data projects

    NASA Astrophysics Data System (ADS)

    Aufdenkampe, A. K.; Mayorga, E.; Horsburgh, J. S.; Lehnert, K. A.; Zaslavsky, I.

    2015-12-01

    We have developed a family of solutions to the challenges of integrating diverse data from of biological and geological (BiG) disciplines for Critical Zone (CZ) science. These standards and software solutions have been developed around the new Observations Data Model version 2.0 (ODM2, http://ODM2.org), which was designed as a profile of the Open Geospatial Consortium's (OGC) Observations and Measurements (O&M) standard. The ODM2 standards and software ecosystem has at it's core an information model that balances specificity with flexibility to powerfully and equally serve the needs of multiple dataset types, from multivariate sensor-generated time series to geochemical measurements of specimen hierarchies to multi-dimensional spectral data to biodiversity observations. ODM2 has been adopted as the information model guiding the next generation of cyberinfrastructure development for the Interdisciplinary Earth Data Alliance (http://www.iedadata.org/) and the CUAHSI Water Data Center (https://www.cuahsi.org/wdc). Here we present several components of the ODM2 standards and software ecosystem that were developed specifically to help CZ scientists and their data managers to share and manage data through the national Critical Zone Observatory data integration project (CZOData, http://criticalzone.org/national/data/) and the bio integration with geo for critical zone science data project (BiG CZ Data, http://bigcz.org/). These include the ODM2 Controlled Vocabulary system (http://vocabulary.odm2.org), the YAML Observation Data Archive & exchange (YODA) File Format (https://github.com/ODM2/YODA-File) and the BiG CZ Toolbox, which will combine easy-to-install ODM2 databases (https://github.com/ODM2/ODM2) with a variety of graphical software packages for data management such as ODMTools (https://github.com/ODM2/ODMToolsPython) and the ODM2 Streaming Data Loader (https://github.com/ODM2/ODM2StreamingDataLoader).

  18. Application of digital human modeling and simulation for vision analysis of pilots in a jet aircraft: a case study.

    PubMed

    Karmakar, Sougata; Pal, Madhu Sudan; Majumdar, Deepti; Majumdar, Dhurjati

    2012-01-01

    Ergonomic evaluation of visual demands becomes crucial for the operators/users when rapid decision making is needed under extreme time constraint like navigation task of jet aircraft. Research reported here comprises ergonomic evaluation of pilot's vision in a jet aircraft in virtual environment to demonstrate how vision analysis tools of digital human modeling software can be used effectively for such study. Three (03) dynamic digital pilot models, representative of smallest, average and largest Indian pilot population were generated from anthropometric database and interfaced with digital prototype of the cockpit in Jack software for analysis of vision within and outside the cockpit. Vision analysis tools like view cones, eye view windows, blind spot area, obscuration zone, reflection zone etc. were employed during evaluation of visual fields. Vision analysis tool was also used for studying kinematic changes of pilot's body joints during simulated gazing activity. From present study, it can be concluded that vision analysis tool of digital human modeling software was found very effective in evaluation of position and alignment of different displays and controls in the workstation based upon their priorities within the visual fields and anthropometry of the targeted users, long before the development of its physical prototype.

  19. The site-scale saturated zone flow model for Yucca Mountain: Calibration of different conceptual models and their impact on flow paths

    USGS Publications Warehouse

    Zyvoloski, G.; Kwicklis, E.; Eddebbarh, A.-A.; Arnold, B.; Faunt, C.; Robinson, B.A.

    2003-01-01

    This paper presents several different conceptual models of the Large Hydraulic Gradient (LHG) region north of Yucca Mountain and describes the impact of those models on groundwater flow near the potential high-level repository site. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain. This model is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. The numerical model is calibrated by matching available water level measurements using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM and parameter estimation software PEST) and model setup allows for efficient calibration of multiple conceptual models. Until now, the Large Hydraulic Gradient has been simulated using a low-permeability, east-west oriented feature, even though direct evidence for this feature is lacking. In addition to this model, we investigate and calibrate three additional conceptual models of the Large Hydraulic Gradient, all of which are based on a presumed zone of hydrothermal chemical alteration north of Yucca Mountain. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the potential repository that record differences in the predicted groundwater flow regime. The results show that Large Hydraulic Gradient can be represented with the alternate conceptual models that include the hydrothermally altered zone. The predicted pathways are mildly sensitive to the choice of the conceptual model and more sensitive to the quality of calibration in the vicinity on the repository. These differences are most likely due to different degrees of fit of model to data, and do not represent important differences in hydrologic conditions for the different conceptual models. ?? 2002 Elsevier Science B.V. All rights reserved.

  20. Groundwater movement simulation by the software package PM5 for the Sviyaga river adjoining territory in the Republic of Tatarstan

    NASA Astrophysics Data System (ADS)

    Kosterina, E. A.; Isagadzhieva, Z. Sh

    2018-01-01

    Data of the ecological-hydrogeological fieldwork at the Predvolzhye region of the Republic of Tatarstan were analyzed. A geofiltration model of the Buinsk region area near the village of Stary Studenets in the territory of the Republic of Tatarstan was constructed by the PM5 software package. The model can be developed to become the basis for estimation of the groundwater reserves of the territory, modeling the operation of water intake wells, designing the location of water intake wells, and evaluation of their operational capabilities, and constructing sanitary protection zones.

  1. SubductionGenerator: A program to build three-dimensional plate configurations

    NASA Astrophysics Data System (ADS)

    Jadamec, M. A.; Kreylos, O.; Billen, M. I.; Turcotte, D. L.; Knepley, M.

    2016-12-01

    Geologic, geochemical, and geophysical data from subduction zones indicate that a two-dimensional paradigm for plate tectonic boundaries is no longer adequate to explain the observations. Many open source software packages exist to simulate the viscous flow of the Earth, such as the dynamics of subduction. However, there are few open source programs that generate the three-dimensional model input. We present an open source software program, SubductionGenerator, that constructs the three-dimensional initial thermal structure and plate boundary structure. A 3D model mesh and tectonic configuration are constructed based on a user specified model domain, slab surface, seafloor age grid file, and shear zone surface. The initial 3D thermal structure for the plates and mantle within the model domain is then constructed using a series of libraries within the code that use a half-space cooling model, plate cooling model, and smoothing functions. The code maps the initial 3D thermal structure and the 3D plate interface onto the mesh nodes using a series of libraries including a k-d tree to increase efficiency. In this way, complicated geometries and multiple plates with variable thickness can be built onto a multi-resolution finite element mesh with a 3D thermal structure and 3D isotropic shear zones oriented at any angle with respect to the grid. SubductionGenerator is aimed at model set-ups more representative of the earth, which can be particularly challenging to construct. Examples include subduction zones where the physical attributes vary in space, such as slab dip and temperature, and overriding plate temperature and thickness. Thus, the program can been used to construct initial tectonic configurations for triple junctions and plate boundary corners.

  2. Three-dimensional structure and seismicity beneath the Central Vanuatu subduction zone

    NASA Astrophysics Data System (ADS)

    Foix, Oceane; Crawford, Wayne; Pelletier, Bernard; Regnier, Marc; Garaebiti, Esline; Koulakov, Ivan

    2017-04-01

    The 1400-km long Vanuatu subduction zone results from subduction of the oceanic Australian plate (OAP) beneath the North-Fijian microplate (NFM). Seismic and volcanic activity are both high, and several morphologic features enter into subduction, affecting seismicity and probably plate coupling. The Entrecasteaux Ridge, West-Torres plateau, and Bougainville seamount currently enter into subduction below the large forearc islands of Santo and Malekula. This collision coincides with a strongly decreased local convergence velocity rate - 35 mm/yr compared to 120-160 mm/yr to the north and south - and significant uplift on the overriding plate, indicating a high degree of deformation. The close proximity of large uplifted forearc islands to the trench provides excellent coverage of the megathrust seismogenic zone for a seismological study. We used 10 months of seismological data collected using the 30-instrument land and sea ARC-VANUATU seismology network to construct a 3D velocity model — using the LOTOS joint location/model inversion software — and locate 11655 earthquakes using the NonLinLoc software suite. The 3-D model reveals low P and S velocities in the first tens of kilometers beneath both islands, probably due to water infiltration in the heavily faulted upper plate. The model also suggests the presence of a subducted seamount beneath south Santo. The earthquake locations reveal a complex interaction of faults and stress zones related to high and highly variable deformation. Both brittle deformation and the seismogenic zone depth limits vary along-slab and earthquake clusters are identified beneath central and south Santo, at about 10-30 km of depth, and southwest of Malekula island between 10-20 km depth.

  3. Evaluation of using digital gravity field models for zoning map creation

    NASA Astrophysics Data System (ADS)

    Loginov, Dmitry

    2018-05-01

    At the present time the digital cartographic models of geophysical fields are taking a special significance into geo-physical mapping. One of the important directions to their application is the creation of zoning maps, which allow taking into account the morphology of geophysical field in the implementation automated choice of contour intervals. The purpose of this work is the comparative evaluation of various digital models in the creation of integrated gravity field zoning map. For comparison were chosen the digital model of gravity field of Russia, created by the analog map with scale of 1 : 2 500 000, and the open global model of gravity field of the Earth - WGM2012. As a result of experimental works the four integrated gravity field zoning maps were obtained with using raw and processed data on each gravity field model. The study demonstrates the possibility of open data use to create integrated zoning maps with the condition to eliminate noise component of model by processing in specialized software systems. In this case, for solving problem of contour intervals automated choice the open digital models aren't inferior to regional models of gravity field, created for individual countries. This fact allows asserting about universality and independence of integrated zoning maps creation regardless of detail of a digital cartographic model of geo-physical fields.

  4. New reductions of the Astrographic Catalogue. Plate adjustments of the Algiers, Oxford I and II, and Vatican Zones.

    NASA Astrophysics Data System (ADS)

    Urban, S. E.; Martin, J. C.; Jackson, E. S.; Corbin, T. E.

    1996-07-01

    The U. S. Naval Observatory is in the process of making new reductions of the Astrographic Catalogue using a modern reference catalog, the ACRS, and new data analysis and reduction software. Currently ten AC zones have been reduced. This papers discusses the reduction models and results from the Algiers, Oxford I and II, and Vatican zones (those of the Cape zone are discussed elsewhere). The resulting star positions will be combined with those of the U.S. Naval Observatory's Twin Astrograph Catalog to produce a catalog of positions and proper motions in support of the Sloan Digital Sky Survey.

  5. Exploring stop-go decision zones at rural high-speed intersections with flashing green signal and insufficient yellow time in China.

    PubMed

    Tang, Keshuang; Xu, Yanqing; Wang, Fen; Oguchi, Takashi

    2016-10-01

    The objective of this study is to empirically analyze and model the stop-go decision behavior of drivers at rural high-speed intersections in China, where a flashing green signal of 3s followed by a yellow signal of 3s is commonly applied to end a green phase. 1, 186 high-resolution vehicle trajectories were collected at four typical high-speed intersection approaches in Shanghai and used for the identification of actual stop-go decision zones and the modeling of stop-go decision behavior. Results indicate that the presence of flashing green significantly changed the theoretical decision zones based on the conventional Dilemma Zone theory. The actual stop-go decision zones at the study intersections were thus formulated and identified based on the empirical data. Binary Logistic model and Fuzzy Logic model were then developed to further explore the impacts of flashing green on the stop-go behavior of drivers. It was found that the Fuzzy Logic model could produce comparably good estimation results as compared to the traditional Binary Logistic models. The findings of this study could contribute the development of effective dilemma zone protection strategies, the improvement of stop-go decision model embedded in the microscopic traffic simulation software and the proper design of signal change and clearance intervals at high-speed intersections in China. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Corridor-based forecasts of work-zone impacts for freeways.

    DOT National Transportation Integrated Search

    2011-08-09

    This project developed an analysis methodology and associated software implementation for the evaluation of : significant work zone impacts on freeways in North Carolina. The FREEVAL-WZ software tool allows the analyst : to predict the operational im...

  7. Calibration of work zone impact analysis software for Missouri.

    DOT National Transportation Integrated Search

    2013-12-01

    This project calibrated two software programs used for estimating the traffic impacts of work zones. The WZ Spreadsheet : and VISSIM programs were recommended in a previous study by the authors. The two programs were calibrated using : field data fro...

  8. Algorithms for Coastal-Zone Color-Scanner Data

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Software for Nimbus-7 Coastal-Zone Color-Scanner (CZCS) derived products consists of set of scientific algorithms for extracting information from CZCS-gathered data. Software uses CZCS-generated Calibrated RadianceTemperature (CRT) tape as input and outputs computer-compatible tape and film product.

  9. Modeling and simulation of the debonding process of composite solid propellants

    NASA Astrophysics Data System (ADS)

    Feng, Tao; Xu, Jin-sheng; Han, Long; Chen, Xiong

    2017-07-01

    In order to study the damage evolution law of composite solid propellants, the molecular dynamics particle filled algorithm was used to establish the mesoscopic structure model of HTPB(Hydroxyl-terminated polybutadiene) propellants. The cohesive element method was employed for the adhesion interface between AP(Ammonium perchlorate) particle and HTPB matrix and the bilinear cohesive zone model was used to describe the mechanical response of the interface elements. The inversion analysis method based on Hooke-Jeeves optimization algorithm was employed to identify the parameters of cohesive zone model(CZM) of the particle/binder interface. Then, the optimized parameters were applied to the commercial finite element software ABAQUS to simulate the damage evolution process for AP particle and HTPB matrix, including the initiation, development, gathering and macroscopic crack. Finally, the stress-strain simulation curve was compared with the experiment curves. The result shows that the bilinear cohesive zone model can accurately describe the debonding and fracture process between the AP particles and HTPB matrix under the uniaxial tension loading.

  10. Mobile application MDDCS for modeling the expansion dynamics of a dislocation loop in FCC metals

    NASA Astrophysics Data System (ADS)

    Kirilyuk, Vasiliy; Petelin, Alexander; Eliseev, Andrey

    2017-11-01

    A mobile version of the software package Dynamic Dislocation of Crystallographic Slip (MDDCS) designed for modeling the expansion dynamics of dislocation loops and formation of a crystallographic slip zone in FCC-metals is examined. The paper describes the possibilities for using MDDCS, the application interface, and the database scheme. The software has a simple and intuitive interface and does not require special training. The user can set the initial parameters of the experiment, carry out computational experiments, export parameters and results of the experiment into separate text files, and display the experiment results on the device screen.

  11. Multimedia Delivery of Coastal Zone Management Training.

    ERIC Educational Resources Information Center

    Clark, M. J.; And Others

    1995-01-01

    Describes Coastal Zone Management (CZM) multimedia course modules, educational software written by the GeoData Institute at the University of Southamptom for an environmental management undergraduate course. Examines five elements that converge to create CZM multimedia teaching: course content, source material, a hardware/software delivery system,…

  12. Preferential Flow Paths In A Karstified Spring Catchment: A Study Of Fault Zones As Conduits To Rapid Groundwater Flow

    NASA Astrophysics Data System (ADS)

    Kordilla, J.; Terrell, A. N.; Veltri, M.; Sauter, M.; Schmidt, S.

    2017-12-01

    In this study we model saturated and unsaturated flow in the karstified Weendespring catchment, located within the Leinetal graben in Goettingen, Germany. We employ the finite element COMSOL Multiphysics modeling software to model variably saturated flow using the Richards equation with a van Genuchten type parameterization. As part of the graben structure, the Weende spring catchment is intersected by seven fault zones along the main flow path of the 7400 m cross section of the catchment. As the Weende spring is part of the drinking water supply in Goettingen, it is particularly important to understand the vulnerability of the catchment and effect of fault zones on rapid transport of contaminants. Nitrate signals have been observed at the spring only a few days after the application of fertilizers within the catchment at a distance of approximately 2km. As the underlying layers are known to be highly impermeable, fault zones within the area are likely to create rapid flow paths to the water table and the spring. The model conceptualizes the catchment as containing three hydrogeological limestone units with varying degrees of karstification: the lower Muschelkalk limestone as a highly conductive layer, the middle Muschelkalk as an aquitard, and the upper Muschelkalk as another conductive layer. The fault zones are parameterized based on a combination of field data from quarries, remote sensing and literary data. The fault zone is modeled considering the fracture core as well as the surrounding damage zone with separate, specific hydraulic properties. The 2D conceptual model was implemented in COMSOL to study unsaturated flow at the catchment scale using van Genuchten parameters. The study demonstrates the importance of fault zones for preferential flow within the catchment and its effect on the spatial distribution of vulnerability.

  13. A New Paradigm For Modeling Fault Zone Inelasticity: A Multiscale Continuum Framework Incorporating Spontaneous Localization and Grain Fragmentation.

    NASA Astrophysics Data System (ADS)

    Elbanna, A. E.

    2015-12-01

    The brittle portion of the crust contains structural features such as faults, jogs, joints, bends and cataclastic zones that span a wide range of length scales. These features may have a profound effect on earthquake nucleation, propagation and arrest. Incorporating these existing features in modeling and the ability to spontaneously generate new one in response to earthquake loading is crucial for predicting seismicity patterns, distribution of aftershocks and nucleation sites, earthquakes arrest mechanisms, and topological changes in the seismogenic zone structure. Here, we report on our efforts in modeling two important mechanisms contributing to the evolution of fault zone topology: (1) Grain comminution at the submeter scale, and (2) Secondary faulting/plasticity at the scale of few to hundreds of meters. We use the finite element software Abaqus to model the dynamic rupture. The constitutive response of the fault zone is modeled using the Shear Transformation Zone theory, a non-equilibrium statistical thermodynamic framework for modeling plastic deformation and localization in amorphous materials such as fault gouge. The gouge layer is modeled as 2D plane strain region with a finite thickness and heterogeenous distribution of porosity. By coupling the amorphous gouge with the surrounding elastic bulk, the model introduces a set of novel features that go beyond the state of the art. These include: (1) self-consistent rate dependent plasticity with a physically-motivated set of internal variables, (2) non-locality that alleviates mesh dependence of shear band formation, (3) spontaneous evolution of fault roughness and its strike which affects ground motion generation and the local stress fields, and (4) spontaneous evolution of grain size and fault zone fabric.

  14. The application of the pilot points in groundwater numerical inversion model

    NASA Astrophysics Data System (ADS)

    Hu, Bin; Teng, Yanguo; Cheng, Lirong

    2015-04-01

    Numerical inversion simulation of groundwater has been widely applied in groundwater. Compared to traditional forward modeling, inversion model has more space to study. Zones and inversing modeling cell by cell are conventional methods. Pilot points is a method between them. The traditional inverse modeling method often uses software dividing the model into several zones with a few parameters needed to be inversed. However, distribution is usually too simple for modeler and result of simulation deviation. Inverse cell by cell will get the most actual parameter distribution in theory, but it need computational complexity greatly and quantity of survey data for geological statistical simulation areas. Compared to those methods, pilot points distribute a set of points throughout the different model domains for parameter estimation. Property values are assigned to model cells by Kriging to ensure geological units within the parameters of heterogeneity. It will reduce requirements of simulation area geological statistics and offset the gap between above methods. Pilot points can not only save calculation time, increase fitting degree, but also reduce instability of numerical model caused by numbers of parameters and other advantages. In this paper, we use pilot point in a field which structure formation heterogeneity and hydraulics parameter was unknown. We compare inversion modeling results of zones and pilot point methods. With the method of comparative analysis, we explore the characteristic of pilot point in groundwater inversion model. First, modeler generates an initial spatially correlated field given a geostatistical model by the description of the case site with the software named Groundwater Vistas 6. Defining Kriging to obtain the value of the field functions over the model domain on the basis of their values at measurement and pilot point locations (hydraulic conductivity), then we assign pilot points to the interpolated field which have been divided into 4 zones. And add range of disturbance values to inversion targets to calculate the value of hydraulic conductivity. Third, after inversion calculation (PEST), the interpolated field will minimize an objective function measuring the misfit between calculated and measured data. It's an optimization problem to find the optimum value of parameters. After the inversion modeling, the following major conclusion can be found out: (1) In a field structure formation is heterogeneity, the results of pilot point method is more real: better fitting result of parameters, more stable calculation of numerical simulation (stable residual distribution). Compared to zones, it is better of reflecting the heterogeneity of study field. (2) Pilot point method ensures that each parameter is sensitive and not entirely dependent on other parameters. Thus it guarantees the relative independence and authenticity of parameters evaluation results. However, it costs more time to calculate than zones. Key words: groundwater; pilot point; inverse model; heterogeneity; hydraulic conductivity

  15. Computerized Workstation for Tsunami Hazard Monitoring

    NASA Astrophysics Data System (ADS)

    Lavrentiev-Jr, Mikhail; Marchuk, Andrey; Romanenko, Alexey; Simonov, Konstantin; Titov, Vasiliy

    2010-05-01

    We present general structure and functionality of the proposed Computerized Workstation for Tsunami Hazard Monitoring (CWTHM). The tool allows interactive monitoring of hazard, tsunami risk assessment, and mitigation - at all stages, from the period of strong tsunamigenic earthquake preparation to inundation of the defended coastal areas. CWTHM is a software-hardware complex with a set of software applications, optimized to achieve best performance on hardware platforms in use. The complex is calibrated for selected tsunami source zone(s) and coastal zone(s) to be defended. The number of zones (both source and coastal) is determined, or restricted, by available hardware resources. The presented complex performs monitoring of selected tsunami source zone via the Internet. The authors developed original algorithms, which enable detection of the preparation zone of the strong underwater earthquake automatically. For the so-determined zone the event time, magnitude and spatial location of tsunami source are evaluated by means of energy of the seismic precursors (foreshocks) analysis. All the above parameters are updated after each foreshock. Once preparing event is detected, several scenarios are forecasted for wave amplitude parameters as well as the inundation zone. Estimations include the lowest and the highest wave amplitudes and the least and the most inundation zone. In addition to that, the most probable case is calculated. In case of multiple defended coastal zones, forecasts and estimates can be done in parallel. Each time the simulated model wave reaches deep ocean buoys or tidal gauge, expected values of wave parameters and inundation zones are updated with historical events information and pre-calculated scenarios. The Method of Splitting Tsunami (MOST) software package is used for mathematical simulation. The authors suggest code acceleration for deep water wave propagation. As a result, performance is 15 times faster compared to MOST, original version. Performance gain is achieved by compiler options, use of optimized libraries, and advantages of OpenMP parallel technology. Moreover, it is possible to achieve 100 times code acceleration by using modern Graphics Processing Units (GPU). Parallel evaluation of inundation zones for multiple coastal zones is also available. All computer codes can be easily assembled under MS Windows and Unix OS family. Although software is virtually platform independent, the most performance gain is achieved while using the recommended hardware components. When the seismic event occurs, all valuable parameters are updated with seismic data and wave propagation monitoring is enabled. As soon as the wave passes each deep ocean tsunameter, parameters of the initial displacement at source are updated from direct calculations based on original algorithms. For better source reconstruction, a combination of two methods is used: optimal unit source linear combination from preliminary calculated database and direct numerical inversion along the wave ray between real source and particular measurement buoys. Specific dissipation parameter along with the wave ray is also taken into account. During the entire wave propagation process the expected wave parameters and inundation zone(s) characteristics are updated with all available information. If recommended hardware components are used, monitoring results are available in real time. The suggested version of CWTHM has been tested by analyzing seismic precursors (foreshocks) and the measured tsunami waves at North Pacific for the Central Kuril's tsunamigenic earthquake of November 15, 2006.

  16. Estimating ammonium and nitrate load from septic systems to surface water bodies within ArcGIS environments

    NASA Astrophysics Data System (ADS)

    Zhu, Yan; Ye, Ming; Roeder, Eberhard; Hicks, Richard W.; Shi, Liangsheng; Yang, Jinzhong

    2016-01-01

    This paper presents a recently developed software, ArcGIS-based Nitrogen Load Estimation Toolkit (ArcNLET), for estimating nitrogen loading from septic systems to surface water bodies. The load estimation is important for managing nitrogen pollution, a world-wide challenge to water resources and environmental management. ArcNLET simulates coupled transport of ammonium and nitrate in both vadose zone and groundwater. This is a unique feature that cannot be found in other ArcGIS-based software for nitrogen modeling. ArcNLET is designed to be flexible for the following four simulating scenarios: (1) nitrate transport alone in groundwater; (2) ammonium and nitrate transport in groundwater; (3) ammonium and nitrate transport in vadose zone; and (4) ammonium and nitrate transport in both vadose zone and groundwater. With this flexibility, ArcNLET can be used as an efficient screening tool in a wide range of management projects related to nitrogen pollution. From the modeling perspective, this paper shows that in areas with high water table (e.g. river and lake shores), it may not be correct to assume a completed nitrification process that converts all ammonium to nitrate in the vadose zone, because observation data can indicate that substantial amount of ammonium enters groundwater. Therefore, in areas with high water table, simulating ammonium transport and estimating ammonium loading, in addition to nitrate transport and loading, are important for avoiding underestimation of nitrogen loading. This is demonstrated in the Eggleston Heights neighborhood in the City of Jacksonville, FL, USA, where monitoring well observations included a well with predominant ammonium concentrations. The ammonium loading given by the calibrated ArcNLET model can be 10-18% of the total nitrogen load, depending on various factors discussed in the paper.

  17. Pilot study of a novel tool for input-free automated identification of transition zone prostate tumors using T2- and diffusion-weighted signal and textural features.

    PubMed

    Stember, Joseph N; Deng, Fang-Ming; Taneja, Samir S; Rosenkrantz, Andrew B

    2014-08-01

    To present results of a pilot study to develop software that identifies regions suspicious for prostate transition zone (TZ) tumor, free of user input. Eight patients with TZ tumors were used to develop the model by training a Naïve Bayes classifier to detect tumors based on selection of most accurate predictors among various signal and textural features on T2-weighted imaging (T2WI) and apparent diffusion coefficient (ADC) maps. Features tested as inputs were: average signal, signal standard deviation, energy, contrast, correlation, homogeneity and entropy (all defined on T2WI); and average ADC. A forward selection scheme was used on the remaining 20% of training set supervoxels to identify important inputs. The trained model was tested on a different set of ten patients, half with TZ tumors. In training cases, the software tiled the TZ with 4 × 4-voxel "supervoxels," 80% of which were used to train the classifier. Each of 100 iterations selected T2WI energy and average ADC, which therefore were deemed the optimal model input. The two-feature model was applied blindly to the separate set of test patients, again without operator input of suspicious foci. The software correctly predicted presence or absence of TZ tumor in all test patients. Furthermore, locations of predicted tumors corresponded spatially with locations of biopsies that had confirmed their presence. Preliminary findings suggest that this tool has potential to accurately predict TZ tumor presence and location, without operator input. © 2013 Wiley Periodicals, Inc.

  18. EVALUATION OF VADOSE ZONE AND SORUCE MODULES FOR MULTI-MEDIA, MULTI-PATHWAY, AND MULTI-RECEPTOR RISK ASSESSMENT USING LARGE-SOIL-COLUMN EXPERIMENTAL DATA

    EPA Science Inventory

    The United States Environmental Protection Agency (EPA) is developing a comprehensive environmental exposure and risk analysis software system for agency-wide application using the methodology of a Multi-media, Multi-pathway, Multi-receptor Risk Assessment (3MRA) model. This sof...

  19. Numerical Modeling and Forecasting of Strong Sumatra Earthquakes

    NASA Astrophysics Data System (ADS)

    Xing, H. L.; Yin, C.

    2007-12-01

    ESyS-Crustal, a finite element based computational model and software has been developed and applied to simulate the complex nonlinear interacting fault systems with the goal to accurately predict earthquakes and tsunami generation. With the available tectonic setting and GPS data around the Sumatra region, the simulation results using the developed software have clearly indicated that the shallow part of the subduction zone in the Sumatra region between latitude 6S and 2N has been locked for a long time, and remained locked even after the Northern part of the zone underwent a major slip event resulting into the infamous Boxing Day tsunami. Two strong earthquakes that occurred in the distant past in this region (between 6S and 1S) in 1797 (M8.2) and 1833 (M9.0) respectively are indicative of the high potential for very large destructive earthquakes to occur in this region with relatively long periods of quiescence in between. The results have been presented in the 5th ACES International Workshop in 2006 before the recent 2007 Sumatra earthquakes occurred which exactly fell into the predicted zone (see the following web site for ACES2006 and detailed presentation file through workshop agenda). The preliminary simulation results obtained so far have shown that there seem to be a few obvious events around the previously locked zone before it is totally ruptured, but apparently no indication of a giant earthquake similar to the 2004 M9 event in the near future which is believed to happen by several earthquake scientists. Further detailed simulations will be carried out and presented in the meeting.

  20. Construction of a 3D structural model based on balanced cross sections and borehole data to create a fundament for further geolocial and hydrological simulations

    NASA Astrophysics Data System (ADS)

    Donndorf, St.; Malz, A.; Kley, J.

    2012-04-01

    Cross section balancing is a generally accepted method for studying fault zone geometries. We show a method for the construction of structural 3D models of complex fault zones using a combination of gOcad modelling and balanced cross sections. In this work a 3D model of the Schlotheim graben in the Thuringian basin was created from serial, parallel cross sections and existing borehole data. The Thuringian Basin is originally a part of the North German Basin, which was separated from it by the Harz uplift in the Late Cretaceous. It comprises several parallel NW-trending inversion structures. The Schlotheim graben is one example of these inverted graben zones, whose structure poses special challenges to 3D modelling. The fault zone extends 30 km in NW-SE direction and 1 km in NE-SW direction. This project was split into two parts: data management and model building. To manage the fundamental data a central database was created in ESRI's ArcGIS. The development of a scripting interface handles the data exchange between the different steps of modelling. The first step is the pre-processing of the base data in ArcGIS, followed by cross section balancing with Midland Valley's Move software and finally the construction of the 3D model in Paradigm's gOcad. With the specific aim of constructing a 3D model based on cross sections, the functionality of the gOcad software had to be extended. These extensions include pre-processing functions to create a simplified and usable data base for gOcad as well as construction functions to create surfaces based on linearly distributed data and processing functions to create the 3D model from different surfaces. In order to use the model for further geological and hydrological simulations, special requirements apply to the surface properties. The first characteristic of the surfaces should be a quality mesh, which contains triangles with maximized internal angles. To achieve that, an external meshing tool was included in gOcad. The second characteristic is that intersecting lines between two surfaces must be included in both surfaces and share nodes with them. To finish the modelling process 3D balancing was performed to further improve the model quality.

  1. Visualising higher order Brillouin zones with applications

    NASA Astrophysics Data System (ADS)

    Andrew, R. C.; Salagaram, T.; Chetty, N.

    2017-05-01

    A key concept in material science is the relationship between the Bravais lattice, the reciprocal lattice and the resulting Brillouin zones (BZ). These zones are often complicated shapes that are hard to construct and visualise without the use of sophisticated software, even by professional scientists. We have used a simple sorting algorithm to construct BZ of any order for a chosen Bravais lattice that is easy to implement in any scientific programming language. The resulting zones can then be visualised using freely available plotting software. This method has pedagogical value for upper-level undergraduate students since, along with other computational methods, it can be used to illustrate how constant-energy surfaces combine with these zones to create van Hove singularities in the density of states. In this paper we apply our algorithm along with the empirical pseudopotential method and the 2D equivalent of the tetrahedron method to show how they can be used in a simple software project to investigate this interaction for a 2D crystal. This project not only enhances students’ fundamental understanding of the principles involved but also improves transferable coding skills.

  2. ANALYTIC ELEMENT MODELING FOR SOURCE WATER ASSESSMENTS OF PUBLIC WATER SUPPLY WELLS: CASE STUDIES IN GLACIAL OUTWASH AND BASIN-AND-RANGE

    EPA Science Inventory

    Over the last 10 years the EPA has invested in analytic elements as a computational method used in public domain software supporting capture zone delineation for source water assessments and wellhead protection. The current release is called WhAEM2000 (wellhead analytic element ...

  3. Integrated software for the detection of epileptogenic zones in refractory epilepsy.

    PubMed

    Mottini, Alejandro; Miceli, Franco; Albin, Germán; Nuñez, Margarita; Ferrándo, Rodolfo; Aguerrebere, Cecilia; Fernandez, Alicia

    2010-01-01

    In this paper we present an integrated software designed to help nuclear medicine physicians in the detection of epileptogenic zones (EZ) by means of ictal-interictal SPECT and MR images. This tool was designed to be flexible, friendly and efficient. A novel detection method was included (A-contrario) along with the classical detection method (Subtraction analysis). The software's performance was evaluated with two separate sets of validation studies: visual interpretation of 12 patient images by an experimented observer and objective analysis of virtual brain phantom experiments by proposed numerical observers. Our results support the potential use of the proposed software to help nuclear medicine physicians in the detection of EZ in clinical practice.

  4. The prediction of radiofrequency ablation zone volume using vascular indices of 3-dimensional volumetric colour Doppler ultrasound in an in vitro blood-perfused bovine liver model

    PubMed Central

    Lanctot, Anthony C; McCarter, Martin D; Roberts, Katherine M; Glueck, Deborah H; Dodd, Gerald D

    2017-01-01

    Objective: To determine the most reliable predictor of radiofrequency (RF) ablation zone volume among three-dimensional (3D) volumetric colour Doppler vascular indices in an in vitro blood-perfused bovine liver model. Methods: 3D colour Doppler volume data of the local hepatic parenchyma were acquired from 37 areas of 13 bovine livers connected to an in vitro oxygenated blood perfusion system. Doppler vascular indices of vascularization index (VI), flow index (FI) and vascularization flow index (VFI) were obtained from the volume data using 3D volume analysis software. 37 RF ablations were performed at the same locations where the ultrasound data were obtained from. The relationship of these vascular indices and the ablation zone volumes measured from gross specimens were analyzed using a general linear mixed model fit with random effect for liver and backward stepwise regression analysis. Results: FI was significantly associated with ablation zone volumes measured on gross specimens (p = 0.0047), but explained little of the variance (Rβ2 = 0.21). Ablation zone volume decreased by 0.23 cm3 (95% confidence interval: −0.38, −0.08) for every 1 increase in FI. Neither VI nor VFI was significantly associated with ablation zone volumes (p > 0.05). Conclusion: Although FI was associated with ablation zone volumes, it could not sufficiently explain their variability, limiting its clinical applicability. VI, FI and VFI are not clinically useful in the prediction of RF ablation zone volume in the liver. Advances in knowledge: Despite a significant association of FI with ablation zone volumes, VI, FI and VFI cannot be used for their prediction. Different Doppler vascular indices need to be investigated for clinical use. PMID:27925468

  5. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software

    PubMed Central

    Dols, W. Stuart; Persily, Andrew K.; Morrow, Jayne B.; Matzke, Brett D.; Sego, Landon H.; Nuffer, Lisa L.; Pulsipher, Brent A.

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones. PMID:27134782

  6. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software.

    PubMed

    Dols, W Stuart; Persily, Andrew K; Morrow, Jayne B; Matzke, Brett D; Sego, Landon H; Nuffer, Lisa L; Pulsipher, Brent A

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

  7. ModBack - simplified contaminant source zone delineation using backtracking

    NASA Astrophysics Data System (ADS)

    Thielsch, K.; Herold, M.; Ptak, T.

    2012-12-01

    Contaminated groundwater poses a serious threat to drinking water resources all over the world. Even though contaminated water might be detected in observation wells, a proper clean up is often only successful if the source of the contamination is detected and subsequently removed, contained or remediated. The high costs of groundwater remediation could be possibly significantly reduced if, from the outset, a focus is placed on source zone detection. ModBack combines several existing modelling tools in one easy to use GIS-based interface helping to delineate potential contaminant source zones in the subsurface. The software is written in Visual Basic 3.5 and uses the ArcObjects library to implement all required GIS applications. It can run without modification on any Microsoft Windows based PC with sufficient RAM and at least Microsoft .NET Framework 3.5. Using ModBack requires additional installation of the following software: Processing Modflow Pro 7.0, ModPath, CSTREAM (Bayer-Raich et al., 2003), Golden Software Surfer and Microsoft Excel. The graphical user interface of ModBack is separated into four blocks of procedures dealing with: data input, groundwater modelling, backtracking and analyses. Geographical data input includes all georeferenced information pertaining to the study site. Information on subsurface contamination is gathered either by conventional sampling of monitoring wells or by conducting integral pumping tests at control planes with a specific sampling scheme. Hydraulic data from these pumping tests together with all other available information are then used to set up a groundwater flow model of the study site, which provides the flow field for transport simulations within the subsequent contamination backtracking procedures, starting from the defined control planes. The backtracking results are then analysed within ModBack. The potential areas of contamination source presence or absence are determined based on the procedure used by Jarsjö et al. (2005). The contaminant plume length can be estimated using plume length statistics, first order rate degradation equations or calculations based on site specific hydraulic and chemical parameters. Furthermore, an analytical tool is included to identify the distribution of contaminants across a control plane. All relevant output can be graphically displayed and saved as vector data to be later used in GIS software. ModBack has been already used to delimit the zones of source presence or absence at several test sites. With ModBack, a tool is now available which enables environmental consultants, engineers and environmental agencies to delineate possible sources of contamination already at the planning stage of site investigation and remediation measures, helping to significantly reduce costs of contaminated site management. Bayer-Raich, M., Jarsjö, J., Holder, T. and Ptak, T. (2003): "Numerical estimations of contaminant mass flow rate based on concentration measurements in pumping wells", ModelCare 2002: A Few Steps Closer to Reality, IAHS Publication No. 277, 10-16. Jarsjö, J., Bayer-Raich, M., Ptak, T. (2005): "Monitoring groundwater contamination and delineating source zones at industrial sites: Uncertainty analyses using integral pumping tests", Journal of Contaminant Hydrology, 79, 107-134

  8. Numerical simulation of the stress distribution in a coal mine caused by a normal fault

    NASA Astrophysics Data System (ADS)

    Zhang, Hongmei; Wu, Jiwen; Zhai, Xiaorong

    2017-06-01

    Luling coal mine was used for research using FLAC3D software to analyze the stress distribution characteristics of the two sides of a normal fault zone with two different working face models. The working faces were, respectively, on the hanging wall and the foot wall; the two directions of mining were directed to the fault. The stress distributions were different across the fault. The stress was concentrated and the influenced range of stress was gradually larger while the working face was located on the hanging wall. The fault zone played a negative effect to the stress transmission. Obviously, the fault prevented stress transmission, the stress concentrated on the fault zone and the hanging wall. In the second model, the stress on the two sides decreased at first, but then increased continuing to transmit to the hanging wall. The concentrated stress in the fault zone decreased and the stress transmission was obvious. Because of this, the result could be used to minimize roadway damage and lengthen the time available for coal mining by careful design of the roadway and working face.

  9. A client-server software for the identification of groundwater vulnerability to pesticides at regional level.

    PubMed

    Di Guardo, Andrea; Finizio, Antonio

    2015-10-15

    The groundwater VULnerability to PESticide software system (VULPES) is a user-friendly, GIS-based and client-server software developed to identify vulnerable areas to pesticides at regional level making use of pesticide fate models. It is a Decision Support System aimed to assist the public policy makers to investigate areas sensitive to specific substances and to propose limitations of use or mitigation measures. VULPES identify the so-called Uniform Geographical Unit (UGU) which are areas characterised by the same agro-environmental conditions. In each UGU it applies the PELMO model obtaining the 80th percentile of the substance concentration at 1 metre depth; then VULPES creates a vulnerability map in shapefile format which classifies the outputs comparing them with the lower threshold set to the legal limit concentration in groundwater (0.1 μg/l). This paper describes the software structure in details and a case study with the application of the terbuthylazine herbicide on the Lombardy region territory. Three zones with different degrees of vulnerabilities has been identified and described. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Numerical modeling of fracking fluid and methane migration through fault zones in shale gas reservoirs

    NASA Astrophysics Data System (ADS)

    Taherdangkoo, Reza; Tatomir, Alexandru; Sauter, Martin

    2017-04-01

    Hydraulic fracturing operation in shale gas reservoir has gained growing interest over the last few years. Groundwater contamination is one of the most important environmental concerns that have emerged surrounding shale gas development (Reagan et al., 2015). The potential impacts of hydraulic fracturing could be studied through the possible pathways for subsurface migration of contaminants towards overlying aquifers (Kissinger et al., 2013; Myers, 2012). The intent of this study is to investigate, by means of numerical simulation, two failure scenarios which are based on the presence of a fault zone that penetrates the full thickness of overburden and connect shale gas reservoir to aquifer. Scenario 1 addresses the potential transport of fracturing fluid from the shale into the subsurface. This scenario was modeled with COMSOL Multiphysics software. Scenario 2 deals with the leakage of methane from the reservoir into the overburden. The numerical modeling of this scenario was implemented in DuMux (free and open-source software), discrete fracture model (DFM) simulator (Tatomir, 2012). The modeling results are used to evaluate the influence of several important parameters (reservoir pressure, aquifer-reservoir separation thickness, fault zone inclination, porosity, permeability, etc.) that could affect the fluid transport through the fault zone. Furthermore, we determined the main transport mechanisms and circumstances in which would allow frack fluid or methane migrate through the fault zone into geological layers. The results show that presence of a conductive fault could reduce the contaminant travel time and a significant contaminant leakage, under certain hydraulic conditions, is most likely to occur. Bibliography Kissinger, A., Helmig, R., Ebigbo, A., Class, H., Lange, T., Sauter, M., Heitfeld, M., Klünker, J., Jahnke, W., 2013. Hydraulic fracturing in unconventional gas reservoirs: risks in the geological system, part 2. Environ Earth Sci 70, 3855-3873. Myers, T., 2012. Potential contaminant pathways from hydraulically fractured shale to aquifers. Groundwater, 50(6), 872-882. Reagan, M.T., Moridis, G.J., Keen, N.D., Johnson, J.N., 2015. Numerical simulation of the environmental impact of hydraulic fracturing of tight/shale gas reservoirs on near-surface groundwater: Background, base cases, shallow reservoirs, short-term gas, and water transport. Water Resources Research 51, 2543-2573. Tatomir, A., 2012. From Discrete to Continuum Concepts of Flow in Fractured Porous Media. Stuttgart University: University of Stuttgart.

  11. Life prediction and constitutive models for engine hot section anisotropic materials program

    NASA Technical Reports Server (NTRS)

    Nissley, D. M.; Meyer, T. G.; Walker, K. P.

    1992-01-01

    This report presents a summary of results from a 7 year program designed to develop generic constitutive and life prediction approaches and models for nickel-based single crystal gas turbine airfoils. The program was composed of a base program and an optional program. The base program addressed the high temperature coated single crystal regime above the airfoil root platform. The optional program investigated the low temperature uncoated single crystal regime below the airfoil root platform including the notched conditions of the airfoil attachment. Both base and option programs involved experimental and analytical efforts. Results from uniaxial constitutive and fatigue life experiments of coated and uncoated PWA 1480 single crystal material formed the basis for the analytical modeling effort. Four single crystal primary orientations were used in the experiments: group of zone axes (001), group of zone axes (011), group of zone axes (111), and group of zone axes (213). Specific secondary orientations were also selected for the notched experiments in the optional program. Constitutive models for an overlay coating and PWA 1480 single crystal materials were developed based on isothermal hysteresis loop data and verified using thermomechanical (TMF) hysteresis loop data. A fatigue life approach and life models were developed for TMF crack initiation of coated PWA 1480. A life model was developed for smooth and notched fatigue in the option program. Finally, computer software incorporating the overlay coating and PWA 1480 constitutive and life models was developed.

  12. A numerical study of zone-melting process for the thermoelectric material of Bi2Te3

    NASA Astrophysics Data System (ADS)

    Chen, W. C.; Wu, Y. C.; Hwang, W. S.; Hsieh, H. L.; Huang, J. Y.; Huang, T. K.

    2015-06-01

    In this study, a numerical model has been established by employing a commercial software; ProCAST, to simulate the variation/distribution of temperature and the subsequent microstructure of Bi2Te3 fabricated by zone-melting technique. Then an experiment is conducted to measure the temperature variation/distribution during the zone-melting process to validate the numerical system. Also, the effects of processing parameters on crystallization microstructure such as moving speed and temperature of heater are numerically evaluated. In the experiment, the Bi2Te3 powder are filled into a 30mm diameter quartz cylinder and the heater is set to 800°C with a moving speed 12.5 mm/hr. A thermocouple is inserted in the Bi2Te3 powder to measure the temperature variation/distribution of the zone-melting process. The temperature variation/distribution measured by experiment is compared to the results of numerical simulation. The results show that our model and the experiment are well matched. Then the model is used to evaluate the crystal formation for Bi2Te3 with a 30mm diameter process. It's found that when the moving speed is slower than 17.5 mm/hr, columnar crystal is obtained. In the end, we use this model to predict the crystal formation of zone-melting process for Bi2Te3 with a 45 mm diameter. The results show that it is difficult to grow columnar crystal when the diameter comes to 45mm.

  13. Control of surface thermal scratch of strip in tandem cold rolling

    NASA Astrophysics Data System (ADS)

    Chen, Jinshan; Li, Changsheng

    2014-07-01

    The thermal scratch seriously affects the surface quality of the cold rolled stainless steel strip. Some researchers have carried out qualitative and theoretical studies in this field. However, there is currently a lack of research on effective forecast and control of thermal scratch defects in practical production, especially in tandem cold rolling. In order to establish precise mathematical model of oil film thickness in deformation zone, the lubrication in cold rolling process of SUS410L stainless steel strip is studied, and major factors affecting oil film thickness are also analyzed. According to the principle of statistics, mathematical model of critical oil film thickness in deformation zone for thermal scratch is built, with fitting and regression analytical method, and then based on temperature comparison method, the criterion for deciding thermal scratch defects is put forward. Storing and calling data through SQL Server 2010, a software on thermal scratch defects control is developed through Microsoft Visual Studio 2008 by MFC technique for stainless steel in tandem cold rolling, and then it is put into practical production. Statistics indicate that the hit rate of thermal scratch is as high as 92.38%, and the occurrence rate of thermal scratch is decreased by 89.13%. Owing to the application of the software, the rolling speed is increased by approximately 9.3%. The software developed provides an effective solution to the problem of thermal scratch defects in tandem cold rolling, and helps to promote products surface quality of stainless steel strips in practical production.

  14. Automatic building information model query generation

    DOE PAGES

    Jiang, Yufei; Yu, Nan; Ming, Jiang; ...

    2015-12-01

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less

  15. Automatic building information model query generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Yufei; Yu, Nan; Ming, Jiang

    Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less

  16. Numerical simulation of air distribution in a room with a sidewall jet under benchmark test conditions

    NASA Astrophysics Data System (ADS)

    Zasimova, Marina; Ivanov, Nikolay

    2018-05-01

    The goal of the study is to validate Large Eddy Simulation (LES) data on mixing ventilation in an isothermal room at conditions of benchmark experiments by Hurnik et al. (2015). The focus is on the accuracy of the mean and rms velocity fields prediction in the quasi-free jet zone of the room with 3D jet supplied from a sidewall rectangular diffuser. Calculations were carried out using the ANSYS Fluent 16.2 software with an algebraic wall-modeled LES subgrid-scale model. CFD results on the mean velocity vector are compared with the Laser Doppler Anemometry data. The difference between the mean velocity vector and the mean air speed in the jet zone, both LES-computed, is presented and discussed.

  17. An Amphibious Magnetotelluric Investigation of the Cascadian Seismogenic and ETS zones.

    NASA Astrophysics Data System (ADS)

    Parris, B. A.; Livelybrooks, D.; Bedrosian, P.; Egbert, G. D.; Key, K.; Schultz, A.; Cook, A.; Kant, M.; Wogan, N.; Zeryck, A.

    2015-12-01

    The amphibious Magnetotelluric Observations of Cascadia using a Huge Array (MOCHA) experiment seeks to address unresolved questions about the seismogenic locked zone and down-dip transition zone where episodic tremor and slip (ETS) originates. The presence of free fluids is thought to be one of the primary controls on ETS behavior within the Cascadia margin. Since the bulk electrical conductivity in the crust and mantle can be greatly increased by fluids, magnetotelluric(MT) observations can offer unique insights on the fluid distribution and its relation to observed ETS behavior. Here we present preliminary results from the 146 MT stations collected for the MOCHA project. MOCHA is unique in that it is the first amphibious array of MT stations occupied to provide for 3-D interpretation of conductivity structure of a subduction zone. The MOCHA data set comprises 75 onshore stations and 71 offshore stations, accumulated over a two-year period, and located on an approximate 25km grid, spanning from the trench to the Eastern Willamette Valley, and from central Oregon into middle Washington. We present the results of a series of east-west (cross-strike) oriented, two-dimensional inversions created using the MARE2DEM software that provide an initial picture of the conductivity structure of the locked and ETS zones and its along strike variations. Our models can be used to identify correlations between ETS occurrence rates and inferred fluid concentrations. Our modeling explores the impact of various parameterizations on 2-D inversion results, including inclusion of a smoothness penalty reduction along the inferred slab interface. This series of 2-D inversions can then be used collectively to help make and guide an a priori 3-D inversion. In addition we will present a preliminary 3-D inversion of the onshore stations created using the ModEM software. We are currently working on modifying ModEM to support inversion of offshore data. The more computationally intensive 3-D inversion of the full amphibious data set will address questions regarding along-strike heterogeneity in fluid distributions within the locked and ETS-originating zones.

  18. Atmospheric release model for the E-area low-level waste facility: Updates and modifications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    The atmospheric release model (ARM) utilizes GoldSim® Monte Carlo simulation software (GTG, 2017) to evaluate the flux of gaseous radionuclides as they volatilize from E-Area disposal facility waste zones, diffuse into the air-filled soil pores surrounding the waste, and emanate at the land surface. This report documents the updates and modifications to the ARM for the next planned E-Area PA considering recommendations from the 2015 PA strategic planning team outlined by Butcher and Phifer.

  19. The thermochemical, two-phase dynamics of subduction zones: results from new, fully coupled models

    NASA Astrophysics Data System (ADS)

    Rees Jones, D. W.; Katz, R. F.; May, D.; Tian, M.; Rudge, J. F.

    2017-12-01

    Subduction zones are responsible for most of Earth's subaerial volcanism. However, previous geodynamic modelling of subduction zones has largely neglected magmatism. We previously showed that magmatism has a significant thermal impact, by advecting sensible heat into the lithosphere beneath arc volcanos [1]. Inclusion of this effect helps reconcile subduction zone models with petrological and heat flow observations. Many important questions remain, including how magma-mantle dynamics of subduction zones affects the position of arc volcanos and the character of their lavas. In this presentation, we employ a fully coupled, thermochemical, two-phase flow theory to investigate the dynamics of subduction zones. We present the first results from our new software (SubFUSc), which solves the coupled equations governing conservation of mass, momentum, energy and chemical species. The presence and migration of partial melts affect permeability and mantle viscosity (both directly and through their thermal impact); these, in turn, feed back on the magma-mantle flow. Thus our fully coupled modelling improves upon previous two-phase models that decoupled the governing equations and fixed the thermal structure [2]. To capture phase change, we use a novel, simplified model of the mantle melting in the presence of volatile species. As in the natural system, volatiles are associated with low-degree melting at temperatures beneath the anhydrous solidus; dehydration reactions in the slab supply volatiles into the wedge, triggering silicic melting. We simulate the migration of melts under buoyancy forces and dynamic pressure gradients. We thereby demonstrate the dynamical controls on the pattern of subduction-zone volcanism (particularly its location, magnitude, and chemical composition). We build on our previous study of the thermal consequences of magma genesis and segregation. We address the question of what controls the location of arc volcanoes themselves [3]. [1] Rees Jones, D. W., Katz, R. F., Tian, M and Rudge, J. F. (2017). Thermal impact of magmatism in subduction zones. arxiv.org/abs/1701.02550 [2] Wilson, C. R., Spiegelman, M., van Keken, P. E., & Hacker, B. R. (2014). EPSL, doi:10.1016/j.epsl.2014.05.052 [3] England, P. C., Katz, Richard F. (2010). Nature, doi:10.1038/nature09417

  20. The increase in the starting torque of PMSM motor by applying of FOC method

    NASA Astrophysics Data System (ADS)

    Plachta, Kamil

    2017-05-01

    The article presents field oriented control method of synchronous permanent magnet motor equipped in optical sensors. This method allows for a wide range regulation of torque and rotational speed of the electric motor. The paper presents mathematical model of electric motor and vector control method. Optical sensors have shorter time response as compared to the inductive sensors, which allow for faster response of the electronic control system to changes of motor loads. The motor driver is based on the digital signal processor which performs advanced mathematical operations in real time. The appliance of Clark and Park transformation in the software defines the angle of rotor position. The presented solution provides smooth adjustment of the rotational speed in the first operating zone and reduces the dead zone of the torque in the second and third operating zones.

  1. The Pedestrian Evacuation Analyst: geographic information systems software for modeling hazard evacuation potential

    USGS Publications Warehouse

    Jones, Jeanne M.; Ng, Peter; Wood, Nathan J.

    2014-01-01

    Recent disasters such as the 2011 Tohoku, Japan, earthquake and tsunami; the 2013 Colorado floods; and the 2014 Oso, Washington, mudslide have raised awareness of catastrophic, sudden-onset hazards that arrive within minutes of the events that trigger them, such as local earthquakes or landslides. Due to the limited amount of time between generation and arrival of sudden-onset hazards, evacuations are typically self-initiated, on foot, and across the landscape (Wood and Schmidtlein, 2012). Although evacuation to naturally occurring high ground may be feasible in some vulnerable communities, evacuation modeling has demonstrated that other communities may require vertical-evacuation structures within a hazard zone, such as berms or buildings, if at-risk individuals are to survive some types of sudden-onset hazards (Wood and Schmidtlein, 2013). Researchers use both static least-cost-distance (LCD) and dynamic agent-based models to assess the pedestrian evacuation potential of vulnerable communities. Although both types of models help to understand the evacuation landscape, LCD models provide a more general overview that is independent of population distributions, which may be difficult to quantify given the dynamic spatial and temporal nature of populations (Wood and Schmidtlein, 2012). Recent LCD efforts related to local tsunami threats have focused on an anisotropic (directionally dependent) path distance modeling approach that incorporates travel directionality, multiple travel speed assumptions, and cost surfaces that reflect variations in slope and land cover (Wood and Schmidtlein, 2012, 2013). The Pedestrian Evacuation Analyst software implements this anisotropic path-distance approach for pedestrian evacuation from sudden-onset hazards, with a particular focus at this time on local tsunami threats. The model estimates evacuation potential based on elevation, direction of movement, land cover, and travel speed and creates a map showing travel times to safety (a time map) throughout a hazard zone. Model results provide a general, static view of the evacuation landscape at different pedestrian travel speeds and can be used to identify areas outside the reach of naturally occurring high ground. In addition, data on the size and location of different populations within the hazard zone can be integrated with travel-time maps to create tables and graphs of at-risk population counts as a function of travel time to safety. As a decision-support tool, the Pedestrian Evacuation Analyst provides the capability to evaluate the effectiveness of various vertical-evacuation structures within a study area, both through time maps of the modeled travel-time landscape with a potential structure in place and through comparisons of population counts within reach of safety. The Pedestrian Evacuation Analyst is designed for use by researchers examining the pedestrian-evacuation potential of an at-risk community. In communities where modeled evacuation times exceed the event (for example, tsunami wave) arrival time, researchers can use the software with emergency managers to assess the area and population served by potential vertical-evacuation options. By automating and managing the modeling process, the software allows researchers to concentrate efforts on providing crucial and timely information on community vulnerability to sudden-onset hazards.

  2. Abrupt Upper-Plate Tilting Upon Slab-Transition-Zone Collision

    NASA Astrophysics Data System (ADS)

    Crameri, F.; Lithgow-Bertelloni, C. R.

    2017-12-01

    During its sinking, the remnant of a surface plate crosses and interacts with multiple boundaries in Earth's interior. The most-prominent dynamic interaction arises at the upper-mantle transition zone where the sinking plate is strongly affected by the higher-viscosity lower mantle. Within our numerical model, we unravel, for the first time, that this very collision of the sinking slab with the transition zone induces a sudden, dramatic downward tilt of the upper plate towards the subduction trench. The slab-transition zone collision sets parts of the higher-viscosity lower mantle in motion. Naturally, this then induces an overall larger return flow cell that, at its onset, tilts the upper plate abruptly by around 0.05 degrees and over around 10 Millions of years. Such a significant and abrupt variation in surface topography should be clearly visible in temporal geologic records of large-scale surface elevation and might explain continental-wide tilting as observed in Australia since the Eocene or North America during the Phanerozoic. Unravelling this crucial mantle-lithosphere interaction was possible thanks to state-of-the-art numerical modelling (powered by StagYY; Tackley 2008, PEPI) and post-processing (powered by StagLab; www.fabiocrameri.ch/software). The new model that is introduced here to study the dynamically self-consistent temporal evolution of subduction features accurate subduction-zone topography, robust single-sided plate sinking, stronger plates close to laboratory values, an upper-mantle phase transition and, crucially, simple continents at a free surface. A novel, fully-automated post-processing includes physical model diagnostics like slab geometry, mantle flow pattern, upper-plate tilt angle and trench location.

  3. Computer Software Management and Information Center

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Computer programs for passive anti-roll tank, earth resources laboratory applications, the NIMBUS-7 coastal zone color scanner derived products, transportable applications executive, plastic and failure analysis of composites, velocity gradient method for calculating velocities in an axisymmetric annular duct, an integrated procurement management system, data I/O PRON for the Motorola exorcisor, aerodynamic shock-layer shape, kinematic modeling, hardware library for a graphics computer, and a file archival system are documented.

  4. Residence-time framework for modeling multicomponent reactive transport in stream hyporheic zones

    NASA Astrophysics Data System (ADS)

    Painter, S. L.; Coon, E. T.; Brooks, S. C.

    2017-12-01

    Process-based models for transport and transformation of nutrients and contaminants in streams require tractable representations of solute exchange between the stream channel and biogeochemically active hyporheic zones. Residence-time based formulations provide an alternative to detailed three-dimensional simulations and have had good success in representing hyporheic exchange of non-reacting solutes. We extend the residence-time formulation for hyporheic transport to accommodate general multicomponent reactive transport. To that end, the integro-differential form of previous residence time models is replaced by an equivalent formulation based on a one-dimensional advection dispersion equation along the channel coupled at each channel location to a one-dimensional transport model in Lagrangian travel-time form. With the channel discretized for numerical solution, the associated Lagrangian model becomes a subgrid model representing an ensemble of streamlines that are diverted into the hyporheic zone before returning to the channel. In contrast to the previous integro-differential forms of the residence-time based models, the hyporheic flowpaths have semi-explicit spatial representation (parameterized by travel time), thus allowing coupling to general biogeochemical models. The approach has been implemented as a stream-corridor subgrid model in the open-source integrated surface/subsurface modeling software ATS. We use bedform-driven flow coupled to a biogeochemical model with explicit microbial biomass dynamics as an example to show that the subgrid representation is able to represent redox zonation in sediments and resulting effects on metal biogeochemical dynamics in a tractable manner that can be scaled to reach scales.

  5. Hot 'nough for ya?: Controls and Constraints on modeling flux melting in subduction zones

    NASA Astrophysics Data System (ADS)

    Spiegelman, M.; Wilson, C. R.; van Keken, P.; Kelemen, P. B.; Hacker, B. R.

    2012-12-01

    The qualitative concept of flux-melting in subduction zones is well established. Progressive dehydration reactions in the down-going slab release fluids to the hot overlying mantle wedge, causing flux melting and the migration of melts to the volcanic front. However, the quantitative details of fluid release, migration, melt generation and transport in the wedge remain poorly understood. In particular, there are two fundamental observations that defy quantitative modeling. The first is the location of the volcanic front with respect to intermediate depth earthquake (e.g. ˜ 100±40 km; England et al., 2004, Syracuse and Abers, 2006) which is remarkably robust yet insensitive to subduction parameters. This is particularly surprising given new estimates on the variability of fluid release in global subduction zones (e.g. van Keken et al. 2011) which show great sensitivity of fluid release to slab thermal conditions. Reconciling these results implies some robust mechanism for focusing fluids/melts toward the wedge corner. The second observation is the global existence of thermally hot erupted basalts and andesites that, if derived from flux melting of the mantle requires sub-arc mantle temperatures of ˜ 1300° C over shallow pressures of 1-2 GPa which are not that different from mid-ocean ridge conditions. These thermodynamic constraints are also implicit in recent parameterizations of wet melting (e.g. Kelley et al, 2010) which tend to produce significant amounts of melt only near the dry solidus. These observations impose significant challenges for geodynamic models of subduction zones, and in particular for those that don't include the explicit transport of fluids and melts. We present new high-resolution model results that suggest that a more complete description of coupled fluid/solid mechanics (allowing the fluid to interact with solid rheological variations) together with rheologically consistent solutions for temperature and solid flow, may provide the required ingredients that allow for robust focusing of both fluids and hot solids to the sub-arc regions. We demonstrate coupled fluid/solid flow models for simplified geometries to understand the basic processes, as well as for more geologically relevant models from a range of observed arc geometries. We will also evaluate the efficacy of current wet melting parameterizations in these models. All of these models have been built using new modeling software we have been developing that allows unprecedented flexibility in the composition and solution of coupled multi-physics problems. Dubbed TerraFERMA (the transparent Finite Element Rapid Model Assembler...no relation to the convection code TERRA), this new software leverages several advanced computational libraries (FEniCS/PETSc/Spud) to make it significantly easier to construct and explore a wide range of models of varying complexity. Subduction zones provide an ideal application area for understanding the role of different degrees of coupling of fluid and solid dynamics and their relation to observations.

  6. Seismic Evaluation of A Historical Structure In Kastamonu - Turkey

    NASA Astrophysics Data System (ADS)

    Pınar, USTA; Işıl ÇARHOĞLU, Asuman; EVCİ, Ahmet

    2018-01-01

    The Kastomonu province is a seismically active zone. the city has many historical buildings made of stone-masonry. In case of any probable future earthquakes, existing buildings may suffer substantial or heavy damages. In the present study, one of the historical traditional house located in Kastamonu were structurally investigated through probabilistic seismic risk assessment methodology. In the study, the building was modeled by using the Finite Element Modeling (FEM) software, SAP2000. Time history analyses were carried out using 10 different ground motion data on the FEM models. Displacements were interpreted, and the results were displayed graphically and discussed.

  7. Zones of impact around icebreakers affecting beluga whales in the Beaufort Sea.

    PubMed

    Erbe, C; Farmer, D M

    2000-09-01

    A software model estimating zones of impact on marine mammals around man-made noise [C. Erbe and D. M. Farmer, J. Acoust. Soc. Am. 108, 1327-1331 (2000)] is applied to the case of icebreakers affecting beluga whales in the Beaufort Sea. Two types of noise emitted by the Canadian Coast Guard icebreaker Henry Larsen are analyzed: bubbler system noise and propeller cavitation noise. Effects on beluga whales are modeled both in a deep-water environment and a near-shore environment. The model estimates that the Henry Larsen is audible to beluga whales over ranges of 35-78 km, depending on location. The zone of behavioral disturbance is only slightly smaller. Masking of beluga communication signals is predicted within 14-71-km range. Temporary hearing damage can occur if a beluga stays within 1-4 km of the Henry Larsen for at least 20 min. Bubbler noise impacts over the short ranges quoted; propeller cavitation noise accounts for all the long-range effects. Serious problems can arise in heavily industrialized areas where animals are exposed to ongoing noise and where anthropogenic noise from a variety of sources adds up.

  8. Trophic network model of exposed sandy coast: Linking continental and marine water ecosystems

    NASA Astrophysics Data System (ADS)

    Razinkovas-Baziukas, Artūras; Morkūnė, Rasa; Bacevičius, Egidijus; Gasiūnaitė, Zita Rasuolė

    2017-08-01

    A macroscopic food web network for the exposed sandy coastal zone of the south-eastern Baltic Sea was reconstructed using ECOPATH software to assess the matter and energy balance in the ecosystem. The model incorporated 40 living functional groups representing the Baltic Sea coastal system of Lithuania during the first decade of 21rst century. The overall pedigree index of our model was relatively high (0.66) as much of the input data originated from the study area. The results indicate net heterotrophy of the coastal zone due to strong influences from the nearby river - lagoon system (Curonian Lagoon). The majority of fish species and waterbirds were present in the coastal system on a seasonal basis and their migrations contributed to heterotrophic conditions. Among fish, the freshwater stragglers possibly contribute to the reversal of flow in biomass and energy from the coastal zone to the river-lagoon system. Top predators such as breeding and wintering piscivorous waterbirds and large pike-perch were identified as keystone species. There was a clear negative balance for the biomass of small marine pelagic fishes such as smelt, sprat and Baltic herring which represent the main prey items in this system.

  9. Vadose Zone Fate and Transport Simulation of Chemicals Associated with Coal Seam Gas Extraction

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Mallants, D.; Jacques, D.; Van Genuchten, M.

    2017-12-01

    The HYDRUS-1D and HYDRUS (2D/3D) computer software packages are widely used finite element models for simulating the one-, and two- or three-dimensional movement of water, heat, and multiple solutes in variably-saturated media, respectively. While the standard HYDRUS models consider only the fate and transport of individual solutes or solutes subject to first-order degradation reactions, several specialized HYDRUS add-on modules can simulate far more complex biogeochemical processes. The objective of this presentation is to provide an overview of the HYDRUS models and their add-on modules, and to demonstrate applications of the software to the subsurface fate and transport of chemicals involved in coal seam gas extraction and water management operations. One application uses the standard HYDRUS model to evaluate the natural soil attenuation potential of hydraulic fracturing chemicals and their transformation products in case of an accidental release. By coupling the processes of retardation, first-order degradation and convective-dispersive transport of the biocide bronopol and its degradation products, we demonstrated how natural attenuation reduces initial concentrations by more than a factor of hundred in the top 5 cm of the vadose zone. A second application uses the UnsatChem module to explore the possible use of coal seam gas produced water for sustainable irrigation. Simulations with different irrigation waters (untreated, amended with surface water, and reverse osmosis treated) provided detailed results regarding chemical indicators of soil and plant health, notably SAR, EC and sodium concentrations. A third application uses the coupled HYDRUS-PHREEQC module to analyze trace metal transport involving cation exchange and surface complexation sorption reactions in the vadose zone leached with coal seam gas produced water following some accidental water release scenario. Results show that the main process responsible for trace metal migration is complexation of naturally present trace metals with inorganic ligands such as (bi)carbonate that enter the soil upon infiltration with alkaline produced water.

  10. Zoning method for environmental engineering geological patterns in underground coal mining areas.

    PubMed

    Liu, Shiliang; Li, Wenping; Wang, Qiqing

    2018-09-01

    Environmental engineering geological patterns (EEGPs) are used to express the trend and intensity of eco-geological environment caused by mining in underground coal mining areas, a complex process controlled by multiple factors. A new zoning method for EEGPs was developed based on the variable-weight theory (VWT), where the weights of factors vary with their value. The method was applied to the Yushenfu mining area, Shaanxi, China. First, the mechanism of the EEGPs caused by mining was elucidated, and four types of EEGPs were proposed. Subsequently, 13 key control factors were selected from mining conditions, lithosphere, hydrosphere, ecosphere, and climatic conditions; their thematic maps were constructed using ArcGIS software and remote-sensing technologies. Then, a stimulation-punishment variable-weight model derived from the partition of basic evaluation unit of study area, construction of partition state-variable-weight vector, and determination of variable-weight interval was built to calculate the variable weights of each factor. On this basis, a zoning mathematical model of EEGPs was established, and the zoning results were analyzed. For comparison, the traditional constant-weight theory (CWT) was also applied to divide the EEGPs. Finally, the zoning results obtained using VWT and CWT were compared. The verification of field investigation indicates that VWT is more accurate and reliable than CWT. The zoning results are consistent with the actual situations and the key of planning design for the rational development of coal resources and protection of eco-geological environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Gravity and magnetic anomaly modeling and correlation using the SPHERE program and Magsat data

    NASA Technical Reports Server (NTRS)

    Braile, L. W.; Hinze, W. J. (Principal Investigator); Vonfrese, R. R. B.

    1980-01-01

    The spherical Earth inversion, modeling, and contouring software were tested and modified for processing data in the Southern Hemisphere. Preliminary geologic/tectonic maps and selected cross sections for South and Central America and the Caribbean region are being compiled and as well as gravity and magnetic models for the major geological features of the area. A preliminary gravity model of the Andeas Beniff Zone was constructed so that the density columns east and west of the subducted plates are in approximate isostatic equilibrium. The magnetic anomaly for the corresponding magnetic model of the zone is being computed with the SPHERE program. A test tape containing global magnetic measurements was converted to a tape compatible with Purdue's CDC system. NOO data were screened for periods of high diurnal activity and reduced to anomaly form using the IGS-75 model. Magnetic intensity anomaly profiles were plotted on the conterminous U.S. map using the track lines as the anomaly base level. The transcontinental magnetic high seen in POGO and MAGSAT data is also represented in the NOO data.

  12. Software for the grouped optimal aggregation technique

    NASA Technical Reports Server (NTRS)

    Brown, P. M.; Shaw, G. W. (Principal Investigator)

    1982-01-01

    The grouped optimal aggregation technique produces minimum variance, unbiased estimates of acreage and production for countries, zones (states), or any designated collection of acreage strata. It uses yield predictions, historical acreage information, and direct acreage estimate from satellite data. The acreage strata are grouped in such a way that the ratio model over historical acreage provides a smaller variance than if the model were applied to each individual stratum. An optimal weighting matrix based on historical acreages, provides the link between incomplete direct acreage estimates and the total, current acreage estimate.

  13. Coastal zone environment measurements at Sakhalin Island using autonomous mobile robotic system

    NASA Astrophysics Data System (ADS)

    Tyugin, Dmitry; Kurkin, Andrey; Zaytsev, Andrey; Zeziulin, Denis; Makarov, Vladimir

    2017-04-01

    To perform continuous complex measurements of environment characteristics in coastal zones autonomous mobile robotic system was built. The main advantage of such system in comparison to manual measurements is an ability to quickly change location of the equipment and start measurements. AMRS allows to transport a set of sensors and appropriate power source for long distances. The equipment installed on the AMRS includes: a modern high-tech ship's radar «Micran» for sea waves measurements, multiparameter platform WXT 520 for weather monitoring, high precision GPS/GLONASS receiver OS-203 for georeferencing, laser scanner platform based on two Sick LMS-511 scanners which can provide 3D distance measurements in up to 80 meters on the AMRS route and rugged designed quad-core fanless computer Matrix MXE-5400 for data collecting and recording. The equipment is controlled by high performance modular software developed specially for the AMRS. During the summer 2016 the experiment was conducted. Measurements took place at the coastal zone of Sakhalin Island (Russia). The measuring system of AMRS was started in automatic mode controlled by the software. As result a lot of data was collected and processed to database. It consists of continuous measurements of the coastal zone including different weather conditions. The most interesting for investigation is a period of three-point storm detected on June, 2, 2016. Further work will relate to data processing of measured environment characteristics and numerical models verification based on the collected data. The presented results of research obtained by the support of the Russian president's scholarship for young scientists and graduate students №SP-193.2015.5

  14. Optimised layout and roadway support planning with integrated intelligent software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kouniali, S.; Josien, J.P.; Piguet, J.P.

    1996-12-01

    Experience with knowledge-based systems for Layout planning and roadway support dimensioning is on hand in European coal mining since 1985. The systems SOUT (Support choice and dimensioning, 1989), SOUT 2, PLANANK (planning of bolt-support), Exos (layout planning diagnosis. 1994), Sout 3 (1995) have been developed in close cooperation by CdF{sup 1}. INERIS{sup 2} , EMN{sup 3} (France) and RAG{sup 4}, DMT{sup 5}, TH - Aachen{sup 6} (Germany); ISLSP (Integrated Software for Layout and support planning) development is in progress (completion scheduled for July 1996). This new software technology in combination with conventional programming systems, numerical models and existing databases turnedmore » out to be suited for setting-up an intelligent decision aid for layout and roadway support planning. The system enhances reliability of planning and optimises the safety-to-cost ratio for (1) deformation forecast for roadways in seam and surrounding rocks, consideration of the general position of the roadway in the rock mass (zones of increased pressure, position of operating and mined panels); (2) support dimensioning; (3) yielding arches, rigid arches, porch sets, rigid rings, yielding rings and bolting/shotcreting for drifts; (4) yielding arches, rigid arches and porch sets for roadways in seam; and (5) bolt support for gateroads (assessment of exclusion criteria and calculation of the bolting pattern) bolting of face-end zones (feasibility and safety assessment; stability guarantee).« less

  15. Aquifer development planning to supply a seaside resort: a case study in Goa, India

    NASA Astrophysics Data System (ADS)

    Lobo Ferreira, J. P. Cárcomo; da Conceição Cunha, Maria; Chachadi, A. G.; Nagel, Kai; Diamantino, Catarina; Oliveira, Manuel Mendes

    2007-09-01

    Using the hydrogeological and socio-economic data derived from a European Commission research project on the measurement, monitoring and sustainability of the coastal environment, two optimization models have been applied to satisfy the future water resources needs of the coastal zone of Bardez in Goa, India. The number of tourists visiting Goa since the 1970s has risen considerably, and roughly a third of them go to Bardez taluka, prompting growth in the tourist-related infrastructure in the region. The optimization models are non-linear mixed integer models that have been solved using GAMS/DICOPT++ commercial software. Optimization models were used, firstly, to indicate the most suitable zones for building seaside resorts and wells to supply the tourist industry with an adequate amount of water, and secondly, to indicate the best location for wells to adequately supply pre-existing hotels. The models presented will help to define the optimal locations for the wells and the hydraulic infrastructures needed to satisfy demand at minimum cost, taking into account environmental constraints such as the risk of saline intrusion.

  16. A Monte Carlo model for 3D grain evolution during welding

    NASA Astrophysics Data System (ADS)

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    2017-09-01

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bézier curves, which allow for the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. The model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.

  17. Conservation Reserve Program effects on floodplain land cover management.

    PubMed

    Jobe, Addison; Kalra, Ajay; Ibendahl, Elise

    2018-05-15

    Growing populations and industrialized agriculture practices have eradicated much of the United States wetlands along river floodplains. One program available for the restoration of floodplains is the Conservation Reserve Program (CRP). The current research explores the effects CRP land change has on flooding zones, utilizing Flood Modeller and HEC-RAS. Flood Modeller is proven a viable tool for flood modeling within the United States when compared to HEC-RAS. Application of the software is used in the Nodaway River system located in the western halves of Iowa and Missouri to model effects of introducing new forest areas within the region. Flood stage during the conversion first decreases in the early years, before rising to produce greater heights. Flow velocities where CRP land is present are reduced for long-term scopes. Velocity reduction occurs as the Manning's roughness increases due to tree diameter and brush density. Flood zones become more widespread with the implementation of CRP. Future model implementations are recommended to witness the effects of smaller flood recurrence intervals. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. En-face imaging of the ellipsoid zone in the retina from optical coherence tomography B-scans

    NASA Astrophysics Data System (ADS)

    Holmes, T.; Larkin, S.; Downing, M.; Csaky, K.

    2015-03-01

    It is generally believed that photoreceptor integrity is related to the ellipsoid zone appearance in optical coherence tomography (OCT) B-scans. Algorithms and software were developed for viewing and analyzing the ellipsoid zone. The software performs the following: (a), automated ellipsoid zone isolation in the B-scans, (b), en-face view of the ellipsoid-zone reflectance, (c), alignment and overlay of (b) onto reflectance images of the retina, and (d), alignment and overlay of (c) with microperimetry sensitivity points. Dataset groups were compared from normal and dry age related macular degeneration (DAMD) subjects. Scalar measurements for correlation against condition included the mean and standard deviation of the ellipsoid zone's reflectance. The imageprocessing techniques for automatically finding the ellipsoid zone are based upon a calculation of optical flow which tracks the edges of laminated structures across an image. Statistical significance was shown in T-tests of these measurements with the population pools separated as normal and DAMD subjects. A display of en-face ellipsoid-zone reflectance shows a clear and recognizable difference between any of the normal and DAMD subjects in that they show generally uniform and nonuniform reflectance, respectively, over the region near the macula. Regions surrounding points of low microperimetry (μP) sensitivity have nonregular and lower levels of ellipsoid-zone reflectance nearby. These findings support the idea that the photoreceptor integrity could be affecting both the ellipsoid-zone reflectance and the sensitivity measurements.

  19. Digital modeling of end-mill cutting tools for FEM applications from the active cutting contour

    NASA Astrophysics Data System (ADS)

    Salguero, Jorge; Marcos, M.; Batista, M.; Gómez, A.; Mayuet, P.; Bienvenido, R.

    2012-04-01

    A very current technique in the research field of machining by material removal is the use of simulations using the Finite Element Method (FEM). Nevertheless, and although is widely used in processes that allows approximations to orthogonal cutting, such as shaping, is scarcely used in more complexes processes, such as milling. This fact is due principally to the complex geometry of the cutting tools in these processes, and the need to realize the studi es in an oblique cutting configuration. This paper shows a methodology for the geometrical characterization of commercial endmill cutting tools, by the extraction of the cutting tool contour, making use of optical metrology, and using this geometry to model the active cutting zone with a 3D CAD software. This model is easily exportable to different CAD formats, such as IGES or STEP, and importable from FEM software, where is possible to study the behavior in service of the same ones.

  20. State-and-transition simulation models: a framework for forecasting landscape change

    USGS Publications Warehouse

    Daniel, Colin; Frid, Leonardo; Sleeter, Benjamin M.; Fortin, Marie-Josée

    2016-01-01

    SummaryA wide range of spatially explicit simulation models have been developed to forecast landscape dynamics, including models for projecting changes in both vegetation and land use. While these models have generally been developed as separate applications, each with a separate purpose and audience, they share many common features.We present a general framework, called a state-and-transition simulation model (STSM), which captures a number of these common features, accompanied by a software product, called ST-Sim, to build and run such models. The STSM method divides a landscape into a set of discrete spatial units and simulates the discrete state of each cell forward as a discrete-time-inhomogeneous stochastic process. The method differs from a spatially interacting Markov chain in several important ways, including the ability to add discrete counters such as age and time-since-transition as state variables, to specify one-step transition rates as either probabilities or target areas, and to represent multiple types of transitions between pairs of states.We demonstrate the STSM method using a model of land-use/land-cover (LULC) change for the state of Hawai'i, USA. Processes represented in this example include expansion/contraction of agricultural lands, urbanization, wildfire, shrub encroachment into grassland and harvest of tree plantations; the model also projects shifts in moisture zones due to climate change. Key model output includes projections of the future spatial and temporal distribution of LULC classes and moisture zones across the landscape over the next 50 years.State-and-transition simulation models can be applied to a wide range of landscapes, including questions of both land-use change and vegetation dynamics. Because the method is inherently stochastic, it is well suited for characterizing uncertainty in model projections. When combined with the ST-Sim software, STSMs offer a simple yet powerful means for developing a wide range of models of landscape dynamics.

  1. Tools for Virtual Collaboration Designed for High Resolution Hydrologic Research with Continental-Scale Data Support

    NASA Astrophysics Data System (ADS)

    Duffy, Christopher; Leonard, Lorne; Shi, Yuning; Bhatt, Gopal; Hanson, Paul; Gil, Yolanda; Yu, Xuan

    2015-04-01

    Using a series of recent examples and papers we explore some progress and potential for virtual (cyber-) collaboration inspired by access to high resolution, harmonized public-sector data at continental scales [1]. The first example describes 7 meso-scale catchments in Pennsylvania, USA where the watershed is forced by climate reanalysis and IPCC future climate scenarios (Intergovernmental Panel on Climate Change). We show how existing public-sector data and community models are currently able to resolve fine-scale eco-hydrologic processes regarding wetland response to climate change [2]. The results reveal that regional climate change is only part of the story, with large variations in flood and drought response associated with differences in terrain, physiography, landuse and/or hydrogeology. The importance of community-driven virtual testbeds are demonstrated in the context of Critical Zone Observatories, where earth scientists from around the world are organizing hydro-geophysical data and model results to explore new processes that couple hydrologic models with land-atmosphere interaction, biogeochemical weathering, carbon-nitrogen cycle, landscape evolution and ecosystem services [3][4]. Critical Zone cyber-research demonstrates how data-driven model development requires a flexible computational structure where process modules are relatively easy to incorporate and where new data structures can be implemented [5]. From the perspective of "Big-Data" the paper points out that extrapolating results from virtual observatories to catchments at continental scales, will require centralized or cloud-based cyberinfrastructure as a necessary condition for effectively sharing petabytes of data and model results [6]. Finally we outline how innovative cyber-science is supporting earth-science learning, sharing and exploration through the use of on-line tools where hydrologists and limnologists are sharing data and models for simulating the coupled impacts of catchment hydrology on lake eco-hydrology (NSF-INSPIRE, IIS1344272). The research attempts to use a virtual environment (www.organicdatascience.org) to break down disciplinary barriers and support emergent communities of science. [1] Source: Leonard and Duffy, 2013, Environmental Modelling & Software; [2] Source: Yu et al, 2014, Computers in Geoscience; [3] Source: Duffy et al, 2014, Procedia Earth and Planetary Science; [4] Source: Shi et al, Journal of Hydrometeorology, 2014; [5] Source: Bhatt et al, 2014, Environmental Modelling & Software ; [6] Leonard and Duffy, 2014, Environmental Modelling and Software.

  2. Using Remote Sensing Data to Constrain Models of Fault Interactions and Plate Boundary Deformation

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Donnellan, A.; Lyzenga, G. A.; Parker, J. W.; Milliner, C. W. D.

    2016-12-01

    Determining the distribution of slip and behavior of fault interactions at plate boundaries is a complex problem. Field and remotely sensed data often lack the necessary coverage to fully resolve fault behavior. However, realistic physical models may be used to more accurately characterize the complex behavior of faults constrained with observed data, such as GPS, InSAR, and SfM. These results will improve the utility of using combined models and data to estimate earthquake potential and characterize plate boundary behavior. Plate boundary faults exhibit complex behavior, with partitioned slip and distributed deformation. To investigate what fraction of slip becomes distributed deformation off major faults, we examine a model fault embedded within a damage zone of reduced elastic rigidity that narrows with depth and forward model the slip and resulting surface deformation. The fault segments and slip distributions are modeled using the JPL GeoFEST software. GeoFEST (Geophysical Finite Element Simulation Tool) is a two- and three-dimensional finite element software package for modeling solid stress and strain in geophysical and other continuum domain applications [Lyzenga, et al., 2000; Glasscoe, et al., 2004; Parker, et al., 2008, 2010]. New methods to advance geohazards research using computer simulations and remotely sensed observations for model validation are required to understand fault slip, the complex nature of fault interaction and plate boundary deformation. These models help enhance our understanding of the underlying processes, such as transient deformation and fault creep, and can aid in developing observation strategies for sUAV, airborne, and upcoming satellite missions seeking to determine how faults behave and interact and assess their associated hazard. Models will also help to characterize this behavior, which will enable improvements in hazard estimation. Validating the model results against remotely sensed observations will allow us to better constrain fault zone rheology and physical properties, having implications for the overall understanding of earthquake physics, fault interactions, plate boundary deformation and earthquake hazard, preparedness and risk reduction.

  3. The control of float zone interfaces by the use of selected boundary conditions

    NASA Technical Reports Server (NTRS)

    Foster, L. M.; Mcintosh, J.

    1983-01-01

    The main goal of the float zone crystal growth project of NASA's Materials Processing in Space Program is to thoroughly understand the molten zone/freezing crystal system and all the mechanisms that govern this system. The surface boundary conditions required to give flat float zone solid melt interfaces were studied and computed. The results provide float zone furnace designers with better methods for controlling solid melt interface shapes and for computing thermal profiles and gradients. Documentation and a user's guide were provided for the computer software.

  4. Forecasting daily meteorological time series using ARIMA and regression models

    NASA Astrophysics Data System (ADS)

    Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir

    2018-04-01

    The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.

  5. Numerical simulation of infiltration and groundwater recharge using the Hydrus for Modflow package and the BEST model of soil hydraulic properties

    NASA Astrophysics Data System (ADS)

    Gumuła-Kawęcka, Anna; Szymkiewicz, Adam; Angulo-Jaramillo, Rafael; Šimůnek, Jirka; Jaworska-Szulc, Beata; Pruszkowska-Caceres, Małgorzata; Gorczewska-Langner, Wioletta; Leterme, Bertrand; Jacques, Diederik

    2017-04-01

    ABSTRACT Groundwater recharge is a complex process, which depends on several factors, including the hydraulic properties of soils in the vadose zone. On the other hand, the rate of recharge is one of the main input data in hydrogeological models for saturated groundwater flow. Thus, there is an increasing understanding of the need for more complete representation of vadose zone processes in groundwater modeling. One of the possible approaches is to use a 1D model of water flow in the unsaturated zone coupled with 3D groundwater model for the saturated zone. Such an approach was implemented in the Hydrus for Modflow package (Seo et al. 2007), which combines two well-known and thoroughly tested modeling tools: groundwater flow simulator MODFLOW (Harbaugh 2005) and one-dimensional vadose zone simulator HYDRUS 1D (Šimůnek et al. 2016), based on the Richards equation. The Hydrus for Modflow package has been recently enhanced by implementing the BEST model of soil hydraulic properties (Lassabatere et al. 2006), which is a combination of van Genuchten - type retention function with Brooks-Corey type hydraulic conductivity function. The parameters of these functions can be divided into texture-related and structure-related and can be obtained from relatively simple lab and field tests. The method appears a promising tool for obtaining input data for vadose zone flow models. The main objective of this work is to evaluate the sensitivity of the recharge rates to the values of various parameters of the BEST model. Simulations are performed for a range of soil textural classes and plant covers, using meteorological data typical for northern Poland. ACKNOWLEDGEMENTS This work has been supported by National Science Centre, Poland in the framework of the project 2015/17/B/ST10/03233 "Groundwater recharge on outwash plain". REFERENCES [1]Harbaugh, A.W. (2005) MODFLOW-2005, the US Geological Survey modular ground-water model: the ground-water flow process. Reston, VA, USA. [2]Lassabatere L. et al. (2006) Beerkan estimation of soil transfer parameters through infiltration experiments—BEST. Soil Science Society of America Journal 70.2: 521-532. [3]Seo, H.S., Šimůnek J., Poeter E.P. (2007) Documentation of the Hydrus package for Modflow-2000, the US Geological Survey modular ground-water model. [4]Šimůnek, J., van Genuchten, M.Th., and Šejna, M. (2016) Recent developments and applications of the HYDRUS computer software packages, Vadose Zone Journal, 15(7), pp. 25, doi: 10.2136/vzj2016.04.0033.

  6. Porosity Development in a Coastal Setting: A Reactive Transport Model to Assess the Influence of Heterogeneity of Hydrological, Geochemical and Lithological Conditions

    NASA Astrophysics Data System (ADS)

    Maqueda, A.; Renard, P.; Cornaton, F. J.

    2014-12-01

    Coastal karst networks are formed by mineral dissolution, mainly calcite, in the freshwater-saltwater mixing zone. The problem has been approached first by studying the kinetics of calcite dissolution and then coupling ion-pairing software with flow and mass transport models. Porosity development models require high computational power. A workaround to reduce computational complexity is to assume the calcite dissolution reaction is relatively fast, thus equilibrium chemistry can be used to model it (Sanford & Konikow, 1989). Later developments allowed the full coupling of kinetics and transport in a model. However kinetics effects of calcite dissolution were found negligible under the single set of assumed hydrological and geochemical boundary conditions. A model is implemented with the coupling of FeFlow software as the flow & transport module and PHREEQC4FEFLOW (Wissmeier, 2013) ion-pairing module. The model is used to assess the influence of heterogeneities in hydrological, geochemical and lithological boundary conditions on porosity evolution. The hydrologic conditions present in the karst aquifer of Quintana Roo coast in Mexico are used as a guide for generating inputs for simulations.

  7. Trajectory Model of Lunar Dust Particles

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The goal of this work was to predict the trajectories of blowing lunar regolith (soil) particles when a spacecraft lands on or launches from the Moon. The blown regolith is known to travel at very high velocity and to damage any hardware located nearby on the Moon. It is important to understand the trajectories so we can develop technologies to mitigate the blast effects for the launch and landing zones at a lunar outpost. A mathematical model was implemented in software to predict the trajectory of a single spherical mass acted on by the gas jet from the nozzle of a lunar lander.

  8. Numerical Simulation of the Thermal Process in a W-Shape Radiant Tube Burner

    NASA Astrophysics Data System (ADS)

    Wang, Yi; Li, Jiyong; Zhang, Lifeng; Ling, Haitao; Li, Yanlong

    2014-07-01

    In the current work, three-dimensional mathematical models were developed for the heat transfer and combustion in a W-shape radiant tube burner (RTB) and were solved using Fluent software (ANSYS Inc., Canonsburg, PA). The standard k- ɛ model, nonpremixed combustion model, and the discrete ordinate model were used for the modeling of turbulence, combustion, and radiant heat transfer, respectively. In addition, the NO x postprocessor was used for the prediction of the NO emission. A corresponding experiment was performed for the validation of mathematical models. The details of fluid flow, heat transfer, and combustion in the RTB were investigated. Moreover, the effect of the air/fuel ratio (A/F) and air staging on the performance of RTB was studied with the reference indexes including heat efficiency, maximum temperature difference on shell wall, and NO emission at the outlet. The results indicated that a low speed zone formed in the vicinity of the combustion chamber outlet, and there were two relative high-temperature zones in the RTB, one in combustion chamber that favored the flame stability and the other from the main flame in the RTB. The maximum temperature difference was 95.48 K. As the A/F increased, the temperature increased first and then decreased. As the ratio of the primary to secondary air increased, the recirculation zone at the outlet of combustion chamber shrank gradually to disappear, and the flame length was longer and the temperature in flame decreased correspondingly.

  9. 3D thermal model of laser surface glazing for H13 tool steel

    NASA Astrophysics Data System (ADS)

    Kabir, I. R.; Yin, D.; Naher, S.

    2017-10-01

    In this work a three dimensional (3D) finite element model of laser surface glazing (LSG) process has been developed. The purpose of the 3D thermal model of LSG was to achieve maximum accuracy towards the predicted outcome for optimizing the process. A cylindrical geometry of 10mm diameter and 1mm length was used in ANSYS 15 software. Temperature distribution, depth of modified zone and cooling rates were analysed from the thermal model. Parametric study was carried out varying the laser power from 200W-300W with constant beam diameter and residence time which were 0.2mm and 0.15ms respectively. The maximum surface temperature 2554°K was obtained for power 300W and minimum surface temperature 1668°K for power 200W. Heating and cooling rates increased with increasing laser power. The depth of the laser modified zone attained for 300W power was 37.5µm and for 200W power was 30µm. No molten zone was observed at 200W power. Maximum surface temperatures obtained from 3D model increased 4% than 2D model presented in author's previous work. In order to verify simulation results an analytical solution of temperature distribution for laser surface modification was used. The surface temperature after heating was calculated for similar laser parameters which is 1689°K. The difference in maximum surface temperature is around 20.7°K between analytical and numerical analysis of LSG for power 200W.

  10. Three-Dimensional Geologic Map of the Hayward Fault Zone, San Francisco Bay Region, California

    USGS Publications Warehouse

    Phelps, G.A.; Graymer, R.W.; Jachens, R.C.; Ponce, D.A.; Simpson, R.W.; Wentworth, C.M.

    2008-01-01

    A three-dimensional (3D) geologic map of the Hayward Fault zone was created by integrating the results from geologic mapping, potential field geophysics, and seismology investigations. The map volume is 100 km long, 20 km wide, and extends to a depth of 12 km below sea level. The map volume is oriented northwest and is approximately bisected by the Hayward Fault. The complex geologic structure of the region makes it difficult to trace many geologic units into the subsurface. Therefore, the map units are generalized from 1:24,000-scale geologic maps. Descriptions of geologic units and structures are offered, along with a discussion of the methods used to map them and incorporate them into the 3D geologic map. The map spatial database and associated viewing software are provided. Elements of the map, such as individual fault surfaces, are also provided in a non-proprietary format so that the user can access the map via open-source software. The sheet accompanying this manuscript shows views taken from the 3D geologic map for the user to access. The 3D geologic map is designed as a multi-purpose resource for further geologic investigations and process modeling.

  11. Newberry Volcano EGS Demonstration Stimulation Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trenton T. Cladouhos, Matthew Clyne, Maisie Nichols,; Susan Petty, William L. Osborn, Laura Nofziger

    2011-10-23

    As a part of Phase I of the Newberry Volcano EGS Demonstration project, several data sets were collected to characterize the rock volume around the well. Fracture, fault, stress, and seismicity data has been collected by borehole televiewer, LiDAR elevation maps, and microseismic monitoring. Well logs and cuttings from the target well (NWG 55-29) and core from a nearby core hole (USGS N-2) have been analyzed to develop geothermal, geochemical, mineralogical and strength models of the rock matrix, altered zones, and fracture fillings (see Osborn et al., this volume). These characterization data sets provide inputs to models used to planmore » and predict EGS reservoir creation and productivity. One model used is AltaStim, a stochastic fracture and flow software model developed by AltaRock. The software's purpose is to model and visualize EGS stimulation scenarios and provide guidance for final planning. The process of creating an AltaStim model requires synthesis of geologic observations at the well, the modeled stress conditions, and the stimulation plan. Any geomechanical model of an EGS stimulation will require many assumptions and unknowns; thus, the model developed here should not be considered a definitive prediction, but a plausible outcome given reasonable assumptions. AltaStim is a tool for understanding the effect of known constraints, assumptions, and conceptual models on plausible outcomes.« less

  12. Parameter Uncertainty Analysis Using Monte Carlo Simulations for a Regional-Scale Groundwater Model

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Pohlmann, K.

    2016-12-01

    Regional-scale grid-based groundwater models for flow and transport often contain multiple types of parameters that can intensify the challenge of parameter uncertainty analysis. We propose a Monte Carlo approach to systematically quantify the influence of various types of model parameters on groundwater flux and contaminant travel times. The Monte Carlo simulations were conducted based on the steady-state conversion of the original transient model, which was then combined with the PEST sensitivity analysis tool SENSAN and particle tracking software MODPATH. Results identified hydrogeologic units whose hydraulic conductivity can significantly affect groundwater flux, and thirteen out of 173 model parameters that can cause large variation in travel times for contaminant particles originating from given source zones.

  13. Antibiogramj: A tool for analysing images from disk diffusion tests.

    PubMed

    Alonso, C A; Domínguez, C; Heras, J; Mata, E; Pascual, V; Torres, C; Zarazaga, M

    2017-05-01

    Disk diffusion testing, known as antibiogram, is widely applied in microbiology to determine the antimicrobial susceptibility of microorganisms. The measurement of the diameter of the zone of growth inhibition of microorganisms around the antimicrobial disks in the antibiogram is frequently performed manually by specialists using a ruler. This is a time-consuming and error-prone task that might be simplified using automated or semi-automated inhibition zone readers. However, most readers are usually expensive instruments with embedded software that require significant changes in laboratory design and workflow. Based on the workflow employed by specialists to determine the antimicrobial susceptibility of microorganisms, we have designed a software tool that, from images of disk diffusion tests, semi-automatises the process. Standard computer vision techniques are employed to achieve such an automatisation. We present AntibiogramJ, a user-friendly and open-source software tool to semi-automatically determine, measure and categorise inhibition zones of images from disk diffusion tests. AntibiogramJ is implemented in Java and deals with images captured with any device that incorporates a camera, including digital cameras and mobile phones. The fully automatic procedure of AntibiogramJ for measuring inhibition zones achieves an overall agreement of 87% with an expert microbiologist; moreover, AntibiogramJ includes features to easily detect when the automatic reading is not correct and fix it manually to obtain the correct result. AntibiogramJ is a user-friendly, platform-independent, open-source, and free tool that, up to the best of our knowledge, is the most complete software tool for antibiogram analysis without requiring any investment in new equipment or changes in the laboratory. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Optimizing alignment and growth of low-loss YAG single crystal fibers using laser heated pedestal growth technique.

    PubMed

    Bera, Subhabrata; Nie, Craig D; Soskind, Michael G; Harrington, James A

    2017-12-10

    The effect of misalignments of different optical components in the laser heated pedestal growth apparatus have been modeled using Zemax optical design software. By isolating the misalignments causing the non-uniformity in the melt zone, the alignment of the components was fine-tuned. Using this optimized alignment, low-loss YAG single crystal fibers of 120 μm diameter were grown, with total attenuation loss as low as 0.5 dB/m at 1064 nm.

  15. Numerical Simulation of the Effect about Groundwater Level Fluctuation on the Concentration of BTEX Dissolved into Source Zone

    NASA Astrophysics Data System (ADS)

    Sun, Liqun; Chen, Yudao; Jiang, Lingzhi; Cheng, Yaping

    2018-01-01

    The water level fluctuation of groundwater will affect the BTEX dissolution in the fuel leakage source zone. In order to study the effect, a leakage test of gasoline was performed in the sand-tank model in the laboratory, and the concentrations of BTEX along with water level were monitored over a long period. Combined with VISUAL MODFLOW software, RT3D module was used to simulate the concentrations of BTEX, and mass flux method was used to evaluate the effects of water level fluctuation on the BTEX dissolution. The results indicate that water level fluctuation can significantly increase the concentration of BTEX dissolved in the leakage source zone. The dissolved amount of BTEX can reach up to 2.4 times under the water level fluctuation condition. The method of numerical simulation combined with mass flux calculation can be used to evaluate the effect of water level fluctuation on BTEX dissolution.

  16. CORMIX2: AN EXPERT SYSTEM FOR HYDRODYNAMIC MIXING ZONE ANALYSIS OF CONVENTIONAL AND TOXIC MULTIPORT DIFFUSER DISCHARGES

    EPA Science Inventory

    CORMIX is a series of software systems for the analysis, prediction, and design of aqueous toxic or conventional pollutant discharges into watercourses, with emphasis on the geometry and dilution characteristics of the initial mixing zone. ubsystem CORMIX1 deals with submerged si...

  17. An inverse approach to constraining strain and vorticity using rigid clast shape preferred orientation data

    NASA Astrophysics Data System (ADS)

    Davis, Joshua R.; Giorgis, Scott

    2014-11-01

    We describe a three-part approach for modeling shape preferred orientation (SPO) data of spheroidal clasts. The first part consists of criteria to determine whether a given SPO and clast shape are compatible. The second part is an algorithm for randomly generating spheroid populations that match a prescribed SPO and clast shape. In the third part, numerical optimization software is used to infer deformation from spheroid populations, by finding the deformation that returns a set of post-deformation spheroids to a minimally anisotropic initial configuration. Two numerical experiments explore the strengths and weaknesses of this approach, while giving information about the sensitivity of the model to noise in data. In monoclinic transpression of oblate rigid spheroids, the model is found to constrain the shortening component but not the simple shear component. This modeling approach is applied to previously published SPO data from the western Idaho shear zone, a monoclinic transpressional zone that deformed a feldspar megacrystic gneiss. Results suggest at most 5 km of shortening, as well as pre-deformation SPO fabric. The shortening estimate is corroborated by a second model that assumes no pre-deformation fabric.

  18. A Monte Carlo model for 3D grain evolution during welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bezier curves, which allow formore » the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. Furthermore, the model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.« less

  19. A Monte Carlo model for 3D grain evolution during welding

    DOE PAGES

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    2017-08-04

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bezier curves, which allow formore » the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. Furthermore, the model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.« less

  20. Modeling Enclosure Design in Above-Grade Walls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lstiburek, J.; Ueno, K.; Musunuru, S.

    2016-03-01

    This report describes the modeling of typical wall assemblies that have performed well historically in various climate zones. The WUFI (Warme und Feuchte instationar) software (Version 5.3) model was used. A library of input data and results are provided. The provided information can be generalized for application to a broad population of houses, within the limits of existing experience. The WUFI software model was calibrated or tuned using wall assemblies with historically successful performance. The primary performance criteria or failure criteria establishing historic performance was moisture content of the exterior sheathing. The primary tuning parameters (simulation inputs) were airflow andmore » specifying appropriate material properties. Rational hygric loads were established based on experience - specifically rain wetting and interior moisture (RH levels). The tuning parameters were limited or bounded by published data or experience. The WUFI templates provided with this report supply useful information resources to new or less-experienced users. The files present various custom settings that will help avoid results that will require overly conservative enclosure assemblies. Overall, better material data, consistent initial assumptions, and consistent inputs among practitioners will improve the quality of WUFI modeling, and improve the level of sophistication in the field.« less

  1. Decision Making on Regional Landfill Site Selection in Hormozgan Province Using Smce

    NASA Astrophysics Data System (ADS)

    Majedi, A. S.; Kamali, B. M.; Maghsoudi, R.

    2015-12-01

    Landfill site selection and suitable conditions to bury hazardous wastes are among the most critical issues in modern societies. Taking several factors and limitations into account along with true decision making requires application of different decision techniques. To this end, current paper aims to make decisions about regional landfill site selection in Hormozgan province and utilizes SMCE technique combined with qualitative and quantitative criteria to select the final alternatives. To this respect, we first will describe the existing environmental situation in our study area and set the goals of our study in the framework of SMCE and will analyze the effective factors in regional landfill site selection. Then, methodological procedure of research was conducted using Delphi approach and questionnaires (in order to determine research validity, Chronbach Alpha (0.94) method was used). Spatial multi-criteria analysis model was designed in the form of criteria tree in SMCE using IL WIS software. Prioritization of respective spatial alternatives included: Bandar Abbas city with total 4 spatial alternatives (one zone with 1st priority, one zone with 3rd priority and two zones with 4thpriority) was considered the first priority, Bastak city with total 3 spatial alternatives (one zone with 2nd priority, one zone with 3rdpriorit and one zone with 4th priority) was the second priority and Bandar Abbas, Minab, Jask and Haji Abad cities were considered as the third priority.

  2. Pi-EEWS: a low cost prototype for on-site earthquake early warning system

    NASA Astrophysics Data System (ADS)

    Pazos, Antonio; Vera, Angel; Morgado, Arturo; Rioja, Carlos; Davila, Jose Martin; Cabieces, Roberto

    2017-04-01

    The Royal Spanish Navy Observatory (ROA), with the participation of the Cadiz University (UCA), have been developed the ALERTES-SC3 EEWS (regional approach) based on the SeisComP3 software package. This development has been done in the frame of the Spanish ALERT-ES (2011-2013) and ALERTES-RIM (2014-2016) projects, and now a days it is being tested in real time for south Iberia. Additionally, the ALERTES-SC3 system integrates an on-site EEWS software, developed by ROA-UCA, which is running for testing in real time in some seismic broad band stations of the WM network. Regional EEWS are not able to provide alerts in the area closet to the epicentre (blind zone), so a dense on-site EEWS is necessary. As it was mentioned, ALERTES-SC3 inludes the on-site software running on several WM stations but a more dense on-site stations are necessary to cover the blind zones. In order to densify this areas, inside of the "blind zones", a low cost on-site prototype "Pi-EEWS", based on a Raspberry Pi card and low cost acelerometers. In this work the main design ideas, the components and its capabilities will be shown.

  3. DMI's Baltic Sea Coastal operational forecasting system

    NASA Astrophysics Data System (ADS)

    Murawski, Jens; Berg, Per; Weismann Poulsen, Jacob

    2017-04-01

    Operational forecasting is challenged with bridging the gap between the large scales of the driving weather systems and the local, human scales of the model applications. The limit of what can be represented by local model has been continuously shifted to higher and higher spatial resolution, with the aim to better resolve the local dynamic and to make it possible to describe processes that could only be parameterised in older versions, with the ultimate goal to improve the quality of the forecast. Current hardware trends demand a str onger focus on the development of efficient, highly parallelised software and require a refactoring of the code with a solid focus on portable performance. The gained performance can be used for running high resolution model with a larger coverage. Together with the development of efficient two-way nesting routines, this has made it possible to approach the near-coastal zone with model applications that can run in a time effective way. Denmarks Meteorological Institute uses the HBM(1) ocean circulation model for applications that covers the entire Baltic Sea and North Sea with an integrated model set-up that spans the range of horizontal resolution from 1nm for the entire Baltic Sea to approx. 200m resolution in local fjords (Limfjord). For the next model generation, the high resolution set-ups are going to be extended and new high resolution domains in coastal zones are either implemented or tested for operational use. For the first time it will be possible to cover large stretches of the Baltic coastal zone with sufficiently high resolution to model the local hydrodynamic adequately. (1) HBM stands for HIROMB-BOOS-Model, whereas HIROMB stands for "High Resolution Model for the Baltic Sea" and BOOS stands for "Baltic Operational Oceanography System".

  4. Dynamic fracture network around faults: implications for earthquake ruptures, ground motion and energy budget

    NASA Astrophysics Data System (ADS)

    Okubo, K.; Bhat, H. S.; Rougier, E.; Lei, Z.; Knight, E. E.; Klinger, Y.

    2017-12-01

    Numerous studies have suggested that spontaneous earthquake ruptures can dynamically induce failure in secondary fracture network, regarded as damage zone around faults. The feedbacks of such fracture network play a crucial role in earthquake rupture, its radiated wave field and the total energy budget. A novel numerical modeling tool based on the combined finite-discrete element method (FDEM), which accounts for the main rupture propagation and nucleation/propagation of secondary cracks, was used to quantify the evolution of the fracture network and evaluate its effects on the main rupture and its associated radiation. The simulations were performed with the FDEM-based software tool, Hybrid Optimization Software Suite (HOSSedu) developed by Los Alamos National Laboratory. We first modeled an earthquake rupture on a planar strike-slip fault surrounded by a brittle medium where secondary cracks can be nucleated/activated by the earthquake rupture. We show that the secondary cracks are dynamically generated dominantly on the extensional side of the fault, mainly behind the rupture front, and it forms an intricate network of fractures in the damage zone. The rupture velocity thereby significantly decreases, by 10 to 20 percent, while the supershear transition length increases in comparison to the one with purely elastic medium. It is also observed that the high-frequency component (10 to 100 Hz) of the near-field ground acceleration is enhanced by the dynamically activated fracture network, consistent with field observations. We then conducted the case study in depth with various sets of initial stress state, and friction properties, to investigate the evolution of damage zone. We show that the width of damage zone decreases in depth, forming "flower-like" structure as the characteristic slip distance in linear slip-weakening law, or the fracture energy on the fault, is kept constant with depth. Finally, we compared the fracture energy on the fault to the energy absorbed by the secondary fracture network to better understand the earthquake energy budget. We conclude that the secondary fracture network plays an important role on the dynamic earthquake rupture, its radiated wave field and the overall energy budget.

  5. A Mass-balance nitrate model for predicting the effects of land use on ground-water quality in municipal wellhead-protection areas

    USGS Publications Warehouse

    Frimpter, M.H.; Donohue, J.J.; Rapacz, M.V.; Beye, H.G.

    1990-01-01

    A mass-balance accounting model can be used to guide the management of septic systems and fertilizers to control the degradation of groundwater quality in zones of an aquifer that contributes water to public supply wells. The nitrate nitrogen concentration of the mixture in the well can be predicted for steady-state conditions by calculating the concentration that results from the total weight of nitrogen and total volume of water entering the zone of contribution to the well. These calculations will allow water-quality managers to predict the nitrate concentrations that would be produced by different types and levels of development, and to plan development accordingly. Computations for different development schemes provide a technical basis for planners and managers to compare water quality effects and to select alternatives that limit nitrate concentration in wells. Appendix A contains tables of nitrate loads and water volumes from common sources for use with the accounting model. Appendix B describes the preparation of a spreadsheet for the nitrate loading calculations with a software package generally available for desktop computers. (USGS)

  6. A New Seismic Hazard Model for Mainland China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z. K.

    2017-12-01

    We are developing a new seismic hazard model for Mainland China by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data, and derive a strain rate model based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones. For each zone, a tapered Gutenberg-Richter (TGR) magnitude-frequency distribution is used to model the seismic activity rates. The a- and b-values of the TGR distribution are calculated using observed earthquake data, while the corner magnitude is constrained independently using the seismic moment rate inferred from the geodetically-based strain rate model. Small and medium sized earthquakes are distributed within the source zones following the location and magnitude patterns of historical earthquakes. Some of the larger earthquakes are distributed onto active faults, based on their geological characteristics such as slip rate, fault length, down-dip width, and various paleoseismic data. The remaining larger earthquakes are then placed into the background. A new set of magnitude-rupture scaling relationships is developed based on earthquake data from China and vicinity. We evaluate and select appropriate ground motion prediction equations by comparing them with observed ground motion data and performing residual analysis. To implement the modeling workflow, we develop a tool that builds upon the functionalities of GEM's Hazard Modeler's Toolkit. The GEM OpenQuake software is used to calculate seismic hazard at various ground motion periods and various return periods. To account for site amplification, we construct a site condition map based on geology. The resulting new seismic hazard maps can be used for seismic risk analysis and management.

  7. Optimizing digital elevation models (DEMs) accuracy for planning and design of mobile communication networks

    NASA Astrophysics Data System (ADS)

    Hassan, Mahmoud A.

    2004-02-01

    Digital elevation models (DEMs) are important tools in the planning, design and maintenance of mobile communication networks. This research paper proposes a method for generating high accuracy DEMs based on SPOT satellite 1A stereo pair images, ground control points (GCP) and Erdas OrthoBASE Pro image processing software. DEMs with 0.2911 m mean error were achieved for the hilly and heavily populated city of Amman. The generated DEM was used to design a mobile communication network resulted in a minimum number of radio base transceiver stations, maximum number of covered regions and less than 2% of dead zones.

  8. Approach for delineation of contributing areas and zones of transport to selected public-supply wells using a regional ground-water flow model, Palm Beach County, Florida

    USGS Publications Warehouse

    Renken, R.A.; Patterson, R.D.; Orzol, L.L.; Dixon, Joann

    2001-01-01

    Rapid urban development and population growth in Palm Beach County, Florida, have been accompanied with the need for additional freshwater withdrawals from the surficial aquifer system. To maintain water quality, County officials protect capture areas and determine zones of transport of municipal supply wells. A multistep process was used to help automate the delineation of wellhead protection areas. A modular ground-water flow model (MODFLOW) Telescopic Mesh Refinement program (MODTMR) was used to construct an embedded flow model and combined with particle tracking to delineate zones of transport to supply wells; model output was coupled with a geographic information system. An embedded flow MODFLOW model was constructed using input and output file data from a preexisting three-dimensional, calibrated model of the surficial aquifer system. Three graphical user interfaces for use with the geographic information software, ArcView, were developed to enhance the telescopic mesh refinement process. These interfaces include AvMODTMR for use with MODTMR; AvHDRD to build MODFLOW river and drain input files from dynamically segmented linear (canals) data sets; and AvWELL Refiner, an interface designed to examine and convert well coverage spatial data layers to a MODFLOW Well package input file. MODPATH (the U.S. Geological Survey particle-tracking postprocessing program) and MODTOOLS (the set of U.S. Geological Survey computer programs to translate MODFLOW and MODPATH output to a geographic information system) were used to map zones of transport. A steady-state, five-layer model of the Boca Raton area was created using the telescopic mesh refinement process and calibrated to average conditions during January 1989 to June 1990. A sensitivity analysis of various model parameters indicates that the model is most sensitive to changes in recharge rates, hydraulic conductivity for layer 1, and leakance for layers 3 and 4 (Biscayne aquifer). Recharge (58 percent); river (canal) leakance (29 percent); and inflow through the northern, western, and southern prescribed flux model boundaries (10 percent) represent the major inflow components. Principal outflow components in the Boca Raton well field area include well discharge (56 percent), river (canal) leakance (27 percent), and water that discharges along the coast (10 percent). A particle-tracking analysis using MODPATH was conducted to better understand well-field ground-water flow patterns and time of travel. MODTOOLS was used to construct zones-of-transport spatial data for municipal supply wells. Porosity estimates were uniformly increased to study the effect of porosity on zones of transport. Where porosity was increased, the size of the zones of transport were shown to decrease.

  9. The "neuro-mapping locator" software. A real-time intraoperative objective paraesthesia mapping tool to evaluate paraesthesia coverage of the painful zone in patients undergoing spinal cord stimulation lead implantation.

    PubMed

    Guetarni, F; Rigoard, P

    2015-03-01

    Conventional spinal cord stimulation (SCS) generates paraesthesia, as the efficacy of this technique is based on the relationship between the paraesthesia provided by SCS on the painful zone and an analgesic effect on the stimulated zone. Although this basic postulate is based on clinical evidence, it is clear that this relationship has never been formally demonstrated by scientific studies. There is a need for objective evaluation tools ("transducers") to transpose electrical signals to clinical effects and to guide therapeutic choices. We have developed a software at Poitiers University hospital allowing real-time objective mapping of the paraesthesia generated by SCS lead placement and programming during the implantation procedure itself, on a touch screen interface. The purpose of this article is to describe this intraoperative mapping software, in terms of its concept and technical aspects. The Neuro-Mapping Locator (NML) software is dedicated to patients with failed back surgery syndrome, candidates for SCS lead implantation, to actively participate in the implantation procedure. Real-time geographical localization of the paraesthesia generated by percutaneous or multicolumn surgical SCS lead implanted under awake anaesthesia allows intraoperative lead programming and possibly lead positioning to be modified with the patient's cooperation. Software updates should enable us to refine objectives related to the use of this tool and minimize observational biases. The ultimate goals of NML software should not be limited to optimize one specific device implantation in a patient but also allow to compare instantaneously various stimulation strategies, by characterizing new technical parameters as "coverage efficacy" and "device specificity" on selected subgroups of patients. Another longer-term objective would be to organize these predictive factors into computer science ontologies, which could constitute robust and helpful data for device selection and programming of tomorrow's neurostimulators. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  10. A GIS-based DRASTIC model for assessing intrinsic groundwater vulnerability in northeastern Missan governorate, southern Iraq

    NASA Astrophysics Data System (ADS)

    Al-Abadi, Alaa M.; Al-Shamma'a, Ayser M.; Aljabbari, Mukdad H.

    2017-03-01

    In this study, intrinsic groundwater vulnerability for the shallow aquifer in northeastern Missan governorate, south of Iraq is evaluated using commonly used DRASTIC model in framework of GIS environment. Preparation of DRASTIC parameters is attained through gathering data from different sources including field survey, geological and meteorological data, a digital elevation model DEM of the study area, archival database, and published research. The different data used to build DRASTIC model are arranged in a geospatial database using spatial analyst extension of ArcGIS 10.2 software. The obtained results related to the vulnerability to general contaminants show that the study area is characterized by two vulnerability zones: low and moderate. Ninety-four percentage (94 %) of the study area has a low class of groundwater vulnerability to contamination, whereas a total of (6 %) of the study area has moderate vulnerability. The pesticides DRASTIC index map shows that the study area is also characterized by two zones of vulnerability: low and moderate. The DRASTIC map of this version clearly shows that small percentage (13 %) of the study area has low vulnerability to contamination, and most parts have moderate vulnerability (about 87 %). The final results indicate that the aquifer system in the interested area is relatively protected from contamination on the groundwater surface. To mitigate the contamination risks in the moderate vulnerability zones, a protective measure must be put before exploiting the aquifer and before comprehensive agricultural activities begin in the area.

  11. Laharz_py: GIS tools for automated mapping of lahar inundation hazard zones

    USGS Publications Warehouse

    Schilling, Steve P.

    2014-01-01

    Laharz_py is written in the Python programming language as a suite of tools for use in ArcMap Geographic Information System (GIS). Primarily, Laharz_py is a computational model that uses statistical descriptions of areas inundated by past mass-flow events to forecast areas likely to be inundated by hypothetical future events. The forecasts use physically motivated and statistically calibrated power-law equations that each has a form A = cV2/3, relating mass-flow volume (V) to planimetric or cross-sectional areas (A) inundated by an average flow as it descends a given drainage. Calibration of the equations utilizes logarithmic transformation and linear regression to determine the best-fit values of c. The software uses values of V, an algorithm for idenitifying mass-flow source locations, and digital elevation models of topography to portray forecast hazard zones for lahars, debris flows, or rock avalanches on maps. Laharz_py offers two methods to construct areas of potential inundation for lahars: (1) Selection of a range of plausible V values results in a set of nested hazard zones showing areas likely to be inundated by a range of hypothetical flows; and (2) The user selects a single volume and a confidence interval for the prediction. In either case, Laharz_py calculates the mean expected A and B value from each user-selected value of V. However, for the second case, a single value of V yields two additional results representing the upper and lower values of the confidence interval of prediction. Calculation of these two bounding predictions require the statistically calibrated prediction equations, a user-specified level of confidence, and t-distribution statistics to calculate the standard error of regression, standard error of the mean, and standard error of prediction. The portrayal of results from these two methods on maps compares the range of inundation areas due to prediction uncertainties with uncertainties in selection of V values. The Open-File Report document contains an explanation of how to install and use the software. The Laharz_py software includes an example data set for Mount Rainier, Washington. The second part of the documentation describes how to use all of the Laharz_py tools in an example dataset at Mount Rainier, Washington.

  12. A graphical modeling tool for evaluating nitrogen loading to and nitrate transport in ground water in the mid-Snake region, south-central Idaho

    USGS Publications Warehouse

    Clark, David W.; Skinner, Kenneth D.; Pollock, David W.

    2006-01-01

    A flow and transport model was created with a graphical user interface to simplify the evaluation of nitrogen loading and nitrate transport in the mid-Snake region in south-central Idaho. This model and interface package, the Snake River Nitrate Scenario Simulator, uses the U.S. Geological Survey's MODFLOW 2000 and MOC3D models. The interface, which is enabled for use with geographic information systems (GIS), was created using ESRI's royalty-free MapObjects LT software. The interface lets users view initial nitrogen-loading conditions (representing conditions as of 1998), alter the nitrogen loading within selected zones by specifying a multiplication factor and applying it to the initial condition, run the flow and transport model, and view a graphical representation of the modeling results. The flow and transport model of the Snake River Nitrate Scenario Simulator was created by rediscretizing and recalibrating a clipped portion of an existing regional flow model. The new subregional model was recalibrated with newly available water-level data and spring and ground-water nitrate concentration data for the study area. An updated nitrogen input GIS layer controls the application of nitrogen to the flow and transport model. Users can alter the nitrogen application to the flow and transport model by altering the nitrogen load in predefined spatial zones contained within similar political, hydrologic, and size-constrained boundaries.

  13. Study to Minimize Learning Progress Differences in Software Learning Class Using PLITAZ System

    ERIC Educational Resources Information Center

    Dong, Jian-Jie; Hwang, Wu-Yuin

    2012-01-01

    This study developed a system using two-phased strategies called "Pause Lecture, Instant Tutor-Tutee Match, and Attention Zone" (PLITAZ). This system was used to help solve learning challenges and to minimize learning progress differences in a software learning class. During a teacher's lecture time, students were encouraged to anonymously express…

  14. A combined monitoring and modeling approach to quantify water and nitrate leaching using effective soil column hydraulic properties

    NASA Astrophysics Data System (ADS)

    Couvreur, V.; Kandelous, M. M.; Moradi, A. B.; Baram, S.; Mairesse, H.; Hopmans, J. W.

    2014-12-01

    There is a worldwide growing concern for agricultural lands input to groundwater pollution. Nitrate contamination of groundwater across the Central Valley of California has been related to its diverse and intensive agricultural practices. However, there has been no study comparing leaching of nitrate in each individual agricultural land within the complex and diversely managed studied area. A combined field monitoring and modeling approach was developed to quantify from simple measurements the leaching of water and nitrate below the root zone. The monitored state variables are soil water content at several depths within the root zone, soil matric potential at two depths below the root zone, and nitrate concentration in the soil solution. In the modeling part, unsaturated water flow and solute transport are simulated with the software HYDRUS in a soil profile fragmented in up to two soil hydraulic types, whose effective hydraulic properties are optimized with an inverse modeling method. The applicability of the method will first be demonstrated "in-silico", with synthetic soil water dynamics data generated with HYDRUS, and considering the soil column as the layering of several soil types characterized in-situ. The method will then be applied to actual soil water status data from various crops in California including tomato, citrus, almond, pistachio, and walnut. Eventually, improvements of irrigation and fertilization management practices (i.e. mainly questions of quantity and frequency of application minimizing leaching under constraint of water and nutrient availability) will be investigated using coupled modeling and optimization tools.

  15. Analysis and calculation of macrosegregation in a casting ingot, exhibits C and E

    NASA Technical Reports Server (NTRS)

    Poirier, D. R.; Maples, A. L.

    1984-01-01

    A computer model which describes the solidification of a binary metal alloy in an insulated rectangular mold with a temperature gradient is presented. A numerical technique, applicable to a broad class of moving boundary problems, was implemented therein. The solidification model described is used to calculate the macrosegregation within the solidified casting by coupling the equations for liquid flow in the solid/liquid or mushy zone with the energy equation for heat flow throughout the ingot and thermal convection in the bulk liquid portion. The rate of development of the solid can be automatically calculated by the model. Numerical analysis of such solidification parameters as enthalpy and boundary layer flow is displayed. On-line user interface and software documentation are presented.

  16. Combined statistical analysis of landslide release and propagation

    NASA Astrophysics Data System (ADS)

    Mergili, Martin; Rohmaneo, Mohammad; Chu, Hone-Jay

    2016-04-01

    Statistical methods - often coupled with stochastic concepts - are commonly employed to relate areas affected by landslides with environmental layers, and to estimate spatial landslide probabilities by applying these relationships. However, such methods only concern the release of landslides, disregarding their motion. Conceptual models for mass flow routing are used for estimating landslide travel distances and possible impact areas. Automated approaches combining release and impact probabilities are rare. The present work attempts to fill this gap by a fully automated procedure combining statistical and stochastic elements, building on the open source GRASS GIS software: (1) The landslide inventory is subset into release and deposition zones. (2) We employ a traditional statistical approach to estimate the spatial release probability of landslides. (3) We back-calculate the probability distribution of the angle of reach of the observed landslides, employing the software tool r.randomwalk. One set of random walks is routed downslope from each pixel defined as release area. Each random walk stops when leaving the observed impact area of the landslide. (4) The cumulative probability function (cdf) derived in (3) is used as input to route a set of random walks downslope from each pixel in the study area through the DEM, assigning the probability gained from the cdf to each pixel along the path (impact probability). The impact probability of a pixel is defined as the average impact probability of all sets of random walks impacting a pixel. Further, the average release probabilities of the release pixels of all sets of random walks impacting a given pixel are stored along with the area of the possible release zone. (5) We compute the zonal release probability by increasing the release probability according to the size of the release zone - the larger the zone, the larger the probability that a landslide will originate from at least one pixel within this zone. We quantify this relationship by a set of empirical curves. (6) Finally, we multiply the zonal release probability with the impact probability in order to estimate the combined impact probability for each pixel. We demonstrate the model with a 167 km² study area in Taiwan, using an inventory of landslides triggered by the typhoon Morakot. Analyzing the model results leads us to a set of key conclusions: (i) The average composite impact probability over the entire study area corresponds well to the density of observed landside pixels. Therefore we conclude that the method is valid in general, even though the concept of the zonal release probability bears some conceptual issues that have to be kept in mind. (ii) The parameters used as predictors cannot fully explain the observed distribution of landslides. The size of the release zone influences the composite impact probability to a larger degree than the pixel-based release probability. (iii) The prediction rate increases considerably when excluding the largest, deep-seated, landslides from the analysis. We conclude that such landslides are mainly related to geological features hardly reflected in the predictor layers used.

  17. Delta-Ferrite Distribution in a Continuous Casting Slab of Fe-Cr-Mn Austenitic Stainless Steel

    NASA Astrophysics Data System (ADS)

    Chen, Chao; Cheng, Guoguang

    2017-10-01

    The delta-ferrite distribution in a continuous casting slab of Fe-Cr-Mn stainless steel grade (200 series J4) was analyzed. The results showed that the ferrite fraction was less than 3 pct. The "M" type distribution was observed in the thickness direction. For the distribution at the centerline, the maximum ferrite content was found in the triangular zone of the macrostructure. In addition, in this zone, the carbon and sulfur were severely segregated. Furthermore, an equilibrium solidification calculation by Thermo-Calc® software indicates that the solidification mode of the composition in this triangular zone is the same as the solidification mode of the averaged composition, i.e., the FA (ferrite-austenite) mode. None of the nickel-chromium equivalent formulas combined with the Schaeffler-type diagram could predict the ferrite fraction of the Cr-Mn stainless steel grade in a reasonable manner. The authors propose that more attention should be paid to the development of prediction models for the ferrite fraction of stainless steels under continuous casting conditions.

  18. Design and application of the emergency response mobile phone-based information system for infectious disease reporting in the Wenchuan earthquake zone.

    PubMed

    Ma, Jiaqi; Zhou, Maigeng; Li, Yanfei; Guo, Yan; Su, Xuemei; Qi, Xiaopeng; Ge, Hui

    2009-05-01

    To describe the design and application of an emergency response mobile phone-based information system for infectious disease reporting. Software engineering and business modeling were used to design and develop the emergency response mobile phone-based information system for infectious disease reporting. Seven days after the initiation of the reporting system, the reporting rate in the earthquake zone reached the level of the same period in 2007, using the mobile phone-based information system. Surveillance of the weekly report on morbidity in the earthquake zone after the initiation of the mobile phone reporting system showed the same trend as the previous three years. The emergency response mobile phone-based information system for infectious disease reporting was an effective solution to transmit urgently needed reports and manage communicable disease surveillance information. This assured the consistency of disease surveillance and facilitated sensitive, accurate, and timely disease surveillance. It is an important backup for the internet-based direct reporting system for communicable disease. © 2009 Blackwell Publishing Asia Pty Ltd and Chinese Cochrane Center, West China Hospital of Sichuan University.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellors, R J

    The Comprehensive Nuclear Test Ban Treaty (CTBT) includes provisions for an on-site inspection (OSI), which allows the use of specific techniques to detect underground anomalies including cavities and rubble zones. One permitted technique is active seismic surveys such as seismic refraction or reflection. The purpose of this report is to conduct some simple modeling to evaluate the potential use of seismic reflection in detecting cavities and to test the use of open-source software in modeling possible scenarios. It should be noted that OSI inspections are conducted under specific constraints regarding duration and logistics. These constraints are likely to significantly impactmore » active seismic surveying, as a seismic survey typically requires considerable equipment, effort, and expertise. For the purposes of this study, which is a first-order feasibility study, these issues will not be considered. This report provides a brief description of the seismic reflection method along with some commonly used software packages. This is followed by an outline of a simple processing stream based on a synthetic model, along with results from a set of models representing underground cavities. A set of scripts used to generate the models are presented in an appendix. We do not consider detection of underground facilities in this work and the geologic setting used in these tests is an extremely simple one.« less

  20. Blueschist preservation in a retrograded, high-pressure, low-temperature metamorphic terrane, Tinos, Greece: Implications for fluid flow paths in subduction zones

    NASA Astrophysics Data System (ADS)

    Breeding, Christopher M.; Ague, Jay J.; BröCker, Michael; Bolton, Edward W.

    2003-01-01

    The preservation of high-pressure, low-temperature (HP-LT) mineral assemblages adjacent to marble unit contacts on the Cycladic island of Tinos in Greece was investigated using a new type of digital outcrop mapping and numerical modeling of metamorphic fluid infiltration. Mineral assemblage distributions in a large blueschist outcrop, adjacent to the basal contact of a 150-meter thick marble horizon, were mapped at centimeter-scale resolution onto digital photographs using a belt-worn computer and graphics editing software. Digital mapping reveals that while most HP-LT rocks in the outcrop were pervasively retrograded to greenschist facies, the marble-blueschist contact zone underwent an even more intense retrogression. Preservation of HP-LT mineral assemblages was mainly restricted to a 10-15 meter zone (or enclave) adjacent to the intensely retrograded lithologic contact. The degree and distribution of the retrograde overprint suggests that pervasively infiltrating fluids were channelized into the marble-blueschist contact and associated veins and flowed around the preserved HP-LT enclave. Numerical modeling of Darcian flow, based on the field observations, suggests that near the marble horizon, deflections in fluid flow paths caused by flow channelization along the high-permeability marble-blueschist contact zone likely resulted in very large fluid fluxes along the lithologic contact and significantly smaller fluxes (as much as 8 times smaller than the input flux) within the narrow, low-flux regions where HP-LT minerals were preserved adjacent to the contact. Our results indicate that lithologic contacts are important conduits for metamorphic fluid flow in subduction zones. Channelization of retrograde fluids into these discrete flow conduits played a critical role in the preservation of HP-LT assemblages.

  1. Implications of different digital elevation models and preprocessing techniques to delineate debris flow inundation hazard zones in El Salvador

    NASA Astrophysics Data System (ADS)

    Anderson, E. R.; Griffin, R.; Irwin, D.

    2013-12-01

    Heavy rains and steep, volcanic slopes in El Salvador cause numerous landslides every year, posing a persistent threat to the population, economy and environment. Although potential debris inundation hazard zones have been delineated using digital elevation models (DEMs), some disparities exist between the simulated zones and actual affected areas. Moreover, these hazard zones have only been identified for volcanic lahars and not the shallow landslides that occur nearly every year. This is despite the availability of tools to delineate a variety of landslide types (e.g., the USGS-developed LAHARZ software). Limitations in DEM spatial resolution, age of the data, and hydrological preprocessing techniques can contribute to inaccurate hazard zone definitions. This study investigates the impacts of using different elevation models and pit filling techniques in the final debris hazard zone delineations, in an effort to determine which combination of methods most closely agrees with observed landslide events. In particular, a national DEM digitized from topographic sheets from the 1970s and 1980s provide an elevation product at a 10 meter resolution. Both natural and anthropogenic modifications of the terrain limit the accuracy of current landslide hazard assessments derived from this source. Global products from the Shuttle Radar Topography Mission (SRTM) and the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global DEM (ASTER GDEM) offer more recent data but at the cost of spatial resolution. New data derived from the NASA Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) in 2013 provides the opportunity to update hazard zones at a higher spatial resolution (approximately 6 meters). Hydrological filling of sinks or pits for current hazard zone simulation has previously been achieved through ArcInfo spatial analyst. Such hydrological processing typically only fills pits and can lead to drastic modifications of original elevation values. Optimized pit filling techniques use both cut and fill operations to minimize modifications of the original DEM. Satellite image interpretation and field surveying provide the baseline upon which to test the accuracy of each model simulation. By outlining areas that could potentially be inundated by debris flows, these efforts can be used to more accurately identify the places and assets immediately exposed to landslide hazards. We contextualize the results of the previous and ongoing efforts into how they may be incorporated into decision support systems. We also discuss if and how these analyses would have provided additional knowledge in the past, and identify specific recommendations as to how they could contribute to a more robust decision support system in the future.

  2. Delineation of areas contributing groundwater to selected receiving surface water bodies for long-term average hydrologic conditions from 1968 to 1983 for Long Island, New York

    USGS Publications Warehouse

    Misut, Paul E.; Monti,, Jack

    2016-10-05

    To assist resource managers and planners in developing informed strategies to address nitrogen loading to coastal water bodies of Long Island, New York, the U.S. Geological Survey and the New York State Department of Environmental Conservation initiated a program to delineate a comprehensive dataset of groundwater recharge areas (or areas contributing groundwater), travel times, and outflows to streams and saline embayments on Long Island. A four-layer regional three-dimensional finite-difference groundwater-flow model of hydrologic conditions from 1968 to 1983 was used to provide delineations of 48 groundwater watersheds on Long Island. Sixteen particle starting points were evenly spaced within each of the 4,000- by 4,000-foot model cells that receive water-table recharge and tracked using forward particle-tracking analysis modeling software to outflow zones. For each particle, simulated travel times were grouped by age as follows: less than or equal to 10 years, greater than 10 years and less than or equal to 100 years, greater than 100 years and less than or equal to 1,000 years, and greater than 1,000 years; and simulated ending zones were grouped into 48 receiving water bodies, based on the New York State Department of Environmental Conservation Waterbody Inventory/Priority Waterbodies List. Areal delineation of travel time zones and groundwater contributing areas were generated and a table was prepared presenting the sum of groundwater outflow for each area.

  3. Dynamic rupture simulations on complex fault zone structures with off-fault plasticity using the ADER-DG method

    NASA Astrophysics Data System (ADS)

    Wollherr, Stephanie; Gabriel, Alice-Agnes; Igel, Heiner

    2015-04-01

    In dynamic rupture models, high stress concentrations at rupture fronts have to to be accommodated by off-fault inelastic processes such as plastic deformation. As presented in (Roten et al., 2014), incorporating plastic yielding can significantly reduce earlier predictions of ground motions in the Los Angeles Basin. Further, an inelastic response of materials surrounding a fault potentially has a strong impact on surface displacement and is therefore a key aspect in understanding the triggering of tsunamis through floor uplifting. We present an implementation of off-fault-plasticity and its verification for the software package SeisSol, an arbitrary high-order derivative discontinuous Galerkin (ADER-DG) method. The software recently reached multi-petaflop/s performance on some of the largest supercomputers worldwide and was a Gordon Bell prize finalist application in 2014 (Heinecke et al., 2014). For the nonelastic calculations we impose a Drucker-Prager yield criterion in shear stress with a viscous regularization following (Andrews, 2005). It permits the smooth relaxation of high stress concentrations induced in the dynamic rupture process. We verify the implementation by comparison to the SCEC/USGS Spontaneous Rupture Code Verification Benchmarks. The results of test problem TPV13 with a 60-degree dipping normal fault show that SeisSol is in good accordance with other codes. Additionally we aim to explore the numerical characteristics of the off-fault plasticity implementation by performing convergence tests for the 2D code. The ADER-DG method is especially suited for complex geometries by using unstructured tetrahedral meshes. Local adaptation of the mesh resolution enables a fine sampling of the cohesive zone on the fault while simultaneously satisfying the dispersion requirements of wave propagation away from the fault. In this context we will investigate the influence of off-fault-plasticity on geometrically complex fault zone structures like subduction zones or branched faults. Studying the interplay of stress conditions and angle dependence of neighbouring branches including inelastic material behaviour and its effects on rupture jumps and seismic activation helps to advance our understanding of earthquake source processes. An application is the simulation of a real large-scale subduction zone scenario including plasticity to validate the coupling of our dynamic rupture calculations to a tsunami model in the framework of the ASCETE project (http://www.ascete.de/). Andrews, D. J. (2005): Rupture dynamics with energy loss outside the slip zone, J. Geophys. Res., 110, B01307. Heinecke, A. (2014), A. Breuer, S. Rettenberger, M. Bader, A.-A. Gabriel, C. Pelties, A. Bode, W. Barth, K. Vaidyanathan, M. Smelyanskiy and P. Dubey: Petascale High Order Dynamic Rupture Earthquake Simulations on Heterogeneous Supercomputers. In Supercomputing 2014, The International Conference for High Performance Computing, Networking, Storage and Analysis. IEEE, New Orleans, LA, USA, November 2014. Roten, D. (2014), K. B. Olsen, S.M. Day, Y. Cui, and D. Fäh: Expected seismic shaking in Los Angeles reduced by San Andreas fault zone plasticity, Geophys. Res. Lett., 41, 2769-2777.

  4. European Regional Climate Zone Modeling of a Commercial Absorption Heat Pump Hot Water Heater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Vishaldeep; Shen, Bo; Keinath, Chris

    2017-01-01

    High efficiency gas-burning hot water heating takes advantage of a condensing heat exchanger to deliver improved combustion efficiency over a standard non-condensing configuration. The water heating is always lower than the gas heating value. In contrast, Gas Absorption Heat Pump (GAHP) hot water heating combines the efficiency of gas burning with the performance increase from a heat pump to offer significant gas energy savings. An ammonia-water system also has the advantage of zero Ozone Depletion Potential and low Global Warming Potential. In comparison with air source electric heat pumps, the absorption system can maintain higher coefficients of performance in coldermore » climates. In this work, a GAHP commercial water heating system was compared to a condensing gas storage system for a range of locations and climate zones across Europe. The thermodynamic performance map of a single effect ammonia-water absorption system was used in a building energy modeling software that could also incorporate the changing ambient air temperature and water mains temperature for a specific location, as well as a full-service restaurant water draw pattern.« less

  5. Modeling the effects of the variability of temperature-related dynamic viscosity on the thermal-affected zone of groundwater heat-pump systems

    NASA Astrophysics Data System (ADS)

    Lo Russo, Stefano; Taddia, Glenda; Cerino Abdin, Elena

    2018-06-01

    Thermal perturbation in the subsurface produced in an open-loop groundwater heat pump (GWHP) plant is a complex transport phenomenon affected by several factors, including the exploited aquifer's hydrogeological and thermal characteristics, well construction features, and the temporal dynamics of the plant's groundwater abstraction and reinjection system. Hydraulic conductivity has a major influence on heat transport because plume propagation, which occurs primarily through advection, tends to degrade following conductive heat transport and convection within moving water. Hydraulic conductivity is, in turn, influenced by water reinjection because the dynamic viscosity of groundwater varies with temperature. This paper reports on a computational analysis conducted using FEFLOW software to quantify how the thermal-affected zone (TAZ) is influenced by the variation in dynamic viscosity due to reinjected groundwater in a well-doublet scheme. The modeling results demonstrate non-negligible groundwater dynamic-viscosity variation that affects thermal plume propagation in the aquifer. This influence on TAZ calculation was enhanced for aquifers with high intrinsic permeability and/or substantial temperature differences between abstracted and post-heat-pump-reinjected groundwater.

  6. Modeling the effects of the variability of temperature-related dynamic viscosity on the thermal-affected zone of groundwater heat-pump systems

    NASA Astrophysics Data System (ADS)

    Lo Russo, Stefano; Taddia, Glenda; Cerino Abdin, Elena

    2018-01-01

    Thermal perturbation in the subsurface produced in an open-loop groundwater heat pump (GWHP) plant is a complex transport phenomenon affected by several factors, including the exploited aquifer's hydrogeological and thermal characteristics, well construction features, and the temporal dynamics of the plant's groundwater abstraction and reinjection system. Hydraulic conductivity has a major influence on heat transport because plume propagation, which occurs primarily through advection, tends to degrade following conductive heat transport and convection within moving water. Hydraulic conductivity is, in turn, influenced by water reinjection because the dynamic viscosity of groundwater varies with temperature. This paper reports on a computational analysis conducted using FEFLOW software to quantify how the thermal-affected zone (TAZ) is influenced by the variation in dynamic viscosity due to reinjected groundwater in a well-doublet scheme. The modeling results demonstrate non-negligible groundwater dynamic-viscosity variation that affects thermal plume propagation in the aquifer. This influence on TAZ calculation was enhanced for aquifers with high intrinsic permeability and/or substantial temperature differences between abstracted and post-heat-pump-reinjected groundwater.

  7. Bridging scales from satellite to grains: Structural mapping aided by tablet and photogrammetry

    NASA Astrophysics Data System (ADS)

    Hawemann, Friedrich; Mancktelow, Neil; Pennacchioni, Giorgio; Wex, Sebastian; Camacho, Alfredo

    2016-04-01

    Bridging scales from satellite to grains: Structural mapping aided by tablet and photogrammetry A fundamental problem in small-scale mapping is linking outcrop observations to the large scale deformation pattern. The evolution of handheld devices such as tablets with integrated GPS and the availability of airborne imagery allows a precise localization of outcrops. Detailed structural geometries can be analyzed through ortho-rectified photo mosaics generated by photogrammetry software. In this study, we use a cheap standard Samsung-tablet (< 300 Euro) to map individual, up to 60 m long shear zones with the tracking option offered by the program Locus Map. Even though GPS accuracy is about 3 m, the relative error from one point to another during tracking is on the order of only about 1 dm. Parts of the shear zone with excellent outcrop are photographed with a standard camera with a relatively wide angle in a mosaic array. An area of about 30 sqm needs about 50 photographs with enough overlap to be used for photogrammetry. The software PhotoScan from Agisoft matches the photographs in a fully automated manner, calculates a 3D model of the outcrop, and has the option to project this as an orthophoto onto a flat surface. This allows original orientations of grain-scale structures to be recorded over areas on a scale up to tens to hundreds of metres. The photo mosaics can then be georeferenced with the aid of the GPS-tracks of the shear zones and included in a GIS. This provides a cheap recording of the structures in high detail. The great advantages over mapping with UAVs (drones) is the resolution (<1mm to >1cm), the independence from weather and energy source, and the low cost.

  8. Dynamic computer simulations of electrophoresis: three decades of active research.

    PubMed

    Thormann, Wolfgang; Caslavska, Jitka; Breadmore, Michael C; Mosher, Richard A

    2009-06-01

    Dynamic models for electrophoresis are based upon model equations derived from the transport concepts in solution together with user-inputted conditions. They are able to predict theoretically the movement of ions and are as such the most versatile tool to explore the fundamentals of electrokinetic separations. Since its inception three decades ago, the state of dynamic computer simulation software and its use has progressed significantly and Electrophoresis played a pivotal role in that endeavor as a large proportion of the fundamental and application papers were published in this periodical. Software is available that simulates all basic electrophoretic systems, including moving boundary electrophoresis, zone electrophoresis, ITP, IEF and EKC, and their combinations under almost exactly the same conditions used in the laboratory. This has been employed to show the detailed mechanisms of many of the fundamental phenomena that occur in electrophoretic separations. Dynamic electrophoretic simulations are relevant for separations on any scale and instrumental format, including free-fluid preparative, gel, capillary and chip electrophoresis. This review includes a historical overview, a survey of current simulators, simulation examples and a discussion of the applications and achievements of dynamic simulation.

  9. FracPaQ: a MATLAB™ Toolbox for the Quantification of Fracture Patterns

    NASA Astrophysics Data System (ADS)

    Healy, D.; Rizzo, R. E.; Cornwell, D. G.; Timms, N.; Farrell, N. J.; Watkins, H.; Gomez-Rivas, E.; Smith, M.

    2016-12-01

    The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, shapes and spatial distributions often exhibit some kind of order. In detail, there may be relationships among the different fracture attributes e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture patterns and fracture attributes. This presentation describes an open source toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales. Our current focus for the application of the software is on quantifying the fracture patterns in and around fault zones. There is a large body of published work on the quantification of relatively simple joint patterns, but fault zones present a bigger, and arguably more important, challenge. The method presented is inherently scale independent, and a key task will be to analyse and integrate quantitative fracture pattern data from micro- to macro-scales. Planned future releases will incorporate multi-scale analyses based on a wavelet method to look for scale transitions, and combining fracture traces from multiple 2-D images to derive the statistically equivalent 3-D fracture pattern.

  10. Customer Communication Challenges and Solutions in Globally Distributed Agile Software Development

    NASA Astrophysics Data System (ADS)

    Pikkarainen, Minna; Korkala, Mikko

    Working in the globally distributed market is one of the key trends among the software organizations all over the world. [1-5]. Several factors have contributed to the growth of distributed software development; time-zone independent ”follow the sun” development, access to well-educated labour, maturation of the technical infrastructure and reduced costs are some of the most commonly cited benefits of distributed development [3, 6-8]. Furthermore, customers are often located in different countries because of the companies’ internationalization purposes or good market opportunities.

  11. Joint Inversion of 3d Mt/gravity/magnetic at Pisagua Fault.

    NASA Astrophysics Data System (ADS)

    Bascur, J.; Saez, P.; Tapia, R.; Humpire, M.

    2017-12-01

    This work shows the results of a joint inversion at Pisagua Fault using 3D Magnetotellurics (MT), gravity and regional magnetic data. The MT survey has a poor coverage of study area with only 21 stations; however, it allows to detect a low resistivity zone aligned with the Pisagua Fault trace that it is interpreted as a damage zone. The integration of gravity and magnetic data, which have more dense sampling and coverage, adds more detail and resolution to the detected low resistivity structure and helps to improve the structure interpretation using the resulted models (density, magnetic-susceptibility and electrical resistivity). The joint inversion process minimizes a multiple target function which includes the data misfit, model roughness and coupling norms (crossgradient and direct relations) for all geophysical methods considered (MT, gravity and magnetic). This process is solved iteratively using the Gauss-Newton method which updates the model of each geophysical method improving its individual data misfit, model roughness and the coupling with the other geophysical models. For solving the model updates of magnetic and gravity methods were developed dedicated 3D inversion software codes which include the coupling norms with additionals geophysical parameters. The model update of the 3D MT is calculated using an iterative method which sequentially filters the priority model and the output model of a single 3D MT inversion process for obtaining the resistivity model coupled solution with the gravity and magnetic methods.

  12. Evaluation of RC Bridge Piers Retrofitted using Fiber-Reinforced Polymer (FRP)

    NASA Astrophysics Data System (ADS)

    Shayanfar, M. A.; Zarrabian, M. S.

    2008-07-01

    For many long years, steel reinforcements have been considered as the only tool for concrete confinements and studied widely, but nowadays application of Fiber Reinforced Polymer (FRP) as an effective alternative is well appreciated. Many bridges have been constructed in the past that are necessary to be retrofitted for resisting against the earthquake motions. The objective of this research is evaluation of nonlinear behavior of RC bridge piers. Eight RC bridge piers have been modeled by ABAQUS software under micromechanical model for homogeneous anisotropic fibers. Also the Bilinear Confinement Model by Nonlinear Transition Zone of Mirmiran has been considered. Then types and angles of fibers and their effects on the final responses were evaluated [1]. Finally, effects of retrofitting are evaluated and some suggestions presented.

  13. [Finite element analysis of stress changes of posterior spinal pedicle screw infixation].

    PubMed

    Yan, Jia-Zhi; Wu, Zhi-Hong; Xu, Ri-Xin; Wang, Xue-Song; Xing, Ze-Jun; Zhao, Yu; Zhang, Jian-Guo; Shen, Jian-Xiong; Wang, Yi-Peng; Qiu, Gui-Xing

    2009-01-06

    To evaluate the mechanical response of L3-L4 segment after posterior interfixation with a transpedicle screw system. Spiral CT machine was used to conduct continuous parallel scan on the L3-L4 section of a 40-year-old healthy male Chinese. The image data thus obtained were introduced into MIMICS software to reconstruct the 2-D data into volume data and obtain 3-D models of every element.. Pro/3-D model construction software system was used to simulate the 3-D entity of L3-L4 fixed by screw robs through spinal pedicle via posterior approach that was introduced into the finite element software ABAQUS to construct a 3-D finite element model. The stress changes on the vertebrae and screw under the axial pressure of 0.5 mPa was analyzed. Under the evenly distributed pressure the displacement of the L4 model was 0.00125815 mm, with an error of only 0.8167% from the datum displacement. The convergence of the model was good. The stress of the fixed vertebral body, intervertebral disc, and internal fixators changed significantly. The stress concentration zone of the intervertebral disc turned from the posterolateral side to anterolateral side. The stress produced by the fixed vertebral bodies decreased significantly. Obvious stress concentration existed in the upper and lower sides of the base of screw and the fixed screw at the upper vertebral body bore greater stress than the lower vertebral body. Integration of computer aided device and finite element analysis can successfully stimulate the internal fixation of L3-IA visa posterior approach and observe the mechanic changes in the vertebral column more directly.

  14. Software design of control system of CCD side-scatter lidar

    NASA Astrophysics Data System (ADS)

    Kuang, Zhiqiang; Liu, Dong; Deng, Qian; Zhang, Zhanye; Wang, Zhenzhu; Yu, Siqi; Tao, Zongming; Xie, Chenbo; Wang, Yingjian

    2018-03-01

    Because of the existence of blind zone and transition zone, the application of backscattering lidar in near-ground is limited. The side-scatter lidar equipped with the Charge Coupled Devices (CCD) can separate the transmitting and receiving devices to avoid the impact of the geometric factors which is exited in the backscattering lidar and, detect the more precise near-ground aerosol signals continuously. Theories of CCD side-scatter lidar and the design of control system are introduced. The visible control of laser and CCD and automatic data processing method of the side-scatter lidar are developed by using the software of Visual C #. The results which are compared with the calibration of the atmospheric aerosol lidar data show that signals from the CCD side- scatter lidar are convincible.

  15. Integrating Climate Change Resilience Features into the Incremental Refinement of an Existing Marine Park

    PubMed Central

    Beckley, Lynnath E.; Kobryn, Halina T.; Lombard, Amanda T.; Radford, Ben; Heyward, Andrew

    2016-01-01

    Marine protected area (MPA) designs are likely to require iterative refinement as new knowledge is gained. In particular, there is an increasing need to consider the effects of climate change, especially the ability of ecosystems to resist and/or recover from climate-related disturbances, within the MPA planning process. However, there has been limited research addressing the incorporation of climate change resilience into MPA design. This study used Marxan conservation planning software with fine-scale shallow water (<20 m) bathymetry and habitat maps, models of major benthic communities for deeper water, and comprehensive human use information from Ningaloo Marine Park in Western Australia to identify climate change resilience features to integrate into the incremental refinement of the marine park. The study assessed the representation of benthic habitats within the current marine park zones, identified priority areas of high resilience for inclusion within no-take zones and examined if any iterative refinements to the current no-take zones are necessary. Of the 65 habitat classes, 16 did not meet representation targets within the current no-take zones, most of which were in deeper offshore waters. These deeper areas also demonstrated the highest resilience values and, as such, Marxan outputs suggested minor increases to the current no-take zones in the deeper offshore areas. This work demonstrates that inclusion of fine-scale climate change resilience features within the design process for MPAs is feasible, and can be applied to future marine spatial planning practices globally. PMID:27529820

  16. Impact Study of Metal Fasteners in Roofing Assemblies using Three-Dimensional Heat Transfer Analysis

    DOE PAGES

    Singh, Manan; Gulati, Rupesh; Ravi, Srinivasan; ...

    2016-11-29

    Heat transfer analysis was performed on typical roofing assemblies using HEAT3, a three-dimensional heat transfer analysis software. The difference in heat transferred through the roofing assemblies considered is compared between two cases - without any steel fasteners and with steel fasteners. In the latter case, the metal roofing fasteners were arranged as per Factor Mutual Global (FMG) approvals, in the field, perimeter, and corner zones of the roof. The temperature conditions used for the analysis represented summer and winter conditions for three separate Climate Zones (CZ) namely Climate Zone 2 or CZ2 represented by Orlando, FL; CZ3 represented by Atlanta,more » GA; and CZ6 zone represented by St. Paul, MN. In all the climatic conditions, higher energy transfer was observed with increase in the number of metal fasteners attributed to high thermal conductivity of metals as compared to the insulation and other materials used in the roofing assembly. This difference in heat loss was also quantified in the form of percentage change in the overall or effective insulation of the roofing assembly for better understanding of the practical aspects. Besides, a comparison of 2D heat transfer analysis (using THERM software) and 3D analysis using HEAT3 is also discussed.« less

  17. Impact Study of Metal Fasteners in Roofing Assemblies using Three-Dimensional Heat Transfer Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Manan; Gulati, Rupesh; Ravi, Srinivasan

    Heat transfer analysis was performed on typical roofing assemblies using HEAT3, a three-dimensional heat transfer analysis software. The difference in heat transferred through the roofing assemblies considered is compared between two cases - without any steel fasteners and with steel fasteners. In the latter case, the metal roofing fasteners were arranged as per Factor Mutual Global (FMG) approvals, in the field, perimeter, and corner zones of the roof. The temperature conditions used for the analysis represented summer and winter conditions for three separate Climate Zones (CZ) namely Climate Zone 2 or CZ2 represented by Orlando, FL; CZ3 represented by Atlanta,more » GA; and CZ6 zone represented by St. Paul, MN. In all the climatic conditions, higher energy transfer was observed with increase in the number of metal fasteners attributed to high thermal conductivity of metals as compared to the insulation and other materials used in the roofing assembly. This difference in heat loss was also quantified in the form of percentage change in the overall or effective insulation of the roofing assembly for better understanding of the practical aspects. Besides, a comparison of 2D heat transfer analysis (using THERM software) and 3D analysis using HEAT3 is also discussed.« less

  18. Deformation of a geo-medium with considering for internal self-balancing stresses

    NASA Astrophysics Data System (ADS)

    Lavrikov, S. V.; Revuzhenko, A. F.

    2016-11-01

    Based on the general concept of rock as a medium with inner sources and sinks of energy, the authors consider an approach to mathematical modeling of a geo-medium with account for internal self-balancing stresses. The description of stresses and strains at the level of microstructural elements and macrovolume of the medium uses methods of non-Archimedean analysis. The model allows describing the accumulation of elastic energy in the form of internal self-balancing stresses. A finite element algorithm and a software program for solving plane boundary-value problems have been developed. The calculated data on rock specimen compression are given. It is shown that the behavior of plastic deformation zones depends on the pre-assigned initial microstresses.

  19. Batch settling curve registration via image data modeling.

    PubMed

    Derlon, Nicolas; Thürlimann, Christian; Dürrenmatt, David; Villez, Kris

    2017-05-01

    To this day, obtaining reliable characterization of sludge settling properties remains a challenging and time-consuming task. Without such assessments however, optimal design and operation of secondary settling tanks is challenging and conservative approaches will remain necessary. With this study, we show that automated sludge blanket height registration and zone settling velocity estimation is possible thanks to analysis of images taken during batch settling experiments. The experimental setup is particularly interesting for practical applications as it consists of off-the-shelf components only, no moving parts are required, and the software is released publicly. Furthermore, the proposed multivariate shape constrained spline model for image analysis appears to be a promising method for reliable sludge blanket height profile registration. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Geological modeling of a fault zone in clay rocks at the Mont-Terri laboratory (Switzerland)

    NASA Astrophysics Data System (ADS)

    Kakurina, M.; Guglielmi, Y.; Nussbaum, C.; Valley, B.

    2016-12-01

    Clay-rich formations are considered to be a natural barrier for radionuclides or fluids (water, hydrocarbons, CO2) migration. However, little is known about the architecture of faults affecting clay formations because of their quick alteration at the Earth's surface. The Mont Terri Underground Research Laboratory provides exceptional conditions to investigate an un-weathered, perfectly exposed clay fault zone architecture and to conduct fault activation experiments that allow explore the conditions for stability of such clay faults. Here we show first results from a detailed geological model of the Mont Terri Main Fault architecture, using GoCad software, a detailed structural analysis of 6 fully cored and logged 30-to-50m long and 3-to-15m spaced boreholes crossing the fault zone. These high-definition geological data were acquired within the Fault Slip (FS) experiment project that consisted in fluid injections in different intervals within the fault using the SIMFIP probe to explore the conditions for the fault mechanical and seismic stability. The Mont Terri Main Fault "core" consists of a thrust zone about 0.8 to 3m wide that is bounded by two major fault planes. Between these planes, there is an assembly of distinct slickensided surfaces and various facies including scaly clays, fault gouge and fractured zones. Scaly clay including S-C bands and microfolds occurs in larger zones at top and bottom of the Mail Fault. A cm-thin layer of gouge, that is known to accommodate high strain parts, runs along the upper fault zone boundary. The non-scaly part mainly consists of undeformed rock block, bounded by slickensides. Such a complexity as well as the continuity of the two major surfaces are hard to correlate between the different boreholes even with the high density of geological data within the relatively small volume of the experiment. This may show that a poor strain localization occurred during faulting giving some perspectives about the potential for reactivation and leakage of faults affecting clay materials.

  1. Connectivity patterns of coastal fishes following different dispersal scenarios across a transboundary marine protected area (Bonifacio strait, NW Mediterranean)

    NASA Astrophysics Data System (ADS)

    Koeck, Barbara; Gérigny, Olivia; Durieux, Eric Dominique Henri; Coudray, Sylvain; Garsi, Laure-Hélène; Bisgambiglia, Paul-Antoine; Galgani, François; Agostini, Sylvia

    2015-03-01

    The Strait of Bonifacio constitutes one of the rare transboundary Marine Protected Areas (MPA) of the Mediterranean Sea (between Sardinia, Italy and Corsica, France). Based on the hypothesis that no-take zones will produce more fish larvae, compared to adjacent fished areas, we modeled the outcome of larvae released by coastal fishes inside the no-take zones of the MPA in order to: (1) characterize the dispersal patterns across the Strait of Bonifacio; (2) identify the main potential settlement areas; (3) quantify the connectivity and the larval supply from the MPAs to the surrounding areas. A high resolution hydrodynamic model (MARS 3D, Corse 400 m) combined to an individual based model (Ichthyop software) was used to model the larval dispersal of fish following various scenarios (Pelagic Larval Duration PLD and release depth) over the main spawning period (i.e. between April and September). Dispersal model outputs were then compared with those obtained from an ichthyoplankton sampling cruise performed in August 2012. There was a significant influence of PLD to the connectivity between coastal areas. The synchronization between spawning and hydrodynamic conditions appeared to be determinant in the larval transport success. Biotic and abiotic parameters affecting the dispersal dynamic of fish larvae within the Strait of Bonifacio were identified and synthesis maps were established as a tool for conservation planning.

  2. The Virtual Watershed Observatory: Cyberinfrastructure for Model-Data Integration and Access

    NASA Astrophysics Data System (ADS)

    Duffy, C.; Leonard, L. N.; Giles, L.; Bhatt, G.; Yu, X.

    2011-12-01

    The Virtual Watershed Observatory (VWO) is a concept where scientists, water managers, educators and the general public can create a virtual observatory from integrated hydrologic model results, national databases and historical or real-time observations via web services. In this paper, we propose a prototype for automated and virtualized web services software using national data products for climate reanalysis, soils, geology, terrain and land cover. The VWO has the broad purpose of making accessible water resource simulations, real-time data assimilation, calibration and archival at the scale of HUC 12 watersheds (Hydrologic Unit Code) anywhere in the continental US. Our prototype for model-data integration focuses on creating tools for fast data storage from selected national databases, as well as the computational resources necessary for a dynamic, distributed watershed simulation. The paper will describe cyberinfrastructure tools and workflow that attempts to resolve the problem of model-data accessibility and scalability such that individuals, research teams, managers and educators can create a WVO in a desired context. Examples are given for the NSF-funded Shale Hills Critical Zone Observatory and the European Critical Zone Observatories within the SoilTrEC project. In the future implementation of WVO services will benefit from the development of a cloud cyber infrastructure as the prototype evolves to data and model intensive computation for continental scale water resource predictions.

  3. Dynamic rupture models of subduction zone earthquakes with off-fault plasticity

    NASA Astrophysics Data System (ADS)

    Wollherr, S.; van Zelst, I.; Gabriel, A. A.; van Dinther, Y.; Madden, E. H.; Ulrich, T.

    2017-12-01

    Modeling tsunami-genesis based on purely elastic seafloor displacement typically underpredicts tsunami sizes. Dynamic rupture simulations allow to analyse whether plastic energy dissipation is a missing rheological component by capturing the complex interplay of the rupture front, emitted seismic waves and the free surface in the accretionary prism. Strike-slip models with off-fault plasticity suggest decreasing rupture speed and extensive plastic yielding mainly at shallow depths. For simplified subduction geometries inelastic deformation on the verge of Coulomb failure may enhance vertical displacement, which in turn favors the generation of large tsunamis (Ma, 2012). However, constraining appropriate initial conditions in terms of fault geometry, initial fault stress and strength remains challenging. Here, we present dynamic rupture models of subduction zones constrained by long-term seismo-thermo-mechanical modeling (STM) without any a priori assumption of regions of failure. The STM model provides self-consistent slab geometries, as well as stress and strength initial conditions which evolve in response to tectonic stresses, temperature, gravity, plasticity and pressure (van Dinther et al. 2013). Coseismic slip and coupled seismic wave propagation is modelled using the software package SeisSol (www.seissol.org), suited for complex fault zone structures and topography/bathymetry. SeisSol allows for local time-stepping, which drastically reduces the time-to-solution (Uphoff et al., 2017). This is particularly important in large-scale scenarios resolving small-scale features, such as the shallow angle between the megathrust fault and the free surface. Our dynamic rupture model uses a Drucker-Prager plastic yield criterion and accounts for thermal pressurization around the fault mimicking the effect of pore pressure changes due to frictional heating. We first analyze the influence of this rheology on rupture dynamics and tsunamigenic properties, i.e. seafloor displacement, in 2D. Finally, we use the same rheology in a large-scale 3D scenario of the 2004 Sumatra earthquake to shed light to the source process that caused the subsequent devastating tsunami.

  4. Gold grade distribution within an epithermal quartz vein system, Kestanelik, NW Turkey: implications for gold exploration

    NASA Astrophysics Data System (ADS)

    Gulyuz, Nilay; Shipton, Zoe; Gulyuz, Erhan; Lord, Richard; Kaymakci, Nuretdin; Kuscu, İlkay

    2017-04-01

    Vein-hosted gold deposits contribute a large part to the global gold production. Discovery of these deposits mainly include drilling of hundreds of holes, collecting thousands of soil and rock samples and some geophysical surveys which are expensive and time consuming. Understanding the structures hosting the veins and the variations in gold concentrations within the veins is crucial to constrain a more economic exploration program. The main aim of this study is to investigate the gold grade distribution in the mineralized quartz veins of a well exposed epithermal gold deposit hosted by Paleozoic schist and Eocene quartz-feldspar-hornblende porphyry in Lapseki, NW Turkey. We have constructed 3D architecture of the vein surfaces by mapping their outcrop geometries using a highly sensitive Trimble GPS, collecting detailed field data, well-logs and geochemistry data from 396 drill holes (255 diamond cut and 141 reverse circulation holes). Modelling was performed in MOVE Structural Modelling and Analysis software granted by Midland Valley's Academic Software Initiative, and GIS application softwares Global Mapper and Esri-ArcGIS. We envisaged that while fluid entering the conduit ascents, a sudden thickness increase in the conduit would lead to a drop in the fluid pressure causing boiling (the most dominant gold precipitation mechanism) and associated gold precipitation. Regression analysis was performed between the orthogonal thickness values and gold grades of each vein, and statistical analyses were performed to see if the gold is concentrated at specific structural positions along dip. Gold grades in the alteration zones were compared to those in the adjacent veins to understand the degree of mineralization in alteration zones. A possible correlation was also examined between the host rock type and the gold grades in the veins. These studies indicated that gold grades are elevated in the adjacent alteration zones where high gold grades exist in the veins. Schist-hosted veins host the majority of gold mineralization (94.39%). While there is almost no correlation between the true vein thickness and the gold grade, 77.65% of high gold grades are located where the veins bend along dip. These results suggest that multiple gold precipitation mechanisms may have been active and boiling mechanism responsible for gold precipitation along the structural pathways was more effective than possible fluid-rock interaction or throttling mechanisms which will precipitate gold at adjacent alteration zones around the pathways at Kestanelik. In addition, specific structural locations such as vein bends are favorable for gold precipitation. This study emphasizes that structural architecture of the veins is one of the key controls on the location of high gold grades. In addition, adding structural data collection and mapping specific structural locations such as bends to the exploration program could permit the key locations of high gold grade to be identified faster, and to focus further drilling and assays.

  5. Probabilistic Seismic Hazard Assessment of the Chiapas State (SE Mexico)

    NASA Astrophysics Data System (ADS)

    Rodríguez-Lomelí, Anabel Georgina; García-Mayordomo, Julián

    2015-04-01

    The Chiapas State, in southeastern Mexico, is a very active seismic region due to the interaction of three tectonic plates: Northamerica, Cocos and Caribe. We present a probabilistic seismic hazard assessment (PSHA) specifically performed to evaluate seismic hazard in the Chiapas state. The PSHA was based on a composited seismic catalogue homogenized to Mw and was used a logic tree procedure for the consideration of different seismogenic source models and ground motion prediction equations (GMPEs). The results were obtained in terms of peak ground acceleration as well as spectral accelerations. The earthquake catalogue was compiled from the International Seismological Center and the Servicio Sismológico Nacional de México sources. Two different seismogenic source zones (SSZ) models were devised based on a revision of the tectonics of the region and the available geomorphological and geological maps. The SSZ were finally defined by the analysis of geophysical data, resulting two main different SSZ models. The Gutenberg-Richter parameters for each SSZ were calculated from the declustered and homogenized catalogue, while the maximum expected earthquake was assessed from both the catalogue and geological criteria. Several worldwide and regional GMPEs for subduction and crustal zones were revised. For each SSZ model we considered four possible combinations of GMPEs. Finally, hazard was calculated in terms of PGA and SA for 500-, 1000-, and 2500-years return periods for each branch of the logic tree using the CRISIS2007 software. The final hazard maps represent the mean values obtained from the two seismogenic and four attenuation models considered in the logic tree. For the three return periods analyzed, the maps locate the most hazardous areas in the Chiapas Central Pacific Zone, the Pacific Coastal Plain and in the Motagua and Polochic Fault Zone; intermediate hazard values in the Chiapas Batholith Zone and in the Strike-Slip Faults Province. The hazard decreases towards the northeast across the Reverse Faults Province and up to Yucatan Platform, where the lowest values are reached. We also produced uniform hazard spectra (UHS) for the three main cities of Chiapas. Tapachula city presents the highest spectral accelerations, while Tuxtla Gutierrez and San Cristobal de las Casas cities show similar values. We conclude that seismic hazard in Chiapas is chiefly controlled by the subduction of the Cocos beneath Northamerica and Caribe tectonic plates, that makes the coastal areas the most hazardous. Additionally, the Motagua and Polochic Fault Zones are also important, increasing the hazard particularly in southeastern Chiapas.

  6. Conditions of Fissuring in a Pumped-Faulted Aquifer System

    NASA Astrophysics Data System (ADS)

    Hernandez-Marin, M.; Burbey, T. J.

    2007-12-01

    Earth fissuring associated with subsidence from groundwater pumping is problematic in many arid-zone heavily pumped basins such as Las Vegas Valley. Long-term pumping at rates considerably greater than the natural recharge rate has stressed the heterogeneous aquifer system resulting in a complex stress-strain regime. A rigorous artificial recharge program coupled with increased surface-water importation has allowed water levels to appreciably recover, which has led to surface rebound in some localities. Nonetheless, new fissures continue to appear, particularly near basin-fill faults that behave as barriers to subsidence bowls. The purpose of this research is to develop a series of computational models to better understand the influence that structure (faults), pumping, and hydrostratigraphy has in the generation and propagation of fissures. The hydrostratigraphy of Las Vegas Valley consists of aquifers, aquitards and a relatively dry vadoze zone that may be as thick as 100m in much of the valley. Quaternary faults are typically depicted as scarps resulting from pre- pumping extensional tectonic events and are probably not responsible for the observed strain. The models developed to simulate the stress-strain and deformation processes in a faulted pumped aquifer-aquitard system of Las Vegas use the ABAQUS CAE (Complete ABAQUS Environment) software system. ABAQUS is a sophisticated engineering industry finite-element modeling package capable of simulating the complex fault- fissure system described here. A brittle failure criteria based on the tensile strength of the materials and the acting stresses (from previous models) are being used to understand how and where fissures are likely to form. , Hypothetical simulations include the role that faults and the vadose zone may play in fissure formation

  7. Assimilation of ambient seismic noise in hydrological models allows estimation of hydraulic conductivity in unsaturated media

    NASA Astrophysics Data System (ADS)

    Fores, B.; Champollion, C.; Mainsant, G.; Fort, A.; Albaric, J.

    2016-12-01

    Karstic hydrosystems represent one of the main water resources in the Mediterranean area but are challenging for geophysical methods. The GEK (Geodesy in Karstic Environment) observatory has been setup in 2011 to study the unsaturated zone of a karstic system in the south of France. The unsaturated zone (the epikarst) is thick and up to 100m on the site. Since 2011, gravity, rainfall and evapotranspiration are monitored. Together, they allow precise estimation of the global water storage changes but lack depth resolution. Surface waves velocity variations, obtained from ambient seismic noise monitoring are used here to overcome this lack. Indeed, velocities depend on saturation and the depths where changes occur can be defined as surface waves are dispersive. From October 2014 to November 2015, two seismometers have been recording noise. Velocity changes at a narrow frequency band (6-8 Hz) have shown a clear annual cycle. Minimum velocity is several months late on precipitations, which is coherent with a slow infiltration and a maximum sensitivity at -40m for these frequencies and this site. Models have been made with the Hydrus-1D software which allows modeling 1D-flow in variably saturated media. With a stochastic sampling, we have researched the underground parameters that reproduce the most the different observations (gravity, evapotranspiration and rainfall, and velocity changes). We show that velocity changes clearly constrain the hydraulic conductivity of the medium. Ambient seismic noise is therefore a promising method to study unsaturated zone which are too deep or too heterogeneous for classic methods.

  8. Application of FE software Elmer to the modeling of crustal-scale processes

    NASA Astrophysics Data System (ADS)

    Maierová, Petra; Guy, Alexandra; Lexa, Ondrej; Cadek, Ondrej

    2010-05-01

    We extended Elmer (the open source finite element software for multiphysical problems, http://www.csc.fi/english/pages/elmer) by user-written procedures for the two-dimensional modeling of crustal-scale processes. The standard version of Elmer is an appropriate tool for modeling of thermomechanical convection with non-linear viscous rheology. In geophysics, it might be suitable for some type of mantle convection modeling. Unlike the mantle, the crust is very heterogeneous. It consists of materials with distinct rheological properties that are subject to highly varied conditions: low pressure and temperature near the surface of the Earth and relatively high pressure and temperature at a depth of several tens of kilometers. Moreover, the deformation in the upper crust is mostly brittle and the strain is concentrated into narrow shear zones and thrusts. In order to simulate the brittle behavior of the crust, we implemented pressure-dependent visco-plastic rheology. The material heterogeneity and chemical convection is implemented in terms of active markers. Another special feature of the crust, the moving free surface, is already included in Elmer by means of a moving computational grid. Erosion can easily be added in this scheme. We tested the properties of our formulation of plastic flow on several numerical experiments simulating the deformation of material under compressional and extensional stresses. In the first step, we examined angles of shear zones that form in a plastically deforming material for different material parameters and grid resolutions. A more complex setting of "sandbox-type" experiments containing heterogeneous material, strain-softening and boundary friction was considered as a next testing case. To illustrate the abilities of the extended Elmer software in crustal deformation studies, we present two models of geological processes: diapirism of the lower crust and a channel flow forced by indentation. Both these processes are assumed to take place during the late stage of the Variscan orogeny in the area of the Bohemian Massif and they are well documented in the geological record. Extensive geological data are thus available and they can be compared with the results of our numerical simulations. Firstly, we model the indentation of a stiff block into a thick and hot crustal root and the consequent flow of the orogenic crust. For the development of the flow, the free surface deformation and erosion are essential. The importance of plastic deformation varies with the thermal structure of the domain. Secondly, we show an influence of thermal, density and viscosity structure of the crust on the time evolution and the final geometry of diapirs. The importance of the strain-rate dependence of viscosity, which is neglected in some numerical models, is discussed.

  9. Root zone water quality model (RZWQM2): Model use, calibration and validation

    USGS Publications Warehouse

    Ma, Liwang; Ahuja, Lajpat; Nolan, B.T.; Malone, Robert; Trout, Thomas; Qi, Z.

    2012-01-01

    The Root Zone Water Quality Model (RZWQM2) has been used widely for simulating agricultural management effects on crop production and soil and water quality. Although it is a one-dimensional model, it has many desirable features for the modeling community. This article outlines the principles of calibrating the model component by component with one or more datasets and validating the model with independent datasets. Users should consult the RZWQM2 user manual distributed along with the model and a more detailed protocol on how to calibrate RZWQM2 provided in a book chapter. Two case studies (or examples) are included in this article. One is from an irrigated maize study in Colorado to illustrate the use of field and laboratory measured soil hydraulic properties on simulated soil water and crop production. It also demonstrates the interaction between soil and plant parameters in simulated plant responses to water stresses. The other is from a maize-soybean rotation study in Iowa to show a manual calibration of the model for crop yield, soil water, and N leaching in tile-drained soils. Although the commonly used trial-and-error calibration method works well for experienced users, as shown in the second example, an automated calibration procedure is more objective, as shown in the first example. Furthermore, the incorporation of the Parameter Estimation Software (PEST) into RZWQM2 made the calibration of the model more efficient than a grid (ordered) search of model parameters. In addition, PEST provides sensitivity and uncertainty analyses that should help users in selecting the right parameters to calibrate.

  10. The Site-Scale Saturated Zone Flow Model for Yucca Mountain

    NASA Astrophysics Data System (ADS)

    Al-Aziz, E.; James, S. C.; Arnold, B. W.; Zyvoloski, G. A.

    2006-12-01

    This presentation provides a reinterpreted conceptual model of the Yucca Mountain site-scale flow system subject to all quality assurance procedures. The results are based on a numerical model of site-scale saturated zone beneath Yucca Mountain, which is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. This effort started from the ground up with a revised and updated hydrogeologic framework model, which incorporates the latest lithology data, and increased grid resolution that better resolves the hydrogeologic framework, which was updated throughout the model domain. In addition, faults are much better represented using the 250× 250- m2 spacing (compared to the previous model's 500× 500-m2 spacing). Data collected since the previous model calibration effort have been included and they comprise all Nye County water-level data through Phase IV of their Early Warning Drilling Program. Target boundary fluxes are derived from the newest (2004) Death Valley Regional Flow System model from the US Geologic Survey. A consistent weighting scheme assigns importance to each measured water-level datum and boundary flux extracted from the regional model. The numerical model is calibrated by matching these weighted water level measurements and boundary fluxes using parameter estimation techniques, along with more informal comparisons of the model to hydrologic and geochemical information. The model software (hydrologic simulation code FEHM~v2.24 and parameter estimation software PEST~v5.5) and model setup facilitates efficient calibration of multiple conceptual models. Analyses evaluate the impact of these updates and additional data on the modeled potentiometric surface and the flowpaths emanating from below the repository. After examining the heads and permeabilities obtained from the calibrated models, we present particle pathways from the proposed repository and compare them to those from the previous model calibration. Specific discharge at a point 5~km from the repository is also examined and found to be within acceptable uncertainty. The results show that updated model yields a calibration with smaller residuals than the previous model revision while ensuring that flowpaths follow measured gradients and paths derived from hydrochemical analyses. This work was supported by the Yucca Mountain Site Characterization Office as part of the Civilian Radioactive Waste Management Program, which is managed by the U.S. Department of Energy, Yucca Mountain Site Characterization Project. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE AC04 94AL85000.

  11. Neural network to diagnose lining condition

    NASA Astrophysics Data System (ADS)

    Yemelyanov, V. A.; Yemelyanova, N. Y.; Nedelkin, A. A.; Zarudnaya, M. V.

    2018-03-01

    The paper presents data on the problem of diagnosing the lining condition at the iron and steel works. The authors describe the neural network structure and software that are designed and developed to determine the lining burnout zones. The simulation results of the proposed neural networks are presented. The authors note the low learning and classification errors of the proposed neural networks. To realize the proposed neural network, the specialized software has been developed.

  12. Analysis of the influence of the interlayer staggered zone in the basalt of Jinsha River Basin on the main buildings

    NASA Astrophysics Data System (ADS)

    Guo, Qiaona; Huang, Jiangwei

    2018-02-01

    In this paper, the finite element software FEFLOW is used to simulate the seepage field of the interlayer staggered zone C2 in the basalt of Jinsha River Basin. The influence of the interlayer staggered zone C2 on the building is analyzed. Combined with the waterproof effect of current design scheme of anti-seepage curtain, the seepage field in the interlayer staggered zone C2 is discussed under different design schemes. The optimal design scheme of anti-seepage curtain is put forward. The results showed that the case four can effectively reduce the head and hydraulic gradient of underground powerhouse area, and improve the groundwater seepage field in the plant area.

  13. Numerical study of wave effects on groundwater flow and solute transport in a laboratory beach.

    PubMed

    Geng, Xiaolong; Boufadel, Michel C; Xia, Yuqiang; Li, Hailong; Zhao, Lin; Jackson, Nancy L; Miller, Richard S

    2014-09-01

    A numerical study was undertaken to investigate the effects of waves on groundwater flow and associated inland-released solute transport based on tracer experiments in a laboratory beach. The MARUN model was used to simulate the density-dependent groundwater flow and subsurface solute transport in the saturated and unsaturated regions of the beach subjected to waves. The Computational Fluid Dynamics (CFD) software, Fluent, was used to simulate waves, which were the seaward boundary condition for MARUN. A no-wave case was also simulated for comparison. Simulation results matched the observed water table and concentration at numerous locations. The results revealed that waves generated seawater-groundwater circulations in the swash and surf zones of the beach, which induced a large seawater-groundwater exchange across the beach face. In comparison to the no-wave case, waves significantly increased the residence time and spreading of inland-applied solutes in the beach. Waves also altered solute pathways and shifted the solute discharge zone further seaward. Residence Time Maps (RTM) revealed that the wave-induced residence time of the inland-applied solutes was largest near the solute exit zone to the sea. Sensitivity analyses suggested that the change in the permeability in the beach altered solute transport properties in a nonlinear way. Due to the slow movement of solutes in the unsaturated zone, the mass of the solute in the unsaturated zone, which reached up to 10% of the total mass in some cases, constituted a continuous slow release of solutes to the saturated zone of the beach. This means of control was not addressed in prior studies. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. The effects of climate change on heating energy consumption of office buildings in different climate zones in China

    NASA Astrophysics Data System (ADS)

    Meng, Fanchao; Li, Mingcai; Cao, Jingfu; Li, Ji; Xiong, Mingming; Feng, Xiaomei; Ren, Guoyu

    2017-06-01

    Climate plays an important role in heating energy consumption owing to the direct relationship between space heating and changes in meteorological conditions. To quantify the impact, the Transient System Simulation Program software was used to simulate the heating loads of office buildings in Harbin, Tianjin, and Shanghai, representing three major climate zones (i.e., severe cold, cold, and hot summer and cold winter climate zones) in China during 1961-2010. Stepwise multiple linear regression was performed to determine the key climatic parameters influencing heating energy consumption. The results showed that dry bulb temperature (DBT) is the dominant climatic parameter affecting building heating loads in all three climate zones across China during the heating period at daily, monthly, and yearly scales (R 2 ≥ 0.86). With the continuous warming climate in winter over the past 50 years, heating loads decreased by 14.2, 7.2, and 7.1 W/m2 in Harbin, Tianjin, and Shanghai, respectively, indicating that the decreasing rate is more apparent in severe cold climate zone. When the DBT increases by 1 °C, the heating loads decrease by 253.1 W/m2 in Harbin, 177.2 W/m2 in Tianjin, and 126.4 W/m2 in Shanghai. These results suggest that the heating energy consumption can be well predicted by the regression models at different temporal scales in different climate conditions owing to the high determination coefficients. In addition, a greater decrease in heating energy consumption in northern severe cold and cold climate zones may efficiently promote the energy saving in these areas with high energy consumption for heating. Particularly, the likely future increase in temperatures should be considered in improving building energy efficiency.

  15. Propagation of uncertainties for an evaluation of the Azores-Gibraltar Fracture Zone tsunamigenic potential

    NASA Astrophysics Data System (ADS)

    Antoshchenkova, Ekaterina; Imbert, David; Richet, Yann; Bardet, Lise; Duluc, Claire-Marie; Rebour, Vincent; Gailler, Audrey; Hébert, Hélène

    2016-04-01

    The aim of this study is to assess evaluation the tsunamigenic potential of the Azores-Gibraltar Fracture Zone (AGFZ). This work is part of the French project TANDEM (Tsunamis in the Atlantic and English ChaNnel: Definition of the Effects through numerical Modeling; www-tandem.cea.fr), special attention is paid to French Atlantic coasts. Structurally, the AGFZ region is complex and not well understood. However, a lot of its faults produce earthquakes with significant vertical slip, of a type that can result in tsunami. We use the major tsunami event of the AGFZ on purpose to have a regional estimation of the tsunamigenic potential of this zone. The major reported event for this zone is the 1755 Lisbon event. There are large uncertainties concerning source location and focal mechanism of this earthquake. Hence, simple deterministic approach is not sufficient to cover on the one side the whole AGFZ with its geological complexity and on the other side the lack of information concerning the 1755 Lisbon tsunami. A parametric modeling environment Promethée (promethee.irsn.org/doku.php) was coupled to tsunami simulation software based on shallow water equations with the aim of propagation of uncertainties. Such a statistic point of view allows us to work with multiple hypotheses simultaneously. In our work we introduce the seismic source parameters in a form of distributions, thus giving a data base of thousands of tsunami scenarios and tsunami wave height distributions. Exploring our tsunami scenarios data base we present preliminary results for France. Tsunami wave heights (within one standard deviation of the mean) can be about 0.5 m - 1 m for the Atlantic coast and approaching 0.3 m for the English Channel.

  16. Generating a global soil evaporation dataset using SMAP soil moisture data to estimate components of the surface water balance

    NASA Astrophysics Data System (ADS)

    Carbone, E.; Small, E. E.; Badger, A.; Livneh, B.

    2016-12-01

    Evapotranspiration (ET) is fundamental to the water, energy and carbon cycles. However, our ability to measure ET and partition the total flux into transpiration and evaporation from soil is limited. This project aims to generate a global, observationally-based soil evaporation dataset (E-SMAP): using SMAP surface soil moisture data in conjunction with models and auxiliary observations to observe or estimate each component of the surface water balance. E-SMAP will enable a better understanding of water balance processes and contribute to forecasts of water resource availability. Here we focus on the flux between the soil surface and root zone layers (qbot), which dictates the proportion of water that is available for soil evaporation. Any water that moves from the surface layer to the root zone contributes to transpiration or groundwater recharge. The magnitude and direction of qbot are driven by gravity and the gradient in matric potential. We use a highly discretized Richards Equation-type model (e.g. Hydrus 1D software) with meteorological forcing from the North American Land Data Assimilation System (NLDAS) to estimate qbot. We verify the simulations using SMAP L4 surface and root zone soil moisture data. These data are well suited for evaluating qbot because they represent the most advanced estimate of the surface to root zone soil moisture gradient at the global scale. Results are compared with similar calculations using NLDAS and in situ soil moisture data. Preliminary calculations show that the greatest amount of variability between qbot determined from NLDAS, in situ and SMAP occurs directly after precipitation events. At these times, uncertainties in qbot calculations significantly affect E-SMAP estimates.

  17. Electrical resistivity tomography to quantify in situ liquid content in a full-scale dry anaerobic digestion reactor.

    PubMed

    André, L; Lamy, E; Lutz, P; Pernier, M; Lespinard, O; Pauss, A; Ribeiro, T

    2016-02-01

    The electrical resistivity tomography (ERT) method is a non-intrusive method widely used in landfills to detect and locate liquid content. An experimental set-up was performed on a dry batch anaerobic digestion reactor to investigate liquid repartition in process and to map spatial distribution of inoculum. Two array electrodes were used: pole-dipole and gradient arrays. A technical adaptation of ERT method was necessary. Measured resistivity data were inverted and modeled by RES2DINV software to get resistivity sections. Continuous calibration along resistivity section was necessary to understand data involving sampling and physicochemical analysis. Samples were analyzed performing both biochemical methane potential and fiber quantification. Correlations were established between the protocol of reactor preparation, resistivity values, liquid content, methane potential and fiber content representing liquid repartition, high methane potential zones and degradations zones. ERT method showed a strong relevance to monitor and to optimize the dry batch anaerobic digestion process. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less

  19. Hydrogeochemical processes and geochemical modeling in a coastal aquifer: Case study of the Marathon coastal plain, Greece

    NASA Astrophysics Data System (ADS)

    Papazotos, Panagiotis; Koumantakis, Ioannis; Kallioras, Andreas; Vasileiou, Eleni; Perraki, Maria

    2017-04-01

    Determining the hydrogeochemical processes has always been a challenge for scientists. The aim of this work is the study of the principal hydrogeochemical processes controlling groundwater quality in the Marathon coastal plain, Greece, with emphasis on the origin of the solutes. Various physicochemical parameters and major ions of twenty-five groundwater samples were analyzed. The hydrogeochemical data of groundwater were studied in order to determine the major factors controlling the chemical composition and hydrogeochemical evolution. In the Marathon coastal plain, three different zones of the alluvial granular aquifer system have been detected, considering the geochemical processes and recharge, which affect its hydrochemical characteristics. The alluvial granular aquifer system is divided eastwards into three zones: a) the natural recharge zone, b) the reverse ion exchange zone and c) the diffusion sea water zone. Cl-is the dominant anion and Na+and Ca2+ are the dominant cations, as determined by plotting the analyses on the respective Piper diagram. Near the coastline high concentrations of Na+ and Cl- were observed indicating a zone of seawater intrusion. On the other hand, westward there is increasing concentration of HCO3- with simultaneous decrease of Na+is indication of a recharge zone from karstic aquifers of the study area. Between the aforementioned zones there is an intermediate one, where reverse ion exchange takes place due to high concentrations of dissolved Na+ and Ca2+ adsorption. The saturation indices (SI) were calculated using the geochemical modeling software PHREEQC. Mineral phases of halite, sylvite, gypsum and anhydrite were estimated to be undersaturated in the water samples, suggesting these phases are minor or absent in the host rock. On the other hand, calcite, aragonite and dolomite are close to equilibrium; these minerals are present in the host rocks or in the unsaturated zone, possibly increasing the Ca2+, Mg2+ and HCO3- concentrations when carbonates are dissolved. The analyses of the bivariate scatter plots, the ionic ratios, the Indices of Base Exchange (IBE), the Gibbs diagram and the dissolution/precipitation reactions show that evaporation and water-rock interaction mechanisms such as dissolution of carbonates, followed by reverse ion exchange, have affected the groundwater chemistry in the study area. The results revealed that groundwater chemistry and therefore the origin of the solutes in the coastal alluvial granular aquifer system of the Marathon coastal plain is primarily affected by a number of factors such as groundwater and mineral equilibrium, seawater intrusion, reverse ion exchange and nitrate concentration. A possible future research could focus on the interaction among hydrogeochemistry, mineral phases and chemical thermodynamic modeling.

  20. Researches on the behaviour of cellular antiballistic composites based on AlMg-SiC alloys

    NASA Astrophysics Data System (ADS)

    Bălţătescu, O.; Florea, R. M.; Rusu, I.; Carcea, I.

    2015-11-01

    The researches presented in this paper refers basically to the impact of a small/medium caliber bullet shot on a light armor built on the base of a AlMg-SiC metallic composite cellular/foam. Thus, we study the antiballistic behavior and protection properties of the armor, based on the effects that occur at the impact zone of the bullet with the composite surface. We performed an antiballistic behavior modeling by means of a finite element analysis, based on a "multi grid" Fast Finite Element (FFE) system. We used for this purpose the DYNA 2D software package. The obtained samples show after the impact the occurrence of concentration / deformation pores effect and intercellular cracks development to the interior of the composite. Those effects, depending on speed, mass and length of the projectile ballistic trajectory, reduce zonal tensions due to the effect of cell walls deformation. It was obtained a good correlation between modeling results and the electron microscope analyse of the impact area. It is worth mentioning that almost all values for impact energy absorbed by the composite armor are in the protection active zone provided by it.

  1. Ramp - Metering Algorithms Evaluated within Simplified Conditions

    NASA Astrophysics Data System (ADS)

    Janota, Aleš; Holečko, Peter; Gregor, Michal; Hruboš, Marián

    2017-12-01

    Freeway networks reach their limits, since it is usually impossible to increase traffic volumes by indefinitely extending transport infrastructure through adding new traffic lanes. One of the possible solutions is to use advanced intelligent transport systems, particularly ramp metering systems. The paper shows how two particular algorithms of local and traffic-responsive control (Zone, ALINEA) can be adapted to simplified conditions corresponding to Slovak freeways. Both control strategies are modelled and simulated using PTV Vissim software, including the module VisVAP. Presented results demonstrate the properties of both control strategies, which are compared mutually as well as with the initial situation in which no control strategy is applied

  2. Artificial neural network in breast lesions from fine-needle aspiration cytology smear.

    PubMed

    Subbaiah, R M; Dey, Pranab; Nijhawan, Raje

    2014-03-01

    Artificial neural networks (ANNs) are applied in engineering and certain medical fields. ANN has immense potential and is rarely been used in breast lesions. In this present study, we attempted to build up a complete robust back propagation ANN model based on cytomorphological data, morphometric data, nuclear densitometric data, and gray level co-occurrence matrix (GLCM) of ductal carcinoma and fibroadenomas of breast cases diagnosed on fine-needle aspiration cytology (FNAC). We selected 52 cases of fibroadenomas and 60 cases of infiltrating ductal carcinoma of breast diagnosed on FNAC by two cytologists. Essential cytological data was quantitated by two independent cytologists (SRM, PD). With the help of Image J software, nuclear morphomeric, densitometric, and GLCM features were measured in all the cases on hematoxylin and eosin-stained smears. With the available data, an ANN model was built up with the help of Neurointelligence software. The network was designed as 41-20-1 (41 input nodes, 20 hidden nodes, 1 output node). The network was trained by the online back propagation algorithm and 500 iterations were done. Learning was adjusted after every iteration. ANN model correctly identified all cases of fibroadenomas and infiltrating carcinomas in the test set. This is one of the first successful composite ANN models of breast carcinomas. This basic model can be used to diagnose the gray zone area of the breast lesions on FNAC. We assume that this model may have far-reaching implications in future. Copyright © 2013 Wiley Periodicals, Inc.

  3. Digital techniques for processing Landsat imagery

    NASA Technical Reports Server (NTRS)

    Green, W. B.

    1978-01-01

    An overview of the basic techniques used to process Landsat images with a digital computer, and the VICAR image processing software developed at JPL and available to users through the NASA sponsored COSMIC computer program distribution center is presented. Examples of subjective processing performed to improve the information display for the human observer, such as contrast enhancement, pseudocolor display and band rationing, and of quantitative processing using mathematical models, such as classification based on multispectral signatures of different areas within a given scene and geometric transformation of imagery into standard mapping projections are given. Examples are illustrated by Landsat scenes of the Andes mountains and Altyn-Tagh fault zone in China before and after contrast enhancement and classification of land use in Portland, Oregon. The VICAR image processing software system which consists of a language translator that simplifies execution of image processing programs and provides a general purpose format so that imagery from a variety of sources can be processed by the same basic set of general applications programs is described.

  4. Conceptual design of the CZMIL data processing system (DPS): algorithms and software for fusing lidar, hyperspectral data, and digital images

    NASA Astrophysics Data System (ADS)

    Park, Joong Yong; Tuell, Grady

    2010-04-01

    The Data Processing System (DPS) of the Coastal Zone Mapping and Imaging Lidar (CZMIL) has been designed to automatically produce a number of novel environmental products through the fusion of Lidar, spectrometer, and camera data in a single software package. These new products significantly transcend use of the system as a bathymeter, and support use of CZMIL as a complete coastal and benthic mapping tool. The DPS provides a spinning globe capability for accessing data files; automated generation of combined topographic and bathymetric point clouds; a fully-integrated manual editor and data analysis tool; automated generation of orthophoto mosaics; automated generation of reflectance data cubes from the imaging spectrometer; a coupled air-ocean spectral optimization model producing images of chlorophyll and CDOM concentrations; and a fusion based capability to produce images and classifications of the shallow water seafloor. Adopting a multitasking approach, we expect to achieve computation of the point clouds, DEMs, and reflectance images at a 1:1 processing to acquisition ratio.

  5. An Updated Comprehensive Risk Analysis for Radioisotopes Identified of High Risk to National Security in the Event of a Radiological Dispersion Device Scenario

    NASA Astrophysics Data System (ADS)

    Robinson, Alexandra R.

    An updated global survey of radioisotope production and distribution was completed and subjected to a revised "down-selection methodology" to determine those radioisotopes that should be classified as potential national security risks based on availability and key physical characteristics that could be exploited in a hypothetical radiological dispersion device. The potential at-risk radioisotopes then were used in a modeling software suite known as Turbo FRMAC, developed by Sandia National Laboratories, to characterize plausible contamination maps known as Protective Action Guideline Zone Maps. This software also was used to calculate the whole body dose equivalent for exposed individuals based on various dispersion parameters and scenarios. Derived Response Levels then were determined for each radioisotope using: 1) target doses to members of the public provided by the U.S. EPA, and 2) occupational dose limits provided by the U.S. Nuclear Regulatory Commission. The limiting Derived Response Level for each radioisotope also was determined.

  6. Three-Dimensional Geologic Model of Complex Fault Structures in the Upper Seco Creek Area, Medina and Uvalde Counties, South-Central Texas

    USGS Publications Warehouse

    Pantea, Michael P.; Cole, James C.; Smith, Bruce D.; Faith, Jason R.; Blome, Charles D.; Smith, David V.

    2008-01-01

    This multimedia report shows and describes digital three-dimensional faulted geologic surfaces and volumes of the lithologic units of the Edwards aquifer in the upper Seco Creek area of Medina and Uvalde Counties in south-central Texas. This geologic framework model was produced using (1) geologic maps and interpretations of depositional environments and paleogeography; (2) lithologic descriptions, interpretations, and geophysical logs from 31 drill holes; (3) rock core and detailed lithologic descriptions from one drill hole; (4) helicopter electromagnetic geophysical data; and (5) known major and minor faults in the study area. These faults were used because of their individual and collective effects on the continuity of the aquifer-forming units in the Edwards Group. Data and information were compared and validated with each other and reflect the complex relationships of structures in the Seco Creek area of the Balcones fault zone. This geologic framework model can be used as a tool to visually explore and study geologic structures within the Seco Creek area of the Balcones fault zone and to show the connectivity of hydrologic units of high and low permeability between and across faults. The software can be used to display other data and information, such as drill-hole data, on this geologic framework model in three-dimensional space.

  7. GIS prospectivity mapping and 3D modeling validation for potential uranium deposit targets in Shangnan district, China

    NASA Astrophysics Data System (ADS)

    Xie, Jiayu; Wang, Gongwen; Sha, Yazhou; Liu, Jiajun; Wen, Botao; Nie, Ming; Zhang, Shuai

    2017-04-01

    Integrating multi-source geoscience information (such as geology, geophysics, geochemistry, and remote sensing) using GIS mapping is one of the key topics and frontiers in quantitative geosciences for mineral exploration. GIS prospective mapping and three-dimensional (3D) modeling can be used not only to extract exploration criteria and delineate metallogenetic targets but also to provide important information for the quantitative assessment of mineral resources. This paper uses the Shangnan district of Shaanxi province (China) as a case study area. GIS mapping and potential granite-hydrothermal uranium targeting were conducted in the study area combining weights of evidence (WofE) and concentration-area (C-A) fractal methods with multi-source geoscience information. 3D deposit-scale modeling using GOCAD software was performed to validate the shapes and features of the potential targets at the subsurface. The research results show that: (1) the known deposits have potential zones at depth, and the 3D geological models can delineate surface or subsurface ore-forming features, which can be used to analyze the uncertainty of the shape and feature of prospectivity mapping at the subsurface; (2) single geochemistry anomalies or remote sensing anomalies at the surface require combining the depth exploration criteria of geophysics to identify potential targets; and (3) the single or sparse exploration criteria zone with few mineralization spots at the surface has high uncertainty in terms of the exploration target.

  8. Natural attenuation software (NAS): Assessing remedial strategies and estimating timeframes

    USGS Publications Warehouse

    Mendez, E.; Widdowson, M.; Chapelle, F.; Casey, C.

    2005-01-01

    Natural Attenuation Software (NAS) is a screening tool to estimate remediation timeframes for monitored natural attenuation (MNA) and to assist in decision-making on the level of source zone treatment in conjunction with MNA using site-specific remediation objectives. Natural attenuation processes that NAS models include are advection, dispersion, sorption, non-aqueous phase liquid (NAPL) dissolution, and biodegradation of either petroleum hydrocarbons or chlorinated ethylenes. Newly-implemented enhancements to NAS designed to maximize the utility of NAS for site managers were observed. NAS has expanded source contaminant specification options to include chlorinated ethanes and chlorinated methanes, and to allow for the analysis of any other user-defined contaminants that may be subject to microbially-mediated transformations (heavy metals, radioisotopes, etc.). Included is the capability to model co-mingled plumes, with constituents from multiple contaminant categories. To enable comparison of remediation timeframe estimates between MNA and specific engineered remedial actions , NAS was modified to incorporate an estimation technique for timeframes associated with pump-and-treat remediation technology for comparison to MNA. This is an abstract of a paper presented at the 8th International In Situ and On-Site Bioremediation Symposium (Baltimore, MD 6/6-9/2005).

  9. Specific CT 3D rendering of the treatment zone after Irreversible Electroporation (IRE) in a pig liver model: the “Chebyshev Center Concept” to define the maximum treatable tumor size

    PubMed Central

    2014-01-01

    Background Size and shape of the treatment zone after Irreversible electroporation (IRE) can be difficult to depict due to the use of multiple applicators with complex spatial configuration. Exact geometrical definition of the treatment zone, however, is mandatory for acute treatment control since incomplete tumor coverage results in limited oncological outcome. In this study, the “Chebyshev Center Concept” was introduced for CT 3d rendering to assess size and position of the maximum treatable tumor at a specific safety margin. Methods In seven pig livers, three different IRE protocols were applied to create treatment zones of different size and shape: Protocol 1 (n = 5 IREs), Protocol 2 (n = 5 IREs), and Protocol 3 (n = 5 IREs). Contrast-enhanced CT was used to assess the treatment zones. Technique A consisted of a semi-automated software prototype for CT 3d rendering with the “Chebyshev Center Concept” implemented (the “Chebyshev Center” is the center of the largest inscribed sphere within the treatment zone) with automated definition of parameters for size, shape and position. Technique B consisted of standard CT 3d analysis with manual definition of the same parameters but position. Results For Protocol 1 and 2, short diameter of the treatment zone and diameter of the largest inscribed sphere within the treatment zone were not significantly different between Technique A and B. For Protocol 3, short diameter of the treatment zone and diameter of the largest inscribed sphere within the treatment zone were significantly smaller for Technique A compared with Technique B (41.1 ± 13.1 mm versus 53.8 ± 1.1 mm and 39.0 ± 8.4 mm versus 53.8 ± 1.1 mm; p < 0.05 and p < 0.01). For Protocol 1, 2 and 3, sphericity of the treatment zone was significantly larger for Technique A compared with B. Conclusions Regarding size and shape of the treatment zone after IRE, CT 3d rendering with the “Chebyshev Center Concept” implemented provides significantly different results compared with standard CT 3d analysis. Since the latter overestimates the size of the treatment zone, the “Chebyshev Center Concept” could be used for a more objective acute treatment control. PMID:24410997

  10. Development of a Distributed Hydrologic Model Using Triangulated Irregular Networks for Continuous, Real-Time Flood Forecasting

    NASA Astrophysics Data System (ADS)

    Ivanov, V. Y.; Vivoni, E. R.; Bras, R. L.; Entekhabi, D.

    2001-05-01

    The Triangulated Irregular Networks (TINs) are widespread in many finite-element modeling applications stressing high spatial non-uniformity while describing the domain of interest in an optimized fashion that results in superior computational efficiency. TINs, being adaptive to the complexity of any terrain, are capable of maintaining topological relations between critical surface features and therefore afford higher flexibility in data manipulation. The TIN-based Real-time Integrated Basin Simulator (tRIBS) is a distributed hydrologic model that utilizes the mesh architecture and the software environment developed for the CHILD landscape evolution model and employs the hydrologic routines of its raster-oriented version, RIBS. As a totally independent software unit, the tRIBS consolidates the strengths of the distributed approach and efficient computational data platform. The current version couples the unsaturated and the saturated zones and accounts for the interaction of moving infiltration fronts with a variable groundwater surface, allowing the model to handle both storm and interstorm periods in a continuous fashion. Recent model enhancements have included the development of interstorm hydrologic fluxes through an evapotranspiration scheme as well as incorporation of a rainfall interception module. Overall, the tRIBS model has proven to properly mimic successive phases of the distributed catchment response by reproducing various runoff production mechanisms and handling their meteorological constraints. Important improvements in modeling options, robustness to data availability and overall design flexibility have also been accomplished. The current efforts are focused on further model developments as well as the application of the tRIBS to various watersheds.

  11. The first clinical application of planning software for laparoscopic microwave thermosphere ablation of malignant liver tumours.

    PubMed

    Berber, Eren

    2015-07-01

    Liver tumour ablation is an operator-dependent procedure. The determination of the optimum needle trajectory and correct ablation parameters could be challenging. The aim of this study was to report the utility of a new, procedure planning software for microwave ablation (MWA) of liver tumours. This was a feasibility study in a pilot group of five patients with nine metastatic liver tumours who underwent laparoscopic MWA. Pre-operatively, parameters predicting the desired ablation zones were calculated for each tumour. Intra-operatively, this planning strategy was followed for both antenna placement and energy application. Post-operative 2-week computed tomography (CT) scans were performed to evaluate complete tumour destruction. The patients had an average of two tumours (range 1-4), measuring 1.9 ± 0.4 cm (range 0.9-4.4 cm). The ablation time was 7.1 ± 1.3 min (range 2.5-10 min) at 100W. There were no complications or mortality. The patients were discharged home on post-operative day (POD) 1. At 2-week CT scans, there were no residual tumours, with a complete ablation demonstrated in all lesions. This study describes and validates pre-treatment planning software for MWA of liver tumours. This software was found useful to determine precisely the ablation parameters and needle placement to create a predicted zone of ablation. © 2015 International Hepato-Pancreato-Biliary Association.

  12. Let's Go Off the Grid: Subsurface Flow Modeling With Analytic Elements

    NASA Astrophysics Data System (ADS)

    Bakker, M.

    2017-12-01

    Subsurface flow modeling with analytic elements has the major advantage that no grid or time stepping are needed. Analytic element formulations exist for steady state and transient flow in layered aquifers and unsaturated flow in the vadose zone. Analytic element models are vector-based and consist of points, lines and curves that represent specific features in the subsurface. Recent advances allow for the simulation of partially penetrating wells and multi-aquifer wells, including skin effect and wellbore storage, horizontal wells of poly-line shape including skin effect, sharp changes in subsurface properties, and surface water features with leaky beds. Input files for analytic element models are simple, short and readable, and can easily be generated from, for example, GIS databases. Future plans include the incorporation of analytic element in parts of grid-based models where additional detail is needed. This presentation will give an overview of advanced flow features that can be modeled, many of which are implemented in free and open-source software.

  13. A prospective development study of software-guided radio-frequency ablation of primary and secondary liver tumors: Clinical intervention modelling, planning and proof for ablation cancer treatment (ClinicIMPPACT).

    PubMed

    Reinhardt, Martin; Brandmaier, Philipp; Seider, Daniel; Kolesnik, Marina; Jenniskens, Sjoerd; Sequeiros, Roberto Blanco; Eibisberger, Martin; Voglreiter, Philip; Flanagan, Ronan; Mariappan, Panchatcharam; Busse, Harald; Moche, Michael

    2017-12-01

    Radio-frequency ablation (RFA) is a promising minimal-invasive treatment option for early liver cancer, however monitoring or predicting the size of the resulting tissue necrosis during the RFA-procedure is a challenging task, potentially resulting in a significant rate of under- or over treatments. Currently there is no reliable lesion size prediction method commercially available. ClinicIMPPACT is designed as multicenter-, prospective-, non-randomized clinical trial to evaluate the accuracy and efficiency of innovative planning and simulation software. 60 patients with early liver cancer will be included at four European clinical institutions and treated with the same RFA system. The preinterventional imaging datasets will be used for computational planning of the RFA treatment. All ablations will be simulated simultaneously to the actual RFA procedure, using the software environment developed in this project. The primary outcome measure is the comparison of the simulated ablation zones with the true lesions shown in follow-up imaging after one month, to assess accuracy of the lesion prediction. This unique multicenter clinical trial aims at the clinical integration of a dedicated software solution to accurately predict lesion size and shape after radiofrequency ablation of liver tumors. Accelerated and optimized workflow integration, and real-time intraoperative image processing, as well as inclusion of patient specific information, e.g. organ perfusion and registration of the real RFA needle position might make the introduced software a powerful tool for interventional radiologists to optimize patient outcomes.

  14. Modelling and Evaluation of Non-Linear Rootwater Uptake for Winter Cropping of Wheat and Berseem

    NASA Astrophysics Data System (ADS)

    GS, K.; Prasad, K. S. H.

    2017-12-01

    The plant water uptake is significant for study to monitor the irrigation supplied to the plant. The Richards equation has been the key governing equation to quantify the root water uptake in the vadose zone and it takes all the sources and sink terms into consideration. The β parameter or the non linearity parameter is used in this modeling to bring the non linearity in the plant root water uptake. The soil parameters are obtained by experimentation and are employed in the Van-Genuchten equation for soil moisture study. Field experiments were carried out at Civil Engineering Department IIT Roorkee, Uttarakhand, India, during the winter season of 2013 and 2014 for berseem and 2016 for wheat as per the local cropping practices. Drainage type lysimeters were installed to study the soil water balance. Soil moisture was monitored using profile probe. Precipitation and all meteorological data were obtained from the nearby gauges located at the National Institute of Hydrology, Roorkee.The moisture data and the deep percolation data were collected on a daily basis and the irrigation supply was controlled and monitored to satisfy the moisture requirements of the crops respectively.In order to study the effect of water scarcity on the crops, the plot was divided and deficited irrigation was applied for the second cropping season for Berseem.The yields for both the seasons was also measured. The solution of Richards equation as applied to the moisture movement in the root zone was modeled. For estimation of root water uptake, the governing equation is the one-dimensional mixed form of Richards' equation is employed (Ji et al., 2007; Shankar et al., 2012).The sink term in the model accounts for the root water uptake, which is utilized by the plant for transpiration. Smaxor the maximum root water uptake for the root zone on a given day must be equal to the maximum transpiration on the corresponding day The model computed moisture content and pressure head is calibrated with the measured soil water content in the crop root zone. The Model output is compared with the output of the HYDRUS 1D software package. The complete calibrated model is now employed to determine the irrigation requirement of crops for a known initial moisture content and available precipitation and can be useful for economical agriculture in the semi-arid regions of India.

  15. Georadar and geoelectricity method to identify the determine zone of sliding landslide

    NASA Astrophysics Data System (ADS)

    Dalimunthe, Y. K.; Hamid, A.

    2018-01-01

    The aim of this research is to determine the contrast between the sliding plane by observing the parameters of rock types, fractures, and faults that could potentially land slides in Bandar Baru, Lampung Barat, Indonesia by both methods of georadar and geoelectricity. This research uses radar reflection profiling configuration for georadar and dipole-dipole configuration for geoelectricity. For georadar data processing has been done with Reflexwave software and for geoelectricity, data processing has been done with Earthimager 2DINV software to interpret subsurface section. Results of research by both methods of georadar and geoelectricity shows the area of contact between the sand stone with resistivity value of 200-1449 Ωm and clay stone with a resistivity value of 32-100 Ωm at the limit depth of 9 m as a potential zone of sliding landslides where the physical properties of clay stone easily derail massive material on it.

  16. Geodynamics of the East African Rift System ∼30 Ma ago: A stress field model

    NASA Astrophysics Data System (ADS)

    Min, Ge; Hou, Guiting

    2018-06-01

    The East African Rift System (EARS) is thought to be an intra-continental ridge that meets the Red Sea and the Gulf of Aden at the Ethiopian Afar as the failed arm of the Afar triple junction. The geodynamics of EARS is still unclear even though several models have been proposed. One model proposes that the EARS developed in a local tensile stress field derived from far-field loads because of the pushing of oceanic ridges. Alternatively, some scientists suggest that the formation of the EARS can be explained by upwelling mantle plumes beneath the lithospheric weak zone (e.g., the Pan-African suture zone). In our study, a shell model is established to consider the Earth's spherical curvature, the lithospheric heterogeneity of the African continent, and the coupling between the mantle plumes and the mid-ocean ridge. The results are calculated via the finite element method using ANSYS software and fit the geological evidence well. To discuss the effects of the different rock mechanical parameters and the boundary conditions, four comparative models are established with different parameters or boundary conditions. Model I ignores the heterogeneity of the African continent, Model II ignores mid-ocean spreading, Model III ignores the upwelling mantle plumes, and Model IV ignores both the heterogeneity of the African continent and the upwelling mantle plumes. Compared to these models is the original model that shows the best-fit results; this model indicates that the coupling of the upwelling mantle plumes and the mid-ocean ridge spreading causes the initial lithospheric breakup in Afar and East Africa. The extension direction and the separation of the EARS around the Tanzanian craton are attributed to the heterogeneity of the East African basement.

  17. Preliminary global paleogeographic maps through the Greenhouse-Icehouse transition: forcing of the Drake Passage and Asian Monsoons.

    NASA Astrophysics Data System (ADS)

    Poblete, Fernando; Dupont-Nivet, Guillaume; Licht, Alexis; van Hinsbergen, Douwe; Roperch, Pierrick; Guillocheau, Francois; Baby, Guillaume; Baatsen, Michiel

    2017-04-01

    Paleogeographic maps are essential for understanding Earth dynamics. They provide the necessary boundary conditions for climate and geodynamic modeling, surface processes and biotic interactions. In particular, the opening and closing of ocean gateways and the growth of major mountain belts are major drivers of climate changes and biotic interchange. However, the timing and spatial extent of such events are highly controversial and regularly questioned by new data. As part of the ERC "MAGIC" project focusing on Asian Monsoons during the Icehouse to Greenhouse transition we thus produced a set of worldwide Cenozoic paleogeographic maps in the period time between 60 to 20 Ma, with a set of boundary conditions specific to the India-Asia collision zone and the Drake Passage. The creation of a paleogeographic map followed a rigorous and reproductively methodology that integrates paleobathymetric, paleoshoreline and paleotopographic data into a coherent plate tectonic model using the open source software GPlates. (1) We use the model provided by Seton et al. (2012) as a first order tectonic model modified to integrate the full restoration of five regions: the Andes, the Scotia Arc, Africa, The Mediterranean Sea and the Tibet-Himalayan collision zone. (2) The paleobathymetry was provided by Müller et al. (2008) using age-depth relationships and assuming symmetric ridge spreading. (3) Paleoshoreline maps were modified according to the fossil database from fossilworks.org and the geological record and were used to represent the boundary between terrestrial and marine paleo-environments. (4) To reconstruct paleoelevations, the most controversial task, we compiled a wide range of data including stable isotope, leaf physiognomy, and thermochronology combined with regional fossil and geological records (tectonic setting) and geomorphological data. Finally, we use the open source GMT software and a set of masks to modify the current Earth relief model (ETOPO) according to the estimated paleoelevation for specific region at each period of time. Our approach specifically takes into account the evolution of continental margins. Paleotopographic evolution is coupled with the evolving shape of continents. Considering the constant addition of new data and models, the value of this method is to generate a progressive paleorelief model of the Earth that can be easily compared and updated with new data.

  18. Mathematical Modeling of Multiphase Filtration in Porous Media with a Chemically Active Skeleton

    NASA Astrophysics Data System (ADS)

    Khramchenkov, M. G.; Khramchenkov, É. M.

    2018-01-01

    The authors propose a mathematical model of two-phase filtration that occurs under the conditions of dissolution of a porous medium. The model can be used for joint description of complex chemical-hydrogeomechanical processes that are of frequent occurrence in the oil-and-gas producing and nature conservation practice. As an example, consideration is given to the acidizing of the bottom zone of the injection well of an oil reservoir. Enclosing rocks are represented by carbonates. The phases of the process are an aqueous solution of hydrochloric acid and oil. A software product for computational experiments is developed. For the numerical experiments, use is made of the data on the wells of an actual oil field. Good agreement is obtained between the field data and the calculated data. Numerical experiments with different configurations of the permeability of an oil stratum are conducted.

  19. 2D modeling of direct laser metal deposition process using a finite particle method

    NASA Astrophysics Data System (ADS)

    Anedaf, T.; Abbès, B.; Abbès, F.; Li, Y. M.

    2018-05-01

    Direct laser metal deposition is one of the material additive manufacturing processes used to produce complex metallic parts. A thorough understanding of the underlying physical phenomena is required to obtain a high-quality parts. In this work, a mathematical model is presented to simulate the coaxial laser direct deposition process tacking into account of mass addition, heat transfer, and fluid flow with free surface and melting. The fluid flow in the melt pool together with mass and energy balances are solved using the Computational Fluid Dynamics (CFD) software NOGRID-points, based on the meshless Finite Pointset Method (FPM). The basis of the computations is a point cloud, which represents the continuum fluid domain. Each finite point carries all fluid information (density, velocity, pressure and temperature). The dynamic shape of the molten zone is explicitly described by the point cloud. The proposed model is used to simulate a single layer cladding.

  20. Microcomputer-controlled world time display for public area viewing

    NASA Astrophysics Data System (ADS)

    Yep, S.; Rashidian, M.

    1982-05-01

    The design, development, and implementation of a microcomputer-controlled world clock is discussed. The system, designated international Time Display System (ITDS), integrates a Geochron Calendar Map and a microcomputer-based digital display to automatically compensate for daylight savings time, leap year, and time zone differences. An in-depth technical description of the design and development of the electronic hardware, firmware, and software systems is provided. Reference material on the time zones, fabrication techniques, and electronic subsystems are also provided.

  1. Hydro-geophysical observations integration in numerical model: case study in Mediterranean karstic unsaturated zone (Larzac, france)

    NASA Astrophysics Data System (ADS)

    Champollion, Cédric; Fores, Benjamin; Le Moigne, Nicolas; Chéry, Jean

    2016-04-01

    Karstic hydro-systems are highly non-linear and heterogeneous but one of the main water resource in the Mediterranean area. Neither local measurements in boreholes or analysis at the spring can take into account the variability of the water storage. Since a few years, ground-based geophysical measurements (such as gravity, electrical resistivity or seismological data) allows following water storage in heterogeneous hydrosystems at an intermediate scale between boreholes and basin. Behind classical rigorous monitoring, the integration of geophysical data in hydrological numerical models in needed for both processes interpretation and quantification. Since a few years, a karstic geophysical observatory (GEK: Géodésie de l'Environnement Karstique, OSU OREME, SNO H+) has been setup in the Mediterranean area in the south of France. The observatory is surrounding more than 250m karstified dolomite, with an unsaturated zone of ~150m thickness. At the observatory water level in boreholes, evapotranspiration and rainfall are classical hydro-meteorological observations completed by continuous gravity, resistivity and seismological measurements. The main objective of the study is the modelling of the whole observation dataset by explicit unsaturated numerical model in one dimension. Hydrus software is used for the explicit modelling of the water storage and transfer and links the different observations (geophysics, water level, evapotranspiration) with the water saturation. Unknown hydrological parameters (permeability, porosity) are retrieved from stochastic inversions. The scale of investigation of the different observations are discussed thank to the modelling results. A sensibility study of the measurements against the model is done and key hydro-geological processes of the site are presented.

  2. Safe Zone of Posterior Screw Insertion for Talar Neck Fractures on 3-Dimensional Reconstruction Model.

    PubMed

    Wu, Jian-Qun; Ma, Sheng-Hui; Liu, Song; Qin, Cheng-He; Jin, Dan; Yu, Bin

    2017-02-01

    To investigate the optimal posterior screw placement and the geometry of safe zones for screw insertion in the talar neck. Computed tomography data for 15 normal feet were imported into Mimics 10.01 software for 3-dimensional reconstruction; 4.0-mm-diameter screws were simulated from the lateral tubercle of the posterior process of the talus to the talar head. The range of screw paths trajectories and screw lengths at nine locations that did not breach the cortex of the talus were evaluated. In addition, the farthest (point a) and nearest point (point b) of the safe zone to the subtalar joint at each location, the anteversion angle (angle A), which is parallel to the sagittal plane, and the horizontal angle (angle B), which is perpendicular to the sagittal plane, were measured. The safe zone was mainly between the 30% location and the 60% location; the width of each safe zone was 13.6° ± 1.4°; the maximum height of each safe zone was 7.8° ± 1.2°. The height of the safe zone was lowest at the 30% location (4.5°) and highest at the 50% location (7.3°). The mixed safe zone of all tali was between the 50% location and the 60% location. When a screw was inserted at point a, the safe entry distance (screw length) ranged from 48.8 to 49.5 mm, and when inserted to point b, the distance ranged from 48.2 to 48.9 mm. And inserting a 48.7 mm screw, 5.6° laterally and 7.4° superiorly, from the lateral tubercle of the posterior process of the talus towards the talar head is safest. The safe zone of posterior screw fixation have been defined applying to most talus, assuming the fractures are well reduced, this may strengthen the stability, shorten the operation time and reduce the incidence of surgical complications. © 2017 Chinese Orthopaedic Association and John Wiley & Sons Australia, Ltd.

  3. FracPaQ: a MATLAB™ toolbox for the quantification of fracture patterns

    NASA Astrophysics Data System (ADS)

    Healy, David; Rizzo, Roberto; Farrell, Natalie; Watkins, Hannah; Cornwell, David; Gomez-Rivas, Enrique; Timms, Nick

    2017-04-01

    The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, shapes and spatial distributions often exhibit some kind of order. In detail, there may be relationships among the different fracture attributes e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture patterns and fracture attributes. This presentation describes an open source toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales. Our current focus for the application of the software is on quantifying crack and fracture patterns in and around fault zones. There is a large body of published work on the quantification of relatively simple joint patterns, but fault zones present a bigger, and arguably more important, challenge. The methods presented are inherently scale independent, and a key task will be to analyse and integrate quantitative fracture pattern data from micro- to macro-scales. New features in this release include multi-scale analyses based on a wavelet method to look for scale transitions, support for multi-colour traces in the input file processed as separate fracture sets, and combining fracture traces from multiple 2-D images to derive the statistically equivalent 3-D fracture pattern expressed as a 2nd rank crack tensor.

  4. Calculating bathymetric and spatial distributions of estuarine eelgrass

    EPA Science Inventory

    Distributions of native eelgrass Zostera marina L. within the intertidal and shallow subtidal zones of three Oregon estuaries (Tillamook, Yaquina, and Alsea) were classified from color infrared aerial orthophotography acquired at extreme low tide. Image processing software, Spati...

  5. GIS applications for military operations in coastal zones

    USGS Publications Warehouse

    Fleming, S.; Jordan, T.; Madden, M.; Usery, E.L.; Welch, R.

    2009-01-01

    In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations. ?? 2008 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS).

  6. Stress concentration on Intraplate Seismicity: Numerical Modeling of Slab-released Fluids in the New Madrid Seismic Zone

    NASA Astrophysics Data System (ADS)

    Saxena, A.; Choi, E.; Powell, C. A.

    2017-12-01

    The mechanism behind the seismicity of the New Madrid Seismic Zone (NMSZ), the major intraplate earthquake source in the Central and Eastern US (CEUS), is still debated but new insights are being provided by recent tomographic studies involving USArray. A high-resolution tomography study by Nyamwandha et al. (2016) in the NMSZ indicates the presence of low (3 % - 5 %) upper mantle Vp and Vs anomalies in the depth range 100 to 250 km. The elevated anomaly magnitudes are difficult to explain by temperature alone. As the low-velocity anomalies beneath the northeast China are attributed to fluids released from the stagnant Pacific slab, water released from the stagnant Laramide Slab, presently located at transition zone depths beneath the CEUS might be contributing to the low velocity features in this region's upper mantle. Here, we investigate the potential impact of the slab-released fluids on the stresses at seismogenic depths using numerical modeling. We convert the tomographic results into temperature field under various assumed values of spatially uniform water content. In more realistic cases, water content is added only when the converted temperature exceeds the melting temperature of olivine. Viscosities are then computed based on the temperature and water content and given to our geodynamic models created by Pylith, an open source software for crustal dynamics. The model results show that increasing water content weakens the upper mantle more than temperature alone and thus elevates the differential stress in the upper crust. These results can better explain the tomography results and seismicity without invoking melting. We also invert the tomography results for volume fraction of orthopyroxene and temperature and compare the resultant stresses with those for pure olivine. To enhance the reproducibility, selected models in this study will be made available in the form of sharable and reproducible packages enabled by EarthCube Building block project, GeoTrust.

  7. The effect of fractal contact lenses on peripheral refraction in myopic model eyes.

    PubMed

    Rodriguez-Vallejo, Manuel; Benlloch, Josefa; Pons, Amparo; Monsoriu, Juan A; Furlan, Walter D

    2014-12-01

    To test multizone contact lenses in model eyes: Fractal Contact Lenses (FCLs), designed to induce myopic peripheral refractive error (PRE). Zemax ray-tracing software was employed to simulate myopic and accommodation-dependent model eyes fitted with FCLs. PRE, defined in terms of mean sphere M and 90°-180° astigmatism J180, was computed at different peripheral positions, ranging from 0 to 35° in steps of 5°, and for different pupil diameters (PDs). Simulated visual performance and changes in the PRE were also analyzed for contact lens decentration and model eye accommodation. For comparison purposes, the same simulations were performed with another commercially available contact lens designed for the same intended use: the Dual Focus (DF). PRE was greater with FCL than with DF when both designs were tested for a 3.5 mm PD, and with and without decentration of the lenses. However, PRE depended on PD with both multizone lenses, with a remarkable reduction of the myopic relative effect for a PD of 5.5 mm. The myopic PRE with contact lenses decreased as the myopic refractive error increased, but this could be compensated by increasing the power of treatment zones. A peripheral myopic shift was also induced by the FCLs in the accommodated model eye. In regard to visual performance, a myopia under-correction with reference to the circle of least confusion was obtained in all cases for a 5.5 mm PD. The ghost images, generated by treatment zones of FCL, were dimmer than the ones produced with DF lens of the same power. FCLs produce a peripheral myopic defocus without compromising central vision in photopic conditions. FCLs have several design parameters that can be varied to obtain optimum results: lens diameter, number of zones, addition and asphericity; resulting in a very promising customized lens for the treatment of myopia progression.

  8. GIS applications for military operations in coastal zones

    NASA Astrophysics Data System (ADS)

    Fleming, S.; Jordan, T.; Madden, M.; Usery, E. L.; Welch, R.

    In order to successfully support current and future US military operations in coastal zones, geospatial information must be rapidly integrated and analyzed to meet ongoing force structure evolution and new mission directives. Coastal zones in a military-operational environment are complex regions that include sea, land and air features that demand high-volume databases of extreme detail within relatively narrow geographic corridors. Static products in the form of analog maps at varying scales traditionally have been used by military commanders and their operational planners. The rapidly changing battlefield of 21st Century warfare, however, demands dynamic mapping solutions. Commercial geographic information system (GIS) software for military-specific applications is now being developed and employed with digital databases to provide customized digital maps of variable scale, content and symbolization tailored to unique demands of military units. Research conducted by the Center for Remote Sensing and Mapping Science at the University of Georgia demonstrated the utility of GIS-based analysis and digital map creation when developing large-scale (1:10,000) products from littoral warfare databases. The methodology employed-selection of data sources (including high resolution commercial images and Lidar), establishment of analysis/modeling parameters, conduct of vehicle mobility analysis, development of models and generation of products (such as a continuous sea-land DEM and geo-visualization of changing shorelines with tidal levels)-is discussed. Based on observations and identified needs from the National Geospatial-Intelligence Agency, formerly the National Imagery and Mapping Agency, and the Department of Defense, prototype GIS models for military operations in sea, land and air environments were created from multiple data sets of a study area at US Marine Corps Base Camp Lejeune, North Carolina. Results of these models, along with methodologies for developing large-scale littoral warfare databases, aid the National Geospatial-Intelligence Agency in meeting littoral warfare analysis, modeling and map generation requirements for US military organizations.

  9. Methods, Computational Platform, Verification, and Application of Earthquake-Soil-Structure-Interaction Modeling and Simulation

    NASA Astrophysics Data System (ADS)

    Tafazzoli, Nima

    Seismic response of soil-structure systems has attracted significant attention for a long time. This is quite understandable with the size and the complexity of soil-structure systems. The focus of three important aspects of ESSI modeling could be on consistent following of input seismic energy and a number of energy dissipation mechanisms within the system, numerical techniques used to simulate dynamics of ESSI, and influence of uncertainty of ESSI simulations. This dissertation is a contribution to development of one such tool called ESSI Simulator. The work is being done on extensive verified and validated suite for ESSI Simulator. Verification and validation are important for high fidelity numerical predictions of behavior of complex systems. This simulator uses finite element method as a numerical tool to obtain solutions for large class of engineering problems such as liquefaction, earthquake-soil-structure-interaction, site effect, piles, pile group, probabilistic plasticity, stochastic elastic-plastic FEM, and detailed large scale parallel models. Response of full three-dimensional soil-structure-interaction simulation of complex structures is evaluated under the 3D wave propagation. Domain-Reduction-Method is used for applying the forces as a two-step procedure for dynamic analysis with the goal of reducing the large size computational domain. The issue of damping of the waves at the boundary of the finite element models is studied using different damping patterns. This is used at the layer of elements outside of the Domain-Reduction-Method zone in order to absorb the residual waves coming out of the boundary layer due to structural excitation. Extensive parametric study is done on dynamic soil-structure-interaction of a complex system and results of different cases in terms of soil strength and foundation embedment are compared. High efficiency set of constitutive models in terms of computational time are developed and implemented in ESSI Simulator. Efficiency is done based on simplifying the elastic-plastic stiffness tensor of the constitutive models. Almost in all the soil-structure systems, there are interface zones in contact with each other. These zones can get detached during the loading or can slip on each other. In this dissertation the frictional contact element is implemented in ESSI Simulator. Extended verification has been done on the implemented element. The interest here is the effect of slipping and gap opening at the interface of soil and concrete foundation on the soil-structure system behavior. In fact transferring the loads to structure is defined based on the contact areas which will affect the response of the system. The effect of gap openings and sliding at the interfaces are shown through application examples. In addition, dissipation of the seismic energy due to frictional sliding of the interface zones are studied. Application Programming Interface (API) and Domain Specific Language (DSL) are being developed to increase developer's and user's modeling and simulation capabilities. API describes software services developed by developers that are used by users. A domain-specific language (DSL) is a small language which usually focuses on a particular problem domain in software. In general DSL programs are translated to a common function or library which can be viewed as a tool to hide the details of the programming, and make it easier for the user to deal with the commands.

  10. Determination of Particular Endogenous Fires Hazard Zones in Goaf with Caving of Longwall

    NASA Astrophysics Data System (ADS)

    Tutak, Magdalena; Brodny, Jaroslaw

    2017-12-01

    Hazard of endogenous fires is one of the basic and common presented occupational safety hazards in coal mine in Poland and in the world. This hazard means possibility of coal self-ignition as the result of its self-heating process in mining heading or its surrounding. In underground coal-mining during ventilating of operating longwalls takes place migration of parts of airflow to goaf with caving. In a case when in these goaf a coal susceptible to selfignition occurs, then the airflow through these goaf may influence on formation of favourable conditions for coal oxidation and subsequently to its self-heating and self-ignition. Endogenous fire formed in such conditions can pose a serious hazard for the crew and for continuity of operation of mining plant. From the practical point of view, a very significant meaning has determination of the zone in the goaf with caving, in which necessary conditions for occurrence of endogenous fire are fulfilled. In the real conditions determination of such a zone is practically impossible. Therefore, authors of paper developed a methodology of determination of this zone basing on the results of modelling tests. This methodology includes a development of model of tested area, determination of boundary conditions and carrying out the simulation calculations. Based on the obtained results particular hazardous zone of endogenous fire is determined. A base for development of model of investigated region and selection of boundary conditions are the results of real tests. In the paper fundamental assumption of developed methodology, particularly in a range of assumed hazard criterion and sealing coefficient of goaf with caving were discussed. Also a mathematical model of gas flow through the porous media was characterized. Example of determination of a zone particularly endangered by endogenous fire for real system of mining heading in one of the hard coal mine was presented. Longwall ventilated in the „Y” system was subjected to the tests. For determined mining-geological conditions, the critical value of velocity of airflow and oxygen concentration in goaf, conditioning initiation of coal oxidation process were determined. For calculations ANSYS Fluent software based on finite volume method, which enable very precisely to determine the physical and chemical air and parameters at any point of tested mining heading and goaf with caving was used. Such precisely determination of these parameters on the base of the test in real conditions is practically impossible. Obtained results allowed to take early proper actions in order to limit the occurrence of endogenous fire. One can conclude, that presented methodology creates great possibilities of practical application of modelling tests for improvement of the occupational safety state in mine.

  11. WannierTools: An open-source software package for novel topological materials

    NASA Astrophysics Data System (ADS)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  12. Modeling of sedimentation and resuspension processes induced by intensive internal gravity waves in the coastal water systems with the use of the advection-diffusion equation for sediment concentration

    NASA Astrophysics Data System (ADS)

    Rouvinskaya, Ekaterina; Kurkin, Andrey; Kurkina, Oxana

    2017-04-01

    Intensive internal gravity waves influence bottom topography in the coastal zone. They induce substantial flows in the bottom layer that are essential for the formation of suspension and for the sediment transport. It is necessary to develop a mathematical model to predict the state of the seabed near the coastline to assess and ensure safety during the building and operation of the hydraulic engineering constructions. There are many models which are used to predict the impact of storm waves on the sediment transport processes. Such models for the impact of the tsunami waves are also actively developing. In recent years, the influence of intense internal waves on the sedimentation processes is also of a special interest. In this study we adapt one of such models, that is based on the advection-diffusion equation and allows to study processes of resuspension under the influence of internal gravity waves in the coastal zone, for solving the specific practical problems. During the numerical simulation precomputed velocity values are substituted in the advection - diffusion equation for sediment concentration at each time step and each node of the computational grid. Velocity values are obtained by the simulation of the internal waves' dynamics by using the IGW Research software package for numerical integration of fully nonlinear two-dimensional (vertical plane) system of equations of hydrodynamics of inviscid incompressible stratified fluid in the Boussinesq approximation bearing in mind the impact of barotropic tide. It is necessary to set the initial velocity and density distribution in the computational domain, bottom topography, as well as the value of the Coriolis parameter and, if necessary, the parameters of the tidal wave to carry out numerical calculations in the software package IGW Research. To initialize the background conditions of the numerical model we used data records obtained in the summer in the southern part of the shelf zone of Sakhalin Island from 1999 to 2003, provided by SakhNIRO, Russia. The process of assimilation of field data with numerical model is described in detail in our previous studies. It has been shown that process of suspension formation is quite intense for the investigated condition. Concentration of suspended particles significantly increases during the tide, especially on naturally uneven bottom relief as well as on the right boundary of the computational domain (near shoreline). Pronounced nepheloid layer is produced. Its thickness is about 5.6 m. At the phase of low tide, the process of suspension sediment production stops, and suspended particles are beginning to settle because of the small vertical velocities. Thickness of nepheloid layer is actively reduced. Obviously, this should lead to a change in the bottom relief. The presented results of research were obtained with the support of the Russian President's scholarship for young scientists and graduate students SP-2311.2016.5.

  13. Evaluation of the ADAPTIR System for Work Zone Traffic Control

    DOT National Transportation Integrated Search

    1999-11-01

    The ADAPTIR system (Automated Data Acquisition and Processing of Traffic Information in Real Time) uses variable message signs (VMS) equipped with radar units, along with a software program to interpret the data, to display appropriate warning and ad...

  14. On the classification of normalized natural frequencies for damage detection in cantilever beam

    NASA Astrophysics Data System (ADS)

    Dahak, Mustapha; Touat, Noureddine; Benseddiq, Noureddine

    2017-08-01

    The presence of a damage on a beam causes changes in the physical properties, which introduce flexibility, and reduce the natural frequencies of the beam. Based on this, a new method is proposed to locate the damage zone in a cantilever beam. In this paper, the cantilever beam is discretized into a number of zones, where each zone has a specific classification of the first four normalized natural frequencies. The damaged zone is distinguished by only the classification of the normalized frequencies of the structure. In the case when the damage is symmetric to the vibration node, we use the unchanged natural frequency as a second information to obtain a more accurate location. The effectiveness of the proposed method is shown by a numerical simulation with ANSYS software and experimental investigation of a cantilever beam with different damage.

  15. Controls on continental strain partitioning above an oblique subduction zone, Northern Andes

    NASA Astrophysics Data System (ADS)

    Schütt, Jorina M.; Whipp, David M., Jr.

    2016-04-01

    Strain partitioning is a common process at obliquely convergent plate margins dividing oblique convergence into margin-normal slip on the plate-bounding fault and horizontal shearing on a strike-slip system parallel to the subduction margin. In subduction zones, strain partitioning in the upper continental plate is mainly controlled by the shear forces acting on the plate interface and the strength of the continental crust. The plate interface forces are influenced by the subducting plate dip angle and the obliquity angle between the normal to the plate margin and the convergence velocity vector, and the crustal strength of the continent is strongly affected by the presence or absence of a volcanic arc, with the presence of the volcanic arcs being common at steep subduction zones. Along the ˜7000 km western margin of South America the convergence obliquity, subduction dip angles and presence of a volcanic arc all vary, but strain partitioning is only observed along parts of it. This raises the questions, to what extent do subduction zone characteristics control strain partitioning in the overriding continental plate, and which factors have the largest influence? We address these questions using lithospheric-scale 3D numerical geodynamic experiments to investigate the influence of subduction dip angle, convergence obliquity, and weaknesses in the crust owing to the volcanic arc on strain partitioning behavior. We base the model design on the Northern Volcanic Zone of the Andes (5° N - 2° S), characterized by steep subduction (˜ 35°), a convergence obliquity between 31° -45° and extensive arc volcanism, and where strain partitioning is observed. The numerical modelling software (DOUAR) solves the Stokes flow and heat transfer equations for a viscous-plastic creeping flow to calculate velocity fields, thermal evolution, rock uplift and strain rates in a 1600 km x 1600 km box with depth 160 km. Subduction geometry and material properties are based on a simplified, generic subduction zone similar to the northern Andes. The upper surface is initially defined to resemble the Andes, but is free to deform during the experiments. We consider two main model designs, one with and one without a volcanic arc (weak continental zone). A relatively high angle of convergence obliquity is predicted to favor strain partitioning, but preliminary model results show no strain partitioning for a uniform continental crustal strength with a friction angle of Φ = 15° . However, strain partitioning does occur when including a weak zone in the continental crust resulting from arc volcanic activity with Φ = 5° . This results in margin-parallel northeastward translation of a continental sliver at 3.2 cm/year. The presence of the sliver agrees well with observations of a continental sliver identified by GPS measurements in the Northern Volcanic Zone with a translation velocity of about 1 cm/year, though the GPS-derived velocity may not be representative of the long-term rate of translation depending on whether the observation period includes one or more seismic cycles. Regardless, the observed behavior is consistent with the observed earthquake focal mechanisms and GPS measurements, suggesting significant northeastward transport of Andean crust along the margin of the northern Andes.

  16. A root zone modelling approach to estimating groundwater recharge from irrigated areas

    NASA Astrophysics Data System (ADS)

    Jiménez-Martínez, J.; Skaggs, T. H.; van Genuchten, M. Th.; Candela, L.

    2009-03-01

    SummaryIn irrigated semi-arid and arid regions, accurate knowledge of groundwater recharge is important for the sustainable management of scarce water resources. The Campo de Cartagena area of southeast Spain is a semi-arid region where irrigation return flow accounts for a substantial portion of recharge. In this study we estimated irrigation return flow using a root zone modelling approach in which irrigation, evapotranspiration, and soil moisture dynamics for specific crops and irrigation regimes were simulated with the HYDRUS-1D software package. The model was calibrated using field data collected in an experimental plot. Good agreement was achieved between the HYDRUS-1D simulations and field measurements made under melon and lettuce crops. The simulations indicated that water use by the crops was below potential levels despite regular irrigation. The fraction of applied water (irrigation plus precipitation) going to recharge ranged from 22% for a summer melon crop to 68% for a fall lettuce crop. In total, we estimate that irrigation of annual fruits and vegetables produces 26 hm 3 y -1 of groundwater recharge to the top unconfined aquifer. This estimate does not include important irrigated perennial crops in the region, such as artichoke and citrus. Overall, the results suggest a greater amount of irrigation return flow in the Campo de Cartagena region than was previously estimated.

  17. Three Dimensional Modeling of Agricultural Contamination of Groundwater: a Case Study in the Nebraska Management Systems Evaluation Area (MSEA) Site

    NASA Astrophysics Data System (ADS)

    Akbariyeh, S.; Snow, D. D.; Bartelt-Hunt, S.; Li, X.; Li, Y.

    2015-12-01

    Contamination of groundwater from nitrogen fertilizers and pesticides in agricultural lands is an important environmental and water quality management issue. It is well recognized that in agriculturally intensive areas, fertilizers and pesticides may leach through the vadose zone and eventually reach groundwater, impacting future uses of this limited resource. While numerical models are commonly used to simulate fate and transport of agricultural contaminants, few models have been validated based on realistic three dimensional soil lithology, hydrological conditions, and historical changes in groundwater quality. In this work, contamination of groundwater in the Nebraska Management Systems Evaluation Area (MSEA) site was simulated based on extensive field data including (1) lithology from 69 wells and 11 test holes; (2) surface soil type, land use, and surface elevations; (3) 5-year groundwater level and flow velocity; (4) daily meteorological monitoring; (5) 5-year seasonal irrigation records; (6) 5-years of spatially intensive contaminant concentration in 40 multilevel monitoring wells; and (7) detailed cultivation records. Using this data, a three-dimensional vadose zone lithological framework was developed using a commercial software tool (RockworksTM). Based on the interpolated lithology, a hydrological model was developed using HYDRUS-3D to simulate water flow and contaminant transport. The model was validated through comparison of simulated atrazine and nitrate concentration with historical data from 40 wells and multilevel samplers. The validated model will be used to predict potential changes in ground water quality due to agricultural contamination under future climate scenarios in the High Plain Aquifer system.

  18. Development of the GPM Observatory Thermal Vacuum Test Model

    NASA Technical Reports Server (NTRS)

    Yang, Kan; Peabody, Hume

    2012-01-01

    A software-based thermal modeling process was documented for generating the thermal panel settings necessary to simulate worst-case on-orbit flight environments in an observatory-level thermal vacuum test setup. The method for creating such a thermal model involved four major steps: (1) determining the major thermal zones for test as indicated by the major dissipating components on the spacecraft, then mapping the major heat flows between these components; (2) finding the flight equivalent sink temperatures for these test thermal zones; (3) determining the thermal test ground support equipment (GSE) design and initial thermal panel settings based on the equivalent sink temperatures; and (4) adjusting the panel settings in the test model to match heat flows and temperatures with the flight model. The observatory test thermal model developed from this process allows quick predictions of the performance of the thermal vacuum test design. In this work, the method described above was applied to the Global Precipitation Measurement (GPM) core observatory spacecraft, a joint project between NASA and the Japanese Aerospace Exploration Agency (JAXA) which is currently being integrated at NASA Goddard Space Flight Center for launch in Early 2014. From preliminary results, the thermal test model generated from this process shows that the heat flows and temperatures match fairly well with the flight thermal model, indicating that the test model can simulate fairly accurately the conditions on-orbit. However, further analysis is needed to determine the best test configuration possible to validate the GPM thermal design before the start of environmental testing later this year. Also, while this analysis method has been applied solely to GPM, it should be emphasized that the same process can be applied to any mission to develop an effective test setup and panel settings which accurately simulate on-orbit thermal environments.

  19. Parallax barrier engineering for image quality improvement in an autostereoscopic 3D display.

    PubMed

    Kim, Sung-Kyu; Yoon, Ki-Hyuk; Yoon, Seon Kyu; Ju, Heongkyu

    2015-05-18

    We present a image quality improvement in a parallax barrier (PB)-based multiview autostereoscopic 3D display system under a real-time tracking of positions of a viewer's eyes. The system presented exploits a parallax barrier engineered to offer significantly improved quality of three-dimensional images for a moving viewer without an eyewear under the dynamic eye tracking. The improved image quality includes enhanced uniformity of image brightness, reduced point crosstalk, and no pseudoscopic effects. We control the relative ratio between two parameters i.e., a pixel size and the aperture of a parallax barrier slit to improve uniformity of image brightness at a viewing zone. The eye tracking that monitors positions of a viewer's eyes enables pixel data control software to turn on only pixels for view images near the viewer's eyes (the other pixels turned off), thus reducing point crosstalk. The eye tracking combined software provides right images for the respective eyes, therefore producing no pseudoscopic effects at its zone boundaries. The viewing zone can be spanned over area larger than the central viewing zone offered by a conventional PB-based multiview autostereoscopic 3D display (no eye tracking). Our 3D display system also provides multiviews for motion parallax under eye tracking. More importantly, we demonstrate substantial reduction of point crosstalk of images at the viewing zone, its level being comparable to that of a commercialized eyewear-assisted 3D display system. The multiview autostereoscopic 3D display presented can greatly resolve the point crosstalk problem, which is one of the critical factors that make it difficult for previous technologies for a multiview autostereoscopic 3D display to replace an eyewear-assisted counterpart.

  20. The first clinical application of planning software for laparoscopic microwave thermosphere ablation of malignant liver tumours

    PubMed Central

    Berber, Eren

    2015-01-01

    Background Liver tumour ablation is an operator-dependent procedure. The determination of the optimum needle trajectory and correct ablation parameters could be challenging. The aim of this study was to report the utility of a new, procedure planning software for microwave ablation (MWA) of liver tumours. Methods This was a feasibility study in a pilot group of five patients with nine metastatic liver tumours who underwent laparoscopic MWA. Pre-operatively, parameters predicting the desired ablation zones were calculated for each tumour. Intra-operatively, this planning strategy was followed for both antenna placement and energy application. Post-operative 2-week computed tomography (CT) scans were performed to evaluate complete tumour destruction. Results The patients had an average of two tumours (range 1–4), measuring 1.9 ± 0.4 cm (range 0.9–4.4 cm). The ablation time was 7.1 ± 1.3 min (range 2.5–10 min) at 100W. There were no complications or mortality. The patients were discharged home on post-operative day (POD) 1. At 2-week CT scans, there were no residual tumours, with a complete ablation demonstrated in all lesions. Conclusions This study describes and validates pre-treatment planning software for MWA of liver tumours. This software was found useful to determine precisely the ablation parameters and needle placement to create a predicted zone of ablation. PMID:25980481

  1. Characterization of the guinea pig animal model and subsequent comparison of the behavioral effects of selective dopaminergic drugs and methamphetamine

    PubMed Central

    Lee, Kiera-Nicole; Pellom, Samuel T.; Oliver, Ericka; Chirwa, Sanika

    2014-01-01

    Though not commonly used in behavior tests guinea pigs may offer subtle behavior repertoires that better mimic human activity and warrant study. To test this, 31 Hartley guinea pigs (male, 200–250 g) were evaluated in PhenoTyper cages using the video-tracking EthoVision XT 7.0 software. Results showed that guinea pigs spent more time in the hidden zone (small box in corner of cage) than the food/water zone, or arena zone. Guinea pigs exhibited thigmotaxis (a wall following strategy) and were active throughout the light and dark phases. Eating and drinking occurred throughout the light and dark phases. An injection of 0.25 mg/kg SCH23390, the dopamine D1 receptors (D1R) antagonist, produced significant decreases in time spent in the hidden zone. There were insignificant changes in time spent in the hidden zone for guinea pigs treated with 7.5 mg SKF38393 (D1R agonist), 1.0 mg/kg sulpiride (D2R antagonist), and 1.0 or 10.0 mg/kg methamphetamine. Locomotor activity profiles were unchanged after injections of saline, SKF38393, SCH23390 and sulpiride. By contrast, a single injection or repeated administration for 7 days of low-dose methamphetamine induced transient hyperactivity but this declined to baseline levels over the 22-hour observation period. Guinea pigs treated with high-dose methamphetamine displayed sustained hyperactivity and travelled significantly greater distances over the circadian cycle. Subsequent 7-day treatment with high-dose methamphetamine induced motor sensitization and significant increases in total distances moved relative to single drug injections or saline controls. These results highlight the versatility and unique features of the guinea pig for studying brain-behavior interactions. PMID:24436154

  2. A Model for Assessing the Liability of Seemingly Correct Software

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.; Voas, Larry K.; Miller, Keith W.

    1991-01-01

    Current research on software reliability does not lend itself to quantitatively assessing the risk posed by a piece of life-critical software. Black-box software reliability models are too general and make too many assumptions to be applied confidently to assessing the risk of life-critical software. We present a model for assessing the risk caused by a piece of software; this model combines software testing results and Hamlet's probable correctness model. We show how this model can assess software risk for those who insure against a loss that can occur if life-critical software fails.

  3. FRACOR-software toolbox for deterministic mapping of fracture corridors in oil fields on AutoCAD platform

    NASA Astrophysics Data System (ADS)

    Ozkaya, Sait I.

    2018-03-01

    Fracture corridors are interconnected large fractures in a narrow sub vertical tabular array, which usually traverse entire reservoir vertically and extended for several hundreds of meters laterally. Fracture corridors with their huge conductivities constitute an important element of many fractured reservoirs. Unlike small diffuse fractures, actual fracture corridors must be mapped deterministically for simulation or field development purposes. Fracture corridors can be identified and quantified definitely with borehole image logs and well testing. However, there are rarely sufficient image logs or well tests, and it is necessary to utilize various fracture corridor indicators with varying degrees of reliability. Integration of data from many different sources, in turn, requires a platform with powerful editing and layering capability. Available commercial reservoir characterization software packages, with layering and editing capabilities, can be cost intensive. CAD packages are far more affordable and may easily acquire the versatility and power of commercial software packages with addition of a small software toolbox. The objective of this communication is to present FRACOR, a software toolbox which enables deterministic 2D fracture corridor mapping and modeling on AutoCAD platform. The FRACOR toolbox is written in AutoLISPand contains several independent routines to import and integrate available fracture corridor data from an oil field, and export results as text files. The resulting fracture corridor maps consists mainly of fracture corridors with different confidence levels from combination of static and dynamic data and exclusion zones where no fracture corridor can exist. The exported text file of fracture corridors from FRACOR can be imported into an upscaling programs to generate fracture grid for dual porosity simulation or used for field development and well planning.

  4. Visibility of bony structures around hip prostheses in dual-energy CT: With or without metal artefact reduction software.

    PubMed

    Jeong, Jewon; Kim, Hyun-Joo; Oh, Eunsun; Cha, Jang Gyu; Hwang, Jiyoung; Hong, Seong Sook; Chang, Yun Woo

    2018-05-23

    The development of dual-energy CT and metal artefact reduction software provides a further chance of reducing metal-related artefacts. However, there have been only a few studies regarding whether MARs practically affect visibility of structures around a metallic hip prosthesis on post-operative CT evaluation. Twenty-seven patients with 42 metallic hip prostheses underwent DECT. The datasets were reconstructed with 70, 90 and 110 keV with and without MARs. The areas were classified into 10 zones according to the reference zone. All the images were reviewed in terms of the severity of the beam-hardening artefacts, differentiation of the bony cortex and trabeculae and visualization of trabecular patterns with a three-point scale. The metallic screw diameter was measured in the acetabulum with 110 keV images. The scores were the worst on 70 keV images without MARs [mean scores:1.84-4.22 (p < 0.001-1.000)]. The structures in zone II were best visualized on 110 keV (p < 0.001-0.011, mean scores: 2.86-5.22). In other zones, there is general similarity in mean scores whether applying MARs or not (p < 0.001-0.920). The mean diameter of the screw was 5.85 mm without MARs and 3.44 mm with MARs (mean reference diameter: 6.48 mm). The 110 keV images without MARs are best for evaluating acetabular zone II. The visibility of the bony structures around the hip prosthesis is similar in the other zones with or without MARs regardless of keV. MARS may not be needed for the evaluation of the metallic hip prosthesis itself at sufficient high-energy levels; however, MARS still has a role in the evaluation of other soft tissues around the prosthesis. © 2018 The Royal Australian and New Zealand College of Radiologists.

  5. Static behavior of the weld in the joint of the steel support element using experiment and numerical modeling

    NASA Astrophysics Data System (ADS)

    Krejsa, M.; Brozovsky, J.; Mikolasek, D.; Parenica, P.; Koubova, L.

    2018-04-01

    The paper is focused on the numerical modeling of welded steel bearing elements using commercial software system ANSYS, which is based on the finite element method - FEM. It is important to check and compare the results of FEM analysis with the results of physical verification test, in which the real behavior of the bearing element can be observed. The results of the comparison can be used for calibration of the computational model. The article deals with the physical test of steel supporting elements, whose main purpose is obtaining of material, geometry and strength characteristics of the fillet and butt welds including heat affected zone in the basic material of welded steel bearing element. The pressure test was performed during the experiment, wherein the total load value and the corresponding deformation of the specimens under the load was monitored. Obtained data were used for the calibration of numerical models of test samples and they are necessary for further stress and strain analysis of steel supporting elements.

  6. Infill Walls Contribution on the Progressive Collapse Resistance of a Typical Mid-rise RC Framed Building

    NASA Astrophysics Data System (ADS)

    Besoiu, Teodora; Popa, Anca

    2017-10-01

    This study investigates the effect of the autoclaved aerated concrete infill walls on the progressive collapse resistance of a typical RC framed structure. The 13-storey building located in Brăila (a zone with high seismic risk in Romania) was designed according to the former Romanian seismic code P13-70 (1970). Two models of the structure are generated in the Extreme Loading® for Structures computer software: a model with infill walls and a model without infill walls. Following GSA (2003) Guidelines, a nonlinear dynamic procedure is used to determine the progressive collapse risk of the building when a first-storey corner column is suddenly removed. It was found that, the structure is not expected to fail under the standard GSA loading: DL+0.25LL. Moreover, if the infill walls are introduced in the model, the maximum vertical displacement of the node above the removed column is reduced by about 48%.

  7. Extreme scale multi-physics simulations of the tsunamigenic 2004 Sumatra megathrust earthquake

    NASA Astrophysics Data System (ADS)

    Ulrich, T.; Gabriel, A. A.; Madden, E. H.; Wollherr, S.; Uphoff, C.; Rettenberger, S.; Bader, M.

    2017-12-01

    SeisSol (www.seissol.org) is an open-source software package based on an arbitrary high-order derivative Discontinuous Galerkin method (ADER-DG). It solves spontaneous dynamic rupture propagation on pre-existing fault interfaces according to non-linear friction laws, coupled to seismic wave propagation with high-order accuracy in space and time (minimal dispersion errors). SeisSol exploits unstructured meshes to account for complex geometries, e.g. high resolution topography and bathymetry, 3D subsurface structure, and fault networks. We present the up-to-date largest (1500 km of faults) and longest (500 s) dynamic rupture simulation modeling the 2004 Sumatra-Andaman earthquake. We demonstrate the need for end-to-end-optimization and petascale performance of scientific software to realize realistic simulations on the extreme scales of subduction zone earthquakes: Considering the full complexity of subduction zone geometries leads inevitably to huge differences in element sizes. The main code improvements include a cache-aware wave propagation scheme and optimizations of the dynamic rupture kernels using code generation. In addition, a novel clustered local-time-stepping scheme for dynamic rupture has been established. Finally, asynchronous output has been implemented to overlap I/O and compute time. We resolve the frictional sliding process on the curved mega-thrust and a system of splay faults, as well as the seismic wave field and seafloor displacement with frequency content up to 2.2 Hz. We validate the scenario by geodetic, seismological and tsunami observations. The resulting rupture dynamics shed new light on the activation and importance of splay faults.

  8. Hindcasting of Storm Surges, Currents, and Waves at Lower Delaware Bay during Hurricane Isabel

    NASA Astrophysics Data System (ADS)

    Salehi, M.

    2017-12-01

    Hurricanes are a major threat to coastal communities and infrastructures including nuclear power plants located in low-lying coastal zones. In response, their sensitive elements should be protected by smart design to withstand against drastic impact of such natural phenomena. Accurate and reliable estimate of hurricane attributes is the first step to that effort. Numerical models have extensively grown over the past few years and are effective tools in modeling large scale natural events such as hurricane. The impact of low probability hurricanes on the lower Delaware Bay is investigated using dynamically coupled meteorological, hydrodynamic, and wave components of Delft3D software. Efforts are made to significantly reduce the computational overburden of performing such analysis for the industry, yet keeping the same level of accuracy at the area of study (AOS). The model is comprised of overall and nested domains. The overall model domain includes portion of Atlantic Ocean, Delaware, and Chesapeake bays. The nested model domain includes Delaware Bay, its floodplain, and portion of the continental shelf. This study is portion of a larger modeling effort to study the impact of low probability hurricanes on sensitive infrastructures located at the coastal zones prone to hurricane activity. The AOS is located on the east bank of Delaware Bay almost 16 miles upstream of its mouth. Model generated wind speed, significant wave height, water surface elevation, and current are calibrated for hurricane Isabel (2003). The model calibration results agreed reasonably well with field observations. Furthermore, sensitivity of surge and wave responses to various hurricane parameters was tested. In line with findings from other researchers, accuracy of wind field played a major role in hindcasting the hurricane attributes.

  9. 78 FR 23690 - Airworthiness Directives; The Boeing Company

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-22

    ... management system (CMS) configuration database; and installing new operational program software (OPS) for the CSCP, zone management unit (ZMU), passenger address controller, cabin interphone controller, cabin area... on the Internet at http://www.regulations.gov ; or in person at the Docket Management Facility...

  10. Numerical study of wall shear stress-based descriptors in the human left coronary artery.

    PubMed

    Pinto, S I S; Campos, J B L M

    2016-10-01

    The present work is about the application of wall shear stress descriptors - time averaged wall shear stress (TAWSS), oscillating shear index (OSI) and relative residence time (RRT) - to the study of blood flow in the left coronary artery (LCA). These descriptors aid the prediction of disturbed flow conditions in the vessels and play a significant role in the detection of potential zones of atherosclerosis development. Hemodynamic descriptors data were obtained, numerically, through ANSYS® software, for the LCA of a patient-specific geometry and for a 3D idealized model. Comparing both cases, the results are coherent, in terms of location and magnitude. Low TAWSS, high OSI and high RRT values are observed in the bifurcation - potential zone of atherosclerosis appearance. The dissimilarities observed in the TAWSS values, considering blood as a Newtonian or non-Newtonian fluid, releases the importance of the correct blood rheologic caracterization. Moreover, for a higher Reynolds number, the TAWSS values decrease in the bifurcation and along the LAD branch, increasing the probability of plaques deposition. Furthermore, for a stenotic LCA model, very low TAWSS and high RRT values in front and behind the stenosis are observed, indicating the probable extension, in the flow direction, of the lesion.

  11. The role of fault surface geometry in the evolution of the fault deformation zone: comparing modeling with field example from the Vignanotica normal fault (Gargano, Southern Italy).

    NASA Astrophysics Data System (ADS)

    Maggi, Matteo; Cianfarra, Paola; Salvini, Francesco

    2013-04-01

    Faults have a (brittle) deformation zone that can be described as the presence of two distintive zones: an internal Fault core (FC) and an external Fault Damage Zone (FDZ). The FC is characterized by grinding processes that comminute the rock grains to a final grain-size distribution characterized by the prevalence of smaller grains over larger, represented by high fractal dimensions (up to 3.4). On the other hand, the FDZ is characterized by a network of fracture sets with characteristic attitudes (i.e. Riedel cleavages). This deformation pattern has important consequences on rock permeability. FC often represents hydraulic barriers, while FDZ, with its fracture connection, represents zones of higher permability. The observation of faults revealed that dimension and characteristics of FC and FDZ varies both in intensity and dimensions along them. One of the controlling factor in FC and FDZ development is the fault plane geometry. By changing its attitude, fault plane geometry locally alter the stress component produced by the fault kinematics and its combination with the bulk boundary conditions (regional stress field, fluid pressure, rocks rheology) is responsible for the development of zones of higher and lower fracture intensity with variable extension along the fault planes. Furthermore, the displacement along faults provides a cumulative deformation pattern that varies through time. The modeling of the fault evolution through time (4D modeling) is therefore required to fully describe the fracturing and therefore permeability. In this presentation we show a methodology developed to predict distribution of fracture intensity integrating seismic data and numerical modeling. Fault geometry is carefully reconstructed by interpolating stick lines from interpreted seismic sections converted to depth. The modeling is based on a mixed numerical/analytical method. Fault surface is discretized into cells with their geometric and rheological characteristics. For each cell, the acting stress and strength are computed by analytical laws (Coulomb failure). Total brittle deformation for each cell is then computed by cumulating the brittle failure values along the path of each cell belonging to one side onto the facing one. The brittle failure value is provided by the DF function, that is the difference between the computed shear and the strength of the cell at each step along its path by using the Frap in-house developed software. The width of the FC and the FDZ are computed as a function of the DF distribution and displacement around the fault. This methodology has been successfully applied to model the brittle deformation pattern of the Vignanotica normal fault (Gargano, Southern Italy) where fracture intensity is expressed by the dimensionless H/S ratio representing the ratio between the dimension and the spacing of homologous fracture sets (i.e., group of parallel fractures that can be ascribed to the same event/stage/stress field).

  12. Integrated Flight Path Planning System and Flight Control System for Unmanned Helicopters

    PubMed Central

    Jan, Shau Shiun; Lin, Yu Hsiang

    2011-01-01

    This paper focuses on the design of an integrated navigation and guidance system for unmanned helicopters. The integrated navigation system comprises two systems: the Flight Path Planning System (FPPS) and the Flight Control System (FCS). The FPPS finds the shortest flight path by the A-Star (A*) algorithm in an adaptive manner for different flight conditions, and the FPPS can add a forbidden zone to stop the unmanned helicopter from crossing over into dangerous areas. In this paper, the FPPS computation time is reduced by the multi-resolution scheme, and the flight path quality is improved by the path smoothing methods. Meanwhile, the FCS includes the fuzzy inference systems (FISs) based on the fuzzy logic. By using expert knowledge and experience to train the FIS, the controller can operate the unmanned helicopter without dynamic models. The integrated system of the FPPS and the FCS is aimed at providing navigation and guidance to the mission destination and it is implemented by coupling the flight simulation software, X-Plane, and the computing software, MATLAB. Simulations are performed and shown in real time three-dimensional animations. Finally, the integrated system is demonstrated to work successfully in controlling the unmanned helicopter to operate in various terrains of a digital elevation model (DEM). PMID:22164029

  13. Integrated flight path planning system and flight control system for unmanned helicopters.

    PubMed

    Jan, Shau Shiun; Lin, Yu Hsiang

    2011-01-01

    This paper focuses on the design of an integrated navigation and guidance system for unmanned helicopters. The integrated navigation system comprises two systems: the Flight Path Planning System (FPPS) and the Flight Control System (FCS). The FPPS finds the shortest flight path by the A-Star (A*) algorithm in an adaptive manner for different flight conditions, and the FPPS can add a forbidden zone to stop the unmanned helicopter from crossing over into dangerous areas. In this paper, the FPPS computation time is reduced by the multi-resolution scheme, and the flight path quality is improved by the path smoothing methods. Meanwhile, the FCS includes the fuzzy inference systems (FISs) based on the fuzzy logic. By using expert knowledge and experience to train the FIS, the controller can operate the unmanned helicopter without dynamic models. The integrated system of the FPPS and the FCS is aimed at providing navigation and guidance to the mission destination and it is implemented by coupling the flight simulation software, X-Plane, and the computing software, MATLAB. Simulations are performed and shown in real time three-dimensional animations. Finally, the integrated system is demonstrated to work successfully in controlling the unmanned helicopter to operate in various terrains of a digital elevation model (DEM).

  14. JaMBES: A "New" Way of Calculating Plate Tectonic Reconstruction

    NASA Astrophysics Data System (ADS)

    Chambord, A. I.; Smith, E. G. C.; Sutherland, R.

    2014-12-01

    Calculating the paleoposition of tectonic plates using marine geophysical data has been usually done by using the Hellinger criterion [Hellinger, 1981]. However, for the Hellinger software [Kirkwood et al., 1999] to produce stable results, we find that the input data must be abundant and spatially well distributed. Although magnetic anomalies and fracture zone data have been increasingly abundant since the 1960s, some parts of the globe remain too sparsely explored to provide enough data for the Hellinger code to provide satisfactory rotations. In this poster, we present new software to calculate the paleopositions of tectonic plates using magnetic anomalies and fracture zone data. Our method is based on the theory of plate tectonics as introduced by [Bullard et al., 1965] and [Morgan, 1968], which states that ridge segments (ie. magnetic lineations) and fracture zones are at right angles to each other. In order to test our software, we apply it to a region of the world where climatic conditions hinder the acquisition of magnetic data: the Southwest Pacific, between New Zealand and Antarctica from breakup time to chron 20 (c43Ma). Bullard, E., J. E. Everett, and A. G. Smith (1965), The fit of continents around the atlantic, Philosophical Transactions of the Royal Society of London, Series A: Mathematical and Physical Sciences, 258(1088), 41-51. Hellinger, S. J. (1981), The uncertainties of finite rotations in plate tectonics, Journal of Geophysical Research, 86(B10), 9312-9318. Kirkwood, B. H., J. Y. Royer, T. C. Chang, and R. G. Gordon (1999), Statistical tools for estimating and combining finite rotations and their uncertainties, Geophysical Journal International, 137(2), 408-428. Morgan, W. J. (1968), Rises, trenches, great faults, and crustal blocks, Journal of Geophysical Research, 73(6), 1959-1982.

  15. Modeling SMAP Spacecraft Attitude Control Estimation Error Using Signal Generation Model

    NASA Technical Reports Server (NTRS)

    Rizvi, Farheen

    2016-01-01

    Two ground simulation software are used to model the SMAP spacecraft dynamics. The CAST software uses a higher fidelity model than the ADAMS software. The ADAMS software models the spacecraft plant, controller and actuator models, and assumes a perfect sensor and estimator model. In this simulation study, the spacecraft dynamics results from the ADAMS software are used as CAST software is unavailable. The main source of spacecraft dynamics error in the higher fidelity CAST software is due to the estimation error. A signal generation model is developed to capture the effect of this estimation error in the overall spacecraft dynamics. Then, this signal generation model is included in the ADAMS software spacecraft dynamics estimate such that the results are similar to CAST. This signal generation model has similar characteristics mean, variance and power spectral density as the true CAST estimation error. In this way, ADAMS software can still be used while capturing the higher fidelity spacecraft dynamics modeling from CAST software.

  16. Influence of operating conditions on the air gasification of dry refinery sludge in updraft gasifier

    NASA Astrophysics Data System (ADS)

    Ahmed, R.; Sinnathambi, C. M.

    2013-06-01

    In the present work, details of the equilibrium modeling of dry refinery sludge (DRS) are presented using ASPEN PLUS Simulator in updraft gasifier. Due to lack of available information in the open journal on refinery sludge gasification using updraft gasifier, an evaluate for its optimum conditions on gasification is presented in this paper. For this purpose a Taguchi Orthogonal array design, statistical software is applied to find optimum conditions for DRS gasification. The goal is to identify the most significant process variable in DRS gasification conditions. The process variables include; oxidation zone temperature, equivalent ratio, operating pressure will be simulated and examined. Attention was focused on the effect of optimum operating conditions on the gas composition of H2 and CO (desirable) and CO2 (undesirable) in terms of mass fraction. From our results and finding it can be concluded that the syngas (H2 & CO) yield in term of mass fraction favors high oxidation zone temperature and at atmospheric pressure while CO2 acid gas favor at a high level of equivalent ratio as well as air flow rate favoring towards complete combustion.

  17. Human Cases of Tularemia in Armenia, 1996-2012.

    PubMed

    Melikjanyan, Syuzanna; Palayan, Karo; Vanyan, Artavazd; Avetisyan, Lilit; Bakunts, Nune; Kotanyan, Marine; Guerra, Marta

    2017-09-01

    A retrospective analysis was conducted of human cases and outbreaks of tularemia in the Republic of Armenia from 1996 to 2012 utilizing geographic information system software. A total of 266 human cases of tularemia were recorded in Armenia from 1996 to 2012, with yearly incidence ranging from 0 to 5.5 cases per 100,000 people. Cases predominantly affected the male population (62.8%), 11-20 year age group (37.2%), agricultural workers (49.6%), and persons residing in rural areas (93.6%). In 2003, a waterborne outbreak involving 158 cases occurred in Kotayk Marz, and in 2007, a foodborne outbreak with 17 cases occurred in Gegharkunik Marz, attributed to exposure of food products to contaminated hay. Geospatial analysis of all cases showed that the majority were associated with the steppe vegetation zone, elevations between 1,400 and 2,300 m, and the climate zone associated with dry, warm summers, and cold winters. Characterization of these environmental factors were used to develop a predictive risk model to improve surveillance and outbreak response for tularemia in Armenia.

  18. Probabilistic Modeling and Evaluation of Surf Zone Injury Occurrence along the Delaware Coast

    NASA Astrophysics Data System (ADS)

    Doelp, M.; Puleo, J. A.

    2017-12-01

    Beebe Healthcare in Lewes, DE collected along the DE coast surf zone injury (SZI) data for seven summer seasons from 2010 through 2016. Data include, but are not limited to, time of injury, gender, age, and activity. Over 2000 injuries were recorded over the seven year period, including 116 spinal injuries and three fatalities. These injuries are predominantly wave related incidents including wading (41%), bodysurfing (26%), and body-boarding (20%). Despite the large number of injuries, beach associated hazards do not receive the same level of awareness that rip currents receive. Injury population statistics revealed those between the ages of 11 and 15 years old suffered the greatest proportion of injuries (18.8%). Male water users were twice as likely to sustain injury as their female counterparts. Also, non-locals were roughly six times more likely to sustain injury than locals. In 2016, five or more injuries occurred for 18.5% of the days sampled, and no injuries occurred for 31.4% of the sample days. The episodic nature of injury occurrence and population statistics indicate the importance of environmental conditions and human behavior on surf zone injuries. Higher order statistics are necessary to effectively assess SZI cause and likelihood of occurrence on a particular day. A Bayesian network using Netica software (Norsys) was constructed to model SZI and predict changes in injury likelihood on an hourly basis. The network incorporates environmental data collected by weather stations, NDBC buoy #44009, USACE buoy at Bethany Beach, and by researcher personnel on the beach. The Bayesian model includes prior (e.g., historic) information to infer relationships between provided parameters. Sensitivity analysis determined the most influential variables to injury likelihood are population, water temperature, nearshore wave height, beach slope, and the day of the week. Forecasting during the 2017 summer season will test model ability to predict injury likelihood.

  19. The Collaborative Information Portal and NASA's Mars Exploration Rover Mission

    NASA Technical Reports Server (NTRS)

    Mak, Ronald; Walton, Joan

    2005-01-01

    The Collaborative Information Portal was enterprise software developed jointly by the NASA Ames Research Center and the Jet Propulsion Laboratory for NASA's Mars Exploration Rover mission. Mission managers, engineers, scientists, and researchers used this Internet application to view current staffing and event schedules, download data and image files generated by the rovers, receive broadcast messages, and get accurate times in various Mars and Earth time zones. This article describes the features, architecture, and implementation of this software, and concludes with lessons we learned from its deployment and a look towards future missions.

  20. DESPOTIC - a new software library to Derive the Energetics and SPectra of Optically Thick Interstellar Clouds

    NASA Astrophysics Data System (ADS)

    Krumholz, Mark R.

    2014-01-01

    I describe DESPOTIC, a code to Derive the Energetics and SPectra of Optically Thick Interstellar Clouds. DESPOTIC represents such clouds using a one-zone model, and can calculate line luminosities, line cooling rates, and in restricted cases line profiles using an escape probability formalism. It also includes approximate treatments of the dominant heating, cooling and chemical processes for the cold interstellar medium, including cosmic ray and X-ray heating, grain photoelectric heating, heating of the dust by infrared and ultraviolet radiation, thermal cooling of the dust, collisional energy exchange between dust and gas, and a simple network for carbon chemistry. Based on these heating, cooling and chemical rates, DESPOTIC can calculate clouds' equilibrium gas and dust temperatures, equilibrium carbon chemical state and time-dependent thermal and chemical evolution. The software is intended to allow rapid and interactive calculation of clouds' characteristic temperatures, identification of their dominant heating and cooling mechanisms and prediction of their observable spectra across a wide range of interstellar environments. DESPOTIC is implemented as a PYTHON package, and is released under the GNU General Public License.

  1. Astroinformatics as a New Research Field. UkrVO Astroinformation Resources: Tasks and Prospective

    NASA Astrophysics Data System (ADS)

    Vavilova, I. B.

    The data-oriented astronomy has allowed classifying the Astroinformatics as a new academic research field, which covers various multi-disciplinary applications of the e-Astronomy. Among them are the data modeling, data mining, metadata standards development, data access, digital astronomical databases, image archives and visualization, machine learning, statistics and other computational methods and software for work with astronomical survey and catalogues with their teta- topeta-scale astroinformation resource. In this review we describe briefly the astroinformatics applications and software/services performed for different astronomical tasks in frame of the VIrtual Roentgen and Gamma Observatory (VIRGO) and Ukrainian VirtualObservatory (UkrVO). Among them there are projects based on the archival space-born data of X-ray and gamma space observatories and on the Joint Digitized Archive (JDA) database of astroplate network collections. The UkrVO JDA DR1 deals with the star catalogues (FON, Polar zone, open clusters, GRB star fields) as well as the UkrVO JDA DR2 deals with the Solar System bodies (giant and small planets, satellites, astronomical heritage images).

  2. Incidence and Outcomes of Optical Zone Enlargement and Recentration After Previous Myopic LASIK by Topography-Guided Custom Ablation.

    PubMed

    Reinstein, Dan Z; Archer, Timothy J; Carp, Glenn I; Stuart, Alastair J; Rowe, Elizabeth L; Nesbit, Andrew; Moore, Tara

    2018-02-01

    To report the incidence, visual and refractive outcomes, optical zone enlargement, and recentration using topography-guided CRS-Master TOSCA II software with the MEL 80 excimer laser (Carl Zeiss Meditec AG, Jena, Germany) after primary myopic laser refractive surgery. Retrospective analysis of 73 eyes (40 patients) with complaints of night vision disturbances due to either a decentration or small optical zone following a primary myopic laser refractive surgery procedure using the MEL 80 laser. Multiple ATLAS topography scans were imported into the CRS-Master software for topography-guided ablation planning. The topography-guided re-treatment procedure was performed as either a LASIK flap lift, a new LASIK flap, a side cut only, or photorefractive keratectomy. Axial curvature maps were analyzed using a fixed grid and set of concentric circles superimposed to measure the topographic optical zone diameter and centration. Follow-up was 12 months. The incidence of use in the population of myopic treatments during the study period was 0.79% (73 of 9,249). The optical zone diameter was increased by 11% from a mean of 5.65 to 6.32 mm, with a maximum change of 2 mm in one case. Topographic decentration was reduced by 64% from a mean of 0.58 to 0.21 mm. There was a 44% reduction in spherical aberration, 53% reduction in coma, and 39% reduction in total higher order aberrations. A subjective improvement in night vision symptoms was reported by 93%. Regarding efficacy, 82% of eyes reached 20/20 and 100% reached 20/32 (preoperative CDVA was 20/20 or better in 90%). Regarding safety, no eyes lost two lines of CDVA and 27% gained one line. Regarding predictability, 71% of re-treatments were within ±0.50 diopters. Topography-guided ablation was effective in enlarging the optical zone, recentering the optical zone, and reducing higher order aberrations. Topography-guided custom ablation appears to be an effective method for re-treatment procedures of symptomatic patients after myopic LASIK. [J Refract Surg. 2018;34(2):121-130.]. Copyright 2018, SLACK Incorporated.

  3. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  4. What controls deformation in a bent three-dimensional orogen? An example from the Bolivian Andes

    NASA Astrophysics Data System (ADS)

    Kaislaniemi, L.; Whipp, D. M., Jr.

    2017-12-01

    The width of orogens is thought to be affected by both erosional intensity and strength of the rocks. Along-strike variation of the orogen width can be expected to reflect shifts in these factors. An example of such variation can be found around the Bolivian orocline, which is a change in the orientation of the central Andes, in central Bolivia, from N-S south of 18°S to roughly NW-SE in the north. This bend coincides with 50% reduction in the width of the orogen east of the Altiplano, an approximately eight-fold increase in the annual precipitation, and the presence of a basement arch that reduces the thickness of relatively weak Paleozoic sediments upon which the orogen detaches. This has led to uncertainty about whether the growth of the orogen is controlled primarily by climate (erosion) or tectonics (strength of the basal detachment). We study deformation in a segmented orogen using 3D geodynamic models to understand how along-strike variations in rainfall and basal detachment strength affect orogen deformation and growth of the frontal part of the Andean fold-and-thrust belt (FTB). We calculate the visco-plastic deformation in the retro-wedge of an Andean-style orogen using the finite element software DOUAR (Braun et al. 2008) coupled to the surface process model FastScape (Braun & Willett 2013). The model design includes the basement, the Altiplano, and the FTB east of the plateau. A weak basal detachment zone is prescribed. Strain softening allows development of new faults and free evolution of the detachment zone. The effects of varying rock strength and varying precipitation are considered to determine the primary control(s) on the geometry and evolution of curved orogens. Results show that both increased precipitation and stronger detachment zone can explain differences in the width of the FTB, as reflected in the topography. These factors, however, lead to different structural evolution of the orogen: Weak basal detachment zone promotes growth of the FTB towards the foreland, whereas strong basal detachment keeps the deformation nearer to the plateau. Increased precipitation causes strong localization of the frontal thrust and no internal deformation in the foreland or near the plateau. Strike-slip faults are produced by variation in detachment zone strength, but not by shifts in precipitation rates.

  5. Development practices and lessons learned in developing SimPEG

    NASA Astrophysics Data System (ADS)

    Cockett, R.; Heagy, L. J.; Kang, S.; Rosenkjaer, G. K.

    2015-12-01

    Inverse modelling provides a mathematical framework for constructing a model of physical property distributions in the subsurface that are consistent with the data collected in geophysical surveys. The geosciences are increasingly moving towards the integration of geological, geophysical, and hydrological information to better characterize the subsurface. This integration must span disciplines and is not only challenging scientifically, but additionally the inconsistencies between conventions often makes implementations complicated, non­ reproducible, or inefficient. SimPEG is an open-source, multi-university effort aimed at providing a generalized framework for solving forward and inverse problems. SimPEG includes finite volume discretizations on structured and unstructured meshes, interfaces to standard numerical solver packages, convex optimization algorithms, model parameterizations, and visualization routines. The SimPEG package (http://simpeg.xyz) supports an ecosystem of forward and inverse modelling applications, including electromagnetics, vadose zone flow, seismic, and potential­ fields, that are all written with a common interface and toolbox. The goal of SimPEG is to support a community of researchers with well-tested, extensible tools, and encourage transparency and reproducibility both of the SimPEG software and the geoscientific research it is applied to. In this presentation, we will share some of the lessons we have learned in designing the modular infrastructure, testing and development practices of SimPEG. We will discuss our use of version control, extensive unit-testing, continuous integration, documentation, issue tracking, and resources that facilitate communication between existing team members and allows new researchers to get involved. These practices have enabled the use of SimPEG in research, industry, and education as well as the ability to support a growing number of dependent repositories and applications. We hope that sharing our practices and experiences will help other researchers who are creating communities around their own scientific software. As this session suggests, "software is critical to the success of science," but, it is the *communities* of researchers that must be supported as we strive to create top quality research tools.

  6. TerraFERMA: Harnessing Advanced Computational Libraries in Earth Science

    NASA Astrophysics Data System (ADS)

    Wilson, C. R.; Spiegelman, M.; van Keken, P.

    2012-12-01

    Many important problems in Earth sciences can be described by non-linear coupled systems of partial differential equations. These "multi-physics" problems include thermo-chemical convection in Earth and planetary interiors, interactions of fluids and magmas with the Earth's mantle and crust and coupled flow of water and ice. These problems are of interest to a large community of researchers but are complicated to model and understand. Much of this complexity stems from the nature of multi-physics where small changes in the coupling between variables or constitutive relations can lead to radical changes in behavior, which in turn affect critical computational choices such as discretizations, solvers and preconditioners. To make progress in understanding such coupled systems requires a computational framework where multi-physics problems can be described at a high-level while maintaining the flexibility to easily modify the solution algorithm. Fortunately, recent advances in computational science provide a basis for implementing such a framework. Here we present the Transparent Finite Element Rapid Model Assembler (TerraFERMA), which leverages several advanced open-source libraries for core functionality. FEniCS (fenicsproject.org) provides a high level language for describing the weak forms of coupled systems of equations, and an automatic code generator that produces finite element assembly code. PETSc (www.mcs.anl.gov/petsc) provides a wide range of scalable linear and non-linear solvers that can be composed into effective multi-physics preconditioners. SPuD (amcg.ese.ic.ac.uk/Spud) is an application neutral options system that provides both human and machine-readable interfaces based on a single xml schema. Our software integrates these libraries and provides the user with a framework for exploring multi-physics problems. A single options file fully describes the problem, including all equations, coefficients and solver options. Custom compiled applications are generated from this file but share an infrastructure for services common to all models, e.g. diagnostics, checkpointing and global non-linear convergence monitoring. This maximizes code reusability, reliability and longevity ensuring that scientific results and the methods used to acquire them are transparent and reproducible. TerraFERMA has been tested against many published geodynamic benchmarks including 2D/3D thermal convection problems, the subduction zone benchmarks and benchmarks for magmatic solitary waves. It is currently being used in the investigation of reactive cracking phenomena with applications to carbon sequestration, but we will principally discuss its use in modeling the migration of fluids in subduction zones. Subduction zones require an understanding of the highly nonlinear interactions of fluids with solids and thus provide an excellent scientific driver for the development of multi-physics software.

  7. Using analytic element models to delineate drinking water source protection areas.

    PubMed

    Raymond, Heather A; Bondoc, Michael; McGinnis, John; Metropulos, Kathy; Heider, Pat; Reed, Allison; Saines, Steve

    2006-01-01

    Since 1999, Ohio EPA hydrogeologists have used two analytic element models (AEMs), the proprietary software GFLOW and U.S. EPA's WhAEM, to delineate protection areas for 535 public water systems. Both models now use the GFLOW2001 solution engine, integrate well with Geographic Information System (GIS) technology, have a user-friendly graphical interface, are capable of simulating a variety of complex hydrogeologic settings, and do not rely upon a model grid. These features simplify the modeling process and enable AEMs to bridge the gap between existing simplistic delineation methods and more complex numerical models. Ohio EPA hydrogeologists demonstrated that WhAEM2000 and GFLOW2000 were capable of producing capture zones similar to more widely accepted models by applying the AEMs to eight sites that had been previously delineated using other methods. After the Ohio EPA delineated protection areas using AEMs, more simplistic delineation methods used by other states (volumetric equation and arbitrary fixed radii) were applied to the same water systems to compare the differences between various methods. GIS software and two-tailed paired t-tests were used to quantify the differences in protection areas and analyze the data. The results of this analysis demonstrate that AEMs typically produce significantly different protection areas than the most simplistic delineation methods, in terms of total area and shape. If the volumetric equation had been used instead of AEMs, Ohio would not have protected 265 km2 of critical upgradient area and would have overprotected 269 km2 of primarily downgradient land. Since an increasing number of land-use restrictions are being tied to drinking water protection areas, this analysis has broad policy implications.

  8. Alteration of fault rocks by CO2-bearing fluids with implications for sequestration

    NASA Astrophysics Data System (ADS)

    Luetkemeyer, P. B.; Kirschner, D. L.; Solum, J. G.; Naruk, S.

    2011-12-01

    Carbonates and sulfates commonly occur as primary (diagenetic) pore cements and secondary fluid-mobilized veins within fault zones. Stable isotope analyses of calcite, formation fluid, and fault zone fluids can help elucidate the carbon sources and the extent of fluid-rock interaction within a particular reservoir. Introduction of CO2 bearing fluids into a reservoir/fault system can profoundly affect the overall fluid chemistry of the reservoir/fault system and may lead to the enhancement or degradation of porosity within the fault zone. The extent of precipitation and/or dissolution of minerals within a fault zone can ultimately influence the sealing properties of a fault. The Colorado Plateau contains a number of large carbon dioxide reservoirs some of which leak and some of which do not. Several normal faults within the Paradox Basin (SE Utah) dissect the Green River anticline giving rise to a series of footwall reservoirs with fault-dependent columns. Numerous CO2-charged springs and geysers are associated with these faults. This study seeks to identify regional sources and subsurface migration of CO2 to these reservoirs and the effect(s) faults have on trap performance. Data provided in this study include mineralogical, elemental, and stable isotope data for fault rocks, host rocks, and carbonate veins that come from two localities along one fault that locally sealed CO2. This fault is just tens of meters away from another normal fault that has leaked CO2-charged waters to the land surface for thousands of years. These analyses have been used to determine the source of carbon isotopes from sedimentary derived carbon and deeply sourced CO2. XRF and XRD data taken from several transects across the normal faults are consistent with mechanical mixing and fluid-assisted mass transfer processes within the fault zone. δ13C range from -6% to +10% (PDB); δ18O values range from +15% to +24% (VSMOW). Geochemical modeling software is used to model the alteration productions of fault rocks from fluids of various chemistries coming from several different reservoirs within an active CO2-charged fault system. These results are compared to data obtained in the field.

  9. GIS for Predicting the Avalanche Zones in the Mountain Regions of Kazakhstan

    NASA Astrophysics Data System (ADS)

    Omirzhanova, Zh. T.; Urazaliev, A. S.; Aimenov, A. T.

    2015-10-01

    Foothills of Trans Ili Alatau is a recreational area with buildings and sports facilities and resorts, sanatoriums, etc. In summer and winter there are a very large number of skiers, climbers, tourists and workers of organizations which located in the mountains. In this regard, forecasting natural destructive phenomena using GIS software is an important task of many scientific fields. The formation of avalanches, except meteorological conditions, such as temperature, wind speed, snow thickness, especially affecting mountainous terrain. Great importance in the formation of avalanches play steepness (slope) of the slope and exposure. If steep slopes contribute to the accumulation of snow in some places, increase the risk of flooding of the slope, the various irregularities can delay an avalanche. According to statistics, the bulk of the avalanche is formed on the slopes steeper than 30°. In the course of research a 3D model of the terrain was created with the help of programs ArcGIS and Surfer. Identified areas with steep slopes, the exposure is made to the cardinal. For dangerous terrain location is divided into three groups: favorable zone, danger zone and the zone of increased risk. The range of deviations from 30-45° is dangerous, since the angle of inclination of more than 30°, there is a maximum thickness of sliding snow, water, the upper layer of the surface and there is an increase rate of moving array, and the mountain slopes at an angle 450 above are the area increased risk. Created on DTM data are also plotted Weather Service for the winter of current year. The resulting model allows to get information upon request and display it on map base, assess the condition of the terrain by avalanches, as well as to solve the problem of life safety in mountainous areas, to develop measures to prevent emergency situations and prevent human losses.

  10. Correlation of contrast-enhanced MR images with the histopathology of minimally invasive thermal and cryoablation cancer treatments in normal dog prostates

    NASA Astrophysics Data System (ADS)

    Bouley, D. M.; Daniel, B.; Butts Pauly, K.; Liu, E.; Kinsey, A.; Nau, W.; Diederich, C. J.; Sommer, G.

    2007-02-01

    Magnetic Resonance Imaging (MRI) is a promising tool for visualizing the delivery of minimally invasive cancer treatments such as high intensity ultrasound (HUS) and cryoablation. We use an acute dog prostate model to correlate lesion histopathology with contrast-enhanced (CE) T1 weighted MR images, to aid the radiologists in real time interpretation of in vivo lesion boundaries and pre-existing lesions. Following thermal or cryo treatments, prostate glands are removed, sliced, stained with the vital dye triphenyl tetrazolium chloride, photographed, fixed and processed in oversized blocks for routine microscopy. Slides are scanned by Trestle Corporation at .32 microns/pixel resolution, the various lesions traced using annotation software, and digital images compared to CE MR images. Histologically, HUS results in discrete lesions characterized by a "heat-fixed" zone, in which glands subjected to the highest temperatures are minimally altered, surrounded by a rim or "transition zone" composed of severely fragmented, necrotic glands, interstitial edema and vascular congestion. The "heat-fixed" zone is non-enhancing on CE MRI while the "transition zone" appears as a bright, enhancing rim. Likewise, the CE MR images for cryo lesions appear similar to thermally induced lesions, yet the histopathology is significantly different. Glands subjected to prolonged freezing appear totally disrupted, coagulated and hemorrhagic, while less intensely frozen glands along the lesion edge are partially fragmented and contain apoptotic cells. In conclusion, thermal and cryo-induced lesions, as well as certain pre-existing lesions (cystic hyperplasia - non-enhancing, chronic prostatitis - enhancing) have particular MRI profiles, useful for treatment and diagnostic purposes.

  11. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-03-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO 2 -based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system.

  12. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies

    PubMed Central

    Dols, W. Stuart.; Emmerich, Steven J.; Polidoro, Brian J.

    2016-01-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. Practical Application CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO2-based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system. PMID:27099405

  13. Pre-stack depth Migration imaging of the Hellenic Subduction Zone

    NASA Astrophysics Data System (ADS)

    Hussni, S. G.; Becel, A.; Schenini, L.; Laigle, M.; Dessa, J. X.; Galve, A.; Vitard, C.

    2017-12-01

    In 365 AD, a major M>8-tsunamignic earthquake occurred along the southwestern segment of the Hellenic subduction zone. Although this is the largest seismic event ever reported in Europe, some fundamental questions remain regarding the deep geometry of the interplate megathrust, as well as other faults within the overriding plate potentially connected to it. The main objective here is to image those deep structures, whose depths range between 15 and 45 km, using leading edge seismic reflection equipment. To this end, a 210-km-long multichannel seismic profile was acquired with the 8 km-long streamer and the 6600 cu.in source of R/V Marcus Langseth. This was realized at the end of 2015, during the SISMED cruise. This survey was made possible through a collective effort gathering labs (Géoazur, LDEO, ISTEP, ENS-Paris, EOST, LDO, Dpt. Geosciences of Pau Univ). A preliminary processing sequence has first been applied using Geovation software of CGG, which yielded a post-stack time migration of collected data, as well as pre-stack time migration obtained with a model derived from velocity analyses. Using Paradigm software, a pre-stack depth migration was subsequently carried out. This step required some tuning in the pre-processing sequence in order to improve multiple removal, noise suppression and to better reveal the true geometry of reflectors in depth. This iteration of pre-processing included, the use of parabolic Radon transform, FK filtering and time variant band pass filtering. An initial velocity model was built using depth-converted RMS velocities obtained from SISMED data for the sedimentary layer, complemented at depth with a smooth version of the tomographic velocities derived from coincident wide-angle data acquired during the 2012-ULYSSE survey. Then, we performed a Kirchhoff Pre-stack depth migration with traveltimes calculated using the Eikonal equation. Velocity model were then tuned through residual velocity analyses to flatten reflections in common reflection point gathers. These new results improve the imaging of deep reflectors and even reveal some new features. We will present this work, a comparison with our previously obtained post-stack time migration, as well as some insights into the new geological structures revealed by the depth imaging.

  14. Model Driven Engineering with Ontology Technologies

    NASA Astrophysics Data System (ADS)

    Staab, Steffen; Walter, Tobias; Gröner, Gerd; Parreiras, Fernando Silva

    Ontologies constitute formal models of some aspect of the world that may be used for drawing interesting logical conclusions even for large models. Software models capture relevant characteristics of a software artifact to be developed, yet, most often these software models have limited formal semantics, or the underlying (often graphical) software language varies from case to case in a way that makes it hard if not impossible to fix its semantics. In this contribution, we survey the use of ontology technologies for software modeling in order to carry over advantages from ontology technologies to the software modeling domain. It will turn out that ontology-based metamodels constitute a core means for exploiting expressive ontology reasoning in the software modeling domain while remaining flexible enough to accommodate varying needs of software modelers.

  15. Metamorphic density controls on early-stage subduction dynamics

    NASA Astrophysics Data System (ADS)

    Duesterhoeft, Erik; Oberhänsli, Roland; Bousquet, Romain

    2013-04-01

    Subduction is primarily driven by the densification of the downgoing oceanic slab, due to dynamic P-T-fields in subduction zones. It is crucial to unravel slab densification induced by metamorphic reactions to understand the influence on plate dynamics. By analyzing the density and metamorphic structure of subduction zones, we may gain knowledge about the driving, metamorphic processes in a subduction zone like the eclogitization (i.e., the transformation of a MORB to an eclogite), the breakdown of hydrous minerals and the release of fluid or the generation of partial melts. We have therefore developed a 2D subduction zone model down to 250 km that is based on thermodynamic equilibrium assemblage computations. Our model computes the "metamorphic density" of rocks as a function of pressure, temperature and chemical composition using the Theriak-Domino software package at different time stages. We have used this model to investigate how the hydration, dehydration, partial melting and fractionation processes of rocks all influence the metamorphic density and greatly depend on the temperature field within subduction systems. These processes are commonly neglected by other approaches (e.g., gravitational or thermomechanical in nature) reproducing the density distribution within this tectonic setting. The process of eclogitization is assumed as being important to subduction dynamics, based on the very high density (3.6 g/cm3) of eclogitic rocks. The eclogitization in a MORB-type crust is possible only if the rock reaches the garnet phase stability field. This process is primarily temperature driven. Our model demonstrates that the initiation of eclogitization of the slab is not the only significant process that makes the descending slab denser and is responsible for the slab pull force. Indeed, our results show that the densification of the downgoing lithospheric mantle (due to an increase of pressure) starts in the early subduction stage and makes a significant contribution to the slab pull, where eclogitization does not occur. Thus, the lithospheric mantle acts as additional ballast below the sinking slab shortly after the initiation of subduction. Our calculation shows that the dogma of eclogitized basaltic, oceanic crust as the driving force of slab pull is overestimated during the early stage of subduction. These results improve our understanding of the force budget for slab pull during the intial and early stage of subduction. Therefore, the complex metamorphic structure of a slab and mantle wedge has an important impact on the development and dynamics of subduction zones. Further Reading: Duesterhoeft, Oberhänsli & Bousquet (2013), submitted to Earth and Planetary Science Letters

  16. Foreshocks and aftershocks of Pisagua 2014 earthquake: time and space evolution of megathrust event.

    NASA Astrophysics Data System (ADS)

    Fuenzalida Velasco, Amaya; Rietbrock, Andreas; Wollam, Jack; Thomas, Reece; de Lima Neto, Oscar; Tavera, Hernando; Garth, Thomas; Ruiz, Sergio

    2016-04-01

    The 2014 Pisagua earthquake of magnitude 8.2 is the first case in Chile where a foreshock sequence was clearly recorded by a local network, as well all the complete sequence including the mainshock and its aftershocks. The seismicity of the last year before the mainshock include numerous clusters close to the epicentral zone (Ruiz et al; 2014) but it was on 16th March that this activity became stronger with the Mw 6.7 precursory event taking place in front of Iquique coast at 12 km depth. The Pisagua earthquake arrived on 1st April 2015 breaking almost 120 km N-S and two days after a 7.6 aftershock occurred in the south of the rupture, enlarging the zone affected by this sequence. In this work, we analyse the foreshocks and aftershock sequence of Pisagua earthquake, from the spatial and time evolution for a total of 15.764 events that were recorded from the 1st March to 31th May 2015. This event catalogue was obtained from the automatic analyse of seismic raw data of more than 50 stations installed in the north of Chile and the south of Peru. We used the STA/LTA algorithm for the detection of P and S arrival times on the vertical components and then a method of back propagation in a 1D velocity model for the event association and preliminary location of its hypocenters following the algorithm outlined by Rietbrock et al. (2012). These results were then improved by locating with NonLinLoc software using a regional velocity model. We selected the larger events to analyse its moment tensor solution by a full waveform inversion using ISOLA software. In order to understand the process of nucleation and propagation of the Pisagua earthquake, we also analysed the evolution in time of the seismicity of the three months of data. The zone where the precursory events took place was strongly activated two weeks before the mainshock and remained very active until the end of the analysed period with an important quantity of the seismicity located in the upper plate and having variations in its focal mechanisms. The evolution of the Pisagua sequence point out a rupture by steps, that we suggest to be related to the properties of the upper plate, as well as along in the subduction interface. The spatial distribution of seismicity was compared to the inter-seismic coupling of previous studies, the regional bathymetry and the slip distribution of both the mainshock and the Magnitude 7.6 event. The results show an important relation between the low coupling zones and the areas lacking large magnitude events

  17. Numerical simulation of multiple-physical fields coupling for thermal anomalies before earthquakes: A case study of the 2008 Wenchuan Ms8.0 earthquake in southwest China

    NASA Astrophysics Data System (ADS)

    Deng, Z.

    2017-12-01

    It has become a highly focused issue that thermal anomalies appear before major earthquakes. There are various hypotheses about the mechanism of thermal anomalies. Because of lacking of enough evidences, the mechanism is still require to be further researched. Gestation and occurrence of a major earthquake is related with the interaction of multi-physical fields. The underground fluid surging out the surface is very likely to be the reason for the thermal anomaly. This study tries to answer some question, such as how the geothermal energy transfer to the surface, and how the multiple-physical fields interacted. The 2008 Wenchuan Ms8.0 earthquake, is one of the largest evens in the last decade in China mainland. Remote sensing studies indicate that distinguishable thermal anomalies occurred several days before the earthquake. The heat anomaly value is more than 3 times the average in normal time and distributes along the Longmen Shan fault zone. Based on geological and geophysical data, 2D dynamic model of coupled stress, seepage and thermal fields (HTM model) is constructed. Then using the COMSOL multi-physics filed software, this work tries to reveal the generation process and distribution patterns of thermal anomalies prior to thrust-type major earthquakes. The simulation get the results: (1)Before the micro rupture, with the increase of compression, the heat current flows to the fault in the footwall on the whole, while in the hanging wall of the fault, particularly near the ground surface, the heat flow upward. In the fault zone, heat flow upward along the fracture surface, heat flux in the fracture zone is slightly larger than the wall rock;, but the value is all very small. (2)After the occurrence of the micro fracture, the heat flow rapidly collects to the faults. In the fault zones, the heat flow accelerates up along the fracture surfaces, the heat flux increases suddenly, and the vertical heat flux reaches to the maximum. The heat flux in the 3 fracture zones is obviously larger than that in the non fracture zone. The high heat flux anomaly can continue several days to one month. The simulation results is consistent with the reality earthquake cases.

  18. Geophysical delineation of the freshwater/saline-water transition zone in the Barton Springs segment of the Edwards Aquifer, Travis and Hays Counties, Texas, September 2006

    USGS Publications Warehouse

    Payne, J.D.; Kress, W.H.; Shah, S.D.; Stefanov, J.E.; Smith, B.A.; Hunt, B.B.

    2007-01-01

    During September 2006, the U.S. Geological Survey, in cooperation with the Barton Springs/Edwards Aquifer Conservation District, conducted a geophysical pilot study to determine whether time-domain electromagnetic (TDEM) sounding could be used to delineate the freshwater/saline-water transition zone in the Barton Springs segment of the Edwards aquifer in Travis and Hays Counties, Texas. There was uncertainty regarding the application of TDEM sounding for this purpose because of the depth of the aquifer (200-500 feet to the top of the aquifer) and the relatively low-resistivity clayey units in the upper confining unit. Twenty-five TDEM soundings were made along four 2-3-mile-long profiles in a study area overlying the transition zone near the Travis-Hays County boundary. The soundings yield measurements of subsurface electrical resistivity, the variations in which were correlated with hydrogeologic and stratigraphic units, and then with dissolved solids concentrations in the aquifer. Geonics Protem 47 and 57 systems with 492-foot and 328-foot transmitter-loop sizes were used to collect the TDEM soundings. A smooth model (vertical delineation of calculated apparent resistivity that represents an estimate [non-unique] of the true resistivity) for each sounding site was created using an iterative software program for inverse modeling. The effectiveness of using TDEM soundings to delineate the transition zone was indicated by comparing the distribution of resistivity in the aquifer with the distribution of dissolved solids concentrations in the aquifer along the profiles. TDEM sounding data show that, in general, the Edwards aquifer in the study area is characterized by a sharp change in resistivity from west to east. The western part of the Edwards aquifer in the study area shows higher resistivity than the eastern part. The higher resistivity regions correspond to lower dissolved solids concentrations (freshwater), and the lower resistivity regions correspond to higher dissolved solids concentrations (saline water). On the basis of reasonably close matches between the inferred locations of the freshwater/saline-water transition zone in the Edwards aquifer in the study area from resistivities and from dissolved solids concentrations in three of the four profiles, TDEM sounding appears to be a suitable tool for delineating the transition zone.

  19. Characterizing multiple timescales of stream and storage zone interaction that affect solute fate and transport in streams

    USGS Publications Warehouse

    Choi, Jungyill; Harvey, Judson W.; Conklin, Martha H.

    2000-01-01

    The fate of contaminants in streams and rivers is affected by exchange and biogeochemical transformation in slowly moving or stagnant flow zones that interact with rapid flow in the main channel. In a typical stream, there are multiple types of slowly moving flow zones in which exchange and transformation occur, such as stagnant or recirculating surface water as well as subsurface hyporheic zones. However, most investigators use transport models with just a single storage zone in their modeling studies, which assumes that the effects of multiple storage zones can be lumped together. Our study addressed the following question: Can a single‐storage zone model reliably characterize the effects of physical retention and biogeochemical reactions in multiple storage zones? We extended an existing stream transport model with a single storage zone to include a second storage zone. With the extended model we generated 500 data sets representing transport of nonreactive and reactive solutes in stream systems that have two different types of storage zones with variable hydrologic conditions. The one storage zone model was tested by optimizing the lumped storage parameters to achieve a best fit for each of the generated data sets. Multiple storage processes were categorized as possessing I, additive; II, competitive; or III, dominant storage zone characteristics. The classification was based on the goodness of fit of generated data sets, the degree of similarity in mean retention time of the two storage zones, and the relative distributions of exchange flux and storage capacity between the two storage zones. For most cases (>90%) the one storage zone model described either the effect of the sum of multiple storage processes (category I) or the dominant storage process (category III). Failure of the one storage zone model occurred mainly for category II, that is, when one of the storage zones had a much longer mean retention time (ts ratio > 5.0) and when the dominance of storage capacity and exchange flux occurred in different storage zones. We also used the one storage zone model to estimate a “single” lumped rate constant representing the net removal of a solute by biogeochemical reactions in multiple storage zones. For most cases the lumped rate constant that was optimized by one storage zone modeling estimated the flux‐weighted rate constant for multiple storage zones. Our results explain how the relative hydrologic properties of multiple storage zones (retention time, storage capacity, exchange flux, and biogeochemical reaction rate constant) affect the reliability of lumped parameters determined by a one storage zone transport model. We conclude that stream transport models with a single storage compartment will in most cases reliably characterize the dominant physical processes of solute retention and biogeochemical reactions in streams with multiple storage zones.

  20. Ultrahigh resolution topographic mapping of Mars with MRO HiRISE stereo images: Meter-scale slopes of candidate Phoenix landing sites

    USGS Publications Warehouse

    Kirk, R.L.; Howington-Kraus, E.; Rosiek, M.R.; Anderson, J.A.; Archinal, B.A.; Becker, K.J.; Cook, D.A.; Galuszka, D.M.; Geissler, P.E.; Hare, T.M.; Holmberg, I.M.; Keszthelyi, L.P.; Redding, B.L.; Delamere, W.A.; Gallagher, D.; Chapel, J.D.; Eliason, E.M.; King, R.; McEwen, A.S.

    2009-01-01

    The objectives of this paper are twofold: first, to report our estimates of the meter-to-decameter-scale topography and slopes of candidate landing sites for the Phoenix mission, based on analysis of Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) images with a typical pixel scale of 3 m and Mars Reconnaissance Orbiter (MRO) High Resolution Imaging Science Experiment (HiRISE) images at 0.3 m pixel-1 and, second, to document in detail the geometric calibration, software, and procedures on which the photogrammetric analysis of HiRISE data is based. A combination of optical design modeling, laboratory observations, star images, and Mars images form the basis for software in the U.S. Geological Survey Integrated Software for Imagers and Spectrometers (ISIS) 3 system that corrects the images for a variety of distortions with single-pixel or subpixel accuracy. Corrected images are analyzed in the commercial photogrammetric software SOCET SET (??BAE Systems), yielding digital topographic models (DTMs) with a grid spacing of 1 m (3-4 pixels) that require minimal interactive editing. Photoclinometry yields DTMs with single-pixel grid spacing. Slopes from MOC and HiRISE are comparable throughout the latitude zone of interest and compare favorably with those where past missions have landed successfully; only the Mars Exploration Rover (MER) B site in Meridiani Planum is smoother. MOC results at multiple locations have root-mean-square (RMS) bidirectional slopes of 0.8-4.5?? at baselines of 3-10 m. HiRISE stereopairs (one per final candidate site and one in the former site) yield 1.8-2.8?? slopes at 1-m baseline. Slopes at 1 m from photoclinometry are also in the range 2-3?? after correction for image blur. Slopes exceeding the 16?? Phoenix safety limit are extremely rare. Copyright 2008 by the American Geophysical Union.

  1. Analyzing the Implications of Climate Data on Plant Hardiness Zones for Green Infrastructure Planning: Case Study of Knoxville, Tennessee and Surrounding Region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sylvester, Linda M.; Omitaomu, Olufemi A.; Parish, Esther S.

    Downscaled climate data for Knoxville, Tennessee and the surrounding region were used to investigate future changing Plant Hardiness Zones due to climate change. The methodology used is the same as the US Department of Agriculture (USDA), well-known for their creation of the standard Plant Hardiness Zone map used by gardeners and planners. USDA data were calculated from observed daily data for 1976–2005. The modeled climate data for the past is daily data from 1980-2005 and the future data is projected for 2025–2050. The average of all the modeled annual extreme minimums for each time period of interest was calculated. Eachmore » 1 km raster cell was placed into zone categories based on temperature, using the same criteria and categories of the USDA. The individual models vary between suggesting little change to the Plant Hardiness Zones to suggesting Knoxville moves into the next two Hardiness Zones. But overall, the models suggest moving into the next warmer Zone. USDA currently has the Knoxville area categorized as Zone 7a. None of the Zones calculated from the climate data models placed Knoxville in Zone 7a for the similar time period. The models placed Knoxville in a cooler Hardiness Zone and projected the area to increase to Zone 7. The modeled temperature data appears to be slightly cooler than the actual temperature data and this may explain the zone discrepancy. However, overall Knoxville is projected to increase to the next warmer Zone. As the modeled data has Knoxville, overall, moving from Zone 6 to Zone 7, it can be inferred that Knoxville, Tennessee may increase from their current Zone 7 to Zone 8.« less

  2. Stope Stability Assessment and Effect of Horizontal to Vertical Stress Ratio on the Yielding and Relaxation Zones Around Underground Open Stopes Using Empirical and Finite Element Methods

    NASA Astrophysics Data System (ADS)

    Sepehri, Mohammadali; Apel, Derek; Liu, Wei

    2017-09-01

    Predicting the stability of open stopes can be a challenging task for underground mine engineers. For decades, the stability graph method has been used as the first step of open stope design around the world. However, there are some shortcomings with this method. For instance, the stability graph method does not account for the relaxation zones around the stopes. Another limitation of the stability graph is that this method cannot to be used to evaluate the stability of the stopes with high walls made of backfill materials. However, there are several analytical and numerical methods that can be used to overcome these limitations. In this study, both empirical and numerical methods have been used to assess the stability of an open stope located between mine levels N9225 and N9250 at Diavik diamond underground mine. It was shown that the numerical methods can be used as complementary methods along with other analytical and empirical methods to assess the stability of open stopes. A three dimensional elastoplastic finite element model was constructed using Abaqus software. In this paper a sensitivity analysis was performed to investigate the impact of the stress ratio "k" on the extent of the yielding and relaxation zones around the hangingwall and footwall of the understudy stope.

  3. Application of the graphics processor unit to simulate a near field diffraction

    NASA Astrophysics Data System (ADS)

    Zinchik, Alexander A.; Topalov, Oleg K.; Muzychenko, Yana B.

    2017-06-01

    For many years, computer modeling program used for lecture demonstrations. Most of the existing commercial software, such as Virtual Lab, LightTrans GmbH company are quite expensive and have a surplus capabilities for educational tasks. The complexity of the diffraction demonstrations in the near zone, due to the large amount of calculations required to obtain the two-dimensional distribution of the amplitude and phase. At this day, there are no demonstrations, allowing to show the resulting distribution of amplitude and phase without much time delay. Even when using Fast Fourier Transform (FFT) algorithms diffraction calculation speed in the near zone for the input complex amplitude distributions with size more than 2000 × 2000 pixels is tens of seconds. Our program selects the appropriate propagation operator from a prescribed set of operators including Spectrum of Plane Waves propagation and Rayleigh-Sommerfeld propagation (using convolution). After implementation, we make a comparison between the calculation time for the near field diffraction: calculations made on GPU and CPU, showing that using GPU for calculations diffraction pattern in near zone does increase the overall speed of algorithm for an image of size 2048×2048 sampling points and more. The modules are implemented as separate dynamic-link libraries and can be used for lecture demonstrations, workshops, selfstudy and students in solving various problems such as the phase retrieval task.

  4. Austenite grain growth simulation considering the solute-drag effect and pinning effect.

    PubMed

    Fujiyama, Naoto; Nishibata, Toshinobu; Seki, Akira; Hirata, Hiroyuki; Kojima, Kazuhiro; Ogawa, Kazuhiro

    2017-01-01

    The pinning effect is useful for restraining austenite grain growth in low alloy steel and improving heat affected zone toughness in welded joints. We propose a new calculation model for predicting austenite grain growth behavior. The model is mainly comprised of two theories: the solute-drag effect and the pinning effect of TiN precipitates. The calculation of the solute-drag effect is based on the hypothesis that the width of each austenite grain boundary is constant and that the element content maintains equilibrium segregation at the austenite grain boundaries. We used Hillert's law under the assumption that the austenite grain boundary phase is a liquid so that we could estimate the equilibrium solute concentration at the austenite grain boundaries. The equilibrium solute concentration was calculated using the Thermo-Calc software. Pinning effect was estimated by Nishizawa's equation. The calculated austenite grain growth at 1473-1673 K showed excellent correspondence with the experimental results.

  5. Global tectonic reconstructions with continuously deforming and evolving rigid plates

    NASA Astrophysics Data System (ADS)

    Gurnis, Michael; Yang, Ting; Cannon, John; Turner, Mark; Williams, Simon; Flament, Nicolas; Müller, R. Dietmar

    2018-07-01

    Traditional plate reconstruction methodologies do not allow for plate deformation to be considered. Here we present software to construct and visualize global tectonic reconstructions with deforming plates within the context of rigid plates. Both deforming and rigid plates are defined by continuously evolving polygons. The deforming regions are tessellated with triangular meshes such that either strain rate or cumulative strain can be followed. The finite strain history, crustal thickness and stretching factor of points within the deformation zones are tracked as Lagrangian points. Integrating these tools within the interactive platform GPlates enables specialized users to build and refine deforming plate models and integrate them with other models in time and space. We demonstrate the integrated platform with regional reconstructions of Cenozoic western North America, the Mesozoic South American Atlantic margin, and Cenozoic southeast Asia, embedded within global reconstructions, using different data and reconstruction strategies.

  6. An assessment of gas emanation hazard using a geographic information system and geostatistics.

    PubMed

    Astorri, F; Beaubien, S E; Ciotoli, G; Lombardi, S

    2002-03-01

    This paper describes the use of geostatistical analysis and GIS techniques to assess gas emanation hazards. The Mt. Vulsini volcanic district was selected for this study because of the wide range of natural phenomena locally present that affect gas migration in the near surface. In addition, soil gas samples that were collected in this area should allow for a calibration between the generated risk/hazard models and the measured distribution of toxic gas species at surface. The approach used during this study consisted of three general stages. First data were digitally organized into thematic layers, then software functions in the GIS program "ArcView" were used to compare and correlate these various layers, and then finally the produced "potential-risk" map was compared with radon soil gas data in order to validate the model and/or to select zones for further, more-detailed soil gas investigations.

  7. Spatial Variation of Slip Behavior Beneath the Alaska Peninsula Along Alaska-Aleutian Subduction Zone

    NASA Astrophysics Data System (ADS)

    Li, Shanshan; Freymueller, Jeffrey T.

    2018-04-01

    We resurveyed preexisting campaign Global Positioning System (GPS) sites and estimated a highly precise GPS velocity field for the Alaska Peninsula. We use the TDEFNODE software to model the slip deficit distribution using the new GPS velocities. We find systematic misfits to the vertical velocities from the optimal model that fits the horizontal velocities well, which cannot be explained by altering the slip distribution, so we use only the horizontal velocities in the study. Locations of three boundaries that mark significant along-strike change in the locking distribution are identified. The Kodiak segment is strongly locked, the Semidi segment is intermediate, the Shumagin segment is weakly locked, and the Sanak segment is dominantly creeping. We suggest that a change in preexisting plate fabric orientation on the downgoing plate has an important control on the along-strike variation in the megathrust locking distribution and subduction seismicity.

  8. Generic domain models in software engineering

    NASA Technical Reports Server (NTRS)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  9. Achieving sustainable ground-water management by using GIS-integrated simulation tools: the EU H2020 FREEWAT platform

    NASA Astrophysics Data System (ADS)

    Rossetto, Rudy; De Filippis, Giovanna; Borsi, Iacopo; Foglia, Laura; Toegl, Anja; Cannata, Massimiliano; Neumann, Jakob; Vazquez-Sune, Enric; Criollo, Rotman

    2017-04-01

    In order to achieve sustainable and participated ground-water management, innovative software built on the integration of numerical models within GIS software is a perfect candidate to provide a full characterization of quantitative and qualitative aspects of ground- and surface-water resources maintaining the time and spatial dimension. The EU H2020 FREEWAT project (FREE and open source software tools for WATer resource management; Rossetto et al., 2015) aims at simplifying the application of EU water-related Directives through an open-source and public-domain, GIS-integrated simulation platform for planning and management of ground- and surface-water resources. The FREEWAT platform allows to simulate the whole hydrological cycle, coupling the power of GIS geo-processing and post-processing tools in spatial data analysis with that of process-based simulation models. This results in a modeling environment where large spatial datasets can be stored, managed and visualized and where several simulation codes (mainly belonging to the USGS MODFLOW family) are integrated to simulate multiple hydrological, hydrochemical or economic processes. So far, the FREEWAT platform is a large plugin for the QGIS GIS desktop software and it integrates the following capabilities: • the AkvaGIS module allows to produce plots and statistics for the analysis and interpretation of hydrochemical and hydrogeological data; • the Observation Analysis Tool, to facilitate the import, analysis and visualization of time-series data and the use of these data to support model construction and calibration; • groundwater flow simulation in the saturated and unsaturated zones may be simulated using MODFLOW-2005 (Harbaugh, 2005); • multi-species advective-dispersive transport in the saturated zone can be simulated using MT3DMS (Zheng & Wang, 1999); the possibility to simulate viscosity- and density-dependent flows is further accomplished through SEAWAT (Langevin et al., 2007); • sustainable management of combined use of ground- and surface-water resources in rural environments is accomplished by the Farm Process module embedded in MODFLOW-OWHM (Hanson et al., 2014), which allows to dynamically integrate crop water demand and supply from ground- and surface-water; • UCODE_2014 (Poeter et al., 2014) is implemented to perform sensitivity analysis and parameter estimation to improve the model fit through an inverse, regression method based on the evaluation of an objective function. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT aims at enhancing science and participatory approach and evidence-based decision making in water resource management, hence producing relevant outcomes for policy implementation. Acknowledgements This paper is presented within the framework of the project FREEWAT, which has received funding from the European Union's HORIZON 2020 research and innovation programme under Grant Agreement n. 642224. References Hanson, R.T., Boyce, S.E., Schmid, W., Hughes, J.D., Mehl, S.M., Leake, S.A., Maddock, T., Niswonger, R.G. One-Water Hydrologic Flow Model (MODFLOW-OWHM), U.S. Geological Survey, Techniques and Methods 6-A51, 2014 134 p. Harbaugh A.W. (2005) - MODFLOW-2005, The U.S. Geological Survey Modular Ground-Water Model - the Ground-Water Flow Process. U.S. Geological Survey, Techniques and Methods 6-A16, 253 p. Langevin C.D., Thorne D.T. Jr., Dausman A.M., Sukop M.C. & Guo Weixing (2007) - SEAWAT Version 4: A Computer Program for Simulation of Multi-Species Solute and Heat Transport. U.S. Geological Survey Techniques and Methods 6-A22, 39 pp. Poeter E.P., Hill M.C., Lu D., Tiedeman C.R. & Mehl S. (2014) - UCODE_2014, with new capabilities to define parameters unique to predictions, calculate weights using simulated values, estimate parameters with SVD, evaluate uncertainty with MCMC, and more. Integrated Groundwater Modeling Center Report Number GWMI 2014-02. Rossetto, R., Borsi, I. & Foglia, L. FREEWAT: FREE and open source software tools for WATer resource management, Rendiconti Online Società Geologica Italiana, 2015, 35, 252-255. Zheng C. & Wang P.P. (1999) - MT3DMS, A modular three-dimensional multi-species transport model for simulation of advection, dispersion and chemical reactions of contaminants in groundwater systems. U.S. Army Engineer Research and Development Center Contract Report SERDP-99-1, Vicksburg, MS, 202 pp.

  10. Ion association in water solution of soil and vadose zone of chestnut saline solonetz as a driver of terrestrial carbon sink

    NASA Astrophysics Data System (ADS)

    Batukaev, Abdul-Malik A.; Endovitsky, Anatoly P.; Andreev, Andrey G.; Kalinichenko, Valery P.; Minkina, Tatiana M.; Dikaev, Zaurbek S.; Mandzhieva, Saglara S.; Sushkova, Svetlana N.

    2016-03-01

    The assessment of soil and vadose zone as the drains for carbon sink and proper modeling of the effects and extremes of biogeochemical cycles in the terrestrial biosphere are the key components to understanding the carbon cycle, global climate system, and aquatic and terrestrial system uncertainties. Calcium carbonate equilibrium causes saturation of solution with CaCO3, and it determines its material composition, migration and accumulation of salts. In a solution electrically neutral ion pairs are formed: CaCO30, CaSO40, MgCO30, and MgSO40, as well as charged ion pairs CaHCO3+, MgHCO3+, NaCO3-, NaSO4-, CaOH+, and MgOH+. The calcium carbonate equilibrium algorithm, mathematical model and original software to calculate the real equilibrium forms of ions and to determine the nature of calcium carbonate balance in a solution were developed. This approach conducts the quantitative assessment of real ion forms of solution in solonetz soil and vadose zone of dry steppe taking into account the ion association at high ionic strength of saline soil solution. The concentrations of free and associated ion form were calculated according to analytical ion concentration in real solution. In the iteration procedure, the equations were used to find the following: ion material balance, a linear interpolation of equilibrium constants, a method of ionic pairs, the laws of initial concentration preservation, operating masses of equilibrium system, and the concentration constants of ion pair dissociation. The coefficient of ion association γe was determined as the ratio of ions free form to analytical content of ion γe = Cass/Can. Depending on soil and vadose zone layer, concentration and composition of solution in the ionic pair's form are 11-52 % Ca2+; 22.2-54.6 % Mg2+; 1.1-10.5 % Na+; 3.7-23.8 HCO3-, 23.3-61.6 % SO42-, and up to 85.7 % CO32-. The carbonate system of soil and vadose zone water solution helps to explain the evolution of salted soils, vadose and saturation zones, and landscape. It also helps to improve the soil maintenance, plant nutrition and irrigation. The association of ions in soil solutions is one of the drivers promoting transformation of solution, excessive fluxes of carbon in the soil, and loss of carbon from soil through vadose zone.

  11. SU-F-R-36: Validating Quantitative Radiomic Texture Features for Oncologic PET: A Digital Phantom Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, F; Yang, Y; Young, L

    Purpose: Radiomic texture features derived from the oncologic PET have recently been brought under intense investigation within the context of patient stratification and treatment outcome prediction in a variety of cancer types; however, their validity has not yet been examined. This work is aimed to validate radiomic PET texture metrics through the use of realistic simulations in the ground truth setting. Methods: Simulation of FDG-PET was conducted by applying the Zubal phantom as an attenuation map to the SimSET software package that employs Monte Carlo techniques to model the physical process of emission imaging. A total of 15 irregularly-shaped lesionsmore » featuring heterogeneous activity distribution were simulated. For each simulated lesion, 28 texture features in relation to the intensity histograms (GLIH), grey-level co-occurrence matrices (GLCOM), neighborhood difference matrices (GLNDM), and zone size matrices (GLZSM) were evaluated and compared with their respective values extracted from the ground truth activity map. Results: In reference to the values from the ground truth images, texture parameters appearing on the simulated data varied with a range of 0.73–3026.2% for GLIH-based, 0.02–100.1% for GLCOM-based, 1.11–173.8% for GLNDM-based, and 0.35–66.3% for GLZSM-based. For majority of the examined texture metrics (16/28), their values on the simulated data differed significantly from those from the ground truth images (P-value ranges from <0.0001 to 0.04). Features not exhibiting significant difference comprised of GLIH-based standard deviation, GLCO-based energy and entropy, GLND-based coarseness and contrast, and GLZS-based low gray-level zone emphasis, high gray-level zone emphasis, short zone low gray-level emphasis, long zone low gray-level emphasis, long zone high gray-level emphasis, and zone size nonuniformity. Conclusion: The extent to which PET imaging disturbs texture appearance is feature-dependent and could be substantial. It is thus advised that use of PET texture parameters for predictive and prognostic measurements in oncologic setting awaits further systematic and critical evaluation.« less

  12. Simulation analysis of capacity and performance improvement in wastewater treatment plants: Case study of Alexandria eastern plant

    NASA Astrophysics Data System (ADS)

    Moursy, Aly; Sorour, Mohamed T.; Moustafa, Medhat; Elbarqi, Walid; Fayd, Mai; Elreedy, Ahmed

    2018-05-01

    This study concerns the upgrading of a real domestic wastewater treatment plant (WWTP) supported by simulation. The main aims of this work are to: (1) decide between two technologies to improve WWTP capacity and its nitrogen removal efficiency; membrane bioreactor (MBR) and integrated fixed film activated sludge (IFAS), and (2) perform a cost estimation analysis for the two proposed solutions. The model used was calibrated based on data from the existing WWTP, namely, Eastern plant and located in Alexandria, Egypt. The activated sludge model No. 1 (ASM1) was considered in the model analysis by GPS-X 7 software. Steady-state analysis revealed that high performances corresponded to high compliance with Egyptian standards were achieved by the two techniques; however, MBR was better. Nonetheless, the two systems showed poor nitrogen removal efficiency according to the current situation, which reveals that the plant needs a modification to add an anaerobic treatment unit before the aerobic zone.

  13. Description and quantitative modeling of oolitic reservoir analogs within the lower Kansas City Group (Pennsylvanian), southeastern Kansas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    French, J.A.; Watney, W.L.

    A significant number of petroleum reservoirs within the Kansas City Group in central and western Kansas are dominantly oolitic grainstones that cap 10- to 30-m-thick, shallowing-upward, carbonate-rich depositional sequences. Coeval units that occur at and near the surface in southeastern Kansas contain similar porous lithofacies that have been examined in detail via cores, outcrops, and an extensive log database to better understand the equivalent reservoirs. These studies suggest that individual oolitic, reservoir-quality units in the Bethany Falls Limestone (equivalent to the K zone in the subsurface) developed at several relative sea level stands that occurred during development of a highstandmore » systems tract within this depositional sequence. As many as three grain-rich parasequences may occur at a given location. The occurrence of multiple parasequences indicates a relatively complex history of K-zone deposition, which likely resulted in significant effects on reservoir architecture. Two-dimensional forward modeling of this sequence with our interactive, PC-based software has revealed that limited combinations of parameters such as shelf configuration, eustasy, sedimentation rates, and subsidence rates generate strata successions similar to those observed. Sensitivity analysis coupled with regional characterization of processes suggest ranges of values that these parameters could have had during deposition of these units. The ultimate goal of this modeling is to improve our ability to predict facies development in areas of potential and known hydrocarbon accumulations.« less

  14. Library Databases as Unexamined Classroom Technologies

    ERIC Educational Resources Information Center

    Faix, Allison

    2014-01-01

    In their 1994 article, "The Politics of the Interface: Power and its Exercise in Electronic Contact Zones," compositionists Cynthia Selfe and Richard Selfe give examples of how certain features of word processing software and other programs used in writing classrooms (including their icons, clip art, interfaces, and file structures) can…

  15. Application of Neural Network Technologies for Price Forecasting in the Liberalized Electricity Market

    NASA Astrophysics Data System (ADS)

    Gerikh, Valentin; Kolosok, Irina; Kurbatsky, Victor; Tomin, Nikita

    2009-01-01

    The paper presents the results of experimental studies concerning calculation of electricity prices in different price zones in Russia and Europe. The calculations are based on the intelligent software "ANAPRO" that implements the approaches based on the modern methods of data analysis and artificial intelligence technologies.

  16. Systems Librarian and Automation Review.

    ERIC Educational Resources Information Center

    Schuyler, Michael

    1992-01-01

    Discusses software sharing on computer networks and the need for proper bandwidth; and describes the technology behind FidoNet, a computer network made up of electronic bulletin boards. Network features highlighted include front-end mailers, Zone Mail Hour, Nodelist, NetMail, EchoMail, computer conferences, tosser and scanner programs, and host…

  17. Designing Asynchronous, Text-Based Computer Conferences: Ten Research-Based Suggestions

    ERIC Educational Resources Information Center

    Choitz, Paul; Lee, Doris

    2006-01-01

    Asynchronous computer conferencing refers to the use of computer software and a network enabling participants to post messages that allow discourse to continue even though interactions may be extended over days and weeks. Asynchronous conferences are time-independent, adapting to multiple time zones and learner schedules. Such activities as…

  18. [New methodologicalapproaches to establishment the sizes of the sanitary protection zone and roadside clear zones of civil airports].

    PubMed

    Kartyshev, O A

    2013-01-01

    This circumstance leads to considerable mistakes it creation of SPZ borders of the airports, in some cases it impedes development of the latters and causes objective difficulties for hygienic assessment of projects. In this article the results of studies on the creation and validation of two new domestic methods for the construction of impact zones of aircraft noise and dispersion of the concentrations of pollutants in assessing the negative impact of airports are considered. Both branch methods agreed upon with the Ministry of Transport have been harmonized with ICAO (International Civil Aviation Organization) requirements. The results of full-scale measurements have confirmed the possibilities of developed software for their implementation in the formation of a common SPZ border of an airport.

  19. Heart Rate Reduction With Ivabradine Protects Against Left Ventricular Remodeling by Attenuating Infarct Expansion and Preserving Remote-Zone Contractile Function and Synchrony in a Mouse Model of Reperfused Myocardial Infarction.

    PubMed

    O'Connor, Daniel M; Smith, Robert S; Piras, Bryan A; Beyers, Ronald J; Lin, Dan; Hossack, John A; French, Brent A

    2016-04-22

    Ivabradine selectively inhibits the pacemaker current of the sinoatrial node, slowing heart rate. Few studies have examined the effects of ivabradine on the mechanical properties of the heart after reperfused myocardial infarction (MI). Advances in ultrasound speckle-tracking allow strain analyses to be performed in small-animal models, enabling the assessment of regional mechanical function. After 1 hour of coronary occlusion followed by reperfusion, mice received 10 mg/kg per day of ivabradine dissolved in drinking water (n=10), or were treated as infarcted controls (n=9). Three-dimensional high-frequency echocardiography was performed at baseline and at days 2, 7, 14, and 28 post-MI. Speckle-tracking software was used to calculate intramural longitudinal myocardial strain (Ell) and strain rate. Standard deviation time to peak radial strain (SD Tpeak Err) and temporal uniformity of strain were calculated from short-axis cines acquired in the left ventricular remote zone. Ivabradine reduced heart rate by 8% to 16% over the course of 28 days compared to controls (P<0.001). On day 28 post-MI, the ivabradine group was found to have significantly smaller end-systolic volumes, greater ejection fraction, reduced wall thinning, and greater peak Ell and Ell rate in the remote zone, as well as globally. Temporal uniformity of strain and SD Tpeak Err were significantly smaller in the ivabradine-treated group by day 28 (P<0.05). High-frequency ultrasound speckle-tracking demonstrated decreased left ventricular remodeling and dyssynchrony, as well as improved mechanical performance in remote myocardium after heart rate reduction with ivabradine. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  20. Development and Application of Nonlinear Land-Use Regression Models

    NASA Astrophysics Data System (ADS)

    Champendal, Alexandre; Kanevski, Mikhail; Huguenot, Pierre-Emmanuel

    2014-05-01

    The problem of air pollution modelling in urban zones is of great importance both from scientific and applied points of view. At present there are several fundamental approaches either based on science-based modelling (air pollution dispersion) or on the application of space-time geostatistical methods (e.g. family of kriging models or conditional stochastic simulations). Recently, there were important developments in so-called Land Use Regression (LUR) models. These models take into account geospatial information (e.g. traffic network, sources of pollution, average traffic, population census, land use, etc.) at different scales, for example, using buffering operations. Usually the dimension of the input space (number of independent variables) is within the range of (10-100). It was shown that LUR models have some potential to model complex and highly variable patterns of air pollution in urban zones. Most of LUR models currently used are linear models. In the present research the nonlinear LUR models are developed and applied for Geneva city. Mainly two nonlinear data-driven models were elaborated: multilayer perceptron and random forest. An important part of the research deals also with a comprehensive exploratory data analysis using statistical, geostatistical and time series tools. Unsupervised self-organizing maps were applied to better understand space-time patterns of the pollution. The real data case study deals with spatial-temporal air pollution data of Geneva (2002-2011). Nitrogen dioxide (NO2) has caught our attention. It has effects on human health and on plants; NO2 contributes to the phenomenon of acid rain. The negative effects of nitrogen dioxides on plants are the reduction of the growth, production and pesticide resistance. And finally, the effects on materials: nitrogen dioxide increases the corrosion. The data used for this study consist of a set of 106 NO2 passive sensors. 80 were used to build the models and the remaining 36 have constituted the testing set. Missing data have been completed using multiple linear regression and annual average values of pollutant concentrations were computed. All sensors are dispersed homogeneously over the central urban area of Geneva. The main result of the study is that the nonlinear LUR models developed have demonstrated their efficiency in modelling complex phrenomena of air pollution in urban zones and significantly reduced the testing error in comparison with linear models. Further research deals with the development and application of other non-linear data-driven models (Kanevski et al. 2009). References Kanevski M., Pozdnoukhov A. and Timonin V. (2009). Machine Learning for Spatial Environmental Data. Theory, Applications and Software. EPLF Press, Lausanne.

  1. DEVELOPMENT OF A PORTABLE SOFTWARE LANGUAGE FOR PHYSIOLOGICALLY-BASED PHARMACOKINETIC (PBPK) MODELS

    EPA Science Inventory

    The PBPK modeling community has had a long-standing problem with modeling software compatibility. The numerous software packages used for PBPK models are, at best, minimally compatible. This creates problems ranging from model obsolescence due to software support discontinuation...

  2. FAST: A multi-processed environment for visualization of computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin

    1991-01-01

    Three-dimensional, unsteady, multi-zoned fluid dynamics simulations over full scale aircraft are typical of the problems being investigated at NASA Ames' Numerical Aerodynamic Simulation (NAS) facility on CRAY2 and CRAY-YMP supercomputers. With multiple processor workstations available in the 10-30 Mflop range, we feel that these new developments in scientific computing warrant a new approach to the design and implementation of analysis tools. These larger, more complex problems create a need for new visualization techniques not possible with the existing software or systems available as of this writing. The visualization techniques will change as the supercomputing environment, and hence the scientific methods employed, evolves even further. The Flow Analysis Software Toolkit (FAST), an implementation of a software system for fluid mechanics analysis, is discussed.

  3. Cost Estimation of Software Development and the Implications for the Program Manager

    DTIC Science & Technology

    1992-06-01

    Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome

  4. Software ``Best'' Practices: Agile Deconstructed

    NASA Astrophysics Data System (ADS)

    Fraser, Steven

    This workshop will explore the intersection of agility and software development in a world of legacy code-bases and large teams. Organizations with hundreds of developers and code-bases exceeding a million or tens of millions of lines of code are seeking new ways to expedite development while retaining and attracting staff who desire to apply “agile” methods. This is a situation where specific agile practices may be embraced outside of their usual zone of applicability. Here is where practitioners must understand both what “best practices” already exist in the organization - and how they might be improved or modified by applying “agile” approaches.

  5. Kinetic synergistic transitions in the Ostwald ripening processes

    NASA Astrophysics Data System (ADS)

    Sachkov, I. N.; Turygina, V. F.; Dolganov, A. N.

    2018-01-01

    There is proposed approach to mathematical description of the kinetic transitions in Ostwald ripening processes of volatile substance in nonuniformly heated porous materials. It is based upon the finite element method. There are implemented computer software. The main feature of the software is to calculate evaporation and condensation fluxes on the walls of a nonuniformly heated cylindrical capillary. Kinetic transitions are detected for three modes of volatile substances migration which are different by condensation zones location. There are controlling dimensionless parameters of the kinetic transition which are revealed during research. There is phase diagram of the Ostwald ripening process modes realization.

  6. Brain metabolic pattern analysis using a magnetic resonance spectra classification software in experimental stroke.

    PubMed

    Jiménez-Xarrié, Elena; Davila, Myriam; Candiota, Ana Paula; Delgado-Mederos, Raquel; Ortega-Martorell, Sandra; Julià-Sapé, Margarida; Arús, Carles; Martí-Fàbregas, Joan

    2017-01-13

    Magnetic resonance spectroscopy (MRS) provides non-invasive information about the metabolic pattern of the brain parenchyma in vivo. The SpectraClassifier software performs MRS pattern-recognition by determining the spectral features (metabolites) which can be used objectively to classify spectra. Our aim was to develop an Infarct Evolution Classifier and a Brain Regions Classifier in a rat model of focal ischemic stroke using SpectraClassifier. A total of 164 single-voxel proton spectra obtained with a 7 Tesla magnet at an echo time of 12 ms from non-infarcted parenchyma, subventricular zones and infarcted parenchyma were analyzed with SpectraClassifier ( http://gabrmn.uab.es/?q=sc ). The spectra corresponded to Sprague-Dawley rats (healthy rats, n = 7) and stroke rats at day 1 post-stroke (acute phase, n = 6 rats) and at days 7 ± 1 post-stroke (subacute phase, n = 14). In the Infarct Evolution Classifier, spectral features contributed by lactate + mobile lipids (1.33 ppm), total creatine (3.05 ppm) and mobile lipids (0.85 ppm) distinguished among non-infarcted parenchyma (100% sensitivity and 100% specificity), acute phase of infarct (100% sensitivity and 95% specificity) and subacute phase of infarct (78% sensitivity and 100% specificity). In the Brain Regions Classifier, spectral features contributed by myoinositol (3.62 ppm) and total creatine (3.04/3.05 ppm) distinguished among infarcted parenchyma (100% sensitivity and 98% specificity), non-infarcted parenchyma (84% sensitivity and 84% specificity) and subventricular zones (76% sensitivity and 93% specificity). SpectraClassifier identified candidate biomarkers for infarct evolution (mobile lipids accumulation) and different brain regions (myoinositol content).

  7. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the secondmore » place. 407 refs., 4 figs., 2 tabs.« less

  8. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place.more » 407 refs., 4 figs., 2 tabs.« less

  9. The instrument control software package for the Habitable-Zone Planet Finder spectrometer

    NASA Astrophysics Data System (ADS)

    Bender, Chad F.; Robertson, Paul; Stefansson, Gudmundur Kari; Monson, Andrew; Anderson, Tyler; Halverson, Samuel; Hearty, Frederick; Levi, Eric; Mahadevan, Suvrath; Nelson, Matthew; Ramsey, Larry; Roy, Arpita; Schwab, Christian; Shetrone, Matthew; Terrien, Ryan

    2016-08-01

    We describe the Instrument Control Software (ICS) package that we have built for The Habitable-Zone Planet Finder (HPF) spectrometer. The ICS controls and monitors instrument subsystems, facilitates communication with the Hobby-Eberly Telescope facility, and provides user interfaces for observers and telescope operators. The backend is built around the asynchronous network software stack provided by the Python Twisted engine, and is linked to a suite of custom hardware communication protocols. This backend is accessed through Python-based command-line and PyQt graphical frontends. In this paper we describe several of the customized subsystem communication protocols that provide access to and help maintain the hardware systems that comprise HPF, and show how asynchronous communication benefits the numerous hardware components. We also discuss our Detector Control Subsystem, built as a set of custom Python wrappers around a C-library that provides native Linux access to the SIDECAR ASIC and Hawaii-2RG detector system used by HPF. HPF will be one of the first astronomical instruments on sky to utilize this native Linux capability through the SIDECAR Acquisition Module (SAM) electronics. The ICS we have created is very flexible, and we are adapting it for NEID, NASA's Extreme Precision Doppler Spectrometer for the WIYN telescope; we will describe this adaptation, and describe the potential for use in other astronomical instruments.

  10. Influence of physical factors and geochemical conditions on groundwater acidification during enhanced reductive dechlorination

    NASA Astrophysics Data System (ADS)

    Brovelli, A.; Barry, D. A.; Robinson, C.; Gerhard, J.

    2010-12-01

    Enhanced reductive dehalogenation is an attractive in situ treatment technology for chlorinated contaminants. The process includes two acid-forming microbial reactions: fermentation of an organic substrate resulting in short-chain fatty acids, and dehalogenation resulting in hydrochloric acid. The accumulation of acids and the resulting drop of groundwater pH are controlled by the mass and distribution of chlorinated solvents in the source zone, type of electron donor, availability of alternative terminal electron acceptors and presence of soil mineral phases able to buffer the pH (such as carbonates). Groundwater acidification may reduce or halt microbial activity, and thus dehalogenation, significantly increasing the time and costs required to remediate the aquifer. For this reason, research in this area is gaining increasing attention. In previous work (Robinson et al., 2009 407:4560, Sci. Tot. Environ, Robinson and Barry, 2009 24:1332, Environ. Model. & Software, Brovelli et al., 2010, submitted), a detailed geochemical and groundwater flow model able to predict the pH change occurring during reductive dehalogenation was developed. The model accounts for the main processes influencing groundwater pH, including the groundwater composition, the electron donor used and soil mineral phase interactions. In this study, the model was applied to investigate how spatial variability occurring at the field scale affects groundwater pH and dechlorination rates. Numerical simulations were conducted to examine the influence of heterogeneous hydraulic conductivity on the distribution of the injected, fermentable substrate and on the accumulation/dilution of the acidic products of reductive dehalogenation. The influence of the geometry of the DNAPL source zone was studied, as well as the spatial distribution of soil minerals. The results of this study showed that the heterogeneous distribution of the soil properties have a potentially large effect on the remediation efficiency. For example, zones of high hydraulic conductivity can prevent the accumulation of acids and alleviate the problem of groundwater acidification. The conclusions drawn and insights gained from this modeling study will be useful to design improved in situ enhanced dehalogenation remediation schemes.

  11. Mapping for the masses: using free remote sensing data for disaster management

    NASA Astrophysics Data System (ADS)

    Teeuw, R.; McWilliam, N.; Morris, N.; Saunders, C.

    2009-04-01

    We examine the uses of free satellite imagery and Digital Elevation Models (DEMs) for disaster management, targeting three data sources: the United Nations Charter on Space and Disasters, Google Earth and internet-based satellite data archives, such as the Global Land Cover Facility (GLCF). The research has assessed SRTM and ASTER DEM data, Landsat TM/ETM+ and ASTER imagery, as well as utilising datasets and basic GIS operations available via Google Earth. As an aid to Disaster Risk Reduction, four sets of maps can be produced from satellite data: (i) Multiple Geohazards: areas prone to slope instability, coastal inundation and fluvial flooding; (ii) Vulnerability: population density, habitation types, land cover types and infrastructure; (iii) Disaster Risk: produced by combining severity scores from (i) and (ii); (iv) Reconstruction: zones of rock/sediment with construction uses; areas of woodland (for fuel/construction) water sources; transport routes; zones suitable for re-settlement. This set of Disaster Risk Reduction maps are ideal for regional (1:50,000 to 1:250,000 scale) planning for in low-income countries: more detailed assessments require relatively expensive high resolution satellite imagery or aerial photography, although Google Earth has a good track record for posting high-res imagery of disaster zones (e.g. the 2008 Burma storm surge). The Disaster Risk maps highlight areas of maximum risk to a region's emergency planners and decision makers, enabling various types of public education and other disaster mitigation measures. The Reconstruction map also helps to save lives, by facilitating disaster recovery. Many problems have been identified. Access to the UN Charter imagery is fine after a disaster, but very difficult if assessing pre-disaster indicators: the data supplied also tends to be pre-processed, when some relief agencies would prefer to have raw data. The limited and expensive internet access in many developing countries limits access to archives of free satellite data, such as the GLCF. Finally, data integration, spatial/temporal analysis and map production are all hindered by the high price of most GIS software, making the development of suitable open-source software a priority.

  12. Assessment of eco-environmental quality of Western Taiwan Straits Economic Zone.

    PubMed

    Ma, He; Shi, Longyu

    2016-05-01

    Regional eco-environmental quality is the key and foundation to the sustainable socio-economic development of a region. Eco-environmental quality assessment can reveal the capacity of sustainable socio-economic development in a region and the degree of coordination between social production and the living environment. As part of a new development strategy for Fujian Province, the Western Taiwan Straits Economic Zone (hereafter referred to as the Economic Zone) provides an important guarantee for the development of China's southeastern coastal area. Based on ecological and remote sensing data on the Economic Zone obtained in 2000, 2005, and 2010, this study investigated county-level administrative regions with a comprehensive index of eco-environmental indicators. An objective weighting method was used to determine the importance of each indicator. This led to the development of an indicator system to assess the eco-environmental quality of the economic zone. ArcGIS software was used to assess the eco-environmental quality of the economic zone based on each indicator. The eco-environmental quality index (EQI) of the county-level administrative regions was calculated. The overall eco-environmental quality of the Economic Zone during the period studied is described and analyzed. The results show that the overall eco-environmental quality of the Economic Zone is satisfactory, but significant intraregional differences still exist. The key to improving the overall eco-environmental quality of this area is to restore vegetation and preserve biodiversity.

  13. Usability Prediction & Ranking of SDLC Models Using Fuzzy Hierarchical Usability Model

    NASA Astrophysics Data System (ADS)

    Gupta, Deepak; Ahlawat, Anil K.; Sagar, Kalpna

    2017-06-01

    Evaluation of software quality is an important aspect for controlling and managing the software. By such evaluation, improvements in software process can be made. The software quality is significantly dependent on software usability. Many researchers have proposed numbers of usability models. Each model considers a set of usability factors but do not cover all the usability aspects. Practical implementation of these models is still missing, as there is a lack of precise definition of usability. Also, it is very difficult to integrate these models into current software engineering practices. In order to overcome these challenges, this paper aims to define the term `usability' using the proposed hierarchical usability model with its detailed taxonomy. The taxonomy considers generic evaluation criteria for identifying the quality components, which brings together factors, attributes and characteristics defined in various HCI and software models. For the first time, the usability model is also implemented to predict more accurate usability values. The proposed system is named as fuzzy hierarchical usability model that can be easily integrated into the current software engineering practices. In order to validate the work, a dataset of six software development life cycle models is created and employed. These models are ranked according to their predicted usability values. This research also focuses on the detailed comparison of proposed model with the existing usability models.

  14. Correlation analysis of air pollutant index levels and dengue cases across five different zones in Selangor, Malaysia.

    PubMed

    Thiruchelvam, Loshini; Dass, Sarat C; Zaki, Rafdzah; Yahya, Abqariyah; Asirvadam, Vijanth S

    2018-05-07

    This study investigated the potential relationship between dengue cases and air quality - as measured by the Air Pollution Index (API) for five zones in the state of Selangor, Malaysia. Dengue case patterns can be learned using prediction models based on feedback (lagged terms). However, the question whether air quality affects dengue cases is still not thoroughly investigated based on such feedback models. This work developed dengue prediction models using the autoregressive integrated moving average (ARIMA) and ARIMA with an exogeneous variable (ARIMAX) time series methodologies with API as the exogeneous variable. The Box Jenkins approach based on maximum likelihood was used for analysis as it gives effective model estimates and prediction. Three stages of model comparison were carried out for each zone: first with ARIMA models without API, then ARIMAX models with API data from the API station for that zone and finally, ARIMAX models with API data from the zone and spatially neighbouring zones. Bayesian Information Criterion (BIC) gives goodness-of-fit versus parsimony comparisons between all elicited models. Our study found that ARIMA models, with the lowest BIC value, outperformed the rest in all five zones. The BIC values for the zone of Kuala Selangor were -800.66, -796.22, and -790.5229, respectively, for ARIMA only, ARIMAX with single API component and ARIMAX with API components from its zone and spatially neighbouring zones. Therefore, we concluded that API levels, either temporally for each zone or spatio- temporally based on neighbouring zones, do not have a significant effect on dengue cases.

  15. Long-term predictions of minewater geothermal systems heat resources

    NASA Astrophysics Data System (ADS)

    Harcout-Menou, Virginie; de ridder, fjo; laenen, ben; ferket, helga

    2014-05-01

    Abandoned underground mines usually flood due to the natural rise of the water table. In most cases the process is relatively slow giving the mine water time to equilibrate thermally with the the surrounding rock massif. Typical mine water temperature is too low to be used for direct heating, but is well suited to be combined with heat pumps. For example, heat extracted from the mine can be used during winter for space heating, while the process could be reversed during summer to provide space cooling. Altough not yet widely spread, the use of low temperature geothermal energy from abandoned mines has already been implemented in the Netherlands, Spain, USA, Germany and the UK. Reliable reservoir modelling is crucial to predict how geothermal minewater systems will react to predefined exploitation schemes and to define the energy potential and development strategy of a large-scale geothermal - cold/heat storage mine water systems. However, most numerical reservoir modelling software are developed for typical environments, such as porous media (a.o. many codes developed for petroleum reservoirs or groundwater formations) and cannot be applied to mine systems. Indeed, mines are atypical environments that encompass different types of flow, namely porous media flow, fracture flow and open pipe flow usually described with different modelling codes. Ideally, 3D models accounting for the subsurface geometry, geology, hydrogeology, thermal aspects and flooding history of the mine as well as long-term effects of heat extraction should be used. A new modelling approach is proposed here to predict the long-term behaviour of Minewater geothermal systems in a reactive and reliable manner. The simulation method integrates concepts for heat and mass transport through various media (e.g., back-filled areas, fractured rock, fault zones). As a base, the standard software EPANET2 (Rossman 1999; 2000) was used. Additional equations for describing heat flow through the mine (both through open pipes and from the rock massif) have been implemented. Among others, parametric methods are used to bypass some shortcomings in the physical models used for the subsurface. The advantage is that the complete geometry of the mine workings can be integrated and that computing is fast enough to allow implementing and testing several scenarios (e.g. contributions from fault zones, different assumptions about the actual status of shafts, drifts and mined out areas) in an efficient way (Ferket et al., 2011). EPANET allows to incorporate the full complexity of the subsurface mine structure. As a result, the flooded mine is considered as a network of pipes, each with a custom-defined diameter, length and roughness.

  16. Bathymetric surveys of the Kootenai River near Bonners Ferry, Idaho, water year 2011

    USGS Publications Warehouse

    Fosness, Ryan L.

    2013-01-01

    In 2009, the Kootenai Tribe of Idaho released and implemented the Kootenai River Habitat Restoration Master Plan. This plan aimed to restore, enhance, and maintain the Kootenai River habitat and landscape to support and sustain habitat conditions for aquatic species and animal populations. In support of these restoration efforts, the U.S. Geological Survey, in cooperation with the Kootenai Tribe of Idaho, conducted high-resolution multibeam echosounder bathymetric surveys in May, June, and July 2011, as a baseline bathymetric monitoring survey on the Kootenai River near Bonners Ferry, Idaho. Three channel patterns or reaches exist in the study area—braided, meander, and a transitional zone connecting the braided and meander reaches. Bathymetric data were collected at three study areas in 2011 to provide: (1) surveys in unmapped portions of the meander reach; (2) monitoring of the presence and extent of sand along planned lines within a section of the meander reach; and (3) monitoring aggradation and degradation of the channel bed at specific cross sections within the braided reach and transitional zone. The bathymetric data will be used to update and verify flow models, calibrate and verify sediment transport modeling efforts, and aid in the biological assessment in support of the Kootenai River Habitat Restoration Master Plan. The data and planned lines for each study reach were produced in ASCII XYZ format supported by most geospatial software.

  17. State Enterprise Zone Programs: Have They Worked?

    ERIC Educational Resources Information Center

    Peters, Alan H.; Fisher, Peter S.

    The effectiveness of state enterprise zone programs was examined by using a hypothetical-firm model called the Tax and Incentives Model-Enterprise Zones (TAIM-ez) model to analyze the value of enterprise zone incentives to businesses across the United States and especially in the 13 states that had substantial enterprise zone programs by 1990. The…

  18. An open source software for fast grid-based data-mining in spatial epidemiology (FGBASE).

    PubMed

    Baker, David M; Valleron, Alain-Jacques

    2014-10-30

    Examining whether disease cases are clustered in space is an important part of epidemiological research. Another important part of spatial epidemiology is testing whether patients suffering from a disease are more, or less, exposed to environmental factors of interest than adequately defined controls. Both approaches involve determining the number of cases and controls (or population at risk) in specific zones. For cluster searches, this often must be done for millions of different zones. Doing this by calculating distances can lead to very lengthy computations. In this work we discuss the computational advantages of geographical grid-based methods, and introduce an open source software (FGBASE) which we have created for this purpose. Geographical grids based on the Lambert Azimuthal Equal Area projection are well suited for spatial epidemiology because they preserve area: each cell of the grid has the same area. We describe how data is projected onto such a grid, as well as grid-based algorithms for spatial epidemiological data-mining. The software program (FGBASE), that we have developed, implements these grid-based methods. The grid based algorithms perform extremely fast. This is particularly the case for cluster searches. When applied to a cohort of French Type 1 Diabetes (T1D) patients, as an example, the grid based algorithms detected potential clusters in a few seconds on a modern laptop. This compares very favorably to an equivalent cluster search using distance calculations instead of a grid, which took over 4 hours on the same computer. In the case study we discovered 4 potential clusters of T1D cases near the cities of Le Havre, Dunkerque, Toulouse and Nantes. One example of environmental analysis with our software was to study whether a significant association could be found between distance to vineyards with heavy pesticide. None was found. In both examples, the software facilitates the rapid testing of hypotheses. Grid-based algorithms for mining spatial epidemiological data provide advantages in terms of computational complexity thus improving the speed of computations. We believe that these methods and this software tool (FGBASE) will lower the computational barriers to entry for those performing epidemiological research.

  19. TerraFERMA: The Transparent Finite Element Rapid Model Assembler for multi-physics problems in the solid Earth sciences

    NASA Astrophysics Data System (ADS)

    Spiegelman, M. W.; Wilson, C. R.; Van Keken, P. E.

    2013-12-01

    We announce the release of a new software infrastructure, TerraFERMA, the Transparent Finite Element Rapid Model Assembler for the exploration and solution of coupled multi-physics problems. The design of TerraFERMA is driven by two overarching computational needs in Earth sciences. The first is the need for increased flexibility in both problem description and solution strategies for coupled problems where small changes in model assumptions can often lead to dramatic changes in physical behavior. The second is the need for software and models that are more transparent so that results can be verified, reproduced and modified in a manner such that the best ideas in computation and earth science can be more easily shared and reused. TerraFERMA leverages three advanced open-source libraries for scientific computation that provide high level problem description (FEniCS), composable solvers for coupled multi-physics problems (PETSc) and a science neutral options handling system (SPuD) that allows the hierarchical management of all model options. TerraFERMA integrates these libraries into an easier to use interface that organizes the scientific and computational choices required in a model into a single options file, from which a custom compiled application is generated and run. Because all models share the same infrastructure, models become more reusable and reproducible. TerraFERMA inherits much of its functionality from the underlying libraries. It currently solves partial differential equations (PDE) using finite element methods on simplicial meshes of triangles (2D) and tetrahedra (3D). The software is particularly well suited for non-linear problems with complex coupling between components. We demonstrate the design and utility of TerraFERMA through examples of thermal convection and magma dynamics. TerraFERMA has been tested successfully against over 45 benchmark problems from 7 publications in incompressible and compressible convection, magmatic solitary waves and Stokes flow with free surfaces. We have been using it extensively for research in basic magma dynamics, fluid flow in subduction zones and reactive cracking in poro-elastic materials. TerraFERMA is open-source and available as a git repository at bitbucket.org/tferma/tferma and through CIG. Instability of a 1-D magmatic solitary wave to spherical 3D waves calculated using TerraFERMA

  20. Coastal Zone Mapping and Imaging Lidar (CZMIL): first flights and system validation

    NASA Astrophysics Data System (ADS)

    Feygels, Viktor I.; Park, Joong Yong; Aitken, Jennifer; Kim, Minsu; Payment, Andy; Ramnath, Vinod

    2012-09-01

    CZMIL is an integrated lidar-imagery sensor system and software suite designed for the highly automated generation of physical and environmental information products for mapping the coastal zone. This paper presents the results of CZMIL system validation in turbid water conditions on the Gulf Coast of Mississippi and in relatively clear water conditions in Florida in late spring 2012. The system performance test shows that CZMIL successfully achieved 7-8m depth in Kd =0.46m-1 (Kd is the diffuse attenuation coefficient) in Mississippi and up to 41m when Kd=0.11m-1 in Florida. With a seven segment array for topographic mode and the shallow water zone, CZMIL generated high resolution products with a maximum pulse rate of 70 kHz, and with 10 kHz in the deep water zone. Diffuse attenuation coefficient, bottom reflectance and other environmental parameters for the whole multi km2 area were estimated based on fusion of lidar and CASI-1500 hyperspectral camera data.

  1. Study on Frequency content in seismic hazard analysis in West Azarbayjan and East Azarbayjan provinces (Iran)

    NASA Astrophysics Data System (ADS)

    Behzadafshar, K.; Abbaszadeh Shahri, A.; Isfandiari, K.

    2012-12-01

    ABSTRACT: Iran plate is prone to earthquake, occurrence of destructive earthquakes approximately every 5 years certify it. Due to existence of happened great earthquakes and large number of potential seismic sources (active faults) which some of them are responsible for great earthquakes the North-West of Iran which is located in junction of Alborz and Zagros seismotectonic provinces (Mirzaii et al, 1998) is an interesting area for seismologists. Considering to population and existence of large cities like Tabriz, Ardabil and Orumiyeh which play crucial role in industry and economy of Iran, authors decided to focus on study of seismic hazard assessment in these two provinces to achieve ground acceleration in different frequency content and indicate critical frequencies in the studied area. It is important to note that however lots of studies have been done in North -West of Iran, but building code modifications also need frequency content analysis to asses seismic hazard more precisely which has been done in the present study. Furthermore, in previous studies have been applied free download softwares which were provided before 2000 but the most important advantage of this study is applying professional industrial software which has been written in 2009 and provided by authors. This applied software can cover previous software weak points very well such as gridding potential sources, attention to the seismogenic zone and applying attenuation relationships directly. Obtained hazard maps illustrate that maximum accelerations will be experienced in North West to South East direction which increased by frequency reduction from 100 Hz to 10 Hz then decreased by frequency reduce (to 0.25 Hz). Maximum acceleration will be occurred in the basement in 10 HZ frequency content. Keywords: hazard map, Frequency content, seismogenic zone, Iran

  2. Auditory feedback improves heart rate moderation during moderate-intensity exercise.

    PubMed

    Shaykevich, Alex; Grove, J Robert; Jackson, Ben; Landers, Grant J; Dimmock, James

    2015-05-01

    The objective of this study is to determine whether exposure to automated HR feedback can produce improvements in the ability to regulate HR during moderate-intensity exercise and to evaluate the persistence of these improvements after feedback is removed. Twenty healthy adults performed 10 indoor exercise sessions on cycle ergometers over 5 wk after a twice-weekly schedule. During these sessions (FB), participants received auditory feedback designed to maintain HR within a personalized, moderate-intensity training zone between 70% and 80% of estimated maximum HR. All feedback was delivered via a custom mobile software application. Participants underwent an initial assessment (PREFB) to measure their ability to maintain exercise intensity defined by the training zone without use of feedback. After completing the feedback training, participants performed three additional assessments identical to PREFB at 1 wk (POST1), 2 wk (POST2), and 4 wk (POST3) after their last feedback session. Time in zone (TIZ), defined as the ratio of the time spent within the training zone divided by the overall time of exercise, rate of perceived exertion, instrumental attitudes, and affective attitudes were then evaluated to assess results using two-way, mixed-model ANOVA with sessions and gender as factors. Training with feedback significantly improved TIZ (P < 0.01) compared with PREFB. An absence of significant differences in TIZ between FB, POST1, POST2, and POST3 (P ≥ 0.35) indicated that these improvements were maintained after feedback was removed. No significant differences in rate of perceived exertion (P ≥ 0.40) or attitude measures (P ≥ 0.30) were observed. Auditory biofeedback is an effective mechanism for entraining HR regulation during moderate-intensity exercise in healthy adults.

  3. Validation and Verification of LADEE Models and Software

    NASA Technical Reports Server (NTRS)

    Gundy-Burlet, Karen

    2013-01-01

    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  4. Kinematic Reconstruction of the Costa Rican Margin: Evidence for Discontinuities in Deformation Across the Margin

    NASA Astrophysics Data System (ADS)

    Gose, B.; Bangs, N. L.; McIntosh, K. D.

    2016-12-01

    Recently acquired 3D seismic reflection data shows that both in-sequence and out-of sequence faults makeup the interior structure of the Costa Rican convergent margin. Recent studies have found evidence for a phase of accretion that led to the formation of a series of thrust-bounded folds that are easily observable within the margin-wedge fabric. Along a primary 2D transect through the center of the 3D survey, faults partition the outer 23km of the margin into 8 fault-bounded segments that can be divided into two characteristic zones, those closest to the trench (S1-S3) and those furthest (S4-S8) separated by a slope break 10 km from the trench (Fig 1). To better understand the observed structure, each segment was characterized as a fault-propagation fold and geometrically modeled using Paradigm's Geosec 2D software. Kinematic flexural slip modules were applied in order to perform bed-length balancing and generate a geologic reconstruction of the margin. Results show the section of the margin spanning from 3-23km from the deformation front has experienced 27.5% shortening, assuming the interpreted horizons were initially flat and continuous. The individual values for percent shortening are not consistent across the margin but distributed into two zones each with progressively increasing strain in the landward direction. Zone 2 (landward) begins with a percent shortening for S8 at 22% and linearly decreases to 2% shortening as you move seaward to S4. The Zone 1-2 boundary is marked by a slope break coinciding with an increase in percent shortening (S3,15%) followed by less shortening seaward (9%, 8%). Shortening and the associated strain is focused at the landward side of the two zones, within S3 and S8. We conclude that the Costa Rican margin has some degree of mechanical partitioning, with a notable discontinuity in strain patterns occurring 10 km from the trench.

  5. High-resolution seismic-reflection data offshore of Dana Point, southern California borderland

    USGS Publications Warehouse

    Sliter, Ray W.; Ryan, Holly F.; Triezenberg, Peter J.

    2010-01-01

    The U.S. Geological Survey collected high-resolution shallow seismic-reflection profiles in September 2006 in the offshore area between Dana Point and San Mateo Point in southern Orange and northern San Diego Counties, California. Reflection profiles were located to image folds and reverse faults associated with the San Mateo fault zone and high-angle strike-slip faults near the shelf break (the Newport-Inglewood fault zone) and at the base of the slope. Interpretations of these data were used to update the USGS Quaternary fault database and in shaking hazard models for the State of California developed by the Working Group for California Earthquake Probabilities. This cruise was funded by the U.S. Geological Survey Coastal and Marine Catastrophic Hazards project. Seismic-reflection data were acquired aboard the R/V Sea Explorer, which is operated by the Ocean Institute at Dana Point. A SIG ELC820 minisparker seismic source and a SIG single-channel streamer were used. More than 420 km of seismic-reflection data were collected. This report includes maps of the seismic-survey sections, linked to Google Earth? software, and digital data files showing images of each transect in SEG-Y, JPEG, and TIFF formats.

  6. Improvement of Steam Turbine Operational Performance and Reliability with using Modern Information Technologies

    NASA Astrophysics Data System (ADS)

    Brezgin, V. I.; Brodov, Yu M.; Kultishev, A. Yu

    2017-11-01

    The report presents improvement methods review in the fields of the steam turbine units design and operation based on modern information technologies application. In accordance with the life cycle methodology support, a conceptual model of the information support system during life cycle main stages (LC) of steam turbine unit is suggested. A classifying system, which ensures the creation of sustainable information links between the engineer team (manufacture’s plant) and customer organizations (power plants), is proposed. Within report, the principle of parameterization expansion beyond the geometric constructions at the design and improvement process of steam turbine unit equipment is proposed, studied and justified. The report presents the steam turbine unit equipment design methodology based on the brand new oil-cooler design system that have been developed and implemented by authors. This design system combines the construction subsystem, which is characterized by extensive usage of family tables and templates, and computation subsystem, which includes a methodology for the thermal-hydraulic zone-by-zone oil coolers design calculations. The report presents data about the developed software for operational monitoring, assessment of equipment parameters features as well as its implementation on five power plants.

  7. Design and implementation of a GPS guidance system for agricultural tractors using augmented reality technology.

    PubMed

    Santana-Fernández, Javier; Gómez-Gil, Jaime; del-Pozo-San-Cirilo, Laura

    2010-01-01

    Current commercial tractor guidance systems present to the driver information to perform agricultural tasks in the best way. This information generally includes a treated zones map referenced to the tractor's position. Unlike actual guidance systems where the tractor driver must mentally associate treated zone maps and the plot layout, this paper presents a guidance system that using Augmented Reality (AR) technology, allows the tractor driver to see the real plot though eye monitor glasses with the treated zones in a different color. The paper includes a description of the system hardware and software, a real test done with image captures seen by the tractor driver, and a discussion predicting that the historical evolution of guidance systems could involve the use of AR technology in the agricultural guidance and monitoring systems.

  8. Design and Implementation of a GPS Guidance System for Agricultural Tractors Using Augmented Reality Technology

    PubMed Central

    Santana-Fernández, Javier; Gómez-Gil, Jaime; del-Pozo-San-Cirilo, Laura

    2010-01-01

    Current commercial tractor guidance systems present to the driver information to perform agricultural tasks in the best way. This information generally includes a treated zones map referenced to the tractor’s position. Unlike actual guidance systems where the tractor driver must mentally associate treated zone maps and the plot layout, this paper presents a guidance system that using Augmented Reality (AR) technology, allows the tractor driver to see the real plot though eye monitor glasses with the treated zones in a different color. The paper includes a description of the system hardware and software, a real test done with image captures seen by the tractor driver, and a discussion predicting that the historical evolution of guidance systems could involve the use of AR technology in the agricultural guidance and monitoring systems. PMID:22163479

  9. Work zone safety analysis and modeling: a state-of-the-art review.

    PubMed

    Yang, Hong; Ozbay, Kaan; Ozturk, Ozgur; Xie, Kun

    2015-01-01

    Work zone safety is one of the top priorities for transportation agencies. In recent years, a considerable volume of research has sought to determine work zone crash characteristics and causal factors. Unlike other non-work zone-related safety studies (on both crash frequency and severity), there has not yet been a comprehensive review and assessment of methodological approaches for work zone safety. To address this deficit, this article aims to provide a comprehensive review of the existing extensive research efforts focused on work zone crash-related analysis and modeling, in the hopes of providing researchers and practitioners with a complete overview. Relevant literature published in the last 5 decades was retrieved from the National Work Zone Crash Information Clearinghouse and the Transport Research International Documentation database and other public digital libraries and search engines. Both peer-reviewed publications and research reports were obtained. Each study was carefully reviewed, and those that focused on either work zone crash data analysis or work zone safety modeling were identified. The most relevant studies are specifically examined and discussed in the article. The identified studies were carefully synthesized to understand the state of knowledge on work zone safety. Agreement and inconsistency regarding the characteristics of the work zone crashes discussed in the descriptive studies were summarized. Progress and issues about the current practices on work zone crash frequency and severity modeling are also explored and discussed. The challenges facing work zone safety research are then presented. The synthesis of the literature suggests that the presence of a work zone is likely to increase the crash rate. Crashes are not uniformly distributed within work zones and rear-end crashes are the most prevalent type of crashes in work zones. There was no across-the-board agreement among numerous papers reviewed on the relationship between work zone crashes and other factors such as time, weather, victim severity, traffic control devices, and facility types. Moreover, both work zone crash frequency and severity models still rely on relatively simple modeling techniques and approaches. In addition, work zone data limitations have caused a number of challenges in analyzing and modeling work zone safety. Additional efforts on data collection, developing a systematic data analysis framework, and using more advanced modeling approaches are suggested as future research tasks.

  10. Gis-Based Spatial Statistical Analysis of College Graduates Employment

    NASA Astrophysics Data System (ADS)

    Tang, R.

    2012-07-01

    It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  11. Frame analysis of UNNES electric bus chassis construction using finite element method

    NASA Astrophysics Data System (ADS)

    Nugroho, Untoro; Anis, Samsudin; Kusumawardani, Rini; Khoiron, Ahmad Mustamil; Maulana, Syahdan Sigit; Irvandi, Muhammad; Mashdiq, Zia Putra

    2018-03-01

    Designing the chassis needs to be done element simulation analysis to gain chassis strength on an electric bus. The purpose of this research is to get the results of chassis simulation on an electric bus when having load use FEM (Finite element method). This research was conduct in several stages of process, such as modeling chassis by Autodesk Inventor and finite element simulation software. The frame is going to be simulated with static loading by determine fixed support and then will be given the vertical force. The fixed on the frame is clamped at both the front and rear suspensions. After the simulation based on FEM it can conclude that frame is still under elastic zone, until the frame design is safe to use.

  12. Semi-automatic mapping of fault rocks on a Digital Outcrop Model, Gole Larghe Fault Zone (Southern Alps, Italy)

    NASA Astrophysics Data System (ADS)

    Vho, Alice; Bistacchi, Andrea

    2015-04-01

    A quantitative analysis of fault-rock distribution is of paramount importance for studies of fault zone architecture, fault and earthquake mechanics, and fluid circulation along faults at depth. Here we present a semi-automatic workflow for fault-rock mapping on a Digital Outcrop Model (DOM). This workflow has been developed on a real case of study: the strike-slip Gole Larghe Fault Zone (GLFZ). It consists of a fault zone exhumed from ca. 10 km depth, hosted in granitoid rocks of Adamello batholith (Italian Southern Alps). Individual seismogenic slip surfaces generally show green cataclasites (cemented by the precipitation of epidote and K-feldspar from hydrothermal fluids) and more or less well preserved pseudotachylytes (black when well preserved, greenish to white when altered). First of all, a digital model for the outcrop is reconstructed with photogrammetric techniques, using a large number of high resolution digital photographs, processed with VisualSFM software. By using high resolution photographs the DOM can have a much higher resolution than with LIDAR surveys, up to 0.2 mm/pixel. Then, image processing is performed to map the fault-rock distribution with the ImageJ-Fiji package. Green cataclasites and epidote/K-feldspar veins can be quite easily separated from the host rock (tonalite) using spectral analysis. Particularly, band ratio and principal component analysis have been tested successfully. The mapping of black pseudotachylyte veins is more tricky because the differences between the pseudotachylyte and biotite spectral signature are not appreciable. For this reason we have tested different morphological processing tools aimed at identifying (and subtracting) the tiny biotite grains. We propose a solution based on binary images involving a combination of size and circularity thresholds. Comparing the results with manually segmented images, we noticed that major problems occur only when pseudotachylyte veins are very thin and discontinuous. After having tested and refined the image analysis processing for some typical images, we have recorded a macro with ImageJ-Fiji allowing to process all the images for a given DOM. As a result, the three different types of rocks can be semi-automatically mapped on large DOMs using a simple and efficient procedure. This allows to develop quantitative analyses of fault rock distribution and thickness, fault trace roughness/curvature and length, fault zone architecture, and alteration halos due to hydrothermal fluid-rock interaction. To improve our workflow, additional or different morphological operators could be integrated in our procedure to yield a better resolution on small and thin pseudotachylyte veins (e.g. perimeter/area ratio).

  13. Modelling of groundwater quality using bicarbonate chemical parameter in Netravathi and Gurpur river confluence, India

    NASA Astrophysics Data System (ADS)

    Sylus, K. J.; H., Ramesh

    2018-04-01

    In the coastal aquifer, seawater intrusion considered the major problem which contaminates freshwater and reduces its quality for domestic use. In order to find seawater intrusion, the groundwater quality analysis for the different chemical parameter was considered as the basic method to find out contamination. This analysis was carried out as per Bureau of Indian standards (2012) and World Health Organisations (1996). In this study, Bicarbonate parameter was considered for groundwater quality analysis which ranges the permissible limit in between 200-600 mg/l. The groundwater system was modelled using Groundwater modelling software (GMS) in which the FEMWATER package used for flow and transport. The FEMWATER package works in the principle of finite element method. The base input data of model include elevation, Groundwater head, First bottom and second bottom of the study area. The modelling results show the spatial occurrence of contamination in the study area of Netravathi and Gurpur river confluence at the various time period. Further, the results of the modelling also show that the contamination occurs up to a distance of 519m towards the freshwater zone of the study area.

  14. A web-enabled system for integrated assessment of watershed development

    USGS Publications Warehouse

    Dymond, R.; Lohani, V.; Regmi, B.; Dietz, R.

    2004-01-01

    Researchers at Virginia Tech have put together the primary structure of a web enabled integrated modeling system that has potential to be a planning tool to help decision makers and stakeholders in making appropriate watershed management decisions. This paper describes the integrated system, including data sources, collection, analysis methods, system software and design, and issues of integrating the various component models. The integrated system has three modeling components, namely hydrology, economics, and fish health, and is accompanied by descriptive 'help files.' Since all three components have a related spatial aspect, GIS technology provides the integration platform. When completed, a user will access the integrated system over the web to choose pre-selected land development patterns to create a 'what if' scenario using an easy-to-follow interface. The hydrologic model simulates effects of the scenario on annual runoff volume, flood peaks of various return periods, and ground water recharge. The economics model evaluates tax revenue and fiscal costs as a result of a new land development scenario. The fish health model evaluates effects of new land uses in zones of influence to the health of fish populations in those areas. Copyright ASCE 2004.

  15. Hydrochemical tracers in the middle Rio Grande Basin, USA: 2. Calibration of a groundwater-flow model

    USGS Publications Warehouse

    Sanford, W.E.; Plummer, Niel; McAda, D.P.; Bexfield, L.M.; Anderholm, S.K.

    2004-01-01

    The calibration of a groundwater model with the aid of hydrochemical data has demonstrated that low recharge rates in the Middle Rio Grande Basin may be responsible for a groundwater trough in the center of the basin and for a substantial amount of Rio Grande water in the regional flow system. Earlier models of the basin had difficulty reproducing these features without any hydrochemical data to constrain the rates and distribution of recharge. The objective of this study was to use the large quantity of available hydrochemical data to help calibrate the model parameters, including the recharge rates. The model was constructed using the US Geological Survey's software MODFLOW, MODPATH, and UCODE, and calibrated using 14C activities and the positions of certain flow zones defined by the hydrochemical data. Parameter estimation was performed using a combination of nonlinear regression techniques and a manual search for the minimum difference between field and simulated observations. The calibrated recharge values were substantially smaller than those used in previous models. Results from a 30,000-year transient simulation suggest that recharge was at a maximum about 20,000 years ago and at a minimum about 10,000 years ago. ?? Springer-Verlag 2004.

  16. Mechano-logical model of C. elegans germ line suggests feedback on the cell cycle

    PubMed Central

    Atwell, Kathryn; Qin, Zhao; Gavaghan, David; Kugler, Hillel; Hubbard, E. Jane Albert; Osborne, James M.

    2015-01-01

    The Caenorhabditis elegans germ line is an outstanding model system in which to study the control of cell division and differentiation. Although many of the molecules that regulate germ cell proliferation and fate decisions have been identified, how these signals interact with cellular dynamics and physical forces within the gonad remains poorly understood. We therefore developed a dynamic, 3D in silico model of the C. elegans germ line, incorporating both the mechanical interactions between cells and the decision-making processes within cells. Our model successfully reproduces key features of the germ line during development and adulthood, including a reasonable ovulation rate, correct sperm count, and appropriate organization of the germ line into stably maintained zones. The model highlights a previously overlooked way in which germ cell pressure may influence gonadogenesis, and also predicts that adult germ cells might be subject to mechanical feedback on the cell cycle akin to contact inhibition. We provide experimental data consistent with the latter hypothesis. Finally, we present cell trajectories and ancestry recorded over the course of a simulation. The novel approaches and software described here link mechanics and cellular decision-making, and are applicable to modeling other developmental and stem cell systems. PMID:26428008

  17. State-of-the-Art Resources (SOAR) for Software Vulnerability Detection, Test, and Evaluation

    DTIC Science & Technology

    2014-07-01

    preclude in-depth analysis, and widespread use of a Software -as-a- Service ( SaaS ) model that limits data availability and application to DoD systems...provide mobile application analysis using a Software - as-a- Service ( SaaS ) model. In this case, any software to be analyzed must be sent to the...tools are only available through a SaaS model. The widespread use of a Software -as-a- Service ( SaaS ) model as a sole evaluation model limits data

  18. A UML-based metamodel for software evolution process

    NASA Astrophysics Data System (ADS)

    Jiang, Zuo; Zhou, Wei-Hong; Fu, Zhi-Tao; Xiong, Shun-Qing

    2014-04-01

    A software evolution process is a set of interrelated software processes under which the corresponding software is evolving. An object-oriented software evolution process meta-model (OO-EPMM), abstract syntax and formal OCL constraint of meta-model are presented in this paper. OO-EPMM can not only represent software development process, but also represent software evolution.

  19. Description and application of capture zone delineation for a wellfield at Hilton Head Island, South Carolina

    USGS Publications Warehouse

    Landmeyer, J.E.

    1994-01-01

    Ground-water capture zone boundaries for individual pumped wells in a confined aquffer were delineated by using groundwater models. Both analytical and numerical (semi-analytical) models that more accurately represent the $round-water-flow system were used. All models delineated 2-dimensional boundaries (capture zones) that represent the areal extent of groundwater contribution to a pumped well. The resultant capture zones were evaluated on the basis of the ability of each model to realistically rapresent the part of the ground-water-flow system that contributed water to the pumped wells. Analytical models used were based on a fixed radius approach, and induded; an arbitrary radius model, a calculated fixed radius model based on the volumetric-flow equation with a time-of-travel criterion, and a calculated fixed radius model derived from modification of the Theis model with a drawdown criterion. Numerical models used induded the 2-dimensional, finite-difference models RESSQC and MWCAP. The arbitrary radius and Theis analytical models delineated capture zone boundaries that compared least favorably with capture zones delineated using the volumetric-flow analytical model and both numerical models. The numerical models produced more hydrologically reasonable capture zones (that were oriented parallel to the regional flow direction) than the volumetric-flow equation. The RESSQC numerical model computed more hydrologically realistic capture zones than the MWCAP numerical model by accounting for changes in the shape of capture zones caused by multiple-well interference. The capture zone boundaries generated by using both analytical and numerical models indicated that the curnmtly used 100-foot radius of protection around a wellhead in South Carolina is an underestimate of the extent of ground-water capture for pumped wetis in this particular wellfield in the Upper Floridan aquifer. The arbitrary fixed radius of 100 feet was shown to underestimate the upgradient contribution of ground-water flow to a pumped well.

  20. A coupled hydrodynamic-hydrochemical modeling for predicting mineral transport in a natural acid drainage system.

    NASA Astrophysics Data System (ADS)

    Zegers Risopatron, G., Sr.; Navarro, L.; Montserrat, S., Sr.; McPhee, J. P.; Niño, Y.

    2017-12-01

    The geochemistry of water and sediments, coupled with hydrodynamic transport in mountainous channels, is of particular interest in central Chilean Andes due to natural occurrence of acid waters. In this paper, we present a coupled transport and geochemical model to estimate and understand transport processes and fate of minerals at the Yerba Loca Basin, located near Santiago, Chile. In the upper zone, water presentes low pH ( 3) and high concentrations of iron, aluminum, copper, manganese and zinc. Acidity and minerals are the consequence of water-rock interactions in hydrothermal alteration zones, rich in sulphides and sulphates, covered by seasonal snow and glaciers. Downstream, as a consequence of neutral to alkaline lateral water contributions (pH >7) along the river, pH increases and concentration of solutes decreases. The mineral transport model has three components: (i) a hydrodynamic model, where we use HEC-RAS to solve 1D Saint-Venant equations, (ii) a sediment transport model to estimate erosion and sedimentation rates, which quantify minerals transference between water and riverbed and (iii) a solute transport model, based on the 1D OTIS model which takes into account the temporal delay in solutes transport that typically is observed in natural channels (transient storage). Hydrochemistry is solved using PHREEQC, a software for speciation and batch reaction. Our results show that correlation between mineral precipitation and dissolution according to pH values changes along the river. Based on pH measurements (and according to literature) we inferred that main minerals in the water system are brochantite, ferrihydrite, hydrobasaluminite and schwertmannite. Results show that our model can predict the transport and fate of minerals and metals in the Yerba Loca Basin. Mineral dissolution and precipitation process occur for limited ranges of pH values. When pH values are increased, iron minerals (schwertmannite) are the first to precipitate ( 2.5

  1. Design and fabrication of the progressive addition lenses

    NASA Astrophysics Data System (ADS)

    Qin, Linling; Qian, Lin; Yu, Jingchi

    2011-11-01

    The use of progressive addition lenses (PALs) for the correction of presbyopia has increased dramatically in recent years. These lenses are now being used as the preferred alternative to bifocal and trifocal lenses in many parts of the world. Progressive addition lenses are a kind of opthalmic lenses with freeform surface. The surface curvature of the Progressive addition lenses varies gradually from a minimum value in the upper area, to a maximum value in the lower area. Thus a PAL has a surface with three zones which have very small astigmatism: far-view zone, near-view zone, and intermediate zone. The far view zone and near view zone have relatively constant powers and connected by the intermediate zone with power varies progressively. The design and fabrication technologies of progressive addition lenses have fast progresses because of the massive development of the optical simulation software, multi-axis ultraprecision machining technologies and CNC machining technologies. The design principles of progressive addition lenses are discussed in a historic review. Several kinds of design methods are illustrated, and their advantages and disadvantages are also represented. In the current study, it is shown that the optical characteristics of the different progressive addition lenses designs are significantly different from one another. The different fabrication technologies of Progressive addition lenses are also discussed in the paper. Plastic injection molding and precision-machine turning are the common fabrication technologies for exterior PALs and Interior PALs respectively.

  2. Software Development for a Three-Dimensional Gravity Inversion and Application to Study of the Border Ranges Fault System, South-Central Alaska

    NASA Astrophysics Data System (ADS)

    Cardenas, R.; Doser, D. I.; Baker, M. R.

    2011-12-01

    Summary The Border Ranges Fault (BRFS) system bounds the Cook Inlet and Susitna Basins, an important petroleum province within south-central Alaska. An initial research goal is to test several plausible models of structure along the Border Ranges Fault System by developing a novel, 3D inversion software package. The inversion utilizes gravity data constrained with geophysical, borehole, and surface geological information. The novel inversion approach involves directly modeling known geology, initially free-air corrected data, and revising a priori uncertainties on the geologic model to allow comparisons to alternative interpretations. This technique to evaluate 3D structure in regions of highly complex geology can be applied in other studies of energy resources. The software reads an ASCII text file containing the latitude, longitude, elevation, and Free Air anomalies of each gravity station as well as gridded surface files of known topology. The contributions of each node in the grid are computed in order to compare the theoretical gravity calculations from a forward model to the gravity observations. The computation of solutions to the "linearized" inversion yields a range of plausible densities. The user will have the option of varying body proportions and dimensions to compare variations in density for changing depths of the gridded surface. Introduction Previous modeling of the BRFS using geophysical data has been limited due to the complexity of local geology and structure, both of shallow crustal features and the deeper subduction zone. Since the inversion is based on a sequence of gridded surfaces, it is feasible to develop software to help build these gridded geologic models. Without a way to modify grid surface elevations, density, and magnetic susceptibility in real time, the inversion process for the geologist would be highly nonlinear and poorly constrained, especially in structural geology this complex. Without a basic understanding of the geometry of the BRFS, its role in the formation and petroleum generation processes of the upper Cook Inlet and Susitna Basins is poorly understood. Model Generation The gravitational contributions are computed using a geophysics formulation, namely the vertical line element. g = πR2Gρ(x2+y2+z2)-1/2 Each line element is semi-infinite and extends from the top to the bottom of each structural layer. The user may define a three-dimensional body at a location on the surface. Each vertex of the body will be represented as separate nodes in the grid. The contribution of the body to the gravity value will be computed as a volume integral and added to the overall gravity contributions of other nodes on the surface. The user will also be able to modify the elevation and density of the defined body in real time. The most noted effectiveness of the software is in the user-defined a priori information facilitating real time interpretations and the computational efficiency of the model solution by using vertical line elements to address structural bodies with complex geometry.

  3. Zone calculation as a tool for assessing performance outcome in laparoscopic suturing.

    PubMed

    Buckley, Christina E; Kavanagh, Dara O; Nugent, Emmeline; Ryan, Donncha; Traynor, Oscar J; Neary, Paul C

    2015-06-01

    Simulator performance is measured by metrics, which are valued as an objective way of assessing trainees. Certain procedures such as laparoscopic suturing, however, may not be suitable for assessment under traditionally formulated metrics. Our aim was to assess if our new metric is a valid method of assessing laparoscopic suturing. A software program was developed to order to create a new metric, which would calculate the percentage of time spent operating within pre-defined areas called "zones." Twenty-five candidates (medical students N = 10, surgical residents N = 10, and laparoscopic experts N = 5) performed the laparoscopic suturing task on the ProMIS III(®) simulator. New metrics of "in-zone" and "out-zone" scores as well as traditional metrics of time, path length, and smoothness were generated. Performance was also assessed by two blinded observers using the OSATS and FLS rating scales. This novel metric was evaluated by comparing it to both traditional metrics and subjective scores. There was a significant difference in the average in-zone and out-zone scores between all three experience groups (p < 0.05). The new zone metrics scores correlated significantly with the subjective-blinded observer scores of OSATS and FLS (p = 0.0001). The new zone metric scores also correlated significantly with the traditional metrics of path length, time, and smoothness (p < 0.05). The new metric is a valid tool for assessing laparoscopic suturing objectively. This could be incorporated into a competency-based curriculum to monitor resident progression in the simulated setting.

  4. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    PubMed

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  5. Stochastic Ground Water Flow Simulation with a Fracture Zone Continuum Model

    USGS Publications Warehouse

    Langevin, C.D.

    2003-01-01

    A method is presented for incorporating the hydraulic effects of vertical fracture zones into two-dimensional cell-based continuum models of ground water flow and particle tracking. High hydraulic conductivity features are used in the model to represent fracture zones. For fracture zones that are not coincident with model rows or columns, an adjustment is required for the hydraulic conductivity value entered into the model cells to compensate for the longer flowpath through the model grid. A similar adjustment is also required for simulated travel times through model cells. A travel time error of less than 8% can occur for particles moving through fractures with certain orientations. The fracture zone continuum model uses stochastically generated fracture zone networks and Monte Carlo analysis to quantify uncertainties with simulated advective travel times. An approach is also presented for converting an equivalent continuum model into a fracture zone continuum model by establishing the contribution of matrix block transmissivity to the bulk transmissivity of the aquifer. The methods are used for a case study in west-central Florida to quantify advective travel times from a potential wetland rehydration site to a municipal supply wellfield. Uncertainties in advective travel times are assumed to result from the presence of vertical fracture zones, commonly observed on aerial photographs as photolineaments.

  6. A measurement system for large, complex software programs

    NASA Technical Reports Server (NTRS)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  7. Simulation of climate-change effects on streamflow, lake water budgets, and stream temperature using GSFLOW and SNTEMP, Trout Lake Watershed, Wisconsin

    USGS Publications Warehouse

    Hunt, Randall J.; Walker, John F.; Selbig, William R.; Westenbroek, Stephen M.; Regan, R. Steve

    2013-01-01

    Although groundwater and surface water are considered a single resource, historically hydrologic simulations have not accounted for feedback loops between the groundwater system and other hydrologic processes. These feedbacks include timing and rates of evapotranspiration, surface runoff, soil-zone flow, and interactions with the groundwater system. Simulations that iteratively couple the surface-water and groundwater systems, however, are characterized by long run times and calibration challenges. In this study, calibrated, uncoupled transient surface-water and steady-state groundwater models were used to construct one coupled transient groundwater/surface-water model for the Trout Lake Watershed in north-central Wisconsin, USA. The computer code GSFLOW (Ground-water/Surface-water FLOW) was used to simulate the coupled hydrologic system; a surface-water model represented hydrologic processes in the atmosphere, at land surface, and within the soil-zone, and a groundwater-flow model represented the unsaturated zone, saturated zone, stream, and lake budgets. The coupled GSFLOW model was calibrated by using heads, streamflows, lake levels, actual evapotranspiration rates, solar radiation, and snowpack measurements collected during water years 1998–2007; calibration was performed by using advanced features present in the PEST parameter estimation software suite. Simulated streamflows from the calibrated GSFLOW model and other basin characteristics were used as input to the one-dimensional SNTEMP (Stream-Network TEMPerature) model to simulate daily stream temperature in selected tributaries in the watershed. The temperature model was calibrated to high-resolution stream temperature time-series data measured in 2002. The calibrated GSFLOW and SNTEMP models were then used to simulate effects of potential climate change for the period extending to the year 2100. An ensemble of climate models and emission scenarios was evaluated. Downscaled climate drivers for the period 2010–2100 showed increases in maximum and minimum temperature over the scenario period. Scenarios of future precipitation did not show a monotonic trend like temperature. Uncertainty in the climate drivers increased over time for both temperature and precipitation. Separate calibration of the uncoupled groundwater and surface-water models did not provide a representative initial parameter set for coupled model calibration. A sequentially linked calibration, in which the uncoupled models were linked by means of utility software, provided a starting parameter set suitable for coupled model calibration. Even with sequentially linked calibration, however, transmissivity of the lower part of the aquifer required further adjustment during coupled model calibration to attain reasonable parameter values for evaporation rates off a small seepage lake (a lake with no appreciable surface-water outlets) with a long history of study. The resulting coupled model was well calibrated to most types of observed time-series data used for calibration. Daily stream temperatures measured during 2002 were successfully simulated with SNTEMP; the model fit was acceptable for a range of groundwater inflow rates into the streams. Forecasts of potential climate change scenarios showed growing season length increasing by weeks, and both potential and actual evapotranspiration rates increasing appreciably, in response to increasing air temperature. Simulated actual evapotranspiration rates increased less than simulated potential evapotranspiration rates as a result of water limitation in the root zone during the summer high-evapotranspiration period. The hydrologic-system response to climate change was characterized by a reduction in the importance of the snow-melt pulse and an increase in the importance of fall and winter groundwater recharge. The less dynamic hydrologic regime is likely to result in drier soil conditions in rainfed wetlands and uplands, in contrast to less drying in groundwater-fed systems. Seepage lakes showed larger forecast stage declines related to climate change than did drainage lakes (lakes with outlet streams). Seepage lakes higher in the watershed (nearer to groundwater divides) had less groundwater inflow and thus had larger forecast declines in lake stage; however, ground-water inflow to seepage lakes in general tended to increase as a fraction of the lake budgets with lake-stage decline because inward hydraulic gradients increased. Drainage lakes were characterized by less simulated stage decline as reductions in outlet streamflow of set losses to other water flows. Net groundwater inflow tended to decrease in drainage lakes over the scenario period. Simulated stream temperatures increased appreciably with climate change. The estimated increase in annual average temperature ranged from approximately 1 to 2 degrees Celsius by 2100 in the stream characterized by a high groundwater inflow rate and 2 to 3 degrees Celsius in the stream with a lower rate. The climate drivers used for the climate-change scenarios had appreciable variation between the General Circulation Model and emission scenario selected; this uncertainty was reflected in hydrologic flow and temperature model results. Thus, as with all forecasts of this type, the results are best considered to approximate potential outcomes of climate change.

  8. Definition of zones with different levels of productivity within an agricultural field using fuzzy modeling

    USDA-ARS?s Scientific Manuscript database

    Zoning of agricultural fields is an important task for utilization of precision farming technology. One method for the definition of zones with different levels of productivity is based on fuzzy indicator model. Fuzzy indicator model for identification of zones with different levels of productivit...

  9. Loading Analysis of Composite Wind Turbine Blade for Fatigue Life Prediction of Adhesively Bonded Root Joint

    NASA Astrophysics Data System (ADS)

    Salimi-Majd, Davood; Azimzadeh, Vahid; Mohammadi, Bijan

    2015-06-01

    Nowadays wind energy is widely used as a non-polluting cost-effective renewable energy resource. During the lifetime of a composite wind turbine which is about 20 years, the rotor blades are subjected to different cyclic loads such as aerodynamics, centrifugal and gravitational forces. These loading conditions, cause to fatigue failure of the blade at the adhesively bonded root joint, where the highest bending moments will occur and consequently, is the most critical zone of the blade. So it is important to estimate the fatigue life of the root joint. The cohesive zone model is one of the best methods for prediction of initiation and propagation of debonding at the root joint. The advantage of this method is the possibility of modeling the debonding without any requirement to the remeshing. However in order to use this approach, it is necessary to analyze the cyclic loading condition at the root joint. For this purpose after implementing a cohesive interface element in the Ansys finite element software, one blade of a horizontal axis wind turbine with 46 m rotor diameter was modelled in full scale. Then after applying loads on the blade under different condition of the blade in a full rotation, the critical condition of the blade is obtained based on the delamination index and also the load ratio on the root joint in fatigue cycles is calculated. These data are the inputs for fatigue damage growth analysis of the root joint by using CZM approach that will be investigated in future work.

  10. Incompletely Mixed Surface Transient Storage Zones at River Restoration Structures: Modeling Implications

    NASA Astrophysics Data System (ADS)

    Endreny, T. A.; Robinson, J.

    2012-12-01

    River restoration structures, also known as river steering deflectors, are designed to reduce bank shear stress by generating wake zones between the bank and the constricted conveyance region. There is interest in characterizing the surface transient storage (STS) and associated biogeochemical processing in the STS zones around these structures to quantify the ecosystem benefits of river restoration. This research explored how the hydraulics around river restoration structures prohibits application of transient storage models designed for homogenous, completely mixed STS zones. We used slug and constant rate injections of a conservative tracer in a 3rd order river in Onondaga County, NY over the course of five experiments at varying flow regimes. Recovered breakthrough curves spanned a transect including the main channel and wake zone at a j-hook restoration structure. We noted divergent patterns of peak solute concentration and times within the wake zone regardless of transect location within the structure. Analysis reveals an inhomogeneous STS zone which is frequently still loading tracer after the main channel has peaked. The breakthrough curve loading patterns at the restoration structure violated the assumptions of simplified "random walk" 2 zone transient storage models which seek to identify representative STS zones and zone locations. Use of structure-scale Weiner filter based multi-rate mass transfer models to characterize STS zones residence times are similarly dependent on a representative zone location. Each 2 zone model assumes 1 zone is a completely mixed STS zone and the other a completely mixed main channel. Our research reveals limits to simple application of the recently developed 2 zone models, and raises important questions about the measurement scale necessary to identify critical STS properties at restoration sites. An explanation for the incompletely mixed STS zone may be the distinct hydraulics at restoration sites, including a constrained high velocity conveyance region closely abutting a wake zone that receives periodic disruption from the upstream structure shearing vortices.igure 1. River restoration j-hook with blue dye revealing main channel and edge of wake zone with multiple surface transient storage zones.

  11. Generalized mathematical model of red muds’ thickener of alumina production

    NASA Astrophysics Data System (ADS)

    Fedorova, E. R.; Vinogradova, A. A.

    2018-03-01

    The article describes the principle of a generalized mathematical model of the red mud’s thickener construction. The model of the red muds’ thickener of alumina production consists of sub-models of flocculation zones containing solid fraction feed slurry, free-fall and cramped sedimentation zones or effective sedimentation zones, bleaching zones. The generalized mathematical model of thickener allows predicting the content of solid fraction in the condensed product and in the upper discharge. The sub-model of solid phase aggregation allows one to count up average size of floccules, which is created during the flocculation process in feedwell. The sub-model of the free-fall and cramped sedimentation zone allows one to count up the concentration profile taking into account the variable cross-sectional area of the thickener. The sub-model of the bleaching zone is constructed on the basis of the theory of the precipitation of Kinc, supplemented by correction factors.

  12. Dependability modeling and assessment in UML-based software development.

    PubMed

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  13. Dependability Modeling and Assessment in UML-Based Software Development

    PubMed Central

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C.

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results. PMID:22988428

  14. Software reliability models for fault-tolerant avionics computers and related topics

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1987-01-01

    Software reliability research is briefly described. General research topics are reliability growth models, quality of software reliability prediction, the complete monotonicity property of reliability growth, conceptual modelling of software failure behavior, assurance of ultrahigh reliability, and analysis techniques for fault-tolerant systems.

  15. An assessment of AVIRIS data for hydrothermal alteration mapping in the Goldfield Mining District, Nevada

    NASA Technical Reports Server (NTRS)

    Carrere, Veronique; Abrams, Michael J.

    1988-01-01

    Airborne Visible and Infrared Imaging Spectrometer (AVIRIS) data were acquired over the Goldfield Mining District, Nevada, in September 1987. Goldfield is one of the group of large epithermal precious metal deposits in Tertiary volcanic rocks, associated with silicic volcanism and caldera formation. Hydrothermal alteration consists of silicification along fractures, advanced agrillic and argillic zones further away from veins and more widespread propylitic zones. An evaluation of AVIRIS data quality was performed. Faults in the data, related to engineering problems and a different behavior of the instrument while on-board the U2, were encountered. Consequently, a decision was made to use raw data and correct them only for dark current variations and detector read-out-delays. New software was written to that effect. Atmospheric correction was performed using the flat field correction technique. Analysis of the data was then performed to extract spectral information, mainly concentrating on the 2 to 2.45 micron window, as the alteration minerals of interest have their distinctive spectral reflectance features in this region. Principally kaolinite and alunite spectra were clearly obtained. Mapping of the different minerals and alteration zones was attempted using ratios and clustering techniques. Poor signal-to-noise performance of the instrument and the lack of appropriate software prevented the production of an alteration map of the area. Spectra extracted locally from the AVIRIS data were checked in the field by collecting representative samples of the outcrops.

  16. Thermal dynamic simulation of wall for building energy efficiency under varied climate environment

    NASA Astrophysics Data System (ADS)

    Wang, Xuejin; Zhang, Yujin; Hong, Jing

    2017-08-01

    Aiming at different kind of walls in five cities of different zoning for thermal design, using thermal instantaneous response factors method, the author develops software to calculation air conditioning cooling load temperature, thermal response factors, and periodic response factors. On the basis of the data, the author gives the net work analysis about the influence of dynamic thermal of wall on air-conditioning load and thermal environment in building of different zoning for thermal design regional, and put forward the strategy how to design thermal insulation and heat preservation wall base on dynamic thermal characteristic of wall under different zoning for thermal design regional. And then provide the theory basis and the technical references for the further study on the heat preservation with the insulation are in the service of energy saving wall design. All-year thermal dynamic load simulating and energy consumption analysis for new energy-saving building is very important in building environment. This software will provide the referable scientific foundation for all-year new thermal dynamic load simulation, energy consumption analysis, building environment systems control, carrying through farther research on thermal particularity and general particularity evaluation for new energy -saving walls building. Based on which, we will not only expediently design system of building energy, but also analyze building energy consumption and carry through scientific energy management. The study will provide the referable scientific foundation for carrying through farther research on thermal particularity and general particularity evaluation for new energy saving walls building.

  17. 77 FR 5389 - Fisheries of the Exclusive Economic Zone Off Alaska; Chinook Salmon Bycatch Management in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-03

    ... for audit information on a Crab EDR. Based on experience in these EDR programs, in the final rule... hardware, software, or Internet is restored, the User must enter this same information into the electronic... Fisheries Act catcher vessels, catcher/processor, and mothership sectors as well as representatives for the...

  18. Highly-optimized TWSM software package for seismic diffraction modeling adapted for GPU-cluster

    NASA Astrophysics Data System (ADS)

    Zyatkov, Nikolay; Ayzenberg, Alena; Aizenberg, Arkady

    2015-04-01

    Oil producing companies concern to increase resolution capability of seismic data for complex oil-and-gas bearing deposits connected with salt domes, basalt traps, reefs, lenses, etc. Known methods of seismic wave theory define shape of hydrocarbon accumulation with nonsufficient resolution, since they do not account for multiple diffractions explicitly. We elaborate alternative seismic wave theory in terms of operators of propagation in layers and reflection-transmission at curved interfaces. Approximation of this theory is realized in the seismic frequency range as the Tip-Wave Superposition Method (TWSM). TWSM based on the operator theory allows to evaluate of wavefield in bounded domains/layers with geometrical shadow zones (in nature it can be: salt domes, basalt traps, reefs, lenses, etc.) accounting for so-called cascade diffraction. Cascade diffraction includes edge waves from sharp edges, creeping waves near concave parts of interfaces, waves of the whispering galleries near convex parts of interfaces, etc. The basic algorithm of TWSM package is based on multiplication of large-size matrices (make hundreds of terabytes in size). We use advanced information technologies for effective realization of numerical procedures of the TWSM. In particular, we actively use NVIDIA CUDA technology and GPU accelerators allowing to significantly improve the performance of the TWSM software package, that is important in using it for direct and inverse problems. The accuracy, stability and efficiency of the algorithm are justified by numerical examples with curved interfaces. TWSM package and its separate components can be used in different modeling tasks such as planning of acquisition systems, physical interpretation of laboratory modeling, modeling of individual waves of different types and in some inverse tasks such as imaging in case of laterally inhomogeneous overburden, AVO inversion.

  19. Software For Computing Reliability Of Other Software

    NASA Technical Reports Server (NTRS)

    Nikora, Allen; Antczak, Thomas M.; Lyu, Michael

    1995-01-01

    Computer Aided Software Reliability Estimation (CASRE) computer program developed for use in measuring reliability of other software. Easier for non-specialists in reliability to use than many other currently available programs developed for same purpose. CASRE incorporates mathematical modeling capabilities of public-domain Statistical Modeling and Estimation of Reliability Functions for Software (SMERFS) computer program and runs in Windows software environment. Provides menu-driven command interface; enabling and disabling of menu options guides user through (1) selection of set of failure data, (2) execution of mathematical model, and (3) analysis of results from model. Written in C language.

  20. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  1. Advances in Land Data Assimilation at the NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Reichle, Rolf

    2009-01-01

    Research in land surface data assimilation has grown rapidly over the last decade. In this presentation we provide a brief overview of key research contributions by the NASA Goddard Space Flight Center (GSFC). The GSFC contributions to land assimilation primarily include the continued development and application of the Land Information System (US) and the ensemble Kalman filter (EnKF). In particular, we have developed a method to generate perturbation fields that are correlated in space, time, and across variables and that permit the flexible modeling of errors in land surface models and observations, along with an adaptive filtering approach that estimates observation and model error input parameters. A percentile-based scaling method that addresses soil moisture biases in model and observational estimates opened the path to the successful application of land data assimilation to satellite retrievals of surface soil moisture. Assimilation of AMSR-E surface soil moisture retrievals into the NASA Catchment model provided superior surface and root zone assimilation products (when validated against in situ measurements and compared to the model estimates or satellite observations alone). The multi-model capabilities of US were used to investigate the role of subsurface physics in the assimilation of surface soil moisture observations. Results indicate that the potential of surface soil moisture assimilation to improve root zone information is higher when the surface to root zone coupling is stronger. Building on this experience, GSFC leads the development of the Level 4 Surface and Root-Zone Soil Moisture (L4_SM) product for the planned NASA Soil-Moisture-Active-Passive (SMAP) mission. A key milestone was the design and execution of an Observing System Simulation Experiment that quantified the contribution of soil moisture retrievals to land data assimilation products as a function of retrieval and land model skill and yielded an estimate of the error budget for the SMAP L4_SM product. Terrestrial water storage observations from GRACE satellite system were also successfully assimilated into the NASA Catchment model and provided improved estimates of groundwater variability when compared to the model estimates alone. Moreover, satellite-based land surface temperature (LST) observations from the ISCCP archive were assimilated using a bias estimation module that was specifically designed for LST assimilation. As with soil moisture, LST assimilation provides modest yet statistically significant improvements when compared to the model or satellite observations alone. To achieve the improvement, however, the LST assimilation algorithm must be adapted to the specific formulation of LST in the land model. An improved method for the assimilation of snow cover observations was also developed. Finally, the coupling of LIS to the mesoscale Weather Research and Forecasting (WRF) model enabled investigations into how the sensitivity of land-atmosphere interactions to the specific choice of planetary boundary layer scheme and land surface model varies across surface moisture regimes, and how it can be quantified and evaluated against observations. The on-going development and integration of land assimilation modules into the Land Information System will enable the use of GSFC software with a variety of land models and make it accessible to the research community.

  2. Spatial variation of slip behavior beneath the Alaska Peninsula along Alaska-Aleutian Subduction Zone

    NASA Astrophysics Data System (ADS)

    Li, S.; Freymueller, J. T.

    2017-12-01

    The Alaska Peninsula, including the Shumagin and Semidi segments in the Alaska-Aleutian subduction zone, is one of the best places in the world to study along-strike variations in the seismogenic zone. Understanding the cause of along-strike variations on the plate interface and seismic potential is significant for better understanding of the dynamic mechanical properties of faults and the rheology of the lower crust and lithospheric mantle in subduction zones. GPS measurements can be used to study these properties and estimate the slip deficit distribution on the plate interface. We re-surveyed pre-existing (1992-2001) campaign GPS sites in 2016 and estimated a new dense and highly precise GPS velocity field for the Alaska Peninsula. We find evidence for only minimal time variations in the slip distribution in the region. We used the TDEFNODE software package to invert for the slip deficit distribution from the new velocities. There are long-wavelength systematic misfits to the vertical velocities from the optimal model that fits the horizontal velocities well, which cannot be explained by altering the slip distribution on the subduction plate interface. Possible explanations for the systematic misfit are still under investigation since the plate geometry, GIA effect and reference frame errors do not explain the misfits. In this study, we use only the horizontal velocities. We divided the overall Alaska Peninsula area into three sub-areas, which have strong differences in the pattern of the observed deformation, and explored optimal models for each sub-area. The width of the locked region decreases step-wise from NE to SW along strike. Then we compared each of these models to all of the data to identify the locations of the along-strike boundaries that mark the transition from strongly to weakly coupled segments of the margin. We identified three sharp boundaries separating segments with different fault slip deficit rate distributions. Significant change in fault coupling from strong to weak are spatially correlated with the change in pre-existing plate fabric caused by cessation of the Kula-Pacific spreading and reorientation of the northern section of Farallon-Pacific spreading, which also correlate with changes in the degree of outer rise normal faulting and hydration of the downgoing plate.

  3. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  4. Comparison of the Structurally Controlled Landslides Numerical Model Results to the M 7.2 2013 Bohol Earthquake Co-seismic Landslides

    NASA Astrophysics Data System (ADS)

    Macario Galang, Jan Albert; Narod Eco, Rodrigo; Mahar Francisco Lagmay, Alfredo

    2015-04-01

    The M 7.2 October 15, 2013 Bohol earthquake is the most destructive earthquake to hit the Philippines since 2012. The epicenter was located in Sagbayan municipality, central Bohol and was generated by a previously unmapped reverse fault called the "Inabanga Fault". Its name, taken after the barangay (village) where the fault is best exposed and was first seen. The earthquake resulted in 209 fatalities and over 57 billion USD worth of damages. The earthquake generated co-seismic landslides most of which were related to fault structures. Unlike rainfall induced landslides, the trigger for co-seismic landslides happen without warning. Preparedness against this type of landslide therefore, relies heavily on the identification of fracture-related unstable slopes. To mitigate the impacts of co-seismic landslide hazards, morpho-structural orientations or discontinuity sets were mapped in the field with the aid of a 2012 IFSAR Digital Terrain Model (DTM) with 5-meter pixel resolution and < 0.5 meter vertical accuracy. Coltop 3D software was then used to identify similar structures including measurement of their dip and dip directions. The chosen discontinuity sets were then keyed into Matterocking software to identify potential rock slide zones due to planar or wedged discontinuities. After identifying the structurally-controlled unstable slopes, the rock mass propagation extent of the possible rock slides was simulated using Conefall. The results were compared to a post-earthquake landslide inventory of 456 landslides. Out the total number of landslides identified from post-earthquake high-resolution imagery, 366 or 80% intersect the structural-controlled hazard areas of Bohol. The results show the potential of this method to identify co-seismic landslide hazard areas for disaster mitigation. Along with computer methods to simulate shallow landslides, and debris flow paths, located structurally-controlled unstable zones can be used to mark unsafe areas for settlement. The method can be further improved with the use of Lidar DTMs, which has better accuracy than the IFSAR DTM. A nationwide effort under DOST-Project NOAH (DREAM-LIDAR) is underway, to map the Philippine archipelago using Lidar.

  5. Effect of late HIV diagnosis on HIV-related mortality among adults in general hospitals of Central Zone Tigray, northern Ethiopia: a retrospective cohort study.

    PubMed

    Belay, Hadera; Alemseged, Fessahaye; Angesom, Teklit; Hintsa, Solomon; Abay, Mebrahtu

    2017-01-01

    The global incidence of HIV infection is not significantly decreasing, especially in sub-Saharan African countries, including Ethiopia. Though there is availability and accessibility of free HIV services, people are not being diagnosed early for HIV, and hence patients are still dying of HIV-related causes. This research is aimed at verifying the effect of late diagnosis of HIV on HIV-related mortality in Central Zone Tigray, Ethiopia. A retrospective cohort study among adult (≥15 years old) HIV patients in three general hospitals of Tigray was conducted. Record reviews were carried out retrospectively from 2010 to 2015. Sample size was determined using stpower Cox in Stata software. Data were entered into EpiData version 3.1 software and transferred to Stata version 12 for analysis. Both bivariable and multivariable analyses were performed using Cox regression model to compare the HIV-related mortality of exposed (cluster of differentiation 4 cells count <350 cells/mm 3 ) and nonexposed (≥350 cells/mm 3 ) patients using adjusted hazard ratio (AHR) at 95% confidence interval (CI). In all, 638 HIV patients were analyzed, contributing 2,105.6 person-years. Forty-eight (7.5%) patients died of HIV-related causes with a mortality rate of 2.28 per 100 person-years. In the multivariable Cox regression model, patients with late diagnosis of HIV had a higher risk of mortality (AHR =3.22, 95% CI: 1.17-8.82) than patients with early diagnosis of HIV. Rural residence (AHR =1.96, 95% CI: 1.05-3.68), unemployment (AHR =2.70, 95% CI: 1.03-7.08), bedridden patients (AHR =2.98, 95% CI: 1.45-6.13), ambulatory patients (AHR =2.54, 95% CI: 1.05-6.15), and baseline hemoglobin level of <11 mg/dL (AHR =3.06, 95% CI: 1.51-6.23) were other independent predictors of mortality. Late diagnosis of HIV increased HIV-related mortality. Rural residence, unemployment, bedridden and ambulatory patients, and baseline hemoglobin level <11 mg/dL were also independent predictors of HIV-related mortality.

  6. The effect of earthquake on architecture geometry with non-parallel system irregularity configuration

    NASA Astrophysics Data System (ADS)

    Teddy, Livian; Hardiman, Gagoek; Nuroji; Tudjono, Sri

    2017-12-01

    Indonesia is an area prone to earthquake that may cause casualties and damage to buildings. The fatalities or the injured are not largely caused by the earthquake, but by building collapse. The collapse of the building is resulted from the building behaviour against the earthquake, and it depends on many factors, such as architectural design, geometry configuration of structural elements in horizontal and vertical plans, earthquake zone, geographical location (distance to earthquake center), soil type, material quality, and construction quality. One of the geometry configurations that may lead to the collapse of the building is irregular configuration of non-parallel system. In accordance with FEMA-451B, irregular configuration in non-parallel system is defined to have existed if the vertical lateral force-retaining elements are neither parallel nor symmetric with main orthogonal axes of the earthquake-retaining axis system. Such configuration may lead to torque, diagonal translation and local damage to buildings. It does not mean that non-parallel irregular configuration should not be formed on architectural design; however the designer must know the consequence of earthquake behaviour against buildings with irregular configuration of non-parallel system. The present research has the objective to identify earthquake behaviour in architectural geometry with irregular configuration of non-parallel system. The present research was quantitative with simulation experimental method. It consisted of 5 models, where architectural data and model structure data were inputted and analyzed using the software SAP2000 in order to find out its performance, and ETAB2015 to determine the eccentricity occurred. The output of the software analysis was tabulated, graphed, compared and analyzed with relevant theories. For areas of strong earthquake zones, avoid designing buildings which wholly form irregular configuration of non-parallel system. If it is inevitable to design a building with building parts containing irregular configuration of non-parallel system, make it more rigid by forming a triangle module, and use the formula.A good collaboration is needed between architects and structural experts in creating earthquake architecture.

  7. Natural constraints on the rheology of the lower continental crust (Musgrave Ranges, Central Australia)

    NASA Astrophysics Data System (ADS)

    Hawemann, Friedrich; Mancktelow, Neil; Wex, Sebastian; Camacho, Alfredo; Pennacchioni, Giorgio

    2015-04-01

    Current models and extrapolated laboratory data generally predict viscous flow in the lower continental crust and any localized brittle deformation at these depths has been proposed to reflect downward propagation of the frictional-viscous transition zone during short-term seismic events and related high strain rates. Better natural constraints on this proposed rheological behaviour can be obtained directly from currently exposed lower crust that has not been strongly overprinted during its exhumation. One of the largest and best preserved lower crustal sections is located in the Musgrave Ranges, Central Australia. The Petermann Orogeny (550 Ma) in this area is characterized by the development of localized shear zones on a wide range of scales, overprinting water-deficient granulites of Musgravian age (1.2 Ga) as well as younger granites and gabbros. Shearing is rarely localized on lithological inhomogeneities, but rather on precursor fractures and on commonly associated pseudotachylytes. The only exception is that older dolerite dykes are often exploited, possibly because they are planar layers of markedly smaller grain size. Sheared pseudotachylyte often appears caramel-coloured in the field and has a fine grained assemblage of Grt+Cpx+Fsp. Multiple generations of pseudotachylyte formed broadly coeval with shearing are indicated by clasts of sheared pseudotachylyte within pseudotachylyte veins that then themselves subsequently sheared. The ductile shear zones formed under sub-eclogitic conditions of ca. 650°C and 1.2 GPa, generally typical of the lower continental crust. However, the P-T conditions during pseudotachylyte formation cannot be readily determined using classical geothermobarometry, because of the fine grain sizes and possible disequilibrium. The software "Xmaptools" (by Pierre Lanari) allows the quantification of X-ray maps produced by EDS or WDS. It provides both very precise definition of local mineral compositions for exchange geothermobarometry on a statistical basis, and an estimate of the bulk pseudotachylyte composition for small areas, avoiding clasts and heterogeneous composition of the former melt. The combination with thermodynamic modelling using PerpleX is used to test the results from geothermobarometry. The estimated conditions are similar to the ductile shear zones and support evidence for synchronous action of brittle faulting and viscous shearing in the lower crust.

  8. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  9. Toward Building a New Seismic Hazard Model for Mainland China

    NASA Astrophysics Data System (ADS)

    Rong, Y.; Xu, X.; Chen, G.; Cheng, J.; Magistrale, H.; Shen, Z.

    2015-12-01

    At present, the only publicly available seismic hazard model for mainland China was generated by Global Seismic Hazard Assessment Program in 1999. We are building a new seismic hazard model by integrating historical earthquake catalogs, geological faults, geodetic GPS data, and geology maps. To build the model, we construct an Mw-based homogeneous historical earthquake catalog spanning from 780 B.C. to present, create fault models from active fault data using the methodology recommended by Global Earthquake Model (GEM), and derive a strain rate map based on the most complete GPS measurements and a new strain derivation algorithm. We divide China and the surrounding regions into about 20 large seismic source zones based on seismotectonics. For each zone, we use the tapered Gutenberg-Richter (TGR) relationship to model the seismicity rates. We estimate the TGR a- and b-values from the historical earthquake data, and constrain corner magnitude using the seismic moment rate derived from the strain rate. From the TGR distributions, 10,000 to 100,000 years of synthetic earthquakes are simulated. Then, we distribute small and medium earthquakes according to locations and magnitudes of historical earthquakes. Some large earthquakes are distributed on active faults based on characteristics of the faults, including slip rate, fault length and width, and paleoseismic data, and the rest to the background based on the distributions of historical earthquakes and strain rate. We evaluate available ground motion prediction equations (GMPE) by comparison to observed ground motions. To apply appropriate GMPEs, we divide the region into active and stable tectonics. The seismic hazard will be calculated using the OpenQuake software developed by GEM. To account for site amplifications, we construct a site condition map based on geology maps. The resulting new seismic hazard map can be used for seismic risk analysis and management, and business and land-use planning.

  10. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.

  11. Air Flow Modeling in the Wind Tunnel of the FHWA Aerodynamics Laboratory at Turner-Fairbank Highway Research Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sitek, M. A.; Lottes, S. A.; Bojanowski, C.

    Computational fluid dynamics (CFD) modeling is widely used in industry for design and in the research community to support, compliment, and extend the scope of experimental studies. Analysis of transportation infrastructure using high performance cluster computing with CFD and structural mechanics software is done at the Transportation Research and Analysis Computing Center (TRACC) at Argonne National Laboratory. These resources, available at TRACC, were used to perform advanced three-dimensional computational simulations of the wind tunnel laboratory at the Turner-Fairbank Highway Research Center (TFHRC). The goals were to verify the CFD model of the laboratory wind tunnel and then to use versionsmore » of the model to provide the capability to (1) perform larger parametric series of tests than can be easily done in the laboratory with available budget and time, (2) to extend testing to wind speeds that cannot be achieved in the laboratory, and (3) to run types of tests that are very difficult or impossible to run in the laboratory. Modern CFD software has many physics models and domain meshing options. Models, including the choice of turbulence and other physics models and settings, the computational mesh, and the solver settings, need to be validated against measurements to verify that the results are sufficiently accurate for use in engineering applications. The wind tunnel model was built and tested, by comparing to experimental measurements, to provide a valuable tool to perform these types of studies in the future as a complement and extension to TFHRC’s experimental capabilities. Wind tunnel testing at TFHRC is conducted in a subsonic open-jet wind tunnel with a 1.83 m (6 foot) by 1.83 m (6 foot) cross section. A three component dual force-balance system is used to measure forces acting on tested models, and a three degree of freedom suspension system is used for dynamic response tests. Pictures of the room are shown in Figure 1-1 to Figure 1-4. A detailed CAD geometry and CFD model of the wind tunnel laboratory at TFHRC was built and tested. Results were compared against experimental wind velocity measurements at a large number of locations around the room. This testing included an assessment of the air flow uniformity provided by the tunnel to the test zone and assessment of room geometry effects, such as influence of the proximity the room walls, the non-symmetrical position of the tunnel in the room, and the influence of the room setup on the air flow in the room. This information is useful both for simplifying the computational model and in deciding whether or not moving, or removing, some of the furniture or other movable objects in the room will change the flow in the test zone.« less

  12. Modeling the Effects of Coolant Application in Friction Stir Processing on Material Microstructure Using 3D CFD Analysis

    NASA Astrophysics Data System (ADS)

    Aljoaba, Sharif; Dillon, Oscar; Khraisheh, Marwan; Jawahir, I. S.

    2012-07-01

    The ability to generate nano-sized grains is one of the advantages of friction stir processing (FSP). However, the high temperatures generated during the stirring process within the processing zone stimulate the grains to grow after recrystallization. Therefore, maintaining the small grains becomes a critical issue when using FSP. In the present reports, coolants are applied to the fixture and/or processed material in order to reduce the temperature and hence, grain growth. Most of the reported data in the literature concerning cooling techniques are experimental. We have seen no reports that attempt to predict these quantities when using coolants while the material is undergoing FSP. Therefore, there is need to develop a model that predicts the resulting grain size when using coolants, which is an important step toward designing the material microstructure. In this study, two three-dimensional computational fluid dynamics (CFD) models are reported which simulate FSP with and without coolant application while using the STAR CCM+ CFD commercial software. In the model with the coolant application, the fixture (backing plate) is modeled while is not in the other model. User-defined subroutines were incorporated in the software and implemented to investigate the effects of changing process parameters on temperature, strain rate and material velocity fields in, and around, the processed nugget. In addition, a correlation between these parameters and the Zener-Holloman parameter used in material science was developed to predict the grain size distribution. Different stirring conditions were incorporated in this study to investigate their effects on material flow and microstructural modification. A comparison of the results obtained by using each of the models on the processed microstructure is also presented for the case of Mg AZ31B-O alloy. The predicted results are also compared with the available experimental data and generally show good agreement.

  13. a Matlab Geodetic Software for Processing Airborne LIDAR Bathymetry Data

    NASA Astrophysics Data System (ADS)

    Pepe, M.; Prezioso, G.

    2015-04-01

    The ability to build three-dimensional models through technologies based on satellite navigation systems GNSS and the continuous development of new sensors, as Airborne Laser Scanning Hydrography (ALH), data acquisition methods and 3D multi-resolution representations, have contributed significantly to the digital 3D documentation, mapping, preservation and representation of landscapes and heritage as well as to the growth of research in this fields. However, GNSS systems led to the use of the ellipsoidal height; to transform this height in orthometric is necessary to know a geoid undulation model. The latest and most accurate global geoid undulation model, available worldwide, is EGM2008 which has been publicly released by the U.S. National Geospatial-Intelligence Agency (NGA) EGM Development Team. Therefore, given the availability and accuracy of this geoid model, we can use it in geomatics applications that require the conversion of heights. Using this model, to correct the elevation of a point does not coincide with any node must interpolate elevation information of adjacent nodes. The purpose of this paper is produce a Matlab® geodetic software for processing airborne LIDAR bathymetry data. In particular we want to focus on the point clouds in ASPRS LAS format and convert the ellipsoidal height in orthometric. The algorithm, valid on the whole globe and operative for all UTM zones, allows the conversion of ellipsoidal heights using the EGM2008 model. Of this model we analyse the slopes which occur, in some critical areas, between the nodes of the undulations grid; we will focus our attention on the marine areas verifying the impact that the slopes have in the calculation of the orthometric height and, consequently, in the accuracy of the in the 3-D point clouds. This experiment will be carried out by analysing a LAS APRS file containing topographic and bathymetric data collected with LIDAR systems along the coasts of Oregon and Washington (USA).

  14. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  15. NASA Software Cost Estimation Model: An Analogy Based Estimation Model

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James

    2015-01-01

    The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K-­ nearest neighbor prediction model performance on the same data set.

  16. Laboratory and numerical experiments on water and energy fluxes during freezing and thawing in the unsaturated zone

    NASA Astrophysics Data System (ADS)

    Holländer, Hartmut; Montasir Islam, Md.; Šimunek, Jirka

    2017-04-01

    Frozen soil has a major effect in many hydrologic processes, and its effects are difficult to predict. A prime example is flood forecasting during spring snowmelt within the Canadian Prairies. One key driver for the extent of flooding is the antecedent soil moisture and the possibility for water to infiltrate into frozen soils. Therefore, these situations are crucial for accurate flood prediction during every spring. The main objective of this study was to evaluate the water flow and heat transport within HYDRUS-1D version 4.16 and with Hansson's model, which is a detailed freezing/thawing module (Hansson et al., 2004), to predict the impact of frozen and partly frozen soil on infiltration. We developed a standardized data set of water flow and heat transport into (partial) frozen soil by laboratory experiments using fine sand. Temperature, soil moisture, and percolated water were observed at different freezing conditions as well as at thawing conditions. Significant variation in soil moisture was found between the top and the bottom of the soil column at the starting of the thawing period. However, with increasing temperature, the lower depth of the soil column showed higher moisture as the soil became enriched with moisture due to the release of heat by soil particles during the thawing cycle. We applied vadose zone modeling using the results from the laboratory experiments. The simulated water content by HYDRUS-1D 4.16 showed large errors compared to the observed data showing by negative Nash-Sutcliffe Efficiency. Hansson's model was not able to predict soil water fluxes due to its unstable behavior (Šimunek et al., 2016). The soil temperature profile simulated using HYDRUS-1D 4.16 was not able to predict the release of latent heat during the phase change of water that was visible in Hansson's model. Hansson's model includes the energy gain/loss due to the phase change in the amount of latent energy stored in the modified heat transport equation. However, in situations when the thermal heat gradient was large, the latent heat was not the key process, and HYDRUS-1D 4.16 was predicting better soil temperatures compared to Hansson's model. The newly developed data showed their usefulness for the evaluation and validation of the numerical models. We claim that these laboratory results will be useful for the validation of numerical models and for developing scientific knowledge to suggest potential code variations or new code development in numerical models. References: Hansson, K., J. Šimunek, M. Mizoguchi, L.-C. Lundin, and M. T. van Genuchten (2004), Water Flow and Heat Transport in Frozen Soil, Vadose Zone J, 3(2), 693-704. Šimunek, J., M. T. van Genuchten, and M. Sejna (2016), Recent developments and applications of the HYDRUS computer software packages, Vadose Zone J, 15(7).

  17. Presenting an evaluation model of the trauma registry software.

    PubMed

    Asadi, Farkhondeh; Paydar, Somayeh

    2018-04-01

    Trauma is a major cause of 10% death in the worldwide and is considered as a global concern. This problem has made healthcare policy makers and managers to adopt a basic strategy in this context. Trauma registry has an important and basic role in decreasing the mortality and the disabilities due to injuries resulted from trauma. Today, different software are designed for trauma registry. Evaluation of this software improves management, increases efficiency and effectiveness of these systems. Therefore, the aim of this study is to present an evaluation model for trauma registry software. The present study is an applied research. In this study, general and specific criteria of trauma registry software were identified by reviewing literature including books, articles, scientific documents, valid websites and related software in this domain. According to general and specific criteria and related software, a model for evaluating trauma registry software was proposed. Based on the proposed model, a checklist designed and its validity and reliability evaluated. Mentioned model by using of the Delphi technique presented to 12 experts and specialists. To analyze the results, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved by the experts and professionals, the final version of the evaluation model for the trauma registry software was presented. For evaluating of criteria of trauma registry software, two groups were presented: 1- General criteria, 2- Specific criteria. General criteria of trauma registry software were classified into four main categories including: 1- usability, 2- security, 3- maintainability, and 4-interoperability. Specific criteria were divided into four main categories including: 1- data submission and entry, 2- reporting, 3- quality control, 4- decision and research support. The presented model in this research has introduced important general and specific criteria of trauma registry software and sub criteria related to each main criteria separately. This model was validated by experts in this field. Therefore, this model can be used as a comprehensive model and a standard evaluation tool for measuring efficiency and effectiveness and performance improvement of trauma registry software. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Austenite grain growth simulation considering the solute-drag effect and pinning effect

    PubMed Central

    Fujiyama, Naoto; Nishibata, Toshinobu; Seki, Akira; Hirata, Hiroyuki; Kojima, Kazuhiro; Ogawa, Kazuhiro

    2017-01-01

    Abstract The pinning effect is useful for restraining austenite grain growth in low alloy steel and improving heat affected zone toughness in welded joints. We propose a new calculation model for predicting austenite grain growth behavior. The model is mainly comprised of two theories: the solute-drag effect and the pinning effect of TiN precipitates. The calculation of the solute-drag effect is based on the hypothesis that the width of each austenite grain boundary is constant and that the element content maintains equilibrium segregation at the austenite grain boundaries. We used Hillert’s law under the assumption that the austenite grain boundary phase is a liquid so that we could estimate the equilibrium solute concentration at the austenite grain boundaries. The equilibrium solute concentration was calculated using the Thermo-Calc software. Pinning effect was estimated by Nishizawa’s equation. The calculated austenite grain growth at 1473–1673 K showed excellent correspondence with the experimental results. PMID:28179962

  19. Numerical analysis of the wake of a 10kW HAWT

    NASA Astrophysics Data System (ADS)

    Gong, S. G.; Deng, Y. B.; Xie, G. L.; Zhang, J. P.

    2017-01-01

    With the rising of wind power industry and the ever-growing scale of wind farm, the research for the wake performance of wind turbine has an important guiding significance for the overall arrangement of wind turbines in the large wind farm. The wake simulation model of 10kW horizontal-axis wind turbine is presented on the basis of Averaged Navier-Stokes (RANS) equations and the RNG k-ε turbulence model for applying to the rotational fluid flow. The sliding mesh technique in ANSYS CFX software is used to solve the coupling equation of velocity and pressure. The characters of the average velocity in the wake zone under rated inlet wind speed and different rotor rotational speeds have been investigated. Based on the analysis results, it is proposed that the horizontal spacing between the wind turbines is less than two times radius of rotor, and its longitudinal spacing is less than five times of radius. And other results have also been obtained, which are of great importance for large wind farms.

  20. Using modeling tools for implementing feasible land use and nature conservation governance systems in small islands - The Pico Island (Azores) case-study.

    PubMed

    Fernandes, J P; Freire, M; Guiomar, N; Gil, A

    2017-03-15

    The present study deals with the development of systematic conservation planning as management instrument in small oceanic islands, ensuring open systems of governance, and able to integrate an informed and involved participation of the stakeholders. Marxan software was used to define management areas according a set of alternative land use scenarios considering different conservation and management paradigms. Modeled conservation zones were interpreted and compared with the existing protected areas allowing more fused information for future trade-outs and stakeholder's involvement. The results, allowing the identification of Target Management Units (TMU) based on the consideration of different development scenarios proved to be consistent with a feasible development of evaluation approaches able to support sound governance systems. Moreover, the detailed geographic identification of TMU seems to be able to support participated policies towards a more sustainable management of the entire island. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Impacts of building geometry modeling methods on the simulation results of urban building energy models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yixing; Hong, Tianzhen

    We present that urban-scale building energy modeling (UBEM)—using building modeling to understand how a group of buildings will perform together—is attracting increasing attention in the energy modeling field. Unlike modeling a single building, which will use detailed information, UBEM generally uses existing building stock data consisting of high-level building information. This study evaluated the impacts of three zoning methods and the use of floor multipliers on the simulated energy use of 940 office and retail buildings in three climate zones using City Building Energy Saver. The first zoning method, OneZone, creates one thermal zone per floor using the target building'smore » footprint. The second zoning method, AutoZone, splits the building's footprint into perimeter and core zones. A novel, pixel-based automatic zoning algorithm is developed for the AutoZone method. The third zoning method, Prototype, uses the U.S. Department of Energy's reference building prototype shapes. Results show that simulated source energy use of buildings with the floor multiplier are marginally higher by up to 2.6% than those modeling each floor explicitly, which take two to three times longer to run. Compared with the AutoZone method, the OneZone method results in decreased thermal loads and less equipment capacities: 15.2% smaller fan capacity, 11.1% smaller cooling capacity, 11.0% smaller heating capacity, 16.9% less heating loads, and 7.5% less cooling loads. Source energy use differences range from -7.6% to 5.1%. When comparing the Prototype method with the AutoZone method, source energy use differences range from -12.1% to 19.0%, and larger ranges of differences are found for the thermal loads and equipment capacities. This study demonstrated that zoning methods have a significant impact on the simulated energy use of UBEM. Finally, one recommendation resulting from this study is to use the AutoZone method with floor multiplier to obtain accurate results while balancing the simulation run time for UBEM.« less

  2. Impacts of building geometry modeling methods on the simulation results of urban building energy models

    DOE PAGES

    Chen, Yixing; Hong, Tianzhen

    2018-02-20

    We present that urban-scale building energy modeling (UBEM)—using building modeling to understand how a group of buildings will perform together—is attracting increasing attention in the energy modeling field. Unlike modeling a single building, which will use detailed information, UBEM generally uses existing building stock data consisting of high-level building information. This study evaluated the impacts of three zoning methods and the use of floor multipliers on the simulated energy use of 940 office and retail buildings in three climate zones using City Building Energy Saver. The first zoning method, OneZone, creates one thermal zone per floor using the target building'smore » footprint. The second zoning method, AutoZone, splits the building's footprint into perimeter and core zones. A novel, pixel-based automatic zoning algorithm is developed for the AutoZone method. The third zoning method, Prototype, uses the U.S. Department of Energy's reference building prototype shapes. Results show that simulated source energy use of buildings with the floor multiplier are marginally higher by up to 2.6% than those modeling each floor explicitly, which take two to three times longer to run. Compared with the AutoZone method, the OneZone method results in decreased thermal loads and less equipment capacities: 15.2% smaller fan capacity, 11.1% smaller cooling capacity, 11.0% smaller heating capacity, 16.9% less heating loads, and 7.5% less cooling loads. Source energy use differences range from -7.6% to 5.1%. When comparing the Prototype method with the AutoZone method, source energy use differences range from -12.1% to 19.0%, and larger ranges of differences are found for the thermal loads and equipment capacities. This study demonstrated that zoning methods have a significant impact on the simulated energy use of UBEM. Finally, one recommendation resulting from this study is to use the AutoZone method with floor multiplier to obtain accurate results while balancing the simulation run time for UBEM.« less

  3. Modeling approaches for the simulation of ultrasonic inspections of anisotropic composite structures in the CIVA software platform

    NASA Astrophysics Data System (ADS)

    Jezzine, Karim; Imperiale, Alexandre; Demaldent, Edouard; Le Bourdais, Florian; Calmon, Pierre; Dominguez, Nicolas

    2018-04-01

    Models for the simulation of ultrasonic inspections of flat and curved plate-like composite structures, as well as stiffeners, are available in the CIVA-COMPOSITE module released in 2016. A first modelling approach using a ray-based model is able to predict the ultrasonic propagation in an anisotropic effective medium obtained after having homogenized the composite laminate. Fast 3D computations can be performed on configurations featuring delaminations, flat bottom holes or inclusions for example. In addition, computations on ply waviness using this model will be available in CIVA 2017. Another approach is proposed in the CIVA-COMPOSITE module. It is based on the coupling of CIVA ray-based model and a finite difference scheme in time domain (FDTD) developed by AIRBUS. The ray model handles the ultrasonic propagation between the transducer and the FDTD computation zone that surrounds the composite part. In this way, the computational efficiency is preserved and the ultrasound scattering by the composite structure can be predicted. Alternatively, a high order finite element approach is currently developed at CEA but not yet integrated in CIVA. The advantages of this approach will be discussed and first simulation results on Carbon Fiber Reinforced Polymers (CFRP) will be shown. Finally, the application of these modelling tools to the construction of metamodels is discussed.

  4. Numerical Modeling to Assess DNAPL Movement and Removal at the Scenic Site Operable Unit Near Baton Rouge, Louisiana: A Case Study.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oostrom, Mart; Thorne, Paul D.; White, Mark D.

    2003-12-01

    Detailed three-dimensional multifluid flow modeling was conducted to assess movement and removal of dense nonaqueous phase liquid (DNAPL) movement at a waste site in Louisiana. The site’s subsurface consists of several permeable zones separated by (semi) confining clays. In the upper subsurface, the two major permeable zones are, starting with the uppermost zone, the +40- and +20-MSL (mean sea level) zones. At the site, a total of 23,000 m3 of DNAPL was emplaced in an open waste pit between 1962 and 1974. In this period, considerable amounts of DNAPL moved into the subsurface. By 1974 a portion of the DNAPLmore » was removed and the waste site was filled with low-permeability materials and closed. During this process, some of the DNAPL was mixed with the fill material and remained at the site. Between 1974 and 2000, no additional DNAPL recovery activities were implemented. In an effort to reduce the DNAPL source, organic liquid has been pumped through a timed-pumping scheme from a total of 7 wells starting in calendar year 2000. The recovery wells are screened in the lower part of the waste fill material. In site investigations, DNAPL has been encountered in the +40-MSL but not in the +20-MSL zone. The following questions are addressed: (1) Where has the DNAPL migrated vertically and laterally? (2) How much further is DNAPL expected to move in the next century? (3) How effective is the current DNAPL pumping in reducing the DNAPL source? The computational domains for the simulations were derived from 3-D interpolations of borehole logs using a geologic interpretation software (EarthvisionTM ) . The simulation results show that DNAPL primarily entered the subsurface in the period 1962 – 1974, when the waste site was operational. After 1974, the infiltration rates dropped dramatically as a result of the infilling of the waste pit. The simulation results indicate that DNAPL moved from the pit into the underlying +40-MSL zone through two contact zones at the west side of the pit. Lateral movement of the DNAPL body has been relatively slow as a result of the high viscosity and the rapidly decreasing driving force after the waste pit was filled in. For all simulations, lateral movement of DNAPL in the period 1962 - 2001 is predicted to be less than 60 m from the two contact areas, while additional movement in the next century is expected to be less than 30 m. No DNAPL is predicted to enter the +20-MSL zone, which agrees with site information. The simulations also clearly demonstrate the minimal effect of the current pumping scheme on source reduction and DNAPL movement.« less

  5. Alternative Zoning Scenarios for Regional Sustainable Land Use Controls in China: A Knowledge-Based Multiobjective Optimisation Model

    PubMed Central

    Xia, Yin; Liu, Dianfeng; Liu, Yaolin; He, Jianhua; Hong, Xiaofeng

    2014-01-01

    Alternative land use zoning scenarios provide guidance for sustainable land use controls. This study focused on an ecologically vulnerable catchment on the Loess Plateau in China, proposed a novel land use zoning model, and generated alternative zoning solutions to satisfy the various requirements of land use stakeholders and managers. This model combined multiple zoning objectives, i.e., maximum zoning suitability, maximum planning compatibility and maximum spatial compactness, with land use constraints by using goal programming technique, and employed a modified simulated annealing algorithm to search for the optimal zoning solutions. The land use zoning knowledge was incorporated into the initialisation operator and neighbourhood selection strategy of the simulated annealing algorithm to improve its efficiency. The case study indicates that the model is both effective and robust. Five optimal zoning scenarios of the study area were helpful for satisfying the requirements of land use controls in loess hilly regions, e.g., land use intensification, agricultural protection and environmental conservation. PMID:25170679

  6. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    NASA Astrophysics Data System (ADS)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  7. An information model for use in software management estimation and prediction

    NASA Technical Reports Server (NTRS)

    Li, Ningda R.; Zelkowitz, Marvin V.

    1993-01-01

    This paper describes the use of cluster analysis for determining the information model within collected software engineering development data at the NASA/GSFC Software Engineering Laboratory. We describe the Software Management Environment tool that allows managers to predict development attributes during early phases of a software project and the modifications we propose to allow it to develop dynamic models for better predictions of these attributes.

  8. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  9. Thermal Modeling of Bridgman Crystal Growth

    NASA Technical Reports Server (NTRS)

    Cothran, E.

    1983-01-01

    Heat Flow modeled for moving or stationary rod shaped sample inside directional-solidification furnace. Program effectively models one-dimensional heat flow in translating or motionless rod-shaped sample inside of directionalsolidification furnace in which adiabatic zone separates hot zone and cold zone. Applicable to systems for which Biot numbers in hot and cold zones are less than unity.

  10. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  11. Inhalation exposure to cleaning products: application of a two-zone model.

    PubMed

    Earnest, C Matt; Corsi, Richard L

    2013-01-01

    In this study, modifications were made to previously applied two-zone models to address important factors that can affect exposures during cleaning tasks. Specifically, we expand on previous applications of the two-zone model by (1) introducing the source in discrete elements (source-cells) as opposed to a complete instantaneous release, (2) placing source cells in both the inner (near person) and outer zones concurrently, (3) treating each source cell as an independent mixture of multiple constituents, and (4) tracking the time-varying liquid concentration and emission rate of each constituent in each source cell. Three experiments were performed in an environmentally controlled chamber with a thermal mannequin and a simplified pure chemical source to simulate emissions from a cleaning product. Gas phase concentration measurements were taken in the bulk air and in the breathing zone of the mannequin to evaluate the model. The mean ratio of the integrated concentration in the mannequin's breathing zone to the concentration in the outer zone was 4.3 (standard deviation, σ = 1.6). The mean ratio of measured concentration in the breathing zone to predicted concentrations in the inner zone was 0.81 (σ = 0.16). Intake fractions ranged from 1.9 × 10(-3) to 2.7 × 10(-3). Model results reasonably predict those of previous exposure monitoring studies and indicate the inadequacy of well-mixed single-zone model applications for some but not all cleaning events.

  12. Are Earth System model software engineering practices fit for purpose? A case study.

    NASA Astrophysics Data System (ADS)

    Easterbrook, S. M.; Johns, T. C.

    2009-04-01

    We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.

  13. Students' Different Understandings of Class Diagrams

    ERIC Educational Resources Information Center

    Boustedt, Jonas

    2012-01-01

    The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a…

  14. Model-based software process improvement

    NASA Technical Reports Server (NTRS)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  15. Increasing the reliability of ecological models using modern software engineering techniques

    Treesearch

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  16. Tips on Creating Complex Geometry Using Solid Modeling Software

    ERIC Educational Resources Information Center

    Gow, George

    2008-01-01

    Three-dimensional computer-aided drafting (CAD) software, sometimes referred to as "solid modeling" software, is easy to learn, fun to use, and becoming the standard in industry. However, many users have difficulty creating complex geometry with the solid modeling software. And the problem is not entirely a student problem. Even some teachers and…

  17. Software engineering and the role of Ada: Executive seminar

    NASA Technical Reports Server (NTRS)

    Freedman, Glenn B.

    1987-01-01

    The objective was to introduce the basic terminology and concepts of software engineering and Ada. The life cycle model is reviewed. The application of the goals and principles of software engineering is applied. An introductory understanding of the features of the Ada language is gained. Topics addressed include: the software crises; the mandate of the Space Station Program; software life cycle model; software engineering; and Ada under the software engineering umbrella.

  18. Extruded upper first molar intrusion: Comparison between unilateral and bilateral miniscrew anchorage.

    PubMed

    Sugii, Mari Miura; Barreto, Bruno de Castro Ferreira; Francisco Vieira-Júnior, Waldemir; Simone, Katia Regina Izola; Bacchi, Ataís; Caldas, Ricardo Armini

    2018-01-01

    The aim of his study was to evaluate the stress on tooth and alveolar bone caused by orthodontic intrusion forces in a supraerupted upper molar, by using a three-dimensional Finite Element Method (FEM). A superior maxillary segment was modeled in the software SolidWorks 2010 (SolidWorks Corporation, Waltham, MA, USA) containing: cortical and cancellous bone, supraerupted first molar, periodontal tissue and orthodontic components. A finite element model has simulated intrusion forces of 4N onto a tooth, directed to different mini-screw locations. Three different intrusion mechanics vectors were simulated: anchoring on a buccal mini-implant; anchoring on a palatal mini-implant and the association of both anchorage systems. All analyses were performed considering the minimum principal stress and total deformation. Qualitative analyses exhibited stress distribution by color maps. Quantitative analysis was performed with a specific software for reading and solving numerical equations (ANSYS Workbench 14, Ansys, Canonsburg, Pennsylvania, USA). Intrusion forces applied from both sides (buccal and palatal) resulted in a more homogeneous stress distribution; no high peak of stress was detected and it has allowed a vertical resultant movement. Buccal or palatal single-sided forces resulted in concentrated stress zones with higher values and tooth tipping to respective force side. Unilateral forces promoted higher stress in root apex and higher dental tipping. The bilateral forces promoted better distribution without evidence of dental tipping. Bilateral intrusion technique suggested lower probability of root apex resorption.

  19. The Regulation of Growth in the Distal Elongation Zone of Maize Roots

    NASA Technical Reports Server (NTRS)

    Evans, Michael L.

    1998-01-01

    The major goals of the proposed research were 1. To develop specialized software for automated whole surface root expansion analysis and to develop technology for controlled placement of surface electrodes for analysis of relationships between root growth and root pH and electrophysiological properties. 2. To measure surface pH patterns and determine the possible role of proton flux in gravitropic sensing or response, and 3. To determine the role of auxin transport in establishment of patterns of proton flux and electrical gradients during the gravitropic response of roots with special emphasis on the role of the distal elongation zone in the early phases of the gravitropic response.

  20. Presenting an Evaluation Model for the Cancer Registry Software.

    PubMed

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  1. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  2. Validating the Performance of the FHWA Work Zone Model Version 1.0: A Case Study Along I-91 in Springfield, Massachusetts

    DOT National Transportation Integrated Search

    2017-08-01

    Central to the effective design of work zones is being able to understand how drivers behave as they approach and enter a work zone area. States use simulation tools in modeling freeway work zones to predict work zone impacts and to select optimal de...

  3. Hybrid ray-FDTD model for the simulation of the ultrasonic inspection of CFRP parts

    NASA Astrophysics Data System (ADS)

    Jezzine, Karim; Ségur, Damien; Ecault, Romain; Dominguez, Nicolas; Calmon, Pierre

    2017-02-01

    Carbon Fiber Reinforced Polymers (CFRP) are commonly used in structural parts in the aeronautic industry, to reduce the weight of aircraft while maintaining high mechanical performances. Simulation of the ultrasonic inspections of these parts has to face the highly heterogeneous and anisotropic characteristics of these materials. To model the propagation of ultrasound in these composite structures, we propose two complementary approaches. The first one is based on a ray model predicting the propagation of the ultrasound in an anisotropic effective medium obtained from a homogenization of the material. The ray model is designed to deal with possibly curved parts and subsequent continuously varying anisotropic orientations. The second approach is based on the coupling of the ray model, and a finite difference scheme in time domain (FDTD). The ray model handles the ultrasonic propagation between the transducer and the FDTD computation zone that surrounds the composite part. In this way, the computational efficiency is preserved and the ultrasound scattering by the composite structure can be predicted. Inspections of flat or curved composite panels, as well as stiffeners can be performed. The models have been implemented in the CIVA software platform and compared to experiments. We also present an application of the simulation to the performance demonstration of the adaptive inspection technique SAUL (Surface Adaptive Ultrasound).

  4. 26 CFR 1.1400L(b)-1 - Additional first year depreciation deduction for qualified New York Liberty Zone property.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... manner (for example, the election cannot be made through a request under section 446(e) to change the...) and depreciated under section 168; (iii) Computer software as defined in, and depreciated under... consent, the taxpayer must submit a request for a letter ruling. (ii) Automatic 6-month extension. If a...

  5. [Analysis and spatial description to correlative factors on food hygiene appeal and food poison in restaurants of city zone in Qingdao].

    PubMed

    Liu, Ying; Guo, Xin-biao; Li, Hai-rong; Yang, Lin-sheng

    2006-07-01

    To study some factors that affected food poison and appeals in restaurants which were hidden danger on the cards. Data on food hygiene events from 2002 to 2004 in restaurants of 14 blocks which were located in the important city zone of Qingdao were collected and studied. The spatial distribution was conducted by means of Geographic Information System (GIS). Possible factors related to food hygiene events were investigated and analysed by NCSS Data statistics software. information of every block were marked on digitalized map by ARCVIEW3.2a software in order to show the spatial distribution of food hygiene events palpably in different areas in the course of three years. It was showed that air temperature, humidity, sunlight length were the important factors of food poison. Average amount of guests and floating population related to administration level of sanitation, the level of sanitation administration, geography location, business status of restaurants related to their status of food sanitation. This study showed the method that analysed and studied status of food sanitation from different areas by GIS were effective, simple and palpable.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitledge, T.E.; Malloy, S.C.; Patton, C.J.

    This manual was assembled for use as a guide for analyzing the nutrient content of seawater samples collected in the marine coastal zone of the Northeast United States and the Bering Sea. Some modifications (changes in dilution or sample pump tube sizes) may be necessary to achieve optimum measurements in very pronounced oligotrophic, eutrophic or brackish areas. Information is presented under the following section headings: theory and mechanics of automated analysis; continuous flow system description; operation of autoanalyzer system; cookbook of current nutrient methods; automated analyzer and data analysis software; computer interfacing and hardware modifications; and trouble shooting. The threemore » appendixes are entitled: references and additional reading; manifold components and chemicals; and software listings. (JGB)« less

  7. Agriculture and groundwater nitrate contamination in the Seine basin. The STICS-MODCOU modelling chain.

    PubMed

    Ledoux, E; Gomez, E; Monget, J M; Viavattene, C; Viennot, P; Ducharne, A; Benoit, M; Mignolet, C; Schott, C; Mary, B

    2007-04-01

    A software package is presented here to predict the fate of nitrogen fertilizers and the transport of nitrate from the rooting zone of agricultural areas to surface water and groundwater in the Seine basin, taking into account the long residence times of water and nitrate in the unsaturated and aquifer systems. Information on pedological characteristics, land use and farming practices is used to determine the spatial units to be considered. These data are converted into input data for the crop model STICS which simulates the water and nitrogen balances in the soil-plant system with a daily time-step. A spatial application of STICS has been derived at the catchment scale which computes the water and nitrate fluxes at the bottom of the rooting zone. These fluxes are integrated into a surface and groundwater coupled model MODCOU which calculates the daily water balance in the hydrological system, the flow in the rivers and the piezometric variations in the aquifers, using standard climatic data (rainfall, PET). The transport of nitrate and the evolution of nitrate contamination in groundwater and to rivers is computed by the model NEWSAM. This modelling chain is a valuable tool to predict the evolution of crop productivity and nitrate contamination according to various scenarios modifying farming practices and/or climatic changes. Data for the period 1970-2000 are used to simulate the past evolution of nitrogen contamination. The method has been validated using available data bases of nitrate concentrations in the three main aquifers of the Paris basin (Oligocene, Eocene and chalk). The approach has then been used to predict the future evolution of nitrogen contamination up to 2015. A statistical approach allowed estimating the probability of transgression of different concentration thresholds in various areas in the basin. The model is also used to evaluate the cost of the damage resulting of the treatment of drinking water at the scale of a groundwater management unit in the Seine river basin.

  8. Adapting Better Interpolation Methods to Model Amphibious MT Data Along the Cascadian Subduction Zone.

    NASA Astrophysics Data System (ADS)

    Parris, B. A.; Egbert, G. D.; Key, K.; Livelybrooks, D.

    2016-12-01

    Magnetotellurics (MT) is an electromagnetic technique used to model the inner Earth's electrical conductivity structure. MT data can be analyzed using iterative, linearized inversion techniques to generate models imaging, in particular, conductive partial melts and aqueous fluids that play critical roles in subduction zone processes and volcanism. For example, the Magnetotelluric Observations of Cascadia using a Huge Array (MOCHA) experiment provides amphibious data useful for imaging subducted fluids from trench to mantle wedge corner. When using MOD3DEM(Egbert et al. 2012), a finite difference inversion package, we have encountered problems inverting, particularly, sea floor stations due to the strong, nearby conductivity gradients. As a work-around, we have found that denser, finer model grids near the land-sea interface produce better inversions, as characterized by reduced data residuals. This is partly to be due to our ability to more accurately capture topography and bathymetry. We are experimenting with improved interpolation schemes that more accurately track EM fields across cell boundaries, with an eye to enhancing the accuracy of the simulated responses and, thus, inversion results. We are adapting how MOD3DEM interpolates EM fields in two ways. The first seeks to improve weighting functions for interpolants to better address current continuity across grid boundaries. Electric fields are interpolated using a tri-linear spline technique, where the eight nearest electrical field estimates are each given weights determined by the technique, a kind of weighted average. We are modifying these weights to include cross-boundary conductivity ratios to better model current continuity. We are also adapting some of the techniques discussed in Shantsev et al (2014) to enhance the accuracy of the interpolated fields calculated by our forward solver, as well as to better approximate the sensitivities passed to the software's Jacobian that are used to generate a new forward model during each iteration of the inversion.

  9. One-dimensional flow model of the river-hyporheic zone system

    NASA Astrophysics Data System (ADS)

    Pokrajac, D.

    2016-12-01

    The hyporheic zone is a shallow layer beneath natural streams that is characterized by intense exchange of water, nutrients, pollutants and thermal energy. Understanding these exchange processes is crucial for successful modelling of the river hydrodynamics and morphodynamics at various scales from the river corridor up to the river network scale (Cardenas, 2015). Existing simulation models of hyporheic exchange processes are either idealized models of the tracer movement through the river-hyporheic zone system (e.g. TSM, Bencala and Walters, 1983) or detailed models of turbulent flow in a stream, coupled with a conventional 2D Darcian groundwater model (e.g. Cardenas and Wilson, 2007). This paper presents an alternative approach which involves a simple 1-D simulation model of the hyporheic zone system based on the classical SWE equations coupled with the newly developed porous media analogue. This allows incorporating the effects of flow unsteadiness and non-Darcian parameterization od the drag term in the hyporheic zone model. The conceptual model of the stream-hyporheic zone system consists of a 1D model of the open channel flow in the river, coupled with a 1D model of the flow in the hyporheic zone via volume flux due to the difference in the water level in the river and the hyporheic zone. The interaction with the underlying groundwater aquifer is neglected, but coupling the present model with any conventional groundwater model is straightforward. The paper presents the derivation of the 1D flow equations for flow in the hyporheic zone, the details of the numerical scheme used for solving them and the model validation by comparison with published experimental data. References Bencala, K. E., and R. A. Walters (1983) "Simulation of solute transport in a mountain pool-and-riffle stream- a transient storage model", Water Resources Reseach 19(3): 718-724. Cardenas, M. B. (2015) "Hyporheic zone hydrologic science: A historical account of its emergence and a prospectus", Water Resources Research 51: 3601-3616 Cardenas, M. B., and J. L. Wilson (2007) "Dunes, turbulent eddies, and interfacial exchange with permeable sediments", Water Resour. Res. 43:W08412

  10. Modeling work zone crash frequency by quantifying measurement errors in work zone length.

    PubMed

    Yang, Hong; Ozbay, Kaan; Ozturk, Ozgur; Yildirimoglu, Mehmet

    2013-06-01

    Work zones are temporary traffic control zones that can potentially cause safety problems. Maintaining safety, while implementing necessary changes on roadways, is an important challenge traffic engineers and researchers have to confront. In this study, the risk factors in work zone safety evaluation were identified through the estimation of a crash frequency (CF) model. Measurement errors in explanatory variables of a CF model can lead to unreliable estimates of certain parameters. Among these, work zone length raises a major concern in this analysis because it may change as the construction schedule progresses generally without being properly documented. This paper proposes an improved modeling and estimation approach that involves the use of a measurement error (ME) model integrated with the traditional negative binomial (NB) model. The proposed approach was compared with the traditional NB approach. Both models were estimated using a large dataset that consists of 60 work zones in New Jersey. Results showed that the proposed improved approach outperformed the traditional approach in terms of goodness-of-fit statistics. Moreover it is shown that the use of the traditional NB approach in this context can lead to the overestimation of the effect of work zone length on the crash occurrence. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Program Model Checking as a New Trend

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.

  12. Knowledge-based approach for generating target system specifications from a domain model

    NASA Technical Reports Server (NTRS)

    Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan

    1992-01-01

    Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.

  13. Software cost/resource modeling: Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. J.

    1980-01-01

    A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  14. A fingerprint of the epileptogenic zone in human epilepsies.

    PubMed

    Grinenko, Olesya; Li, Jian; Mosher, John C; Wang, Irene Z; Bulacio, Juan C; Gonzalez-Martinez, Jorge; Nair, Dileep; Najm, Imad; Leahy, Richard M; Chauvel, Patrick

    2018-01-01

    Defining a bio-electrical marker for the brain area responsible for initiating a seizure remains an unsolved problem. Fast gamma activity has been identified as the most specific marker for seizure onset, but conflicting results have been reported. In this study, we describe an alternative marker, based on an objective description of interictal to ictal transition, with the aim of identifying a time-frequency pattern or 'fingerprint' that can differentiate the epileptogenic zone from areas of propagation. Seventeen patients who underwent stereoelectroencephalography were included in the study. Each had seizure onset characterized by sustained gamma activity and were seizure-free after tailored resection or laser ablation. We postulated that the epileptogenic zone was always located inside the resection region based on seizure freedom following surgery. To characterize the ictal frequency pattern, we applied the Morlet wavelet transform to data from each pair of adjacent intracerebral electrode contacts. Based on a visual assessment of the time-frequency plots, we hypothesized that a specific time-frequency pattern in the epileptogenic zone should include a combination of (i) sharp transients or spikes; preceding (ii) multiband fast activity concurrent; with (iii) suppression of lower frequencies. To test this hypothesis, we developed software that automatically extracted each of these features from the time-frequency data. We then used a support vector machine to classify each contact-pair as being within epileptogenic zone or not, based on these features. Our machine learning system identified this pattern in 15 of 17 patients. The total number of identified contacts across all patients was 64, with 58 localized inside the resected area. Subsequent quantitative analysis showed strong correlation between maximum frequency of fast activity and suppression inside the resection but not outside. We did not observe significant discrimination power using only the maximum frequency or the timing of fast activity to differentiate contacts either between resected and non-resected regions or between contacts identified as epileptogenic versus non-epileptogenic. Instead of identifying a single frequency or a single timing trait, we observed the more complex pattern described above that distinguishes the epileptogenic zone. This pattern encompasses interictal to ictal transition and may extend until seizure end. Its time-frequency characteristics can be explained in light of recent models emphasizing the role of fast inhibitory interneurons acting on pyramidal cells as a prominent mechanism in seizure triggering. The pattern clearly differentiates the epileptogenic zone from areas of propagation and, as such, represents an epileptogenic zone 'fingerprint'.awx306media15687076823001. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain.

  15. A fingerprint of the epileptogenic zone in human epilepsies

    PubMed Central

    Grinenko, Olesya; Li, Jian; Mosher, John C; Wang, Irene Z; Bulacio, Juan C; Gonzalez-Martinez, Jorge; Nair, Dileep; Najm, Imad; Leahy, Richard M; Chauvel, Patrick

    2018-01-01

    Abstract Defining a bio-electrical marker for the brain area responsible for initiating a seizure remains an unsolved problem. Fast gamma activity has been identified as the most specific marker for seizure onset, but conflicting results have been reported. In this study, we describe an alternative marker, based on an objective description of interictal to ictal transition, with the aim of identifying a time-frequency pattern or ‘fingerprint’ that can differentiate the epileptogenic zone from areas of propagation. Seventeen patients who underwent stereoelectroencephalography were included in the study. Each had seizure onset characterized by sustained gamma activity and were seizure-free after tailored resection or laser ablation. We postulated that the epileptogenic zone was always located inside the resection region based on seizure freedom following surgery. To characterize the ictal frequency pattern, we applied the Morlet wavelet transform to data from each pair of adjacent intracerebral electrode contacts. Based on a visual assessment of the time-frequency plots, we hypothesized that a specific time-frequency pattern in the epileptogenic zone should include a combination of (i) sharp transients or spikes; preceding (ii) multiband fast activity concurrent; with (iii) suppression of lower frequencies. To test this hypothesis, we developed software that automatically extracted each of these features from the time-frequency data. We then used a support vector machine to classify each contact-pair as being within epileptogenic zone or not, based on these features. Our machine learning system identified this pattern in 15 of 17 patients. The total number of identified contacts across all patients was 64, with 58 localized inside the resected area. Subsequent quantitative analysis showed strong correlation between maximum frequency of fast activity and suppression inside the resection but not outside. We did not observe significant discrimination power using only the maximum frequency or the timing of fast activity to differentiate contacts either between resected and non-resected regions or between contacts identified as epileptogenic versus non-epileptogenic. Instead of identifying a single frequency or a single timing trait, we observed the more complex pattern described above that distinguishes the epileptogenic zone. This pattern encompasses interictal to ictal transition and may extend until seizure end. Its time-frequency characteristics can be explained in light of recent models emphasizing the role of fast inhibitory interneurons acting on pyramidal cells as a prominent mechanism in seizure triggering. The pattern clearly differentiates the epileptogenic zone from areas of propagation and, as such, represents an epileptogenic zone ‘fingerprint’. PMID:29253102

  16. Using a coupled groundwater/surfacewater model to predict climate-change impacts to lakes in the Trout Lake watershed, Northern Wisconsin

    USGS Publications Warehouse

    Walker, John F.; Hunt, Randall J.; Markstrom, Steven L.; Hay, Lauren E.; Doherty, John

    2009-01-01

    A major focus of the U.S. Geological Survey’s Trout Lake Water, Energy, and Biogeochemical Budgets (WEBB) project is the development of a watershed model to allow predictions of hydrologic response to future conditions including land-use and climate change. The coupled groundwater/surface-water model GSFLOW was chosen for this purpose because it could easily incorporate an existing groundwater flow model and it provides for simulation of surface-water processes. The Trout Lake watershed in northern Wisconsin is underlain by a highly conductive outwash sand aquifer. In this area, streamflow is dominated by groundwater contributions; however, surface runoff occurs during intense rainfall periods and spring snowmelt. Surface runoff also occurs locally near stream/lake areas where the unsaturated zone is thin. A diverse data set, collected from 1992 to 2007 for the Trout Lake WEBB project and the co-located and NSF-funded North Temperate Lakes LTER project, includes snowpack, solar radiation, potential evapotranspiration, lake levels, groundwater levels, and streamflow. The timeseries processing software TSPROC (Doherty 2003) was used to distill the large time series data set to a smaller set of observations and summary statistics that captured the salient hydrologic information. The timeseries processing reduced hundreds of thousands of observations to less than 5,000. Model calibration included specific predictions for several lakes in the study area using the PEST parameter estimation suite of software (Doherty 2007). The calibrated model was used to simulate the hydrologic response in the study lakes to a variety of climate change scenarios culled from the IPCC Fourth Assessment Report of the Intergovernmental Panel on Climate Change (Solomon et al. 2007). Results from the simulations indicate climate change could result in substantial changes to the lake levels and components of the hydrologic budget of a seepage lake in the flow system. For a drainage lake lower in the flow system, the impacts of climate change are diminished. 

  17. Slope failures evaluation and landslides investigation using 2-D resistivity method

    NASA Astrophysics Data System (ADS)

    Nordiana, M. M.; Azwin, I. N.; Nawawi, M. N. M.; Khalil, A. E.

    2018-06-01

    Slope failure is a complex phenomenon that may caused to landslides. Buildings and infrastructure such as transportation facilities and pipelines located within the boundaries of a landslide can be damaged or destroyed. Slope failure classification and various factors contributing to the instability using 2-D resistivity survey conducted in Selangor, Malaysia are described. Six 2-D resistivity survey lines with 5 m minimum electrode spacing using Pole-dipole array were performed. The data were processed using Res2Dinv and surfer10 software to evaluate the subsurface characteristics. The 2-D resistivity results show that the subsurface consist of two main zones. The first zone was alluvium or highly weathered with resistivity value of 100-1000 Ω m and depth of >30 m. This zone consists of saturated area with resistivity value of 1-100 Ω m and boulders with resistivity value of 1200-7000 Ω m. The second zone with resistivity value of >7000 Ω m was interpreted as granitic bedrock. The study area was characterized by saturated zones, highly weathered zone, highly contain of sand and boulders that will trigger slope failure in the survey area. This will cause to low strength of soil, debris flow and movement of earth. On the basis of the case examples described, 2-D resistivity method is categorized into desirable and useful method in determination of slope failure and future assessments.

  18. Climate change impact on groundwater levels in the Guarani Aquifer outcrop zone

    NASA Astrophysics Data System (ADS)

    Melo, D. D.; Wendland, E.

    2013-12-01

    The unsustainable use of groundwater in many countries might cause water availability restrictions in the future. Such issue is likely to worsen due to predicted climate changes for the incoming decades. As numerous studies suggest, aquifers recharge rates will be affected as a result of climate change. The Guarani Aquifer System (GAS) is one of the most important transboundary aquifer in the world, providing drinkable water for millions of people in four South American countries (Brazil, Argentina, Uruguay and Paraguay). Considering the GAS relevance and how its recharge rates might be altered by climatic conditions anomalies, the objective of this work is to assess possible climate changes impacts on groundwater levels in this aquifer outcrop zone. Global Climate Models' (GCM) outputs were used as inputs in a transient flux groundwater model created using the software SPA (Simulation of Process in Aquifers), enabling groundwater table fluctuation to be evaluated under distinct climatic scenarios. Six monitoring wells, located in a representative basin (Ribeirão da Onça basin) inside a GAS outcrop zone (ROB), provided water table measurements between 2004 and 2011 to calibrate the groundwater model. Using observed climatic data, a water budget method was applied to estimate recharge in different types of land uses. Statistically downscaled future climate scenarios were used as inputs for that same recharge model, which provided data for running SPA under those scenarios. The results show that most of the GCMs used here predict temperature arises over 275,15 K and major monthly rainfall mean changes to take place in the dry season. During wet seasons, those means might experience around 50% decrease. The transient model results indicate that water table variations, derived from around 70% of the climate scenarios, would vary below those measured between 2004 and 2011. Among the thirteen GCMs considered in this work, only four of them predicted more extreme climate scenarios. In some regions of the study area and under these extreme conditions, groundwater surface would decline more than 10 m. Although more optimistic scenarios resulted in an increase of groundwater levels in more than half of ROB, these would cause up to 5 m water table decline. The results reinforce the need for a permanent hydrogeological monitoring, mainly in the GAS recharge areas, along with the development of other climate change impacts assessment works using different downscaling and recharge estimates methods.

  19. The degrees to which transtrochanteric rotational osteotomy moves the region of osteonecrotic femoral head out of the weight-bearing area as evaluated by computer simulation.

    PubMed

    Chen, Weng-Pin; Tai, Ching-Lung; Tan, Chih-Feng; Shih, Chun-Hsiung; Hou, Shun-Hsin; Lee, Mel S

    2005-01-01

    Transtrochanteric rotational osteotomy is a technical demanding procedure. Currently, the pre-operative planning of the transtrochanteric rotational osteotomy is mostly based on X-ray images. The surgeons would need to reconstruct the three-dimensional structure of the femoral head and the necrosis in their mind. This study develops a simulation platform using computer models based on the computed tomography images of the femoral head to evaluate the degree to which transtrochanteric rotational osteotomy moves the region of osteonecrotic femoral head out of the weight-bearing area in stance and gait cycle conditions. Based on this simulation procedure, the surgeons would be better informed before the surgery and the indication can be carefully assessed. A case with osteonecrosis involving 15% of the femoral head was recruited. Virtual models with the same size lesion but at different locations were devised. Computer models were created using SolidWorks 2000 CAD software. The area ratio of weight-bearing zone occupied by the necrotic lesion on two conditions, stance and gait cycle, were measured after surgery simulations. For the specific case and virtual models devised in this study, computer simulation showed the following two findings: (1) The degrees needed to move the necrosis out of the weight-bearing zone in stance were less by anterior rotational osteotomy as compared to that of posterior rotational osteotomy. However, the necrotic region would still overlap with the weight-bearing area during gait cycle. (2) Because the degrees allowed for posterior rotation were less restricted than anterior rotation, posterior rotational osteotomies were often more effective to move the necrotic region out of the weight-bearing area during gait cycle. The computer simulation platform by registering actual CT images is a useful tool to assess the direction and degrees needed for transtrochanteric rotational osteotomy. Although the results indicated that anterior rotational osteotomy was more effective to move the necrosis out of the weight-bearing zone in stance for models devised in this study, in circumstances where the necrotic region located at various locale, considering the limitation of anterior rotation inherited with the risk of vascular compromise, it might be more beneficial to perform posterior rotation osteotomy in taking account of gait cycle.

  20. Proposed method for hazard mapping of landslide propagation zone

    NASA Astrophysics Data System (ADS)

    Serbulea, Manole-Stelian; Gogu, Radu; Manoli, Daniel-Marcel; Gaitanaru, Dragos Stefan; Priceputu, Adrian; Andronic, Adrian; Anghel, Alexandra; Liviu Bugea, Adrian; Ungureanu, Constantin; Niculescu, Alexandru

    2013-04-01

    Sustainable development of communities situated in areas with landslide potential requires a fully understanding of the mechanisms that govern the triggering of the phenomenon as well as the propagation of the sliding mass, with catastrophic consequences on the nearby inhabitants and environment. Modern analysis methods for areas affected by the movement of the soil bodies are presented in this work, as well as a new procedure to assess the landslide hazard. Classical soil mechanics offer sufficient numeric models to assess the landslide triggering zone, such as Limit Equilibrium Methods (Fellenius, Janbu, Morgenstern-Price, Bishop, Spencer etc.), blocks model or progressive mobilization models, Lagrange-based finite element method etc. The computation methods for assessing the propagation zones are quite recent and have high computational requirements, thus not being sufficiently used in practice to confirm their feasibility. The proposed procedure aims to assess not only the landslide hazard factor, but also the affected areas, by means of simple mathematical operations. The method can easily be employed in GIS software, without requiring engineering training. The result is obtained by computing the first and second derivative of the digital terrain model (slope and curvature maps). Using the curvature maps, it is shown that one can assess the areas most likely to be affected by the propagation of the sliding masses. The procedure is first applied on a simple theoretical model and then used on a representative section of a high exposure area in Romania. The method is described by comparison with Romanian legislation for risk and vulnerability assessment, which specifies that the landslide hazard is to be assessed, using an average hazard factor Km, obtained from various other factors. Following the employed example, it is observed that using the Km factor there is an inconsistent distribution of the polygonal surfaces corresponding to different landslide potential. For small values of Km (0.00..0.10) the polygonal surfaces have reduced dimensions along the slopes belonging to main rivers. This can be corrected by including in the analysis the potential areas to be affected by soil instability. Finally, it is shown that the proposed procedure can be used to better assess these areas and to produce more reliable landslide hazard maps. This work was supported by a grant of the Romanian National Authority for Scientific Research, Program for research - Space Technology and Advanced Research - STAR, project number 30/2012.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, D. I.; Han, S. H.

    A PSA analyst has been manually determining fire-induced component failure modes and modeling them into the PSA logics. These can be difficult and time-consuming tasks as they need much information and many events are to be modeled. KAERI has been developing the IPRO-ZONE (interface program for constructing zone effect table) to facilitate fire PSA works for identifying and modeling fire-induced component failure modes, and to construct a one top fire event PSA model. With the output of the IPRO-ZONE, the AIMS-PSA, and internal event one top PSA model, one top fire events PSA model is automatically constructed. The outputs ofmore » the IPRO-ZONE include information on fire zones/fire scenarios, fire propagation areas, equipment failure modes affected by a fire, internal PSA basic events corresponding to fire-induced equipment failure modes, and fire events to be modeled. This paper introduces the IPRO-ZONE, and its application results to fire PSA of Ulchin Unit 3 and SMART(System-integrated Modular Advanced Reactor). (authors)« less

  2. Software Cost-Estimation Model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1985-01-01

    Software Cost Estimation Model SOFTCOST provides automated resource and schedule model for software development. Combines several cost models found in open literature into one comprehensive set of algorithms. Compensates for nearly fifty implementation factors relative to size of task, inherited baseline, organizational and system environment and difficulty of task.

  3. Software Program: Software Management Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this NASA Software Management Guidebook is twofold. First, this document defines the core products and activities required of NASA software projects. It defines life-cycle models and activity-related methods but acknowledges that no single life-cycle model is appropriate for all NASA software projects. It also acknowledges that the appropriate method for accomplishing a required activity depends on characteristics of the software project. Second, this guidebook provides specific guidance to software project managers and team leaders in selecting appropriate life cycles and methods to develop a tailored plan for a software engineering project.

  4. Enhanced Visualization of Subtle Outer Retinal Pathology by En Face Optical Coherence Tomography and Correlation with Multi-Modal Imaging

    PubMed Central

    Chew, Avenell L.; Lamey, Tina; McLaren, Terri; De Roach, John

    2016-01-01

    Purpose To present en face optical coherence tomography (OCT) images generated by graph-search theory algorithm-based custom software and examine correlation with other imaging modalities. Methods En face OCT images derived from high density OCT volumetric scans of 3 healthy subjects and 4 patients using a custom algorithm (graph-search theory) and commercial software (Heidelberg Eye Explorer software (Heidelberg Engineering)) were compared and correlated with near infrared reflectance, fundus autofluorescence, adaptive optics flood-illumination ophthalmoscopy (AO-FIO) and microperimetry. Results Commercial software was unable to generate accurate en face OCT images in eyes with retinal pigment epithelium (RPE) pathology due to segmentation error at the level of Bruch’s membrane (BM). Accurate segmentation of the basal RPE and BM was achieved using custom software. The en face OCT images from eyes with isolated interdigitation or ellipsoid zone pathology were of similar quality between custom software and Heidelberg Eye Explorer software in the absence of any other significant outer retinal pathology. En face OCT images demonstrated angioid streaks, lesions of acute macular neuroretinopathy, hydroxychloroquine toxicity and Bietti crystalline deposits that correlated with other imaging modalities. Conclusions Graph-search theory algorithm helps to overcome the limitations of outer retinal segmentation inaccuracies in commercial software. En face OCT images can provide detailed topography of the reflectivity within a specific layer of the retina which correlates with other forms of fundus imaging. Our results highlight the need for standardization of image reflectivity to facilitate quantification of en face OCT images and longitudinal analysis. PMID:27959968

  5. Empirical studies of software design: Implications for SSEs

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    Implications for Software Engineering Environments (SEEs) are presented in viewgraph format for characteristics of projects studied; significant problems and crucial problem areas in software design for large systems; layered behavioral model of software processes; implications of field study results; software project as an ecological system; results of the LIFT study; information model of design exploration; software design strategies; results of the team design study; and a list of publications.

  6. Software Metrics

    DTIC Science & Technology

    1988-12-01

    software development scene is often charac- c. SPQR Model-Jones terized by: * schedule and cost estimates that are gross-d. COPMO-Thebaut ly inaccurate, SEI...time c. SPQR Model-Jones (in seconds) is simply derived from E by dividing T. Capers Jones has developed a software cost by the Stroud number, S...estimation model called the Software Produc- T=E/S tivity, Quality, and Reliability ( SPQR ) model. The basic approach is similar to that of Boehm’s The value

  7. Studying the Accuracy of Software Process Elicitation: The User Articulated Model

    ERIC Educational Resources Information Center

    Crabtree, Carlton A.

    2010-01-01

    Process models are often the basis for demonstrating improvement and compliance in software engineering organizations. A descriptive model is a type of process model describing the human activities in software development that actually occur. The purpose of a descriptive model is to provide a documented baseline for further process improvement…

  8. Overview of the TriBITS Lifecycle Model: Lean/Agile Software Lifecycle Model for Research-based Computational Science and Engineering Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  9. A conceptual model for megaprogramming

    NASA Technical Reports Server (NTRS)

    Tracz, Will

    1990-01-01

    Megaprogramming is component-based software engineering and life-cycle management. Magaprogramming and its relationship to other research initiatives (common prototyping system/common prototyping language, domain specific software architectures, and software understanding) are analyzed. The desirable attributes of megaprogramming software components are identified and a software development model and resulting prototype megaprogramming system (library interconnection language extended by annotated Ada) are described.

  10. Deep space network software cost estimation model

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1981-01-01

    A parametric software cost estimation model prepared for Deep Space Network (DSN) Data Systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit DSN software life cycle statistics. The estimation model output scales a standard DSN Work Breakdown Structure skeleton, which is then input into a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.

  11. Consistent Evolution of Software Artifacts and Non-Functional Models

    DTIC Science & Technology

    2014-11-14

    induce bad software performance)? 15. SUBJECT TERMS EOARD, Nano particles, Photo-Acoustic Sensors, Model-Driven Engineering ( MDE ), Software Performance...Università degli Studi dell’Aquila, Via Vetoio, 67100 L’Aquila, Italy Email: vittorio.cortellessa@univaq.it Web : http: // www. di. univaq. it/ cortelle/ Phone...Model-Driven Engineering ( MDE ), Software Performance Engineering (SPE), Change Propagation, Performance Antipatterns. For sake of readability of the

  12. An analytical model for non-conservative pollutants mixing in the surf zone.

    PubMed

    Ki, Seo Jin; Hwang, Jin Hwan; Kang, Joo-Hyon; Kim, Joon Ha

    2009-01-01

    Accurate simulation of the surf zone is a prerequisite to improve beach management as well as to understand the fundamentals of fate and transport of contaminants. In the present study, a diagnostic model modified from a classic solute model is provided to illuminate non-conservative pollutants behavior in the surf zone. To readily understand controlling processes in the surf zone, a new dimensionless quantity is employed with index of kappa number (K, a ratio of inactivation rate to transport rate of microbial pollutant in the surf zone), which was then evaluated under different environmental frames during a week simulation period. The sensitivity analysis showed that hydrodynamics and concentration gradients in the surf zone mostly depend on n (number of rip currents), indicating that n should be carefully adjusted in the model. The simulation results reveal, furthermore, that large deviation typically occurs in the daytime, signifying inactivation of fecal indicator bacteria is the main process to control surf zone water quality during the day. Overall, the analytical model shows a good agreement between predicted and synthetic data (R(2) = 0.51 and 0.67 for FC and ENT, respectively) for the simulated period, amplifying its potential use in the surf zone modelling. It is recommended that when the dimensionless index is much larger than 0.5, the present modified model can predict better than the conventional model, but if index is smaller than 0.5, the conventional model is more efficient with respect to time and cost.

  13. A finite element analysis of the vibrational behaviour of the intra-operatively manufactured prosthesis-femur system.

    PubMed

    Pastrav, L C; Devos, J; Van der Perre, G; Jaecques, S V N

    2009-05-01

    In total hip replacement (THR) a good initial stability of the prosthetic stem in the femur, which corresponds to a good overall initial contact, will help assure a good long-term result. During the insertion the implant stability increases and, as a consequence, the resonance frequencies increase, allowing the assessment of the implant fixation by vibration analysis. The influence of changing contact conditions on the resonance frequencies was however not yet quantitatively understood and therefore a finite element analysis (FEA) was set up. Modal analyses on the hip stem-femur system were performed in various contact situations. By modelling the contact changes by means of the contact tolerance options in the finite element software, contact could be varied over the entire hip stem surface or only in specific zones (proximal, central, distal) while keeping other system parameters constant. The results are in agreement with previous observations: contact increase causes positive resonance frequency shifts and the dynamic behaviour is most influenced by contact changes in the proximal zone. Although the finite element analysis did not establish a monotonous relationship between the vibrational mode number and the magnitude of the resonance frequency shift, in general the higher modes are more sensitive to the contact change.

  14. Coupling the Multizone Airflow and Contaminant Transport Software CONTAM with EnergyPlus Using Co-Simulation.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-08-01

    Building modelers need simulation tools capable of simultaneously considering building energy use, airflow and indoor air quality (IAQ) to design and evaluate the ability of buildings and their systems to meet today's demanding energy efficiency and IAQ performance requirements. CONTAM is a widely-used multizone building airflow and contaminant transport simulation tool that requires indoor temperatures as input values. EnergyPlus is a prominent whole-building energy simulation program capable of performing heat transfer calculations that require interzone and infiltration airflows as input values. On their own, each tool is limited in its ability to account for thermal processes upon which building airflow may be significantly dependent and vice versa. This paper describes the initial phase of coupling of CONTAM with EnergyPlus to capture the interdependencies between airflow and heat transfer using co-simulation that allows for sharing of data between independently executing simulation tools. The coupling is accomplished based on the Functional Mock-up Interface (FMI) for Co-simulation specification that provides for integration between independently developed tools. A three-zone combined heat transfer/airflow analytical BESTEST case was simulated to verify the co-simulation is functioning as expected, and an investigation of a two-zone, natural ventilation case designed to challenge the coupled thermal/airflow solution methods was performed.

  15. Lithology and characteristic of landslide in Gombel Hill by 2D geoelectric resistivity method using dipole-dipole configuration

    NASA Astrophysics Data System (ADS)

    Setyawan, Agus; Satria Fikri, Muhammad; Endro Suseno, Jatmiko; Fuad, Muhamad

    2018-05-01

    Gombel hill locates at Semarang, Central Java, Indonesia. Base on Semarang’s susceptiblity map zone, Gombel hill is belong to high susceptibility and instability zone. Instability may cause faults to Gombel hill area, unfortunately the geosciences research in Gombel is still lack. The geophysical survey has been conducted using 2D geoelectric resistivity method with dipole – dipole configuration to identify the lithology of landslide at Gombel hill. The data have been collected from three lines. The first and third line have 100 m length, and the second line have 80 m length with 5 m space in each lines. The data were processed and modelled using Res2Dinv software. From the first line, suspected there are two layers which formed the structure of the subsurface. The second line suspected there are three layers which formed the structure of the subsurface. And the last line suspected there are two layers which formed the structure of the subsurface. Overall, the landslide of Gombel hill area can be found with depth 5 m – 6 m and found at contact between clay and clay rock layer. We expect the results can be used for mitigation hazard and planning the developing infrastructure in Gombel area.

  16. GPS deformation rates in the Bajo Segura Basin (NE of the Eastern Betic Shear Zone, SE Spain)

    NASA Astrophysics Data System (ADS)

    Jesús Borque, María; Sánchez-Alzola, Alberto; Estévez, Antonio; García-Tortosa, Francisco J.; Martín-Rojas, Iván; Molina, Sergio; Alfaro, Pedro; Rodríguez-Caderot, Gracia; de Lacy, Clara; García-Armenteros, Juan Antonio; Avilés, Manuel; Herrera, Antonio; Rosa-Cintas, Sergio; Gil, Antonio J.

    2014-05-01

    The Bajo Segura Basin, located in the NE end of the Eastern Betic Shear Zone, is one of the areas with highest seismic activity of the Iberian Peninsula. It is bounded by the Crevillente Fault to the north and the Bajo Segura Fault to the south, and it is characterized by a Late Miocene to Quaternary folded cover. We estimate the present-day deformation of the study area from a GPS network with 11 sites. Observation campaigns were carried out four times (June 1999, September 2001, September 2002 and September 2013). We used the 6.2 version of GIPSY-OASIS software to process GPS data in Precise Point Positioning mode (PPP). In order to obtain the position time series in the whole period of these episodic campaigns, all the GPS observations from 1999 to 2013 campaigns were processed with an identical standard procedure. We compared our velocity field estimation with respect to GEODVEL tectonic model to obtain the residual velocity field of the Bajo Segura Basin. We estimated a ~N-S shortening with deformation rates varying between 0.2 and 0.6 mm/yr. These results are consistent with local geological deformation rates although slightly higher. They also fit well with regional geodetic data estimated for the Western Mediterranean.

  17. Novel gold nanoparticle trimer reporter probe combined with dry-reagent cotton thread immunoassay device for rapid human ferritin test.

    PubMed

    Mao, Xun; Du, Ting-E; Meng, Lili; Song, Tingting

    2015-08-19

    We reported here for the first time on the use of cotton thread combined with novel gold nanoparticle trimer reporter probe for low-cost, sensitive and rapid detection of a lung cancer related biomarker, human ferritin. A model system comprising ferritin as an analyte and a pair of monoclonal antibodies was used to demonstrate the proof-of-concept on the dry-reagent natural cotton thread immunoassay device. Results indicated that the using of novel gold nanoparticle trimer reporter probe greatly improved the sensitivity comparing with traditional gold nanoparticle reporter probe on the cotton thread immunoassay device. The assay avoids multiple incubation and washing steps performed in most conventional protein analyses. Although qualitative tests are realized by observing the color change of the test zone, quantitative data are obtained by recording the optical responses of the test zone with a commercial scanner and corresponding analysis software. Under optimal conditions, the cotton thread immunoassay device was capable of measuring 10 ng/mL human ferritin under room temperature which is sensitive enough for clinical diagnosis. Moreover, the sample solution employed in the assays is just 8 μL, which is much less than traditional lateral flow strip based biosensors. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Variations in population exposure and evacuation potential to multiple tsunami evacuation phases on Alameda and Bay Farm Islands, California

    NASA Astrophysics Data System (ADS)

    Peters, J.

    2015-12-01

    Planning for a tsunami evacuation is challenging for California communities due to the variety of earthquake sources that could generate a tsunami. A maximum tsunami inundation zone is currently the basis for all tsunami evacuations in California, although an Evacuation Playbook consisting of specific event-based evacuation phases relating to flooding severity is in development. We chose to investigate the Evacuation Playbook approach for the island community of Alameda, CA since past reports estimated a significant difference in numbers of residents in the maximum inundation zone when compared to an event-based inundation zone. In order to recognize variations in the types of residents and businesses within each phase, a population exposure analysis was conducted for each of the four Alameda evacuation phases. A pedestrian evacuation analysis using an anisotropic, path distance model was also conducted to understand the time it would take for populations to reach high ground by foot. Initial results suggest that the two islands of the City of Alameda have different situations when it comes to the four tsunami evacuation phases. Pedestrian evacuation results suggest that Bay Farm Island would have more success evacuating by vehicle due to limited nearby high ground for pedestrians to reach safety. Therefore, agent-based traffic simulation software was used to model vehicle evacuation off Bay Farm Island. Initial results show that Alameda Island could face challenges evacuating numerous boat docks and a large beach for phases 1 and 2, whereas Bay Farm Island is unaffected at these phases but might be challenged with evacuating by vehicle for phases 3 and maximum due to congestion on limited egress routes. A better understanding of the population exposure within each tsunami Evacuation Playbook phase and the time it would take to evacuate out of each phase by foot or vehicle will help emergency managers implement the evacuation phases during an actual tsunami event.

  19. Modernized Approach for Generating Reproducible Heterogeneity Using Transmitted-Light for Flow Visualization Experiments

    NASA Astrophysics Data System (ADS)

    Jones, A. A.; Holt, R. M.

    2017-12-01

    Image capturing in flow experiments has been used for fluid mechanics research since the early 1970s. Interactions of fluid flow between the vadose zone and permanent water table are of great interest because this zone is responsible for all recharge waters, pollutant transport and irrigation efficiency for agriculture. Griffith, et al. (2011) developed an approach where constructed reproducible "geologically realistic" sand configurations are deposited in sandfilled experimental chambers for light-transmitted flow visualization experiments. This method creates reproducible, reverse graded, layered (stratified) thin-slab sand chambers for point source experiments visualizing multiphase flow through porous media. Reverse-graded stratification of sand chambers mimic many naturally occurring sedimentary deposits. Sandfilled chambers use light as nonintrusive tools for measuring water saturation in two-dimensions (2-D). Homogeneous and heterogeneous sand configurations can be produced to visualize the complex physics of the unsaturated zone. The experimental procedure developed by Griffith, et al. (2011) was designed using now outdated and obsolete equipment. We have modernized this approach with new Parker Deadel linear actuator and programed projects/code for multiple configurations. We have also updated the Roper CCD software and image processing software with the latest in industry standards. Modernization of transmitted-light source, robotic equipment, redesigned experimental chambers, and newly developed analytical procedures have greatly reduced time and cost per experiment. We have verified the ability of the new equipment to generate reproducible heterogeneous sand-filled chambers and demonstrated the functionality of the new equipment and procedures by reproducing several gravity-driven fingering experiments conducted by Griffith (2008).

  20. THE EPA MULTIMEDIA INTEGRATED MODELING SYSTEM SOFTWARE SUITE

    EPA Science Inventory

    The U.S. EPA is developing a Multimedia Integrated Modeling System (MIMS) framework that will provide a software infrastructure or environment to support constructing, composing, executing, and evaluating complex modeling studies. The framework will include (1) common software ...

  1. Development and Application of New Quality Model for Software Projects

    PubMed Central

    Karnavel, K.; Dillibabu, R.

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594

  2. Development and application of new quality model for software projects.

    PubMed

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  3. New Ecuadorian VLF and ELF receiver for study the ionosphere

    NASA Astrophysics Data System (ADS)

    Lopez, Ericson; Montenegro, Jefferson; Vasconez, Michael; Vicente, Klever

    Crucial physical phenomena occur in the equatorial atmosphere and ionosphere, which are currently understudied and poorly understood. Thus, scientific campaigns for monitoring the equatorial region are required in order to provide the necessary data for the physical models. Ecuador is located in strategic geographical position where these studies can be performed, providing quality data for the scientific community working in understanding the nature of these physical systems. The Quito Astronomical Observatory (QAO) of National Polytechnic School is moving in this direction by promoting research in space sciences for the study of the equatorial zone. With the participation and the valuable collaboration of international initiatives such us AWESOME, MAGDAS, SAVNET and CALLISTO, the Quito Observatory is establishing a new space physics division on the basis of the International Space Weather Initiative. As part of this project, in the QAO has been designed a new system for acquisition and processing VLF and ELF signals propagating in the ionosphere. The Labview Software is used to filtering, processing and conditioning the received signals, avoiding in this way 60 percent of the analog components present in a common receiver. The same software have been programmed to create the spectrograms and the amplitude and phase diagrams of the radio signals. The data is stored neatly in files that can be processed even with other applications.

  4. An Intracranial Electroencephalography (iEEG) Brain Function Mapping Tool with an Application to Epilepsy Surgery Evaluation.

    PubMed

    Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli

    2016-01-01

    Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications.

  5. An Intracranial Electroencephalography (iEEG) Brain Function Mapping Tool with an Application to Epilepsy Surgery Evaluation

    PubMed Central

    Wang, Yinghua; Yan, Jiaqing; Wen, Jianbin; Yu, Tao; Li, Xiaoli

    2016-01-01

    Objects: Before epilepsy surgeries, intracranial electroencephalography (iEEG) is often employed in function mapping and epileptogenic foci localization. Although the implanted electrodes provide crucial information for epileptogenic zone resection, a convenient clinical tool for electrode position registration and Brain Function Mapping (BFM) visualization is still lacking. In this study, we developed a BFM Tool, which facilitates electrode position registration and BFM visualization, with an application to epilepsy surgeries. Methods: The BFM Tool mainly utilizes electrode location registration and function mapping based on pre-defined brain models from other software. In addition, the electrode node and mapping properties, such as the node size/color, edge color/thickness, mapping method, can be adjusted easily using the setting panel. Moreover, users may manually import/export location and connectivity data to generate figures for further application. The role of this software is demonstrated by a clinical study of language area localization. Results: The BFM Tool helps clinical doctors and researchers visualize implanted electrodes and brain functions in an easy, quick and flexible manner. Conclusions: Our tool provides convenient electrode registration, easy brain function visualization, and has good performance. It is clinical-oriented and is easy to deploy and use. The BFM tool is suitable for epilepsy and other clinical iEEG applications. PMID:27199729

  6. Visualization Skills: A Prerequisite to Advanced Solid Modeling

    ERIC Educational Resources Information Center

    Gow, George

    2007-01-01

    Many educators believe that solid modeling software has made teaching two- and three-dimensional visualization skills obsolete. They claim that the visual tools built into the solid modeling software serve as a replacement for the CAD operator's personal visualization skills. They also claim that because solid modeling software can produce…

  7. Software engineering the mixed model for genome-wide association studies on large samples.

    PubMed

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  8. Potential of pressure solution for strain localization in the Baccu Locci Shear Zone (Sardinia, Italy)

    NASA Astrophysics Data System (ADS)

    Casini, Leonardo; Funedda, Antonio

    2014-09-01

    The mylonites of the Baccu Locci Shear Zone (BLSZ), Sardinia (Italy), were deformed during thrusting along a bottom-to-top strain gradient in lower greenschist facies. The microstructure of metavolcanic protoliths shows evidence for composite deformation accommodated by dislocation creep within strong quartz porphyroclasts, and pressure solution in the finer grained matrix. The evolution of mylonite is simulated in two sets of numerical experiments, assuming either a constant width of the deforming zone (model 1) or a narrowing shear zone (model 2). A 2-5 mm y-1 constant-external-velocity boundary condition is applied on the basis of geologic constraints. Inputs to the models are provided by inverting paleostress values obtained from quartz recrystallized grain-size paleopiezometry. Both models predict a significant stress drop across the shear zone. However, model 1 involves a dramatic decrease in strain rate towards the zone of apparent strain localization. In contrast, model 2 predicts an increase in strain rate with time (from 10-14 to 10-12 s-1), which is consistent with stabilization of the shear zone profile and localization of deformation near the hanging wall. Extrapolating these results to the general context of crust strength suggests that pressure-solution creep may be a critical process for strain softening and for the stabilization of deformation within shear zones.

  9. Development of a calibrated software reliability model for flight and supporting ground software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1991-01-01

    The object of this project was to develop and calibrate quantitative models for predicting the quality of software. Reliable flight and supporting ground software is a highly important factor in the successful operation of the space shuttle program. The models used in the present study consisted of SMERFS (Statistical Modeling and Estimation of Reliability Functions for Software). There are ten models in SMERFS. For a first run, the results obtained in modeling the cumulative number of failures versus execution time showed fairly good results for our data. Plots of cumulative software failures versus calendar weeks were made and the model results were compared with the historical data on the same graph. If the model agrees with actual historical behavior for a set of data then there is confidence in future predictions for this data. Considering the quality of the data, the models have given some significant results, even at this early stage. With better care in data collection, data analysis, recording of the fixing of failures and CPU execution times, the models should prove extremely helpful in making predictions regarding the future pattern of failures, including an estimate of the number of errors remaining in the software and the additional testing time required for the software quality to reach acceptable levels. It appears that there is no one 'best' model for all cases. It is for this reason that the aim of this project was to test several models. One of the recommendations resulting from this study is that great care must be taken in the collection of data. When using a model, the data should satisfy the model assumptions.

  10. Revealing the ISO/IEC 9126-1 Clique Tree for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2007-01-01

    Previous research has shown that acyclic dependency models, if they exist, can be extracted from software quality standards and that these models can be used to assess software safety and product quality. In the case of commercial off-the-shelf (COTS) software, the extracted dependency model can be used in a probabilistic Bayesian network context for COTS software evaluation. Furthermore, while experts typically employ Bayesian networks to encode domain knowledge, secondary structures (clique trees) from Bayesian network graphs can be used to determine the probabilistic distribution of any software variable (attribute) using any clique that contains that variable. Secondary structures, therefore, provide insight into the fundamental nature of graphical networks. This paper will apply secondary structure calculations to reveal the clique tree of the acyclic dependency model extracted from the ISO/IEC 9126-1 software quality standard. Suggestions will be provided to describe how the clique tree may be exploited to aid efficient transformation of an evaluation model.

  11. Microbial respiration and dissolution precipitation reactions of minerals: thermo-kinetics and reactive transport modelling

    NASA Astrophysics Data System (ADS)

    Azaroual, M. M.; Parmentier, M.; Andre, L.; Croiset, N.; Pettenati, M.; Kremer, S.

    2010-12-01

    Microbial processes interact closely with abiotic geochemical reactions and mineralogical transformations in several hydrogeochemical systems. Reactive transport models are aimed to analyze these complex mechanisms integrating as well as the degradation of organic matter as the redox reactions involving successive terminal electron acceptors (TEAPs) mediated by microbes through the continuum of unsaturated zone (soil) - saturated zone (aquifer). The involvement of microbial processes in reactive transport in soil and subsurface geologic greatly complicates the mastery of the major mechanisms and the numerical modelling of these systems. The introduction of kinetic constraints of redox reactions in aqueous phase requires the decoupling of equilibrium reactions and the redefinition of mass balance of chemical elements including the concept of basis species and secondary species of thermodynamic databases used in geochemical modelling tools. An integrated methodology for modelling the reactive transport has been developed and implemented to simulate the transfer of arsenic, denitrification processes and the role of metastable aqueous sulfur species with pyrite and organic matter as electron donors entities. A mechanistic rate law of microbial respiration in various geochemical environments was used to simulate reactive transport of arsenic, nitrate and organic matter combined to the generalized rate law of mineral dissolution - precipitation reactions derived from the transition state theory was used for dissolution - precipitation of silica, aluminosilicate, carbonate, oxyhydroxide, and sulphide minerals. The kinetic parameters are compiled from the literature measurements based on laboratory constrained experiments and field observations. Numerical simulations, using the geochemical software PHREEQC, were performed aiming to identify the key reactions mediated by microbes in the framework of in the first hand the concept of the unsaturated - saturated zones of an artificial recharge of deep aquifers system and in a second hand an acid mine drainage system. A large amount of data is available on the old mine site of Cheni (France). This field data on acid mine drainage are compared to a thermokinetic model including biological kinetics, precipitation-dissolution kinetics and surface complexation on ferrihydrite. The kinetic parameters are from literature and from a fitting on batch biological experiments. The integrated approach combining reaction kinetics and biogeochemical thermodynamic constraints is successfully applied to denitrification experiments in the presence of acetate and pyrite conducted in the laboratory for batch and column systems. The powerful of this coupled approach allows a fine description of the different transition species from nitrate to nitrogen. The fitted kinetic parameters established for modelling these laboratory results are thus extended to simulate the denitrification processes in a field case where organic matter and pyrite FeS2 are the electron donors and O2, NO3, Fe(OH)3, SO4 are the electron acceptors in the framework of a continuum UZ - SZ aiming to identify the stabilized redox zones of acid mine drainage. The detailed results obtained on two actual case studies will be presented.

  12. Collected software engineering papers, volume 9

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1990 through October 1991. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the ninth such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. For the convenience of this presentation, the eight papers contained here are grouped into three major categories: (1) software models studies; (2) software measurement studies; and (3) Ada technology studies. The first category presents studies on reuse models, including a software reuse model applied to maintenance and a model for an organization to support software reuse. The second category includes experimental research methods and software measurement techniques. The third category presents object-oriented approaches using Ada and object-oriented features proposed for Ada. The SEL is actively working to understand and improve the software development process at GSFC.

  13. The TAME Project: Towards improvement-oriented software environments

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Rombach, H. Dieter

    1988-01-01

    Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.

  14. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  15. Vegetation root zone storage and rooting depth, derived from local calibration of a global hydrological model

    NASA Astrophysics Data System (ADS)

    van der Ent, R.; Van Beek, R.; Sutanudjaja, E.; Wang-Erlandsson, L.; Hessels, T.; Bastiaanssen, W.; Bierkens, M. F.

    2017-12-01

    The storage and dynamics of water in the root zone control many important hydrological processes such as saturation excess overland flow, interflow, recharge, capillary rise, soil evaporation and transpiration. These processes are parameterized in hydrological models or land-surface schemes and the effect on runoff prediction can be large. Root zone parameters in global hydrological models are very uncertain as they cannot be measured directly at the scale on which these models operate. In this paper we calibrate the global hydrological model PCR-GLOBWB using a state-of-the-art ensemble of evaporation fields derived by solving the energy balance for satellite observations. We focus our calibration on the root zone parameters of PCR-GLOBWB and derive spatial patterns of maximum root zone storage. We find these patterns to correspond well with previous research. The parameterization of our model allows for the conversion of maximum root zone storage to root zone depth and we find that these correspond quite well to the point observations where available. We conclude that climate and soil type should be taken into account when regionalizing measured root depth for a certain vegetation type. We equally find that using evaporation rather than discharge better allows for local adjustment of root zone parameters within a basin and thus provides orthogonal data to diagnose and optimize hydrological models and land surface schemes.

  16. Vegetation root zone storage and rooting depth, derived from local calibration of a global hydrological model

    NASA Astrophysics Data System (ADS)

    van der Ent, Ruud; van Beek, Rens; Sutanudjaja, Edwin; Wang-Erlandsson, Lan; Hessels, Tim; Bastiaanssen, Wim; Bierkens, Marc

    2017-04-01

    The storage and dynamics of water in the root zone control many important hydrological processes such as saturation excess overland flow, interflow, recharge, capillary rise, soil evaporation and transpiration. These processes are parameterized in hydrological models or land-surface schemes and the effect on runoff prediction can be large. For root zone parameters in global hydrological models are very uncertain as they cannot be measured directly at the scale on which these models operate. In this paper we calibrate the global hydrological model PCR-GLOBWB using a state-of-the-art ensemble of evaporation fields derived by solving the energy balance for satellite observations. We focus our calibration on the root zone parameters of PCR-GLOBWB and derive spatial patterns of maximum root zone storage. We find these patterns to correspond well with previous research. The parameterization of our model allows for the conversion of maximum root zone storage to root zone depth and we find that these correspond quite well to the point observations where available. We conclude that climate and soil type should be taken into account when regionalizing measured root depth for a certain vegetation type. We equally find that using evaporation rather than discharge better allows for local adjustment of root zone parameters within a basin and thus provides orthogonal data to diagnose and optimize hydrological models and land surface schemes.

  17. Software-defined Quantum Networking Ecosystem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Sadlier, Ronald

    The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts withmore » the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.« less

  18. Assessment of the ecological impacts of macroroughness elements in stream flows

    NASA Astrophysics Data System (ADS)

    Niayifar, Amin; Oldroyd, Holly J.; Perona, Paolo

    2017-04-01

    The environmental suitability of flow release rules is often assessed for different fish species by modeling (e.g., CASiMir and PHABSIM) Weighted Usable Area (WUA) curves. However, these models are not able to resolve the hydrodynamic at small scales, e.g. that induced by the presence of macroroughness (e.g., single stones), which yet determine relatively large wakes that may contribute significantly in terms of habitat suitability. The presence of stones generates sheltered zones (i.e., the wake), which are typically temporary stationary points for many fish species. By resting in these low velocity regions, fishes minimize energy expenditure, and can quickly move to nearby fast water to feed (Hayes and Jowett, 1994). Following the analytical model proposed by Negretti et al., (2006), we developed an analytical solution for the wake area behind the macroroughness elements. The total wake area in the river reach being monitored is a function of the streamflow, Q, and it is an actual Usable Area for fishes that can be used to correct the one computed by classic software such as PHABSIM or CASIMIR at each flow rate. By quantifying these wake areas we can therefore assess how the physical properties and number of such zones change in response to the changing hydrologic regime. In order to validate the concept, we selected a 400 meter reach from the Aare river in the center of Switzerland. The statistical distribution of macroroughness elements is obtained by taking orthorectified aerial photographs by drone surveys during low flow conditions. Then, the distribution of the wakes is obtained analytically as a derived distribution. This methodology allows to save computational costs and the time for detailed field surveys.

  19. Intraoperative fluorescence-based enhanced reality laparoscopic real-time imaging to assess bowel perfusion at the anastomotic site in an experimental model.

    PubMed

    Diana, M; Agnus, V; Halvax, P; Liu, Y-Y; Dallemagne, B; Schlagowski, A-I; Geny, B; Diemunsch, P; Lindner, V; Marescaux, J

    2015-01-01

    Fluorescence videography is a promising technique for assessing bowel perfusion. Fluorescence-based enhanced reality (FLER) is a novel concept, in which a dynamic perfusion cartogram, generated by computer analysis, is superimposed on to real-time laparoscopic images. The aim of this experimental study was to assess the accuracy of FLER in detecting differences in perfusion in a small bowel resection-anastomosis model. A small bowel ischaemic segment was created laparoscopically in 13 pigs. Animals were allocated to having anastomoses performed at either low perfusion (25 per cent; n = 7) or high perfusion (75 per cent; n = 6), as determined by FLER analysis. Capillary lactate levels were measured in blood samples obtained by serosal puncturing in the ischaemic area, resection lines and vascularized areas. Pathological inflammation scoring of the anastomosis was carried out. Lactate levels in the ischaemic area (mean(s.d.) 5·6(2·8) mmol/l) were higher than those in resection lines at 25 per cent perfusion (3·7(1·7) mmol/l; P = 0·010) and 75 per cent perfusion (2·9(1·3) mmol/l; P < 0·001), and higher than levels in vascular zones (2·5(1·0) mmol/l; P < 0·001). Lactate levels in resection lines with 75 per cent perfusion were lower than those in lines with 25 per cent perfusion (P < 0·001), and similar to those in vascular zones (P = 0·188). Levels at resection lines with 25 per cent perfusion were higher than those in vascular zones (P = 0·001). Mean(s.d.) global inflammation scores were higher in the 25 per cent perfusion group compared with the 75 per cent perfusion group for mucosa/submucosa (2·1(0·4) versus 1·2(0·4); P = 0·003) and serosa (1·8(0·4) versus 0·8(0·8); P = 0·014). A ratio of preanastomotic lactate levels in the ischaemic area relative to the resection lines of 2 or less was predictive of a more severe inflammation score. In an experimental model, FLER appeared accurate in discriminating bowel perfusion levels. Surgical relevance Clinical assessment has limited accuracy in evaluating bowel perfusion before anastomosis. Fluorescence videography estimates intestinal perfusion based on the fluorescence intensity of injected fluorophores, which is proportional to bowel vascularization. However, evaluation of fluorescence intensity remains a static and subjective measure. Fluorescence-based enhanced reality (FLER) is a dynamic fluorescence videography technique integrating near-infrared endoscopy and specific software. The software generates a virtual perfusion cartogram based on time to peak fluorescence, which can be superimposed on to real-time laparoscopic images. This experimental study demonstrates the accuracy of FLER in detecting differences in bowel perfusion in a survival model of laparoscopic small bowel resection-anastomosis, based on biochemical and histopathological data. It is concluded that real-time imaging of bowel perfusion is easy to use and accurate, and should be translated into clinical use. © 2015 BJS Society Ltd. Published by John Wiley & Sons Ltd.

  20. A Prototype for the Support of Integrated Software Process Development and Improvement

    NASA Astrophysics Data System (ADS)

    Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian

    An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.

  1. Stress drop inferred from dynamic rupture simulations consistent with Moment-Rupture area empirical scaling models: Effects of week shallow zone

    NASA Astrophysics Data System (ADS)

    Dalguer, L. A.; Miyake, H.; Irikura, K.; Wu, H., Sr.

    2016-12-01

    Empirical scaling models of seismic moment and rupture area provide constraints to parameterize source parameters, such as stress drop, for numerical simulations of ground motion. There are several scaling models published in the literature. The effect of the finite width seismogenic zone and the free-surface have been attributed to cause the breaking of the well know self-similar scaling (e.g. Dalguer et al, 2008) given origin to the so called L and W models for large faults. These models imply the existence of three-stage scaling relationship between seismic moment and rupture area (e.g. Irikura and Miyake, 2011). In this paper we extend the work done by Dalguer et al 2008, in which these authors calibrated fault models that match the observations showing that the average stress drop is independent of earthquake size for buried earthquakes, but scale dependent for surface-rupturing earthquakes. Here we have developed additional sets of dynamic rupture models for vertical strike slip faults to evaluate the effect of the weak shallow layer (WSL) zone for the calibration of stress drop. Rupture in the WSL zone is expected to operate with enhanced energy absorption mechanism. The set of dynamic models consists of fault models with width 20km and fault length L=20km, 40km, 60km, 80km, 100km, 120km, 200km, 300km and 400km and average stress drop values of 2.0MPa, 2.5MPa, 3.0MPa, 3.5MPa, 5.0MPa and 7.5MPa. For models that break the free-surface, the WSL zone is modeled assuming a 2km width with stress drop 0.0MPa or -2.0 MPa. Our results show that depending on the characterization of the WSL zone, the average stress drop at the seismogenic zone that fit the empirical models changes. If WSL zone is not considered, that is, stress drop at SL zone is the same as the seismogenic zone, average stress drop is about 20% smaller than models with WSL zone. By introducing more energy absorption at the SL zone, that could be the case of large mature faults, the average stress drop in the seismogenic zone increases. Suggesting that large earthquakes need higher stress drop to break the fault than buried and moderate earthquakes. Therefore, the value of the average stress drop for large events that break the free-source depend on the definition of the WSL. Suggesting that the WSL plays an important role on the prediction of final slip and fault displacement.

  2. Continuous Fine-Fault Estimation with Real-Time GNSS

    NASA Astrophysics Data System (ADS)

    Norford, B. B.; Melbourne, T. I.; Szeliga, W. M.; Santillan, V. M.; Scrivner, C.; Senko, J.; Larsen, D.

    2017-12-01

    Thousands of real-time telemetered GNSS stations operate throughout the circum-Pacific that may be used for rapid earthquake characterization and estimation of local tsunami excitation. We report on the development of a GNSS-based finite-fault inversion system that continuously estimates slip using real-time GNSS position streams from the Cascadia subduction zone and which is being expanded throughout the circum-Pacific. The system uses 1 Hz precise point position streams computed in the ITRF14 reference frame using clock and satellite orbit corrections from the IGS. The software is implemented as seven independent modules that filter time series using Kalman filters, trigger and estimate coseismic offsets, invert for slip using a non-negative least squares method developed by Lawson and Hanson (1974) and elastic half-space Green's Functions developed by Okada (1985), smooth the results temporally and spatially, and write the resulting streams of time-dependent slip to a RabbitMQ messaging server for use by downstream modules such as tsunami excitation modules. Additional fault models can be easily added to the system for other circum-Pacific subduction zones as additional real-time GNSS data become available. The system is currently being tested using data from well-recorded earthquakes including the 2011 Tohoku earthquake, the 2010 Maule earthquake, the 2015 Illapel earthquake, the 2003 Tokachi-oki earthquake, the 2014 Iquique earthquake, the 2010 Mentawai earthquake, the 2016 Kaikoura earthquake, the 2016 Ecuador earthquake, the 2015 Gorkha earthquake, and others. Test data will be fed to the system and the resultant earthquake characterizations will be compared with published earthquake parameters. Seismic events will be assumed to occur on major faults, so, for example, only the San Andreas fault will be considered in Southern California, while the hundreds of other faults in the region will be ignored. Rake will be constrained along each subfault to be consistent with NUVEL-1 plate convergence directions. This software provides a basis for a GNSS-based rapid earthquake finite fault estimation system with global scope.

  3. Idea Paper: The Lifecycle of Software for Scientific Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubey, Anshu; McInnes, Lois C.

    The software lifecycle is a well researched topic that has produced many models to meet the needs of different types of software projects. However, one class of projects, software development for scientific computing, has received relatively little attention from lifecycle researchers. In particular, software for end-to-end computations for obtaining scientific results has received few lifecycle proposals and no formalization of a development model. An examination of development approaches employed by the teams implementing large multicomponent codes reveals a great deal of similarity in their strategies. This idea paper formalizes these related approaches into a lifecycle model for end-to-end scientific applicationmore » software, featuring loose coupling between submodels for development of infrastructure and scientific capability. We also invite input from stakeholders to converge on a model that captures the complexity of this development processes and provides needed lifecycle guidance to the scientific software community.« less

  4. Software development predictors, error analysis, reliability models and software metric analysis

    NASA Technical Reports Server (NTRS)

    Basili, Victor

    1983-01-01

    The use of dynamic characteristics as predictors for software development was studied. It was found that there are some significant factors that could be useful as predictors. From a study on software errors and complexity, it was shown that meaningful results can be obtained which allow insight into software traits and the environment in which it is developed. Reliability models were studied. The research included the field of program testing because the validity of some reliability models depends on the answers to some unanswered questions about testing. In studying software metrics, data collected from seven software engineering laboratory (FORTRAN) projects were examined and three effort reporting accuracy checks were applied to demonstrate the need to validate a data base. Results are discussed.

  5. Modeling the Effects of Hydrogeomorphology and Climactic Factors on Nitrogen, Phosphorus, and Greenhouse Gas Dynamics in Riparian Zones.

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Y.; Vidon, P.; Gold, A.; Pradhanang, S. M.; Addy, K.

    2017-12-01

    Vegetated riparian zones are often considered for use as best management practices to mitigate the impacts of agriculture on water quality. However, riparian zones can also be a source of greenhouse gases and their influence on water quality varies depending on landscape hydrogeomorphic characteristics and climate. Methods used to evaluate riparian zone functions include conceptual models, and spatially explicit and process based models (REMM), but very few attempts have been made to connect riparian zone characteristics with function using easily accessible landscape scale data. Here, we present comprehensive statistical models that can be used to assess riparian zone functions with easily obtainable landscape-scale hydrogeomorphic attributes and climate data. Models were developed from a database spanning 88 years and 36 sites. Statistical methods including principal component analysis and stepwise regression were used to reduced data dimensionality and identify significant predictors. Models were validated using additional data collected from scientific literature. The 8 models developed connect landscape characteristics to nitrogen and phosphorus concentration and removal (1-4), greenhouse gas emissions (5-7), and water table depth (8). Results show the range of influence that various climate and landscape characteristics have on riparian zone functions, and the tradeoffs that exist with regards to nitrogen, phosphorous, and greenhouse gases. These models will help reduce the need for extensive field measurements and help scientists and land managers make more informed decisions regarding the use of riparian zones for water quality management.

  6. Radiofrequency ablation of liver metastases-software-assisted evaluation of the ablation zone in MDCT: tumor-free follow-up versus local recurrent disease.

    PubMed

    Keil, Sebastian; Bruners, Philipp; Schiffl, Katharina; Sedlmair, Martin; Mühlenbruch, Georg; Günther, Rolf W; Das, Marco; Mahnken, Andreas H

    2010-04-01

    The purpose of this study was to investigate differences in change of size and CT value between local recurrences and tumor-free areas after CT-guided radiofrequency ablation (RFA) of hepatic metastases during follow-up by means of dedicated software for automatic evaluation of hepatic lesions. Thirty-two patients with 54 liver metastases from breast or colorectal cancer underwent triphasic contrast-enhanced multidetector-row computed tomography (MDCT) to evaluate hepatic metastatic spread and localization before CT-guided RFA and for follow-up after intervention. Sixteen of these patients (65.1 + or - 10.3 years) with 30 metastases stayed tumor-free (group 1), while the other group (n = 16 with 24 metastases; 62.0 + or - 13.8 years) suffered from local recurrent disease (group 2). Applying an automated software tool (SyngoCT Oncology; Siemens Healthcare, Forchheim, Germany), size parameters (volume, RECIST, WHO) and attenuation were measured within the lesions before, 1 day after, and 28 days after RFA treatment. The natural logarithm (ln) of the quotient of the volume 1 day versus 28 days after RFA treament was computed: lnQ1//28/0(volume). Analogously, ln ratios of RECIST, WHO, and attenuation were computed and statistically evaluated by repeated-measures ANOVA. One lesion in group 2 was excluded from further evaluation due to automated missegmentation. Statistically significant differences between the two groups were observed with respect to initial volume, RECIST, and WHO (p < 0.05). Furthermore, ln ratios corresponding to volume, RECIST, and WHO differed significantly between the two groups. Attenuation evaluations showed no significant differences, but there was a trend toward attenuation assessment for the parameter lnQ28/0(attenuation) (p = 0.0527), showing higher values for group 1 (-0.4 + or - 0.3) compared to group 2 (-0.2 + or - 0.2). In conclusion, hepatic metastases and their zone of coagulation necrosis after RFA differed significantly between tumor-free and local-recurrent ablation zones with respect to the corresponding size parameters. A new parameter (lnQ1//28/0(volume/RECIST/WHO/attenuation)) was introduced, which appears to be of prognostic value at early follow-up CT.

  7. Resource utilization during software development

    NASA Technical Reports Server (NTRS)

    Zelkowitz, Marvin V.

    1988-01-01

    This paper discusses resource utilization over the life cycle of software development and discusses the role that the current 'waterfall' model plays in the actual software life cycle. Software production in the NASA environment was analyzed to measure these differences. The data from 13 different projects were collected by the Software Engineering Laboratory at NASA Goddard Space Flight Center and analyzed for similarities and differences. The results indicate that the waterfall model is not very realistic in practice, and that as technology introduces further perturbations to this model with concepts like executable specifications, rapid prototyping, and wide-spectrum languages, we need to modify our model of this process.

  8. Choosing appropriate subpopulations for modeling tree canopy cover nationwide

    Treesearch

    Gretchen G. Moisen; John W. Coulston; Barry T. Wilson; Warren B. Cohen; Mark V. Finco

    2012-01-01

    In prior national mapping efforts, the country has been divided into numerous ecologically similar mapping zones, and individual models have been constructed for each zone. Additionally, a hierarchical approach has been taken within zones to first mask out areas of nonforest, then target models of tree attributes within forested areas only. This results in many models...

  9. Comparison of 3D reconstruction of mandible for pre-operative planning using commercial and open-source software

    NASA Astrophysics Data System (ADS)

    Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad

    2016-12-01

    3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.

  10. 3D modeling based on CityEngine

    NASA Astrophysics Data System (ADS)

    Jia, Guangyin; Liao, Kaiju

    2017-03-01

    Currently, there are many 3D modeling softwares, like 3DMAX, AUTOCAD, and more populous BIM softwares represented by REVIT. CityEngine modeling software introduced in this paper can fully utilize the existing GIS data and combine other built models to make 3D modeling on internal and external part of buildings in a rapid and batch manner, so as to improve the 3D modeling efficiency.

  11. Global review of open access risk assessment software packages valid for global or continental scale analysis

    NASA Astrophysics Data System (ADS)

    Daniell, James; Simpson, Alanna; Gunasekara, Rashmin; Baca, Abigail; Schaefer, Andreas; Ishizawa, Oscar; Murnane, Rick; Tijssen, Annegien; Deparday, Vivien; Forni, Marc; Himmelfarb, Anne; Leder, Jan

    2015-04-01

    Over the past few decades, a plethora of open access software packages for the calculation of earthquake, volcanic, tsunami, storm surge, wind and flood have been produced globally. As part of the World Bank GFDRR Review released at the Understanding Risk 2014 Conference, over 80 such open access risk assessment software packages were examined. Commercial software was not considered in the evaluation. A preliminary analysis was used to determine whether the 80 models were currently supported and if they were open access. This process was used to select a subset of 31 models that include 8 earthquake models, 4 cyclone models, 11 flood models, and 8 storm surge/tsunami models for more detailed analysis. By using multi-criteria analysis (MCDA) and simple descriptions of the software uses, the review allows users to select a few relevant software packages for their own testing and development. The detailed analysis evaluated the models on the basis of over 100 criteria and provides a synopsis of available open access natural hazard risk modelling tools. In addition, volcano software packages have since been added making the compendium of risk software tools in excess of 100. There has been a huge increase in the quality and availability of open access/source software over the past few years. For example, private entities such as Deltares now have an open source policy regarding some flood models (NGHS). In addition, leaders in developing risk models in the public sector, such as Geoscience Australia (EQRM, TCRM, TsuDAT, AnuGA) or CAPRA (ERN-Flood, Hurricane, CRISIS2007 etc.), are launching and/or helping many other initiatives. As we achieve greater interoperability between modelling tools, we will also achieve a future wherein different open source and open access modelling tools will be increasingly connected and adapted towards unified multi-risk model platforms and highly customised solutions. It was seen that many software tools could be improved by enabling user-defined exposure and vulnerability. Without this function, many tools can only be used regionally and not at global or continental scale. It is becoming increasingly easy to use multiple packages for a single region and/or hazard to characterize the uncertainty in the risk, or use as checks for the sensitivities in the analysis. There is a potential for valuable synergy between existing software. A number of open source software packages could be combined to generate a multi-risk model with multiple views of a hazard. This extensive review has simply attempted to provide a platform for dialogue between all open source and open access software packages and to hopefully inspire collaboration between developers, given the great work done by all open access and open source developers.

  12. Models and metrics for software management and engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1988-01-01

    This paper attempts to characterize and present a state of the art view of several quantitative models and metrics of the software life cycle. These models and metrics can be used to aid in managing and engineering software projects. They deal with various aspects of the software process and product, including resources allocation and estimation, changes and errors, size, complexity and reliability. Some indication is given of the extent to which the various models have been used and the success they have achieved.

  13. Open source molecular modeling.

    PubMed

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  14. A software quality model and metrics for risk assessment

    NASA Technical Reports Server (NTRS)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  15. Integrated geological-geophysical models of unstable slopes in seismogenic areas in NW and SE Europe

    NASA Astrophysics Data System (ADS)

    Mreyen, Anne-Sophie; Micu, Mihai; Onaca, Alexandru; Demoulin, Alain; Havenith, Hans-Balder

    2017-04-01

    We will present a series of new integrated 3D models of landslide sites that were investigated in distinctive seismotectonic and climatic contexts: (1) along the Hockai Fault Zone in Belgium, with the 1692 Verviers Earthquake (M 6 - 6.5) as most prominent earthquake that occurred in that fault zone and (2) in the seismic region of Vrancea, Romania, where four earthquakes with Mw > 7.4 have been recorded during the last two centuries. Both sites present deep-seated failures located in more or less seismically active areas. In such areas, slope stability analyses have to take into account the possible contributions to ground failure. Our investigation methods had to be adapted to capture the deep structure as well as the physico-mechanical characteristics that influence the dynamic behaviour of the landslide body. Field surveys included electrical resistivity tomography profiles, seismic refraction profiles (analysed in terms of both seismic P-wave tomography and surface waves), ambient noise measurements to determine the soil resonance frequencies through H/V analysis, complemented by geological and geomorphic mapping. The H/V method, in particular, is more and more used for landslide investigations or sites marked by topographic relief (in addition to the more classical applications on flat sites). Results of data interpretation were compiled in 3D geological-geophysical models supported by high resolution remote sensing data of the ground surface. Data and results were not only analysed in parallel or successively; to ensure full integration of all inputs-outputs, some data fusion and geostatistical techniques were applied to establish closer links between them. Inside the 3D models, material boundaries were defined in terms of surfaces and volumes. Those models were used as inputs for 2D dynamic numerical simulations completed with the UDEC (Itasca) software. For some sites, a full back-analysis was carried out to assess the possibility of a seismic triggering of the landslides.

  16. SU-E-T-754: Three-Dimensional Patient Modeling Using Photogrammetry for Collision Avoidance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Popple, R; Cardan, R

    2015-06-15

    Purpose: To evaluate photogrammetry for creating a three-dimensional patient model. Methods: A mannequin was configured on the couch of a CT scanner to simulate a patient setup using an indexed positioning device. A CT fiducial was placed on the indexed CT table-overlay at the reference index position. Two dimensional photogrammetry targets were placed on the table in known positions. A digital SLR camera was used to obtain 27 images from different positions around the CT table. The images were imported into a commercial photogrammetry package and a 3D model constructed. Each photogrammetry target was identified on 2 to 5 images.more » The CT DICOM metadata and the position of the CT fiducial were used to calculate the coordinates of the photogrammetry targets in the CT image frame of reference. The coordinates were transferred to the photogrammetry software to orient the 3D model. The mannequin setup was transferred to the treatment couch of a linear accelerator and positioned at isocenter using in-room lasers. The treatment couch coordinates were noted and compared with prediction. The collision free regions were measured over the full range of gantry and table motion and were compared with predictions obtained using a general purpose polygon interference algorithm. Results: The reconstructed 3D model consisted of 180000 triangles. The difference between the predicted and measured couch positions were 5 mm, 1 mm, and 1 mm for longitudinal, lateral, and vertical, respectively. The collision prediction tested 64620 gantry table combinations in 11.1 seconds. The accuracy was 96.5%, with false positive and negative results occurring at the boundaries of the collision space. Conclusion: Photogrammetry can be used as a tool for collision avoidance during treatment planning. The results indicate that a buffer zone is necessary to avoid false negatives at the boundary of the collision-free zone. Testing with human patients is underway. Research partially supported by a grant from Varian Medical Systems.« less

  17. Model Package Report: Central Plateau Vadose Zone Geoframework Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Springer, Sarah D.

    The purpose of the Central Plateau Vadose Zone (CPVZ) Geoframework model (GFM) is to provide a reasonable, consistent, and defensible three-dimensional (3D) representation of the vadose zone beneath the Central Plateau at the Hanford Site to support the Composite Analysis (CA) vadose zone contaminant fate and transport models. The GFM is a 3D representation of the subsurface geologic structure. From this 3D geologic model, exported results in the form of point, surface, and/or volumes are used as inputs to populate and assemble the various numerical model architectures, providing a 3D-layered grid that is consistent with the GFM. The objective ofmore » this report is to define the process used to produce a hydrostratigraphic model for the vadose zone beneath the Hanford Site Central Plateau and the corresponding CA domain.« less

  18. Identifying Developmental Zones in Maize Lateral Root Cell Length Profiles using Multiple Change-Point Models

    PubMed Central

    Moreno-Ortega, Beatriz; Fort, Guillaume; Muller, Bertrand; Guédon, Yann

    2017-01-01

    The identification of the limits between the cell division, elongation and mature zones in the root apex is still a matter of controversy when methods based on cellular features, molecular markers or kinematics are compared while methods based on cell length profiles have been comparatively underexplored. Segmentation models were developed to identify developmental zones within a root apex on the basis of epidermal cell length profiles. Heteroscedastic piecewise linear models were estimated for maize lateral roots of various lengths of both wild type and two mutants affected in auxin signaling (rtcs and rum-1). The outputs of these individual root analyses combined with morphological features (first root hair position and root diameter) were then globally analyzed using principal component analysis. Three zones corresponding to the division zone, the elongation zone and the mature zone were identified in most lateral roots while division zone and sometimes elongation zone were missing in arrested roots. Our results are consistent with an auxin-dependent coordination between cell flux, cell elongation and cell differentiation. The proposed segmentation models could extend our knowledge of developmental regulations in longitudinally organized plant organs such as roots, monocot leaves or internodes. PMID:29123533

  19. Software Technology for Adaptable, Reliable Systems (STARS)

    DTIC Science & Technology

    1994-03-25

    Tmeline(3), SECOMO(3), SEER(3), GSFC Software Engineering Lab Model(l), SLIM(4), SEER-SEM(l), SPQR (2), PRICE-S(2), internally-developed models(3), APMSS(1...3 " Timeline - 3 " SASET (Software Architecture Sizing Estimating Tool) - 2 " MicroMan 11- 2 * LCM (Logistics Cost Model) - 2 * SPQR - 2 * PRICE-S - 2

  20. Experiences in Teaching a Graduate Course on Model-Driven Software Development

    ERIC Educational Resources Information Center

    Tekinerdogan, Bedir

    2011-01-01

    Model-driven software development (MDSD) aims to support the development and evolution of software intensive systems using the basic concepts of model, metamodel, and model transformation. In parallel with the ongoing academic research, MDSD is more and more applied in industrial practices. After being accepted both by a broad community of…

  1. Variable-intercept panel model for deformation zoning of a super-high arch dam.

    PubMed

    Shi, Zhongwen; Gu, Chongshi; Qin, Dong

    2016-01-01

    This study determines dam deformation similarity indexes based on an analysis of deformation zoning features and panel data clustering theory, with comprehensive consideration to the actual deformation law of super-high arch dams and the spatial-temporal features of dam deformation. Measurement methods of these indexes are studied. Based on the established deformation similarity criteria, the principle used to determine the number of dam deformation zones is constructed through entropy weight method. This study proposes the deformation zoning method for super-high arch dams and the implementation steps, analyzes the effect of special influencing factors of different dam zones on the deformation, introduces dummy variables that represent the special effect of dam deformation, and establishes a variable-intercept panel model for deformation zoning of super-high arch dams. Based on different patterns of the special effect in the variable-intercept panel model, two panel analysis models were established to monitor fixed and random effects of dam deformation. Hausman test method of model selection and model effectiveness assessment method are discussed. Finally, the effectiveness of established models is verified through a case study.

  2. Modeling and Simulation for a Surf Zone Robot

    DTIC Science & Technology

    2012-12-14

    of-freedom surf zone robot is developed and tested with a physical test platform and with a simulated robot in Robot Operating System . Derived from...terrain. The application of the model to future platforms is analyzed and a broad examination of the current state of surf zone robotic systems is...public release; distribution is unlimited MODELING AND SIMULATION FOR A SURF ZONE ROBOT Eric Shuey Lieutenant, United States Navy B.S., Systems

  3. Industry Software Trustworthiness Criterion Research Based on Business Trustworthiness

    NASA Astrophysics Data System (ADS)

    Zhang, Jin; Liu, Jun-fei; Jiao, Hai-xing; Shen, Yi; Liu, Shu-yuan

    To industry software Trustworthiness problem, an idea aiming to business to construct industry software trustworthiness criterion is proposed. Based on the triangle model of "trustworthy grade definition-trustworthy evidence model-trustworthy evaluating", the idea of business trustworthiness is incarnated from different aspects of trustworthy triangle model for special industry software, power producing management system (PPMS). Business trustworthiness is the center in the constructed industry trustworthy software criterion. Fusing the international standard and industry rules, the constructed trustworthy criterion strengthens the maneuverability and reliability. Quantitive evaluating method makes the evaluating results be intuitionistic and comparable.

  4. Automated support for experience-based software management

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  5. Fold-Thrust mapping using photogrammetry in Western Champsaur basin, SE France

    NASA Astrophysics Data System (ADS)

    Totake, Y.; Butler, R.; Bond, C. E.

    2016-12-01

    There is an increasing demand for high-resolution geometric data for outcropping geological structures - not only to test models for their formation and evolution but also to create synthetic seismic visualisations for comparison with subsurface data. High-resolution 3D scenes reconstructed by modern photogrammetry offer an efficient toolbox for such work. When integrated with direct field measurements and observations, these products can be used to build geological interpretations and models. Photogrammetric techniques using standard equipment are ideally suited to working in the high mountain terrain that commonly offers the best outcrops, as all equipment is readily portable and, in the absence of cloud-cover, not restricted to the meteorological and legal restrictions that can affect some airborne approaches. The workflows and approaches for generating geological models utilising such photogrammetry techniques are the focus of our contribution. Our case study comes from SE France where early Alpine fore-deep sediments have been deformed into arrays of fold-thrust complexes. Over 1500m vertical relief provides excellent outcrop control with surrounding hillsides providing vantage points for ground-based photogrammetry. We collected over 9,400 photographs across the fold-thrust array using a handheld digital camera from 133 ground locations that were individually georeferenced. We processed the photographic images within the software PhotoScan-Pro to build 3D landscape scenes. The built photogrammetric models were then imported into the software Move, along with field measurements, to map faults and sedimentary layers and to produce geological cross sections and 3D geological surfaces. Polylines of sediment beds and faults traced on our photogrammetry models allow interpretation of a pseudo-3D geometry of the deformation structures, and enable prediction of dips and strikes from inaccessible field areas, to map the complex geometries of the thrust faults and deformed strata in detail. The resultant structural geometry of the thrust zones delivers an exceptional analogue to inaccessible subsurface fold-thrust structures which are often challenging to obtain a clear seismic image.

  6. Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models

    NASA Astrophysics Data System (ADS)

    Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto

    In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.

  7. Process-based modeling of temperature and water profiles in the seedling recruitment zone: Part I. Model validation

    USDA-ARS?s Scientific Manuscript database

    Process-based modeling provides detailed spatial and temporal information of the soil environment in the shallow seedling recruitment zone across field topography where measurements of soil temperature and water may not sufficiently describe the zone. Hourly temperature and water profiles within the...

  8. New approach to effective diffusion coefficient evaluation in the nanostructured two-phase media

    NASA Astrophysics Data System (ADS)

    Lyashenko, Yu. O.; Liashenko, O. Y.; Morozovich, V. V.

    2018-03-01

    Most widely used basic and combined models for evaluation of the effective diffusion parameters of inhomogeneous two-phase zone are reviewed. A new combined model of effective medium is analyzed for the description of diffusion processes in the two-phase zones. In this model the effective diffusivity depends on the growth kinetic coefficients of each phase, the volume fractions of phases and on the additional parameter that generally characterizes the structure type of the two-phase zone. Our combined model describes two-phase zone evolution in the binary systems based on consideration of the diffusion fluxes through both phases. The Lattice Monte Carlo method was used to test the validity of different phenomenological models for evaluation of the effective diffusivity in nanostructured two-phase zones with different structural morphology.

  9. Bacterial communities in soil samples from the Mingyong Glacier of southwestern China.

    PubMed

    Li, Haoyu; Taj, Muhammad Kamran; Ji, Xiuling; Zhang, Qi; Lin, Liangbing; Zhou, Zhimei; Wei, Yunlin

    2017-05-01

    The present study was an effort to determine the bacterial diversity of soils in Mingyong Glacier located at the Meili Snow Mountains of southwestern China. Mingyong Glacier has different climatic zones within a very narrow area, and bacterial community diversity in this low temperature area remains largely unknown. In this study, soil samples were collected from four different climatic zones: M11A (dry warm valley), M14 (forest), M15 (grass land), and M16 (glacier zones). Phylogenetic analysis based on 16S rRNA gene V6 hypervariable region showed high bacterial abundance in the glacier. The number of Operational Taxonomic Units ranged from 2.24×10 3 to 5.56×10 3 in soil samples. Statistical analysis of 16S rRNA gene clone libraries results showed that bacterial diversity in zones M11A,M14 and M16 are higher than in zone M15. The bacterial community structures are clearly distinguishable, and phylogenetic analysis showed that the predominant phyla were Proteobacteria, Deinococcus-Thermus, Firmicutes, Actinobacteria, and Nitrospirae in Mingyong Glacier. Seventy-nine different orders from four zones have been isolated. Bacterial diversity and distribution of bacterial communities related to the anthropogenic perturbations in zone (M15) were confirmed by diversity index analysis, and the diversity index of other three zones was satisfactory through this analysis software. The results suggest that bacterial diversity and distribution analyses using bacterial 16S rRNA gene V6 hypervariable region were successful, and bacterial communities in this area not only had the same bacterial phyla compared to other glaciers but also had their own rare species.

  10. Capability Maturity Model (CMM) for Software Process Improvements

    NASA Technical Reports Server (NTRS)

    Ling, Robert Y.

    2000-01-01

    This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.

  11. The numerical simulation of heat transfer during a hybrid laser-MIG welding using equivalent heat source approach

    NASA Astrophysics Data System (ADS)

    Bendaoud, Issam; Matteï, Simone; Cicala, Eugen; Tomashchuk, Iryna; Andrzejewski, Henri; Sallamand, Pierre; Mathieu, Alexandre; Bouchaud, Fréderic

    2014-03-01

    The present study is dedicated to the numerical simulation of an industrial case of hybrid laser-MIG welding of high thickness duplex steel UR2507Cu with Y-shaped chamfer geometry. It consists in simulation of heat transfer phenomena using heat equivalent source approach and implementing in finite element software COMSOL Multiphysics. A numerical exploratory designs method is used to identify the heat sources parameters in order to obtain a minimal required difference between the numerical results and the experiment which are the shape of the welded zone and the temperature evolution in different locations. The obtained results were found in good correspondence with experiment, both for melted zone shape and thermal history.

  12. Development of a zoning-based environmental-ecological-coupled model for lakes to assess lake restoration effect

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Zou, Changxin; Zhao, Yanwei

    2017-04-01

    Environmental/ecological models are widely used for lake management as they provide a means to understand physical, chemical and biological processes in highly complex ecosystems. Most research focused on the development of environmental (water quality) and ecological models, separately. Limited studies were developed to couple the two models, and in these limited coupled models, a lake was regarded as a whole for analysis (i.e., considering the lake to be one well-mixed box), which was appropriate for small-scale lakes and was not sufficient to capture spatial variations within middle-scale or large-scale lakes. This paper seeks to establish a zoning-based environmental-ecological-coupled model for a lake. The Baiyangdian Lake, the largest freshwater lake in Northern China, was adopted as the study case. The coupled lake models including a hydrodynamics and water quality model established by MIKE21 and a compartmental ecological model used STELLA software have been established for middle-sized Baiyangdian Lake to realize the simulation of spatial variations of ecological conditions. On the basis of the flow field distribution results generated by MIKE21 hydrodynamics model, four water area zones were used as an example for compartmental ecological model calibration and validation. The results revealed that the developed coupled lake models can reasonably reflected the changes of the key state variables although there remain some state variables that are not well represented by the model due to the low quality of field monitoring data. Monitoring sites in a compartment may not be representative of the water quality and ecological conditions in the entire compartment even though that is the intention of compartment-based model design. There was only one ecological observation from a single monitoring site for some periods. This single-measurement issue may cause large discrepancies particularly when sampled site is not representative of the whole compartment. The coupled models have been applied to simulate the spatial variation trends of ecological condition under ecological water supplement as an example to reflect the application effect in lake restoration and management. The simulation results indicate that the models can provide a useful tool for lake restoration and management. The simulated spatial variation trends can provide a foundation for establishing permissible ranges for a selected set of water quality indices for a series of management measures such as watershed pollution load control and ecological water transfer. Meanwhile, the coupled models can help us to understand processes taking place and the relations of interaction between components in the lake ecosystem and external conditions. Taken together, the proposed models we established show some promising applications as middle-scale or large-scale lake management tools for pollution load control and ecological water transfer. These tools quantify the implications of proposed future water management decisions.

  13. Improved High Resolution Models of Subduction Dynamics: Use of transversely isotropic viscosity with a free-surface

    NASA Astrophysics Data System (ADS)

    Liu, X.; Gurnis, M.; Stadler, G.; Rudi, J.; Ratnaswamy, V.; Ghattas, O.

    2017-12-01

    Dynamic topography, or uncompensated topography, is controlled by internal dynamics, and provide constraints on the buoyancy structure and rheological parameters in the mantle. Compared with other surface manifestations such as the geoid, dynamic topography is very sensitive to shallower and more regional mantle structure. For example, the significant dynamic topography above the subduction zone potentially provides a rich mine for inferring the rheological and mechanical properties such as plate coupling, flow, and lateral viscosity variations, all critical in plate tectonics. However, employing subduction zone topography in the inversion study requires that we have a better understanding of the topography from forward models, especially the influence of the viscosity formulation, numerical resolution, and other factors. One common approach to formulating a fault between the subducted slab and the overriding plates in viscous flow models assumes a thin weak zone. However, due to the large lateral variation in viscosity, topography from free-slip numerical models typically has artificially large magnitude as well as high-frequency undulations over subduction zone, which adds to the difficulty in making comparisons between model results and observations. In this study, we formulate a weak zone with the transversely isotropic viscosity (TI) where the tangential viscosity is much smaller than the viscosity in the normal direction. Similar with isotropic weak zone models, TI models effectively decouple subducted slabs from the overriding plates. However, we find that the topography in TI models is largely reduced compared with that in weak zone models assuming an isotropic viscosity. Moreover, the artificial `tooth paste' squeezing effect observed in isotropic weak zone models vanishes in TI models, although the difference becomes less significant when the dip angle is small. We also implement a free-surface condition in our numerical models, which has a smoothing effect on the topography. With the improved model configuration, we can use the adjoint inversion method in a high-resolution model and employ topography in addition to other observables such as the plate motion to infer critical mechanical and rheological parameters in the subduction zone.

  14. Constraints of subducted slab geometries on trench migration and subduction velocities: flat slabs and slab curtains in the mantle under Asia

    NASA Astrophysics Data System (ADS)

    Wu, J. E.; Suppe, J.; Renqi, L.; Lin, C.; Kanda, R. V.

    2013-12-01

    The past locations, shapes and polarity of subduction trenches provide first-order constraints for plate tectonic reconstructions. Analogue and numerical models of subduction zones suggest that relative subducting (Vs) and overriding (Vor) plate velocities may strongly influence final subducted slab geometries. Here we have mapped the 3D geometries of subducted slabs in the upper and lower mantle of Asia from global seismic tomography. We have incorporated these slabs into plate tectonic models, which allows us to infer the subducting and overriding plate velocities. We describe two distinct slab geometry styles, ';flat slabs' and ';slab curtains', and show their implications for paleo-trench positions and subduction geometries in plate tectonic reconstructions. When compared to analogue and numerical models, the mapped slab styles show similarities to modeled slabs that occupy very different locations within Vs:Vor parameter space. ';Flat slabs' include large swaths of sub-horizontal slabs in the lower mantle that underlie the well-known northward paths of India and Australia from Eastern Gondwana, viewed in a moving hotspot reference. At India the flat slabs account for a significant proportion of the predicted lost Ceno-Tethys Ocean since ~100 Ma, whereas at Australia they record the existence of a major 8000km by 2500-3000km ocean that existed at ~43 Ma between East Asia, the Pacific and Australia. Plate reconstructions incorporating the slab constraints imply these flat slab geometries were generated when continent overran oceanic lithosphere to produce rapid trench retreat, or in other words, when subducting and overriding velocities were equal (i.e. Vs ~ Vor). ';Slab curtains' include subvertical Pacific slabs near the Izu-Bonin and Marianas trenches that extend from the surface down to 1500 km in the lower mantle and are 400 to 500 km thick. Reconstructed slab lengths were assessed from tomographic volumes calculated at serial cross-sections. The ';slab curtain' geometry and restored slab lengths indicate a nearly stationary Pacific trench since ~43 Ma. In contrast to the flat slabs, here the reconstructed subduction zone had large subducting plate velocities relative to very small overriding plate velocities (i.e. Vs >> Vor). In addition to flat slabs and slab curtains, we also find other less widespread local subduction settings that lie at other locations in Vs:Vor parameter space or involved other processes. Slabs were mapped using Gocad software. Mapped slabs were restored to a spherical model Earth surface by two approaches: unfolding (i.e. piecewise flattening) to minimize shape and area distortions, and by evaluated mapped slab volumes. Gplates software was used to integrate the mapped slabs with plate tectonic reconstructions.

  15. Dynamic Model of Basic Oxygen Steelmaking Process Based on Multi-zone Reaction Kinetics: Model Derivation and Validation

    NASA Astrophysics Data System (ADS)

    Rout, Bapin Kumar; Brooks, Geoff; Rhamdhani, M. Akbar; Li, Zushu; Schrama, Frank N. H.; Sun, Jianjun

    2018-04-01

    A multi-zone kinetic model coupled with a dynamic slag generation model was developed for the simulation of hot metal and slag composition during the basic oxygen furnace (BOF) operation. The three reaction zones (i) jet impact zone, (ii) slag-bulk metal zone, (iii) slag-metal-gas emulsion zone were considered for the calculation of overall refining kinetics. In the rate equations, the transient rate parameters were mathematically described as a function of process variables. A micro and macroscopic rate calculation methodology (micro-kinetics and macro-kinetics) were developed to estimate the total refining contributed by the recirculating metal droplets through the slag-metal emulsion zone. The micro-kinetics involves developing the rate equation for individual droplets in the emulsion. The mathematical models for the size distribution of initial droplets, kinetics of simultaneous refining of elements, the residence time in the emulsion, and dynamic interfacial area change were established in the micro-kinetic model. In the macro-kinetics calculation, a droplet generation model was employed and the total amount of refining by emulsion was calculated by summing the refining from the entire population of returning droplets. A dynamic FetO generation model based on oxygen mass balance was developed and coupled with the multi-zone kinetic model. The effect of post-combustion on the evolution of slag and metal composition was investigated. The model was applied to a 200-ton top blowing converter and the simulated value of metal and slag was found to be in good agreement with the measured data. The post-combustion ratio was found to be an important factor in controlling FetO content in the slag and the kinetics of Mn and P in a BOF process.

  16. Effort Drivers Estimation for Brazilian Geographically Distributed Software Development

    NASA Astrophysics Data System (ADS)

    Almeida, Ana Carina M.; Souza, Renata; Aquino, Gibeon; Meira, Silvio

    To meet the requirements of today’s fast paced markets, it is important to develop projects on time and with the minimum use of resources. A good estimate is the key to achieve this goal. Several companies have started to work with geographically distributed teams due to cost reduction and time-to-market. Some researchers indicate that this approach introduces new challenges, because the teams work in different time zones and have possible differences in culture and language. It is already known that the multisite development increases the software cycle time. Data from 15 DSD projects from 10 distinct companies were collected. The analysis shows drivers that impact significantly the total effort planned to develop systems using DSD approach in Brazil.

  17. The integrated analyses of digital field mapping techniques and traditional field methods: implications from the Burdur-Fethiye Shear Zone, SW Turkey as a case-study

    NASA Astrophysics Data System (ADS)

    Elitez, İrem; Yaltırak, Cenk; Zabcı, Cengiz; Şahin, Murat

    2015-04-01

    The precise geological mapping is one of the most important issues in geological studies. Documenting the spatial distribution of geological bodies and their contacts play a crucial role on interpreting the tectonic evolution of any region. Although the traditional field techniques are still accepted to be the most fundamental tools in construction of geological maps, we suggest that the integration of digital technologies to the classical methods significantly increases the resolution and the quality of such products. We simply follow the following steps in integration of the digital data with the traditional field observations. First, we create the digital elevation model (DEM) of the region of interest by interpolating the digital contours of 1:25000 scale topographic maps to 10 m of ground pixel resolution. The non-commercial Google Earth satellite imagery and geological maps of previous studies are draped over the interpolated DEMs in the second stage. The integration of all spatial data is done by using the market leading GIS software, ESRI ArcGIS. We make the preliminary interpretation of major structures as tectonic lineaments and stratigraphic contacts. These preliminary maps are controlled and precisely coordinated during the field studies by using mobile tablets and/or phablets with GPS receivers. The same devices are also used in measuring and recording the geologic structures of the study region. Finally, all digitally collected measurements and observations are added to the GIS database and we finalise our geological map with all available information. We applied this integrated method to map the Burdur-Fethiye Shear Zone (BFSZ) in the southwest Turkey. The BFSZ is an active sinistral 60-to-90 km-wide shear zone, which prolongs about 300 km-long between Suhut-Cay in the northeast and Köyceğiz Lake-Kalkan in the southwest on land. The numerous studies suggest contradictory models not only about the evolution but also about the fault geometry of this wide deformation zone. In our study, we have mapped this complicated region since 2008 by using the data and the steps, which are described briefly above. After our joint-analyses, we show that there is no continuous single and narrow fault, the Burdur-Fethiye Fault, as it was previously suggested by many researches. Instead, the whole region is deformed under the oblique-sinistral shearing with considerable amount of extension, which causes a counterclockwise rotation within the zone.

  18. Analyse et design aerodynamique haute-fidelite de l'integration moteur sur un avion BWB

    NASA Astrophysics Data System (ADS)

    Mirzaei Amirabad, Mojtaba

    BWB (Blended Wing Body) is an innovative type of aircraft based on the flying wing concept. In this configuration, the wing and the fuselage are blended together smoothly. BWB offers economical and environmental advantages by reducing fuel consumption through improving aerodynamic performance. In this project, the goal is to improve the aerodynamic performance by optimizing the main body of BWB that comes from conceptual design. The high fidelity methods applied in this project have been less frequently addressed in the literature. This research develops an automatic optimization procedure in order to reduce the drag force on the main body. The optimization is carried out in two main stages: before and after engine installation. Our objective is to minimize the drag by taking into account several constraints in high fidelity optimization. The commercial software, Isight is chosen as an optimizer in which MATLAB software is called to start the optimization process. Geometry is generated using ANSYS-DesignModeler, unstructured mesh is created by ANSYS-Mesh and CFD calculations are done with the help of ANSYS-Fluent. All of these software are coupled together in ANSYS-Workbench environment which is called by MATLAB. The high fidelity methods are used during optimization by solving Navier-Stokes equations. For verifying the results, a finer structured mesh is created by ICEM software to be used in each stage of optimization. The first stage includes a 3D optimization on the surface of the main body, before adding the engine. The optimized case is then used as an input for the second stage in which the nacelle is added. It could be concluded that this study leads us to obtain appropriate reduction in drag coefficient for BWB without nacelle. In the second stage (adding the nacelle) a drag minimization is also achieved by performing a local optimization. Furthermore, the flow separation, created in the main body-nacelle zone, is reduced.

  19. Atomistic Cohesive Zone Models for Interface Decohesion in Metals

    NASA Technical Reports Server (NTRS)

    Yamakov, Vesselin I.; Saether, Erik; Glaessgen, Edward H.

    2009-01-01

    Using a statistical mechanics approach, a cohesive-zone law in the form of a traction-displacement constitutive relationship characterizing the load transfer across the plane of a growing edge crack is extracted from atomistic simulations for use within a continuum finite element model. The methodology for the atomistic derivation of a cohesive-zone law is presented. This procedure can be implemented to build cohesive-zone finite element models for simulating fracture in nanocrystalline or ultrafine grained materials.

  20. Spatial pattern recognition of seismic events in South West Colombia

    NASA Astrophysics Data System (ADS)

    Benítez, Hernán D.; Flórez, Juan F.; Duque, Diana P.; Benavides, Alberto; Lucía Baquero, Olga; Quintero, Jiber

    2013-09-01

    Recognition of seismogenic zones in geographical regions supports seismic hazard studies. This recognition is usually based on visual, qualitative and subjective analysis of data. Spatial pattern recognition provides a well founded means to obtain relevant information from large amounts of data. The purpose of this work is to identify and classify spatial patterns in instrumental data of the South West Colombian seismic database. In this research, clustering tendency analysis validates whether seismic database possesses a clustering structure. A non-supervised fuzzy clustering algorithm creates groups of seismic events. Given the sensitivity of fuzzy clustering algorithms to centroid initial positions, we proposed a methodology to initialize centroids that generates stable partitions with respect to centroid initialization. As a result of this work, a public software tool provides the user with the routines developed for clustering methodology. The analysis of the seismogenic zones obtained reveals meaningful spatial patterns in South-West Colombia. The clustering analysis provides a quantitative location and dispersion of seismogenic zones that facilitates seismological interpretations of seismic activities in South West Colombia.

  1. Numerical modelling for quantitative environmental risk assessment for the disposal of drill cuttings and mud

    NASA Astrophysics Data System (ADS)

    Wahab, Mohd Amirul Faiz Abdul; Shaufi Sokiman, Mohamad; Parsberg Jakobsen, Kim

    2017-10-01

    To investigate the fate of drilling waste and their impacts towards surrounding environment, numerical models were generated using an environmental software; MIKE by DHI. These numerical models were used to study the transportation of suspended drill waste plumes in the water column and its deposition on seabed in South China Sea (SCS). A random disposal site with the model area of 50 km × 25 km was selected near the Madalene Shoal in SCS and the ambient currents as well as other meteorological conditions were simulated in details at the proposed location. This paper was focusing on sensitivity study of different drill waste particle characteristics on impacts towards marine receiving environment. The drilling scenarios were obtained and adapted from the oil producer well at offshore Sabah (Case 1) and data from actual exploration drilling case at Pumbaa location (PL 469) in the Norwegian Sea (Case 2). The two cases were compared to study the effect of different drilling particle characteristics and their behavior in marine receiving environment after discharged. Using the Hydrodynamic and Sediment Transport models simulated in MIKE by DHI, the variation of currents and the behavior of the drilling waste particles can be analyzed and evaluated in terms of multiple degree zones of impacts.

  2. Characterization of Flame Cut Heavy Steel: Modeling of Temperature History and Residual Stress Formation

    NASA Astrophysics Data System (ADS)

    Jokiaho, T.; Laitinen, A.; Santa-aho, S.; Isakov, M.; Peura, P.; Saarinen, T.; Lehtovaara, A.; Vippola, M.

    2017-12-01

    Heavy steel plates are used in demanding applications that require both high strength and hardness. An important step in the production of such components is cutting the plates with a cost-effective thermal cutting method such as flame cutting. Flame cutting is performed with a controlled flame and oxygen jet, which burns the steel and forms a cutting edge. However, the thermal cutting of heavy steel plates causes several problems. A heat-affected zone (HAZ) is generated at the cut edge due to the steep temperature gradient. Consequently, volume changes, hardness variations, and microstructural changes occur in the HAZ. In addition, residual stresses are formed at the cut edge during the process. In the worst case, unsuitable flame cutting practices generate cracks at the cut edge. The flame cutting of thick steel plate was modeled using the commercial finite element software ABAQUS. The results of modeling were verified by X-ray diffraction-based residual stress measurements and microstructural analysis. The model provides several outcomes, such as obtaining more information related to the formation of residual stresses and the temperature history during the flame cutting process. In addition, an extensive series of flame cut samples was designed with the assistance of the model.

  3. A bridge role metric model for nodes in software networks.

    PubMed

    Li, Bo; Feng, Yanli; Ge, Shiyu; Li, Dashe

    2014-01-01

    A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the Bre results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices.

  4. A Bridge Role Metric Model for Nodes in Software Networks

    PubMed Central

    Li, Bo; Feng, Yanli; Ge, Shiyu; Li, Dashe

    2014-01-01

    A bridge role metric model is put forward in this paper. Compared with previous metric models, our solution of a large-scale object-oriented software system as a complex network is inherently more realistic. To acquire nodes and links in an undirected network, a new model that presents the crucial connectivity of a module or the hub instead of only centrality as in previous metric models is presented. Two previous metric models are described for comparison. In addition, it is obvious that the fitting curve between the results and degrees can well be fitted by a power law. The model represents many realistic characteristics of actual software structures, and a hydropower simulation system is taken as an example. This paper makes additional contributions to an accurate understanding of module design of software systems and is expected to be beneficial to software engineering practices. PMID:25364938

  5. Development of an Environment for Software Reliability Model Selection

    DTIC Science & Technology

    1992-09-01

    now is directed to other related problems such as tools for model selection, multiversion programming, and software fault tolerance modeling... multiversion programming, 7. Hlardware can be repaired by spare modules, which is not. the case for software, 2-6 N. Preventive maintenance is very important

  6. GeoTess: A generalized Earth model software utility

    DOE PAGES

    Ballard, Sanford; Hipp, James; Kraus, Brian; ...

    2016-03-23

    GeoTess is a model parameterization and software support library that manages the construction, population, storage, and interrogation of data stored in 2D and 3D Earth models. Here, the software is available in Java and C++, with a C interface to the C++ library.

  7. Mental Models of Software Forecasting

    NASA Technical Reports Server (NTRS)

    Hihn, J.; Griesel, A.; Bruno, K.; Fouser, T.; Tausworthe, R.

    1993-01-01

    The majority of software engineers resist the use of the currently available cost models. One problem is that the mathematical and statistical models that are currently available do not correspond with the mental models of the software engineers. In an earlier JPL funded study (Hihn and Habib-agahi, 1991) it was found that software engineers prefer to use analogical or analogy-like techniques to derive size and cost estimates, whereas curren CER's hide any analogy in the regression equations. In addition, the currently available models depend upon information which is not available during early planning when the most important forecasts must be made.

  8. Model Driven Engineering

    NASA Astrophysics Data System (ADS)

    Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan

    A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.

  9. Deformation and stress change associated with plate interaction at subduction zones: a kinematic modelling

    NASA Astrophysics Data System (ADS)

    Zhao, Shaorong; Takemoto, Shuzo

    2000-08-01

    The interseismic deformation associated with plate coupling at a subduction zone is commonly simulated by the steady-slip model in which a reverse dip-slip is imposed on the down-dip extension of the locked plate interface, or by the backslip model in which a normal slip is imposed on the locked plate interface. It is found that these two models, although totally different in principle, produce similar patterns for the vertical deformation at a subduction zone. This suggests that it is almost impossible to distinguish between these two models by analysing only the interseismic vertical deformation observed at a subduction zone. The steady-slip model cannot correctly predict the horizontal deformation associated with plate coupling at a subduction zone, a fact that is proved by both the numerical modelling in this study and the GPS (Global Positioning System) observations near the Nankai trough, southwest Japan. It is therefore inadequate to simulate the effect of the plate coupling at a subduction zone by the steady-slip model. It is also revealed that the unphysical assumption inherent in the backslip model of imposing a normal slip on the locked plate interface makes it impossible to predict correctly the horizontal motion of the subducted plate and the stress change within the overthrust zone associated with the plate coupling during interseismic stages. If the analysis made in this work is proved to be correct, some of the previous studies on interpreting the interseismic deformation observed at several subduction zones based on these two models might need substantial revision. On the basis of the investigations on plate interaction at subduction zones made using the finite element method and the kinematic/mechanical conditions of the plate coupling implied by the present plate tectonics, a synthesized model is proposed to simulate the kinematic effect of the plate interaction during interseismic stages. A numerical analysis shows that the proposed model, designed to simulate the motion of a subducted slab, can correctly produce the deformation and the main pattern of stress concentration associated with plate coupling at a subduction zone. The validity of the synthesized model is examined and partially verified by analysing the horizontal deformation observed by GPS near the Nankai trough, southwest Japan.

  10. Using the Vertical Component of the Surface Velocity Field to Map the Locked Zone at Cascadia Subduction Zone

    NASA Astrophysics Data System (ADS)

    Moulas, E.; Brandon, M. T.; Podladchikov, Y.; Bennett, R. A.

    2014-12-01

    At present, our understanding of the locked zone at Cascadia subduction zone is based on thermal modeling and elastic modeling of horizontal GPS velocities. The thermal model by Hyndman and Wang (1995) provided a first-order assessment of where the subduction thrust might be cold enough for stick-slip behavior. The alternative approach by McCaffrey et al. (2007) is to use a Green's function that relates horizontal surface velocities, as recorded by GPS, to interseismic elastic deformation. The thermal modeling approach is limited by a lack of information about the amount of frictional heating occurring on the thrust (Molnar and England, 1990). The GPS approach is limited in that the horizontal velocity component is fairly insensitive to the structure of the locked zone. The vertical velocity component is much more useful for this purpose. We are fortunate in that vertical velocities can now be measured by GPS to a precision of about 0.2 mm/a. The dislocation model predicts that vertical velocities should range up to about 20 percent of the subduction velocity, which means maximum values of ~7 mm/a. The locked zone is generally entirely offshore at Cascadia, except for the Olympic Peninsula region, where the underlying Juan De Fuca plate has an anomalously low dip. Previous thermal and GPS modeling, as well as tide gauge data and episodic tremors indicate the locked zone there extends about 50 to 75 km onland. This situation provides an opportunity to directly study the locked zone. With that objective in mind, we have constructed a full 3D geodynamic model of the Cascadia subduction zone. At present, the model provides a full representation of the interseismic elastic deformation due to variations of slip on the subduction thrust. The model has been benchmarked against the Savage (2D) and Okada (3D) analytical solutions. This model has an important advantage over traditional dislocation modeling in that we include temperature-sensitive viscosity for the upper and lower plates, and also use realistic constitutive models to represent the locked zone. Another important advantage is that the 3D model provides a full representation of the interseismic deformation, which is important for interpreting GPS data.

  11. Comparisons between thermodynamic and one-dimensional combustion models of spark-ignition engines

    NASA Technical Reports Server (NTRS)

    Ramos, J. I.

    1986-01-01

    Results from a one-dimensional combustion model employing a constant eddy diffusivity and a one-step chemical reaction are compared with those of one-zone and two-zone thermodynamic models to study the flame propagation in a spark-ignition engine. One-dimensional model predictions are found to be very sensitive to the eddy diffusivity and reaction rate data. The average mixing temperature found using the one-zone thermodynamic model is higher than those of the two-zone and one-dimensional models during the compression stroke, and that of the one-dimensional model is higher than those predicted by both thermodynamic models during the expansion stroke. The one-dimensional model is shown to predict an accelerating flame even when the front approaches the cold cylinder wall.

  12. Modeling Latent Growth Curves With Incomplete Data Using Different Types of Structural Equation Modeling and Multilevel Software

    ERIC Educational Resources Information Center

    Ferrer, Emilio; Hamagami, Fumiaki; McArdle, John J.

    2004-01-01

    This article offers different examples of how to fit latent growth curve (LGC) models to longitudinal data using a variety of different software programs (i.e., LISREL, Mx, Mplus, AMOS, SAS). The article shows how the same model can be fitted using both structural equation modeling and multilevel software, with nearly identical results, even in…

  13. Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions

    DTIC Science & Technology

    2012-07-01

    Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software

  14. Illuminating Northern California’s Active Faults

    USGS Publications Warehouse

    Prentice, Carol S.; Crosby, Christopher J.; Whitehill, Caroline S.; Arrowsmith, J. Ramon; Furlong, Kevin P.; Philips, David A.

    2009-01-01

    Newly acquired light detection and ranging (lidar) topographic data provide a powerful community resource for the study of landforms associated with the plate boundary faults of northern California (Figure 1). In the spring of 2007, GeoEarthScope, a component of the EarthScope Facility construction project funded by the U.S. National Science Foundation, acquired approximately 2000 square kilometers of airborne lidar topographic data along major active fault zones of northern California. These data are now freely available in point cloud (x, y, z coordinate data for every laser return), digital elevation model (DEM), and KMZ (zipped Keyhole Markup Language, for use in Google EarthTM and other similar software) formats through the GEON OpenTopography Portal (http://www.OpenTopography.org/data). Importantly, vegetation can be digitally removed from lidar data, producing high-resolution images (0.5- or 1.0-meter DEMs) of the ground surface beneath forested regions that reveal landforms typically obscured by vegetation canopy (Figure 2)

  15. Defect measurement and analysis of JPL ground software: a case study

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Spagnuolo, John N., Jr.

    2004-01-01

    Ground software systems at JPL must meet high assurance standards while remaining on schedule due to relatively immovable launch dates for spacecraft that will be controlled by such systems. Toward this end, the Software Quality Improvement (SQI) project's Measurement and Benchmarking (M&B) team is collecting and analyzing defect data of JPL ground system software projects to build software defect prediction models. The aim of these models is to improve predictability with regard to software quality activities. Predictive models will quantitatively define typical trends for JPL ground systems as well as Critical Discriminators (CDs) to provide explanations for atypical deviations from the norm at JPL. CDs are software characteristics that can be estimated or foreseen early in a software project's planning. Thus, these CDs will assist in planning for the predicted degree to which software quality activities for a project are likely to deviation from the normal JPL ground system based on pasted experience across the lab.

  16. Co Modeling and Co Synthesis of Safety Critical Multi threaded Embedded Software for Multi Core Embedded Platforms

    DTIC Science & Technology

    2017-03-20

    computation, Prime Implicates, Boolean Abstraction, real- time embedded software, software synthesis, correct by construction software design , model...types for time -dependent data-flow networks". J.-P. Talpin, P. Jouvelot, S. Shukla. ACM-IEEE Conference on Methods and Models for System Design ...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing   data sources, gathering and

  17. Adaptive Long-Term Monitoring at Environmental Restoration Sites (ER-0629)

    DTIC Science & Technology

    2009-05-01

    Figures Figure 2-1 General Flowchart of Software Application Figure 2-2 Overview of the Genetic Algorithm Approach Figure 2-3 Example of a...and Model Builder) are highlighted on Figure 2-1, which is a general flowchart illustrating the application of the software. The software is applied...monitoring event (e.g., contaminant mass based on interpolation) that modeling is provided by Model Builder. 4 Figure 2-1. General Flowchart of Software

  18. Analysis of the frequency and severity of rear-end crashes in work zones.

    PubMed

    Qi, Yi; Srinivasan, Raghavan; Teng, Hualiang; Baker, Robert

    2013-01-01

    The objective of this study was to identify the factors that influence the frequency and severity of rear-end crashes in work zones because rear-end crashes represent a significant proportion of crashes that occur in work zones. Truncated count data models were developed to identify influencing factors on the frequency of read-end crashes in work zones and ordered probit models were developed to evaluate influencing factors on the severity of rear-end crashes in work zones. Most of the variables identified in this study for these 2 models were significant at the 95 percent level. The statistics for models indicate that the 2 developed models are appropriate compared to alternative models. Major findings related to the frequency of rear-end crashes include the following: (1) work zones for capacity and pavement improvements have the highest frequency compared to other types of work zones; (2) work zones controlled by flaggers are associated with more rear-end crashes compared to those controlled by arrow boards; and (3) work zones with alternating one-way traffic tended to have more rear-end crashes compared to those with lane shifts. Major findings related to the severity of the rear-end crashes include the following: (1) rear-end crashes associated with alcohol, night, pedestrians, and roadway defects are more severe, and those associated with careless backing, stalled vehicles, slippery roadways, and misunderstanding flagging signals are less severe; (2) truck involvement and a large number of vehicles in a crash are both associated with increased severity, and (3) rear-end crashes that happened in work zones for bridge, capacity, and pavement are likely to be more severe than others.

  19. Prostate specific antigen and acinar density: a new dimension, the "Prostatocrit".

    PubMed

    Robinson, Simon; Laniado, Marc; Montgomery, Bruce

    2017-01-01

    Prostate-specific antigen densities have limited success in diagnosing prostate cancer. We emphasise the importance of the peripheral zone when considered with its cellular constituents, the "prostatocrit". Using zonal volumes and asymmetry of glandular acini, we generate a peripheral zone acinar volume and density. With the ratio to the whole gland, we can better predict high grade and all grade cancer. We can model the gland into its acinar and stromal elements. This new "prostatocrit" model could offer more accurate nomograms for biopsy. 674 patients underwent TRUS and biopsy. Whole gland and zonal volumes were recorded. We compared ratio and acinar volumes when added to a "clinic" model using traditional PSA density. Univariate logistic regression was used to find significant predictors for all and high grade cancer. Backwards multiple logistic regression was used to generate ROC curves comparing the new model to conventional density and PSA alone. Prediction of all grades of prostate cancer: significant variables revealed four significant "prostatocrit" parameters: log peripheral zone acinar density; peripheral zone acinar volume/whole gland acinar volume; peripheral zone acinar density/whole gland volume; peripheral zone acinar density. Acinar model (AUC 0.774), clinic model (AUC 0.745) (P=0.0105). Prediction of high grade prostate cancer: peripheral zone acinar density ("prostatocrit") was the only significant density predictor. Acinar model (AUC 0.811), clinic model (AUC 0.769) (P=0.0005). There is renewed use for ratio and "prostatocrit" density of the peripheral zone in predicting cancer. This outperforms all traditional density measurements. Copyright® by the International Brazilian Journal of Urology.

  20. Software forecasting as it is really done: A study of JPL software engineers

    NASA Technical Reports Server (NTRS)

    Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.

    1993-01-01

    This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.

  1. Aquifer Characterization and Groundwater Potential Evaluation in Sedimentary Rock Formation

    NASA Astrophysics Data System (ADS)

    Ashraf, M. A. M.; Yusoh, R.; Sazalil, M. A.; Abidin, M. H. Z.

    2018-04-01

    This study was conducted to characterize the aquifer and evaluate the ground water potential in the formation of sedimentary rocks. Electrical resistivity and drilling methods were used to develop subsurface soil profile for determining suitable location for tube well construction. The electrical resistivity method was used to infer the subsurface soil layer by use of three types of arrays, namely, the pole–dipole, Wenner, and Schlumberger arrays. The surveys were conducted using ABEM Terrameter LS System, and the results were analyzed using 2D resistivity inversion program (RES2DINV) software. The survey alignments were performed with maximum electrode spreads of 400 and 800 m by employing two different resistivity survey lines at the targeted zone. The images were presented in the form of 2D resistivity profiles to provide a clear view of the distribution of interbedded sandstone, siltstone, and shale as well as the potential groundwater zones. The potential groundwater zones identified from the resistivity results were confirmed using pumping, step drawdown, and recovery tests. The combination among the three arrays and the correlation between the well log and pumping test are reliable and successful in identifying potential favorable zones for obtaining groundwater in the study area.

  2. A comparative approach to computer aided design model of a dog femur.

    PubMed

    Turamanlar, O; Verim, O; Karabulut, A

    2016-01-01

    Computer assisted technologies offer new opportunities in medical imaging and rapid prototyping in biomechanical engineering. Three dimensional (3D) modelling of soft tissues and bones are becoming more important. The accuracy of the analysis in modelling processes depends on the outline of the tissues derived from medical images. The aim of this study is the evaluation of the accuracy of 3D models of a dog femur derived from computed tomography data by using point cloud method and boundary line method on several modelling software. Solidworks, Rapidform and 3DSMax software were used to create 3D models and outcomes were evaluated statistically. The most accurate 3D prototype of the dog femur was created with stereolithography method using rapid prototype device. Furthermore, the linearity of the volumes of models was investigated between software and the constructed models. The difference between the software and real models manifests the sensitivity of the software and the devices used in this manner.

  3. Integrated Functional and Executional Modelling of Software Using Web-Based Databases

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Marietta, Roberta

    1998-01-01

    NASA's software subsystems undergo extensive modification and updates over the operational lifetimes. It is imperative that modified software should satisfy safety goals. This report discusses the difficulties encountered in doing so and discusses a solution based on integrated modelling of software, use of automatic information extraction tools, web technology and databases.

  4. Experimental Evaluation of a Serious Game for Teaching Software Process Modeling

    ERIC Educational Resources Information Center

    Chaves, Rafael Oliveira; von Wangenheim, Christiane Gresse; Furtado, Julio Cezar Costa; Oliveira, Sandro Ronaldo Bezerra; Santos, Alex; Favero, Eloi Luiz

    2015-01-01

    Software process modeling (SPM) is an important area of software engineering because it provides a basis for managing, automating, and supporting software process improvement (SPI). Teaching SPM is a challenging task, mainly because it lays great emphasis on theory and offers few practical exercises. Furthermore, as yet few teaching approaches…

  5. Estimating Software-Development Costs With Greater Accuracy

    NASA Technical Reports Server (NTRS)

    Baker, Dan; Hihn, Jairus; Lum, Karen

    2008-01-01

    COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.

  6. SWIFT MODELLER: a Java based GUI for molecular modeling.

    PubMed

    Mathur, Abhinav; Shankaracharya; Vidyarthi, Ambarish S

    2011-10-01

    MODELLER is command line argument based software which requires tedious formatting of inputs and writing of Python scripts which most people are not comfortable with. Also the visualization of output becomes cumbersome due to verbose files. This makes the whole software protocol very complex and requires extensive study of MODELLER manuals and tutorials. Here we describe SWIFT MODELLER, a GUI that automates formatting, scripting and data extraction processes and present it in an interactive way making MODELLER much easier to use than before. The screens in SWIFT MODELLER are designed keeping homology modeling in mind and their flow is a depiction of its steps. It eliminates the formatting of inputs, scripting processes and analysis of verbose output files through automation and makes pasting of the target sequence as the only prerequisite. Jmol (3D structure visualization tool) has been integrated into the GUI which opens and demonstrates the protein data bank files created by the MODELLER software. All files required and created by the software are saved in a folder named after the work instance's date and time of execution. SWIFT MODELLER lowers the skill level required for the software through automation of many of the steps in the original software protocol, thus saving an enormous amount of time per instance and making MODELLER very easy to work with.

  7. Gravity modelling of the Hellenic subduction zone — a regional study

    NASA Astrophysics Data System (ADS)

    Casten, U.; Snopek, K.

    2006-05-01

    The Hellenic subduction zone is clearly expressed in the arc-shaped distribution of earthquake epicenters and gravity anomalies, which connect the Peloponnesos with Crete and Anatolia. In this region, oceanic crust of the African plate collides northward with continental crust of the Aegean microplate, which itself is pushed apart to the south-west by the Anatolian plate and, at the same time, is characterised by crustal extension. The result is an overall collision rate of up to 4 cm/year and a retreating subduction process. Recent passive and active seismic studies on and around Crete gave first, but not in all details consistent, structural results useful for supporting gravity modelling. This was undertaken with the aim of presenting the first 3D density structure of the entire subduction zone. Gravity interpretation was based on a Bouguer map, newly compiled using data from land, marine and satellite sources. The anomalies range from + 170 mGal (Cretan Sea) to - 10 mGal (Mediterranean Ridge). 3D gravity modelling was done applying the modelling software IGMAS. The computed Bouguer map fits the low frequency part of the observed one, which is controlled by variations in Moho depth (less than 20 km below the Cretan Sea and extending 30 km below Crete) and the extremely thick sedimentary cover (partly up to 18 km) of the Mediterranean Ridge. The southernmost edge of the Eurasian plate, with its more triangular-shaped backstop area, was traced south off Crete. Only 50 to 100 km further to the south, the edge of the African continent was traced as well. In between these boundaries there is African oceanic crust, which has a clear arc-shaped detachment line situated at the Eurasian continental edge. The subduction arc is open towards the north, its slab separates hotter mantle material (lower density) below the updoming Moho of the Cretan Sea from colder one (higher density) in the south. Subjacent to the upper continental crust of Crete is a thickened layer of lower crust followed by the subducted oceanic crust with some mantle material as intermediate layer. The depth of the oceanic Moho below Crete is 50 km. The presence and structure of subducted or underplated sediments remains uncertain.

  8. Investigating Uranium Mobility Using Stable Isotope Partitioning of 238U/235U and a Reactive Transport Model

    NASA Astrophysics Data System (ADS)

    Bizjack, M.; Johnson, T. M.; Druhan, J. L.; Shiel, A. E.

    2015-12-01

    We report a numerical reactive transport model which explicitly incorporates the effectively stable isotopes of uranium (U) and the factors that influence their partitioning in bioactive systems. The model reproduces trends observed in U isotope ratios and concentration measurements from a field experiment, thereby improving interpretations of U isotope ratios as a tracer for U reactive transport. A major factor contributing to U storage and transport is its redox state, which is commonly influenced by the availability of organic carbon to support metal-reducing microbial communities. Both laboratory and field experiments have demonstrated that biogenic reduction of U(VI) fractionates the stable isotope ratio 238U/235U, producing an isotopically heavy solid U(IV) product. It has also been shown that other common reactive transport processes involving U do not fractionate isotopes to a consistently measurable level, which suggests the capacity to quantify the extent of bioreduction occurring in groundwater containing U using 238U/235U ratios. A recent study of a U bioremediation experiment at the Rifle IFRC site (Colorado, USA) applied Rayleigh distillation models to quantify U stable isotope fractionation observed during acetate amendment. The application of these simplified models were fit to the observations only by invoking a "memory-effect," or a constant source of low-concentration, unfractionated U(VI). In order to more accurately interpret the measured U isotope ratios, we present a multi-component reactive transport model using the CrunchTope software. This approach is capable of quantifying the cycling and partitioning of individual U isotopes through a realistic network of transport and reaction pathways including reduction, oxidation, and microbial growth. The model incorporates physical heterogeneity of the aquifer sediments through zones of decreased permeability, which replicate the observed bromide tracer, major ion chemistry, U concentration, and U isotope ratios. These results suggest that the rate-limited transport properties of U in the Rifle aquifer are governed by the presence of low-permeability regions in the modeling domain and that these zones are responsible for the suggested "memory" effect observed in previous U isotope studies at this site.

  9. The Relative Importance of the Vadose Zone in Multimedia Risk Assessment Modeling Applied at a National Scale: An Analysis of Benzene Using 3MRA

    NASA Astrophysics Data System (ADS)

    Babendreier, J. E.

    2002-05-01

    Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidable one, particularly in regulatory settings applied on a national scale. Quantitative assessment of uncertainty and sensitivity within integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a systematic, comparative approach coupled with sufficient computational power. The Multimedia, Multipathway, and Multireceptor Risk Assessment Model (3MRA) is an important code being developed by the United States Environmental Protection Agency for use in site-scale risk assessment (e.g. hazardous waste management facilities). The model currently entails over 700 variables, 185 of which are explicitly stochastic. The 3MRA can start with a chemical concentration in a waste management unit (WMU). It estimates the release and transport of the chemical throughout the environment, and predicts associated exposure and risk. The 3MRA simulates multimedia (air, water, soil, sediments), pollutant fate and transport, multipathway exposure routes (food ingestion, water ingestion, soil ingestion, air inhalation, etc.), multireceptor exposures (resident, gardener, farmer, fisher, ecological habitats and populations), and resulting risk (human cancer and non-cancer effects, ecological population and community effects). The 3MRA collates the output for an overall national risk assessment, offering a probabilistic strategy as a basis for regulatory decisions. To facilitate model execution of 3MRA for purposes of conducting uncertainty and sensitivity analysis, a PC-based supercomputer cluster was constructed. Design of SuperMUSE, a 125 GHz Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described, along with the conceptual layout of an accompanying java-based paralleling software toolset. Preliminary work is also reported for a scenario involving Benzene disposal that describes the relative importance of the vadose zone in driving risk levels for ecological receptors and human health. Incorporating landfills, waste piles, aerated tanks, surface impoundments, and land application units, the site-based data used in the analysis included 201 national facilities representing 419 site-WMU combinations.

  10. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  11. Modeling software systems by domains

    NASA Technical Reports Server (NTRS)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  12. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  13. Implementing a modeling software for animated protein-complex interactions using a physics simulation library.

    PubMed

    Ueno, Yutaka; Ito, Shuntaro; Konagaya, Akihiko

    2014-12-01

    To better understand the behaviors and structural dynamics of proteins within a cell, novel software tools are being developed that can create molecular animations based on the findings of structural biology. This study proposes our method developed based on our prototypes to detect collisions and examine the soft-body dynamics of molecular models. The code was implemented with a software development toolkit for rigid-body dynamics simulation and a three-dimensional graphics library. The essential functions of the target software system included the basic molecular modeling environment, collision detection in the molecular models, and physical simulations of the movement of the model. Taking advantage of recent software technologies such as physics simulation modules and interpreted scripting language, the functions required for accurate and meaningful molecular animation were implemented efficiently.

  14. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  15. A General Water Resources Regulation Software System in China

    NASA Astrophysics Data System (ADS)

    LEI, X.

    2017-12-01

    To avoid iterative development of core modules in water resource normal regulation and emergency regulation and improve the capability of maintenance and optimization upgrading of regulation models and business logics, a general water resources regulation software framework was developed based on the collection and analysis of common demands for water resources regulation and emergency management. It can provide a customizable, secondary developed and extensible software framework for the three-level platform "MWR-Basin-Province". Meanwhile, this general software system can realize business collaboration and information sharing of water resources regulation schemes among the three-level platforms, so as to improve the decision-making ability of national water resources regulation. There are four main modules involved in the general software system: 1) A complete set of general water resources regulation modules allows secondary developer to custom-develop water resources regulation decision-making systems; 2) A complete set of model base and model computing software released in the form of Cloud services; 3) A complete set of tools to build the concept map and model system of basin water resources regulation, as well as a model management system to calibrate and configure model parameters; 4) A database which satisfies business functions and functional requirements of general water resources regulation software can finally provide technical support for building basin or regional water resources regulation models.

  16. The discounting model selector: Statistical software for delay discounting applications.

    PubMed

    Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A

    2017-05-01

    Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.

  17. Computational fluid dynamics-habitat suitability index (CFD-HSI) modelling as an exploratory tool for assessing passability of riverine migratory challenge zones for fish

    USGS Publications Warehouse

    Haro, Alexander J.; Chelminski, Michael; Dudley, Robert W.

    2015-01-01

    We developed two-dimensional computational fluid hydraulics-habitat suitability index (CFD-HSI) models to identify and qualitatively assess potential zones of shallow water depth and high water velocity that may present passage challenges for five major anadromous fish species in a 2.63-km reach of the main stem Penobscot River, Maine, as a result of a dam removal downstream of the reach. Suitability parameters were based on distribution of fish lengths and body depths and transformed to cruising, maximum sustained and sprint swimming speeds. Zones of potential depth and velocity challenges were calculated based on the hydraulic models; ability of fish to pass a challenge zone was based on the percent of river channel that the contiguous zone spanned and its maximum along-current length. Three river flows (low: 99.1 m3 sec-1; normal: 344.9 m3 sec-1; and high: 792.9 m3 sec-1) were modelled to simulate existing hydraulic conditions and hydraulic conditions simulating removal of a dam at the downstream boundary of the reach. Potential depth challenge zones were nonexistent for all low-flow simulations of existing conditions for deeper-bodied fishes. Increasing flows for existing conditions and removal of the dam under all flow conditions increased the number and size of potential velocity challenge zones, with the effects of zones being more pronounced for smaller species. The two-dimensional CFD-HSI model has utility in demonstrating gross effects of flow and hydraulic alteration, but may not be as precise a predictive tool as a three-dimensional model. Passability of the potential challenge zones cannot be precisely quantified for two-dimensional or three-dimensional models due to untested assumptions and incomplete data on fish swimming performance and behaviours.

  18. Investigating Some Technical Issues on Cohesive Zone Modeling of Fracture

    NASA Technical Reports Server (NTRS)

    Wang, John T.

    2011-01-01

    This study investigates some technical issues related to the use of cohesive zone models (CZMs) in modeling fracture processes. These issues include: why cohesive laws of different shapes can produce similar fracture predictions; under what conditions CZM predictions have a high degree of agreement with linear elastic fracture mechanics (LEFM) analysis results; when the shape of cohesive laws becomes important in the fracture predictions; and why the opening profile along the cohesive zone length needs to be accurately predicted. Two cohesive models were used in this study to address these technical issues. They are the linear softening cohesive model and the Dugdale perfectly plastic cohesive model. Each cohesive model constitutes five cohesive laws of different maximum tractions. All cohesive laws have the same cohesive work rate (CWR) which is defined by the area under the traction-separation curve. The effects of the maximum traction on the cohesive zone length and the critical remote applied stress are investigated for both models. For a CZM to predict a fracture load similar to that obtained by an LEFM analysis, the cohesive zone length needs to be much smaller than the crack length, which reflects the small scale yielding condition requirement for LEFM analysis to be valid. For large-scale cohesive zone cases, the predicted critical remote applied stresses depend on the shape of cohesive models used and can significantly deviate from LEFM results. Furthermore, this study also reveals the importance of accurately predicting the cohesive zone profile in determining the critical remote applied load.

  19. [Will the climate change affect the mortality from prostate cancer?].

    PubMed

    Santos Arrontes, Daniel; García González, Jesús Isidro; Martín Muñoz, Manuel Pablo; Castro Pita, Miguel; Mañas Pelillo, Antonio; Paniagua Andrés, Pedro

    2007-03-01

    The global heating of the atmosphere, as well as the increase of the exposition to sunlight, will be associated with a decrease of the mortality from prostate cancer, due to an increase of the plasmatic levels of vitamin D. To evaluate if climatological factors (temperature, rainfall, and number of sunlight hours per year) may influence the mortality associated with prostate cancer over a five-year period. In this ecology type study we will evaluate the trends of prostate tumors associated mortality in the period between January 1st 1998 and December 31st 2002, in the geographic area of Spain (17 Autonomic communities-CA-and 2 Autonomic cities- Ceuta and Melilla-, 43 million inhabitants). Demographic and mortality data were obtained from the National Institute of Statistics (INE) and climatological data about temperature and rainfall were obtained from the National Institute of Meteorology (INM). The provinces were classified using the climatic index of Martonne (defined as the quotient between annual rainfall and mean annual temperature plus 10). Areas with a quotient below 5 ml/m2/o C are considered extremely arid zones; between 5 and 15 ml/m2/o C are considered arid zones, between 15 and 20 ml/m2/o C semiarid zones; between 20 and 30 ml/m2/o C subhumid zones; between 30 and 60 ml/m2/o C humid zones; and over 60 ml/m2/o C superhumid zones. We compared mortality rates between different climatic areas using the Jonckheere-Terpstra test for six independent samples following the index of Martonne. All calculations were performed using the SPSS v 13.0 for Windows software. A logistic regression model was performed to identify climate factors associated with prostate cancer mortality. A likeliness of the null hypotheses inferior to 0.05 was considered significant. Prostate cancer mortality presented statistically significant differences, being higher in provinces with higher Martonne index (p < 0.001) and lower in areas with a greater number of sunlight hours per year (p = 0.041). The adjusted mortality rate associated with extreme aridity regions and was 21.51 cases/100,000 males year, whereas in humid zones it was 35.87 cases/100,000 males years. Mortality associated with prostate cancer is significantly superior in regions with less exposition to the sunlight. The climate change may lead to a modification of the main epidemiologic patterns, and it may be associated with a modification of cancer mortality rates. Nevertheless, these results should be taken with caution and should be confirmed by prospective studies.

  20. Classifying zones of suitability for manual drilling using textural and hydraulic parameters of shallow aquifers: a case study in northwestern Senegal

    NASA Astrophysics Data System (ADS)

    Fussi, F. Fabio; Fumagalli, Letizia; Fava, Francesco; Di Mauro, Biagio; Kane, Cheik Hamidou; Niang, Magatte; Wade, Souleye; Hamidou, Barry; Colombo, Roberto; Bonomi, Tullia

    2017-12-01

    A method is proposed that uses analysis of borehole stratigraphic logs for the characterization of shallow aquifers and for the assessment of areas suitable for manual drilling. The model is based on available borehole-log parameters: depth to hard rock, depth to water, thickness of laterite and hydraulic transmissivity of the shallow aquifer. The model is applied to a study area in northwestern Senegal. A dataset of boreholes logs has been processed using a software package (TANGAFRIC) developed during the research. After a manual procedure to assign a standard category describing the lithological characteristics, the next step is the automated extraction of different textural parameters and the estimation of hydraulic conductivity using reference values available in the literature. The hydraulic conductivity values estimated from stratigraphic data have been partially validated, by comparing them with measured values from a series of pumping tests carried out in large-diameter wells. The results show that this method is able to produce a reliable interpretation of the shallow hydrogeological context using information generally available in the region. The research contributes to improving the identification of areas where conditions are suitable for manual drilling. This is achieved by applying the described method, based on a structured and semi-quantitative approach, to classify the zones of suitability for given manual drilling techniques using data available in most African countries. Ultimately, this work will support proposed international programs aimed at promoting low-cost water supply in Africa and enhancing access to safe drinking water for the population.

Top