Sample records for hydrostratigraphic framework model

  1. Three-dimensional hydrogeologic framework model of the Rio Grande transboundary region of New Mexico and Texas, USA, and northern Chihuahua, Mexico

    USGS Publications Warehouse

    Sweetkind, Donald S.

    2017-09-08

    As part of a U.S. Geological Survey study in cooperation with the Bureau of Reclamation, a digital three-dimensional hydrogeologic framework model was constructed for the Rio Grande transboundary region of New Mexico and Texas, USA, and northern Chihuahua, Mexico. This model was constructed to define the aquifer system geometry and subsurface lithologic characteristics and distribution for use in a regional numerical hydrologic model. The model includes five hydrostratigraphic units: river channel alluvium, three informal subdivisions of Santa Fe Group basin fill, and an undivided pre-Santa Fe Group bedrock unit. Model input data were compiled from published cross sections, well data, structure contour maps, selected geophysical data, and contiguous compilations of surficial geology and structural features in the study area. These data were used to construct faulted surfaces that represent the upper and lower subsurface hydrostratigraphic unit boundaries. The digital three-dimensional hydrogeologic framework model is constructed through combining faults, the elevation of the tops of each hydrostratigraphic unit, and boundary lines depicting the subsurface extent of each hydrostratigraphic unit. The framework also compiles a digital representation of the distribution of sedimentary facies within each hydrostratigraphic unit. The digital three-dimensional hydrogeologic model reproduces with reasonable accuracy the previously published subsurface hydrogeologic conceptualization of the aquifer system and represents the large-scale geometry of the subsurface aquifers. The model is at a scale and resolution appropriate for use as the foundation for a numerical hydrologic model of the study area.

  2. A Hydrostratigraphic Model and Alternatives for the Groundwater Flow and Contaminant Transport Model of Corrective Action Unit 97: Yucca Flat-Climax Mine, Lincoln and Nye Counties, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geotechnical Sciences Group Bechtel Nevada

    2006-01-01

    A new three-dimensional hydrostratigraphic framework model for the Yucca Flat-Climax Mine Corrective Action Unit was completed in 2005. The model area includes Yucca Flat and Climax Mine, former nuclear testing areas at the Nevada Test Site, and proximal areas. The model area is approximately 1,250 square kilometers in size and is geologically complex. Yucca Flat is a topographically closed basin typical of many valleys in the Basin and Range province. Faulted and tilted blocks of Tertiary-age volcanic rocks and underlying Proterozoic and Paleozoic sedimentary rocks form low ranges around the structural basin. During the Cretaceous Period a granitic intrusive wasmore » emplaced at the north end of Yucca Flat. A diverse set of geological and geophysical data collected over the past 50 years was used to develop a structural model and hydrostratigraphic system for the basin. These were integrated using EarthVision? software to develop the 3-dimensional hydrostratigraphic framework model. Fifty-six stratigraphic units in the model area were grouped into 25 hydrostratigraphic units based on each unit's propensity toward aquifer or aquitard characteristics. The authors organized the alluvial section into 3 hydrostratigraphic units including 2 aquifers and 1 confining unit. The volcanic units in the model area are organized into 13 hydrostratigraphic units that include 8 aquifers and 5 confining units. The underlying pre-Tertiary rocks are divided into 7 hydrostratigraphic units, including 3 aquifers and 4 confining units. Other units include 1 Tertiary-age sedimentary confining unit and 1 Mesozoic-age granitic confining unit. The model depicts the thickness, extent, and geometric relationships of these hydrostratigraphic units (''layers'' in the model) along with the major structural features (i.e., faults). The model incorporates 178 high-angle normal faults of Tertiary age and 2 low-angle thrust faults of Mesozoic age. The complexity of the model area and the non-uniqueness of some of the interpretations incorporated into the base model made it necessary to formulate alternative interpretations for some of the major features in the model. Five of these alternatives were developed so they could be modeled in the same fashion as the base model. This work was done for the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office in support of the Underground Test Area subproject of the Environmental Restoration Project.« less

  3. A Hydrostratigraphic Framework Model and Alternatives for the Groundwater Flow and Contaminant Transport Model of Corrective Action Unit 98: Frenchman Flat, Clark, Lincoln and Nye Counties, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bechtel Nevada

    2005-09-01

    A new, revised three-dimensional (3-D) hydrostratigraphic framework model for Frenchman Flat was completed in 2004. The area of interest includes Frenchman Flat, a former nuclear testing area at the Nevada Test Site, and proximal areas. Internal and external reviews of an earlier (Phase I) Frenchman Flat model recommended additional data collection to address uncertainties. Subsequently, additional data were collected for this Phase II initiative, including five new drill holes and a 3-D seismic survey.

  4. A Hydrostratigraphic System for Modeling Groundwater Flow and Radionuclide Migration at the Corrective Action Unit Scale, Nevada Test Site and Surrounding Areas, Clark, Lincoln, and Nye Counties, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prothro, Lance; Drellack Jr., Sigmund; Mercadante, Jennifer

    2009-01-31

    Underground Test Area (UGTA) corrective action unit (CAU) groundwater flow and contaminant transport models of the Nevada Test Site (NTS) and vicinity are built upon hydrostratigraphic framework models (HFMs) that utilize the hydrostratigraphic unit (HSU) as the fundamental modeling component. The delineation and three-dimensional (3-D) modeling of HSUs within the highly complex geologic terrain that is the NTS requires a hydrostratigraphic system that is internally consistent, yet flexible enough to account for overlapping model areas, varied geologic terrain, and the development of multiple alternative HFMs. The UGTA CAU-scale hydrostratigraphic system builds on more than 50 years of geologic and hydrologicmore » work in the NTS region. It includes 76 HSUs developed from nearly 300 stratigraphic units that span more than 570 million years of geologic time, and includes rock units as diverse as marine carbonate and siliciclastic rocks, granitic intrusives, rhyolitic lavas and ash-flow tuffs, and alluvial valley-fill deposits. The UGTA CAU-scale hydrostratigraphic system uses a geology-based approach and two-level classification scheme. The first, or lowest, level of the hydrostratigraphic system is the hydrogeologic unit (HGU). Rocks in a model area are first classified as one of ten HGUs based on the rock’s ability to transmit groundwater (i.e., nature of their porosity and permeability), which at the NTS is mainly a function of the rock’s primary lithology, type and degree of postdepositional alteration, and propensity to fracture. The second, or highest, level within the UGTA CAU-scale hydrostratigraphic system is the HSU, which is the fundamental mapping/modeling unit within UGTA CAU-scale HFMs. HSUs are 3-D bodies that are represented in the finite element mesh for the UGTA groundwater modeling process. HSUs are defined systematically by stratigraphically organizing HGUs of similar character into larger HSUs designations. The careful integration of stratigraphic information in the development of HSUs is important to assure individual HSUs are internally consistent, correlatable, and mappable throughout all the model areas.« less

  5. Key subsurface data help to refine Trinity aquifer hydrostratigraphic units, south-central Texas

    USGS Publications Warehouse

    Blome, Charles D.; Clark, Allan K.

    2014-01-01

    The geologic framework and hydrologic characteristics of aquifers are important components for studying the nation’s subsurface heterogeneity and predicting its hydraulic budgets. Detailed study of an aquifer’s subsurface hydrostratigraphy is needed to understand both its geologic and hydrologic frameworks. Surface hydrostratigraphic mapping can also help characterize the spatial distribution and hydraulic connectivity of an aquifer’s permeable zones. Advances in three-dimensional (3-D) mapping and modeling have also enabled geoscientists to visualize the spatial relations between the saturated and unsaturated lithologies. This detailed study of two borehole cores, collected in 2001 on the Camp Stanley Storage Activity (CSSA) area, provided the foundation for revising a number of hydrostratigraphic units representing the middle zone of the Trinity aquifer. The CSSA area is a restricted military facility that encompasses approximately 4,000 acres and is located in Boerne, Texas, northwest of the city of San Antonio. Studying both the surface and subsurface geology of the CSSA area are integral parts of a U.S. Geological Survey project funded through the National Cooperative Geologic Mapping Program. This modification of hydrostratigraphic units is being applied to all subsurface data used to construct a proposed 3-D EarthVision model of the CSSA area and areas to the south and west.

  6. Three-dimensional geologic framework modeling of faulted hydrostratigraphic units within the Edwards Aquifer, Northern Bexar County, Texas

    USGS Publications Warehouse

    Pantea, Michael P.; Cole, James C.

    2004-01-01

    This report describes a digital, three-dimensional faulted hydrostratigraphic model constructed to represent the geologic framework of the Edwards aquifer system in the area of San Antonio, northern Bexar County, Texas. The model is based on mapped geologic relationships that reflect the complex structures of the Balcones fault zone, detailed lithologic descriptions and interpretations of about 40 principal wells (and qualified data from numerous other wells), and a conceptual model of the gross geometry of the Edwards Group units derived from prior interpretations of depositional environments and paleogeography. The digital model depicts the complicated intersections of numerous major and minor faults in the subsurface, as well as their individual and collective impacts on the continuity of the aquifer-forming units of the Edwards Group and the Georgetown Formation. The model allows for detailed examination of the extent of fault dislocation from place to place, and thus the extent to which the effective cross-sectional area of the aquifer is reduced by faulting. The model also depicts the internal hydrostratigraphic subdivisions of the Edwards aquifer, consisting of three major and eight subsidiary hydrogeologic units. This geologic framework model is useful for visualizing the geologic structures within the Balcones fault zone and the interactions of en-echelon fault strands and flexed connecting fault-relay ramps. The model also aids in visualizing the lateral connections between hydrostratigraphic units of relatively high and low permeability across the fault strands. Introduction The Edwards aquifer is the principal source of water for municipal, agricultural, industrial, and military uses by nearly 1.5 million inhabitants of the greater San Antonio, Texas, region (Hovorka and others, 1996; Sharp and Banner, 1997). Discharges from the Edwards aquifer also support local recreation and tourism industries at Barton, Comal, and San Marcos Springs located northeast of San Antonio (Barker and others, 1994), as well as base flow for agricultural applications farther downstream. Average annual discharge from large springs (Comal, San Marcos, Hueco, and others) from the Edwards aquifer was about 365,000 acre-ft from 1934 to1998, with sizeable fluctuations related to annual variations in rainfall. Withdrawals through pumping have increased steadily from about 250,000 acre-ft during the 1960s to over 400,000 acre-ft in the 1990s in response to population growth, especially in the San Antonio metropolitan area (Slattery and Brown, 1999). Average annual recharge to the system (determined through stream gaging) has also varied considerably with annual rainfall fluctuations, but has been about 635,000 acre-ft over the last several decades.

  7. Hydraulic characterization of volcanic rocks in Pahute Mesa using an integrated analysis of 16 multiple-well aquifer tests, Nevada National Security Site, 2009–14

    USGS Publications Warehouse

    Garcia, C. Amanda; Jackson, Tracie R.; Halford, Keith J.; Sweetkind, Donald S.; Damar, Nancy A.; Fenelon, Joseph M.; Reiner, Steven R.

    2017-01-20

    An improved understanding of groundwater flow and radionuclide migration downgradient from underground nuclear-testing areas at Pahute Mesa, Nevada National Security Site, requires accurate subsurface hydraulic characterization. To improve conceptual models of flow and transport in the complex hydrogeologic system beneath Pahute Mesa, the U.S. Geological Survey characterized bulk hydraulic properties of volcanic rocks using an integrated analysis of 16 multiple-well aquifer tests. Single-well aquifer-test analyses provided transmissivity estimates at pumped wells. Transmissivity estimates ranged from less than 1 to about 100,000 square feet per day in Pahute Mesa and the vicinity. Drawdown from multiple-well aquifer testing was estimated and distinguished from natural fluctuations in more than 200 pumping and observation wells using analytical water-level models. Drawdown was detected at distances greater than 3 miles from pumping wells and propagated across hydrostratigraphic units and major structures, indicating that neither faults nor structural blocks noticeably impede or divert groundwater flow in the study area.Consistent hydraulic properties were estimated by simultaneously interpreting drawdown from the 16 multiple-well aquifer tests with an integrated groundwater-flow model composed of 11 well-site models—1 for each aquifer test site. Hydraulic properties were distributed across volcanic rocks with the Phase II Pahute Mesa-Oasis Valley Hydrostratigraphic Framework Model. Estimated hydraulic-conductivity distributions spanned more than two orders of magnitude in hydrostratigraphic units. Overlapping hydraulic conductivity ranges among units indicated that most Phase II Hydrostratigraphic Framework Model units were not hydraulically distinct. Simulated total transmissivity ranged from 1,600 to 68,000 square feet per day for all pumping wells analyzed. High-transmissivity zones exceeding 10,000 square feet per day exist near caldera margins and extend along the northern and eastern Pahute Mesa study area and near the southwestern edge of the study area. The estimated hydraulic-property distributions and observed hydraulic connections among geologic structures improved the characterization and representation of groundwater flow at Pahute Mesa.

  8. Hydrostratigraphic Framework of the Raton, Vermejo, and Trinidad Aquifers in the Raton Basin, Las Animas County, Colorado

    USGS Publications Warehouse

    Watts, Kenneth R.

    2006-01-01

    Exploration for and production of coalbed methane has increased substantially in the Rocky Mountain region of the United States since the 1990s. During 1999-2004, annual production of natural gas (coalbed methane) from the Raton Basin in Las Animas County, Colorado, increased from 28,129,515 to 80,224,130 thousand cubic feet, and the annual volume of ground water coproduced by coalbed methane wells increased from about 949 million gallons to about 2,879 million gallons. Better definition of the hydrostratigraphic framework of the Raton, Vermejo, and Trinidad aquifers in the Raton Basin of southern Colorado is needed to evaluate the long-term effects of coalbed methane development on the availability and sustainability of ground-water resources. In 2001, the U.S. Geological Survey, in cooperation with the Colorado Water Conservation Board, began a study to evaluate the hydrogeology of the Raton Basin in Huerfano and Las Animas Counties, Colorado. Geostatistical methods were used to map the altitude of and depths to the bottoms and tops (structure) and the apparent thicknesses of the Trinidad Sandstone, the Vermejo Formation, and the Raton Formation in Las Animas County, based on completion reports and drillers' logs from about 1,400 coalbed methane wells in the Raton Basin. There was not enough subsurface control to map the structural surfaces and apparent thicknesses of the aquifers in Huerfano County. Geostatistical methods also were used to map the regional water table in the northern part of Las Animas County, based on reported depth to water from completion reports of water-supply wells. Although these maps were developed to better define the hydrostratigraphic framework, they also can be used to determine the contributing aquifer(s) of existing water wells and to estimate drilling depths of proposed water wells. These maps of the hydrostratigraphic framework could be improved with the addition of measured sections and mapping of geologic contacts at outcrops along the eastern and western margins of the Raton Basin.

  9. Three-dimensional model of the hydrostratigraphy and structure of the area in and around the U.S. Army-Camp Stanley Storage Activity Area, northern Bexar County, Texas

    USGS Publications Warehouse

    Pantea, Michael P.; Blome, Charles D.; Clark, Allan K.

    2014-01-01

    A three-dimensional model of the Camp Stanley Storage Activity area defines and illustrates the surface and subsurface hydrostratigraphic architecture of the military base and adjacent areas to the south and west using EarthVision software. The Camp Stanley model contains 11 hydrostratigraphic units in descending order: 1 model layer representing the Edwards aquifer; 1 model layer representing the upper Trinity aquifer; 6 model layers representing the informal hydrostratigraphic units that make up the upper part of the middle Trinity aquifer; and 3 model layers representing each, the Bexar, Cow Creek, and the top of the Hammett of the lower part of the middle Trinity aquifer. The Camp Stanley three-dimensional model includes 14 fault structures that generally trend northeast/southwest. The top of Hammett hydrostratigraphic unit was used to propagate and validate all fault structures and to confirm most of the drill-hole data. Differences between modeled and previously mapped surface geology reflect interpretation of fault relations at depth, fault relations to hydrostratigraphic contacts, and surface digital elevation model simplification to fit the scale of the model. In addition, changes based on recently obtained drill-hole data and field reconnaissance done during the construction of the model. The three-dimensional modeling process revealed previously undetected horst and graben structures in the northeastern and southern parts of the study area. This is atypical, as most faults in the area are en echelon that step down southeasterly to the Gulf Coast. The graben structures may increase the potential for controlling or altering local groundwater flow.

  10. General Separations Area (GSA) Groundwater Flow Model Update: Hydrostratigraphic Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bagwell, L.; Bennett, P.; Flach, G.

    2017-02-21

    This document describes the assembly, selection, and interpretation of hydrostratigraphic data for input to an updated groundwater flow model for the General Separations Area (GSA; Figure 1) at the Department of Energy’s (DOE) Savannah River Site (SRS). This report is one of several discrete but interrelated tasks that support development of an updated groundwater model (Bagwell and Flach, 2016).

  11. Hydrostratigraphic Framework and Selection and Correlation of Geophysical Log Markers in the Surficial Aquifer System, Palm Beach County, Florida

    USGS Publications Warehouse

    Reese, Ronald S.; Wacker, Michael A.

    2007-01-01

    The surficial aquifer system is the major source of freshwater for public water supply in Palm Beach County, Florida, yet many previous studies of the hydrogeology of this aquifer system have focused only on the eastern one-half to one-third of the county in the more densely populated coastal area (Land and others, 1973; Swayze and others, 1980; Swayze and Miller, 1984; Shine and others, 1989). Population growth in the county has resulted in the westward expansion of urbanized areas into agricultural areas and has created new demands on the water resources of the county. Additionally, interest in surface-water resources of central and western areas of the county has increased. In these areas, plans for additional surface-water storage reservoirs are being made under the Comprehensive Everglades Restoration Plan originally proposed by the U.S. Army Corps of Engineers and the South Florida Water Management District (1999), and stormwater treatment areas have been constructed by the South Florida Water Management District. Surface-water and ground-water interactions in the Everglades are thought to be important to water budgets, water quality, and ecology (Harvey and others, 2002). Most of the previous hydrogeologic and ground-water flow simulation studies of the surficial aquifer system have not utilized a hydrostratigraphic framework, in which stratigraphic or sequence stratigraphic units, such as those proposed in Cunningham and others (2001), are delineated in this stratigraphically complex aquifer system. A thick zone of secondary permeability mapped by Swayze and Miller (1984) was not subdivided and was identified as only being within the Anastasia Formation of Pleistocene age. Miller (1987) published 11 geologic sections of the surficial aquifer system, but did not delineate any named stratigraphic units in these sections. This limited interpretation has resulted, in part, from the complex facies changes within rocks and sediments of the surficial aquifer system and the seemingly indistinct and repetitious nature of the most common lithologies, which include sand, shell, sandstone, and limestone. Model construction and layer definition in a simulation of ground-water flow within the surficial aquifer system of Palm Beach County utilized only the boundaries of one or two major hydrogeologic zones, such as the Biscayne aquifer and surficial aquifer system; otherwise layers were defined by average elevations rather than geologic structure or stratigraphy (Shine and others, 1989). Additionally, each major permeable zone layer in the model was assumed to have constant hydraulic conductivity with no allowance for the possibility of discrete (thin) flow zones within the zone. The key to understanding the spatial distribution and hydraulic connectivity of permeable zones in the surficial aquifer system beneath Palm Beach County is the development of a stratigraphic framework based on a consistent method of county-wide correlation. Variability in hydraulic properties in the system needs to be linked to the stratigraphic units delineated in this framework, and proper delineation of the hydrostratigraphic framework should provide a better understanding and simulation of the ground-water flow system. In 2004, the U.S. Geological Survey, in cooperation with the South Florida Water Management District, initiated an investigation to develop a hydrostratigraphic framework for the surficial aquifer system in Palm Beach County.

  12. A Triangulation Method for Identifying Hydrostratigraphic Locations of Well Screens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteside, T. S.

    2015-01-31

    A method to identify the hydrostratigraphic location of well screens was developed using triangulation with known locations. This method was applied to all of the monitor wells being used to develop the new GSA groundwater model. Results from this method are closely aligned with those from an alternate method which uses a mesh surface.

  13. Characterizing the subsurface geology in and around the U.S. Army Camp Stanley Storage Activity, south-central Texas

    USGS Publications Warehouse

    Blome, Charles D.; Clark, Allan K.

    2018-02-15

    Several U.S. Geological Survey projects, supported by the National Cooperative Geologic Mapping Program, have used multi-disciplinary approaches over a 14-year period to reveal the surface and subsurface geologic frameworks of the Edwards and Trinity aquifers of central Texas and the Arbuckle-Simpson aquifer of south-central Oklahoma. Some of the project achievements include advancements in hydrostratigraphic mapping, three-dimensional subsurface framework modeling, and airborne geophysical surveys as well as new methodologies that link geologic and groundwater flow models. One area where some of these milestones were achieved was in and around the U.S. Army Camp Stanley Storage Activity, located in north­western Bexar County, Texas, about 19 miles north­west of downtown San Antonio.

  14. Big data integration for regional hydrostratigraphic mapping

    NASA Astrophysics Data System (ADS)

    Friedel, M. J.

    2013-12-01

    Numerical models provide a way to evaluate groundwater systems, but determining the hydrostratigraphic units (HSUs) used in devising these models remains subjective, nonunique, and uncertain. A novel geophysical-hydrogeologic data integration scheme is proposed to constrain the estimation of continuous HSUs. First, machine-learning and multivariate statistical techniques are used to simultaneously integrate borehole hydrogeologic (lithology, hydraulic conductivity, aqueous field parameters, dissolved constituents) and geophysical (gamma, spontaneous potential, and resistivity) measurements. Second, airborne electromagnetic measurements are numerically inverted to obtain subsurface resistivity structure at randomly selected locations. Third, the machine-learning algorithm is trained using the borehole hydrostratigraphic units and inverted airborne resistivity profiles. The trained machine-learning algorithm is then used to estimate HSUs at independent resistivity profile locations. We demonstrate efficacy of the proposed approach to map the hydrostratigraphy of a heterogeneous surficial aquifer in northwestern Nebraska.

  15. Geologic and hydrostratigraphic map of the Anhalt, Fischer, and Spring Branch 7.5-minute quadrangles, Blanco, Comal, and Kendall Counties, Texas

    USGS Publications Warehouse

    Clark, Allan K.; Robert R. Morris,

    2015-01-01

    The hydrostratigraphic units of the Edwards and Trinity aquifers have been mapped and described herein using a classification system developed by Choquette and Pray (1970), which is based on porosity types being fabric or not-fabric selective. The naming of hydrostratigraphic units is also based on preexisting names and topographic or historical features that occur in outcrop. The only hydrostratigraphic unit of the Edwards aquifer present in the study area is VIII hydrostratigraphic unit. The mapped hydrostratigraphic units of the upper Trinity aquifer are, from top to bottom: the cavernous, Camp Bullis, upper evaporite, fossiliferous, and lower evaporite and they are interval equivalent to the upper member of the Glen Rose Limestone. The middle Trinity aquifer (interval equivalent to the lower member of the Glen Rose Limestone) contains, from top to bottom: the Bulverde, Little Blanco, Twin Sisters, Doeppenschmidt, Rust, and Honey Creek hydrostratigraphic units. The lower part of the middle Trinity aquifer is formed by the Hensell, Cow Creek, and Hammett hydrostratigraphic units which are interval equivalent to the Hensell Sand Member, the Cow Creek Limestone, and the Hammett Shale Member, respectively, of the Pearsall Formation.

  16. HYDROGIOLOGIC FRAMEWORK, GROUND-WATER GEOCHEMISTRY, AND ASSESSMENT OF NITROGEN YIELD FROM BASE FLOW IN TWO AGRICULTURAL WATERSHEDS, KENT COUNTY, MARYLAND

    EPA Science Inventory

    Hydrostratigraphic and geochemical data collected in two adjacent watersheds on the Delmarva Peninsula, in Kent County, Maryland, indicate that shallow subsurface stratigraphy is an important factor that affects the concentrations of nitrogen in ground water discharging as stream...

  17. Application of Surface Geophysical Methods, With Emphasis on Magnetic Resonance Soundings, to Characterize the Hydrostratigraphy of the Brazos River Alluvium Aquifer, College Station, Texas, July 2006 - A Pilot Study

    USGS Publications Warehouse

    Shah, Sachin D.; Kress, Wade H.; Legchenko, Anatoly

    2007-01-01

    The U.S. Geological Survey, in cooperation with the Texas Water Development Board, used surface geophysical methods at the Texas A&M University Brazos River Hydrologic Field Research Site near College Station, Texas, in a pilot study, to characterize the hydrostratigraphic properties of the Brazos River alluvium aquifer and determine the effectiveness of the methods to aid in generating an improved ground-water availability model. Three non-invasive surface geophysical methods were used to characterize the electrical stratigraphy and hydraulic properties and to interpret the hydrostratigraphy of the Brazos River alluvium aquifer. Two methods, time-domain electromagnetic (TDEM) soundings and two-dimensional direct-current (2D-DC) resistivity imaging, were used to define the lateral and vertical extent of the Ships clay, the alluvium of the Brazos River alluvium aquifer, and the underlying Yegua Formation. Magnetic resonance sounding (MRS), a recently developed geophysical method, was used to derive estimates of the hydrologic properties including percentage water content and hydraulic conductivity. Results from the geophysics study demonstrated the usefulness of combined TDEM, 2D-DC resistivity, and MRS methods to reduce the need for additional boreholes in areas with data gaps and to provide more accurate information for ground-water availability models. Stratigraphically, the principal finding of this study is the relation between electrical resistivity and the depth and thickness of the subsurface hydrostratigraphic units at the site. TDEM data defined a three-layer electrical stratigraphy corresponding to a conductor-resistor-conductor that represents the hydrostratigraphic units - the Ships clay, the alluvium of the Brazos River alluvium aquifer, and the Yegua Formation. Sharp electrical boundaries occur at about 4 to 6 and 20 to 22 meters below land surface based on the TDEM data and define the geometry of the more resistive Brazos River alluvium aquifer. Variations in resistivity in the alluvium aquifer range from 10 to more than 175 ohm-meters possibly are caused by lateral changes in grain size. Resistivity increases from east to west along a profile away from the Brazos River, which signifies an increase in grain size within the alluvium aquifer and therefore a more productive zone with more abundant water in the aquifer. MRS data can help delineate the subsurface hydrostratigraphy and identify the geometric boundaries of the hydrostratigraphic units by identifying changes in the free water content, transmissivity, and hydraulic conductivity. MRS data indicate that most productive zones of the alluvium aquifer occur between 12 and 25 meters below land surface in the western part of the study area where the hydraulic conductivity can be as high as 250 meters per day. Hydrostratigraphically, individual hydraulic conductivity values derived from MRS were consistent with those from aquifer tests conducted in 1996 in the study area. Average hydraulic conductivity values from the aquifer tests range from about 61 to 80 meters per day, whereas the MRS-derived hydraulic conductivity values range from about 27 to 97 meters per day. Interpreting an interpolated profile of the hydraulic conductivity values and individual values derived from MRS can help describe the hydrostratigraphic framework of an area and constrain ground-water models for better accuracy.

  18. Hydrostratigraphic characterization of intergranular and secondary porosity in part of the Cambrian sandstone aquifer system of the cratonic interior of North America: Improving predictability of hydrogeologic properties

    USGS Publications Warehouse

    Runkel, Anthony C.; Tipping, R.G.; Alexander, E.C.; Alexander, S.C.

    2006-01-01

    The Upper Cambrian interval of strata in the cratonic interior of North America has a long history of inconsistent hydrogeologic classification and a reputation for marked and unpredictable variability in hydraulic properties. We employed a hydrostratigraphic approach that requires hydraulic data to be interpreted within the context of a detailed characterization of the distribution of porosity and permeability to arrive at a better understanding of these rocks. As a first step, we constructed a framework of hydrostratigraphic attributes that is a depiction of the spatial distribution of both rock matrix and secondary porosity, independent of hydraulic data such as pumping-test results. The locations of hundreds of borehole geophysical logs and laboratory measurements of rock sample matrix porosity and permeability were mapped on detailed (mostly 1:100,000 or greater), conventional, lithostratigraphic maps. Stratigraphic cross-sections, based on hundreds of natural gamma logs and thousands of water-well records, have provided a markedly improved depiction of the regional distribution of rock matrix hydrostratigraphic components. Borehole, core and outcrop observations of secondary porosity were also tied to detailed stratigraphic sections and interpolated regionally. As a second step, we compiled and conducted a large number of hydraulic tests (e.g., packer tests and borehole flowmeter logs) and analyzed thousands of specific capacity tests (converted to hydraulic conductivity). Interpretation of these data within the context of the hydrostratigraphic attributes allowed us to produce a new hydrogeologic characterization for this stratigraphic interval and gain important insights into geologic controls on hydraulic variability. There are a number of assumptions in herent in most previous hydrogeologic investigations of these strata, such as equivalency of lithostratigraphic and hydrogeologic units and the dominance of intergranular flow in sandstone, that are not consistent with our results. A particularly important outcome of our study is recognition of regionally extensive bedding-plane fracture clusters. Such exceptionally high hydraulic conductivity features dominate the hydraulics of aquifers and confining units in these siliciclastic-dominated strata, including within intervals consisting largely of friable sandstone with high intergranular conductivity. Furthermore, our results provide some measure of fracture predictability, by correlating their abundance and hydraulic importance to specific stratigraphic positions and particular depths of burial beneath younger bedrock. A discrete, consistent stratigraphic interval of fine-grained siliciclastic beds also is apparently resistant to the development of vertically interconnected fractures, making the location of this regionally extensive confining unit predictable. Our more rigorous approach of interpreting typical hydraulic tests as well as relatively new techniques of borehole flowmeter logging, within the context of a hydrostratigraphic framework, results in improved definition of individual aquifers and confining units. It also enables quantification of their hydraulic properties, which leads to improved prediction of groundwater flow paths and time-of-travel. ?? 2005 Elsevier B.V. All rights reserved.

  19. Geologic framework, hydrostratigraphy, and ichnology of the Blanco, Payton, and Rough Hollow 7.5-minute quadrangles, Blanco, Comal, Hays, and Kendall Counties, Texas

    USGS Publications Warehouse

    Clark, Allan K.; Golab, James A.; Morris, Robert E.

    2016-09-13

    This report presents the geologic framework, hydro­stratigraphy, and ichnology of the Trinity and Edwards Groups in the Blanco, Payton, and Rough Hollow 7.5-minute quad­rangles in Blanco, Comal, Hays, and Kendall Counties, Texas. Rocks exposed in the study area are of the Lower Cretaceous Trinity Group and lower part of the Fort Terrett Formation of the Lower Cretaceous Edwards Group. The mapped units in the study area are the Hammett Shale, Cow Creek Limestone, Hensell Sand, and Glen Rose Limestone of the Trinity Group and the lower portion of the Fort Terrett Formation of the Edwards Group. The Glen Rose Limestone is composed of the Lower and Upper Members. These Trinity Group rocks con­tain the upper and middle Trinity aquifers. The only remaining outcrops of the Edwards Group are the basal nodular member of the Fort Terrett Formation, which caps several hills in the northern portion of the study area. These rocks were deposited in an open marine to supratidal flats environment. The faulting and fracturing in the study area are part of the Balcones fault zone, an extensional system of faults that generally trends southwest to northeast in south-central Texas.The hydrostratigraphic units of the Edwards and Trinity aquifers were mapped and described using a classification system based on fabric-selective or not-fabric-selective poros­ity types. The only hydrostratigraphic unit of the Edwards aquifer present in the study area is hydrostratigraphic unit VIII. The mapped hydrostratigraphic units of the upper Trinity aquifer are (from top to bottom) the Camp Bullis, upper evaporite, fossiliferous, and lower evaporite which are interval equivalent to the Upper Member of the Glen Rose Limestone. The middle Trinity aquifer encompasses (from top to bottom) the Lower Member of the Glen Rose Limestone, the Hensell Sand Member, and the Cow Creek Limestone Member of the Pearsall Formation. The Lower Member of the Glen Rose Limestone is subdivided into six informal hydro­stratigraphic units (from top to bottom) the Bulverde, Little Blanco, Twin Sisters, Doeppenschmidt, Rust, and Honey Creek hydrostratigraphic units.This study used the ichnofabric index scale to interpret the amount of bioturbation in the field. Most of the geologic units in the study area are assigned to the Cruziana and Thalassinoides ichnofacies consistent with interpretations of a tidal-dominated open marine environment (sublittoral zone). Ichnofossil assemblages are dominated by Thalassinoides networks, but also contain Cruziana, Ophiomorpha, Paleo­phycus, Planolites, and Serpulid traces.

  20. An Integrated Hydrogeologic and Geophysical Investigation to Characterize the Hydrostratigraphy of the Edwards Aquifer in an Area of Northeastern Bexar County, Texas

    USGS Publications Warehouse

    Shah, Sachin D.; Smith, Bruce D.; Clark, Allan K.; Payne, Jason

    2008-01-01

    In August 2007, the U.S. Geological Survey, in cooperation with the San Antonio Water System, did a hydrogeologic and geophysical investigation to characterize the hydrostratigraphy (hydrostratigraphic zones) and also the hydrogeologic features (karst features such as sinkholes and caves) of the Edwards aquifer in a 16-square-kilometer area of northeastern Bexar County, Texas, undergoing urban development. Existing hydrostratigraphic information, enhanced by local-scale geologic mapping in the area, and surface geophysics were used to associate ranges of electrical resistivities obtained from capacitively coupled (CC) resistivity surveys, frequency-domain electromagnetic (FDEM) surveys, time-domain electromagnetic (TDEM) soundings, and two-dimensional direct-current (2D-DC) resistivity surveys with each of seven hydrostratigraphic zones (equivalent to members of the Kainer and Person Formations) of the Edwards aquifer. The principal finding of this investigation is the relation between electrical resistivity and the contacts between the hydrostratigraphic zones of the Edwards aquifer and the underlying Trinity aquifer in the area. In general, the TDEM data indicate a two-layer model in which an electrical conductor underlies an electrical resistor, which is consistent with the Trinity aquifer (conductor) underlying the Edwards aquifer (resistor). TDEM data also show the plane of Bat Cave fault, a well-known fault in the area, to be associated with a local, nearly vertical zone of low resistivity that provides evidence, although not definitive, for Bat Cave fault functioning as a flow barrier, at least locally. In general, the CC resistivity, FDEM survey, and 2D-DC resistivity survey data show a sharp electrical contrast from north to south, changing from high resistivity to low resistivity across Bat Cave fault as well as possible karst features in the study area. Interpreted karst features that show relatively low resistivity within a relatively high-resistivity area likely are attributable to clay or soil filling a sinkhole. In general, faults are inferred where lithologic incongruity indicates possible displacement. Along most inferred faults, displacement was not sufficient to place different members of the Kainer or Person Formations (hydrostratigraphic zones) adjacent across the inferred fault plane. In general, the Kainer Formation (hydrostratigraphic zones V through VIII) has a higher resistivity than the Person Formation (hydrostratigraphic zones II through IV). Although resistivity variations from the CC resistivity, FDEM, and 2D-DC resistivity surveys, with mapping information, were sufficient to allow surface mapping of the lateral extent of hydrostratigraphic zones in places, resistivity variations from TDEM data were not sufficient to allow vertical delineation of hydrostratigraphic zones; however, the Edwards aquifer-Trinity aquifer contact could be identified from the TDEM data.

  1. Development of a Unified Hydrostratigraphic Framework for the Floridan Aquifer System in Central and Southern Florida

    NASA Astrophysics Data System (ADS)

    Reese, R. S.

    2008-05-01

    The mostly carbonate Floridan aquifer system (FAS) of central and southern Florida is a widely used resource with a complex hydrostratigraphic framework that is managed primarily in a subregional context according to water management jurisdictional boundaries. As use of the FAS increases, a consistent regional hydrostratigraphic framework is needed for effective management across these boundaries. Stratigraphic marker horizons within and near the top of FAS were delineated and mapped to develop a preliminary, correlative stratigraphic framework. This framework was used to identify and determine aquifers, subaquifers, and confining units and map their spatial distribution. These horizons are based on lithologic changes and geophysical log signatures identified in previous studies, and they were extended throughout the study area primarily by correlation of natural gamma-ray logs. The FAS consists of the Upper Floridan aquifer, middle confining unit, and Lower Floridan aquifer. A regional, productive zone is delineated and informally referred to as the Avon Park permeable zone. This zone is present over most of the study area and is characterized by thick units of dolostone with interbedded limestone and high fracture permeability. The zone has been identified in different regions in previous studies, either as the upper part of the Lower Floridan aquifer or as the lower part of the Upper Floridan aquifer. In this study it is generally considered to be within the middle confining unit. Transmissivity of the Avon Park permeable zone, a major source of water supply, generally ranges from less than 1x104 up to 1.6x106 ft2/day, and is greatest in central Florida where dolomite is developed as a major component of the zone. A large area of low transmissivity (less than 105 ft2/day) in southern Florida coincides with an area where limestone is the predominant lithology within the zone. Major uses of the FAS now include withdrawal for public and agricultural supply, including treatment with reverse osmosis, aquifer storage and recovery, and disposal of treated wastewater. Water-level and water-quality conflicts could arise between these competing uses, and delineating the extent and hydraulic connectivity of the Avon Park permeable zone within the FAS may help managers and others predict and minimize such conflicts.

  2. Groundwater Flow Model of Corrective Action Units 101 and 102: Central and Western Pahute Mesa, Nevada Test Site, Nye County, Nevada, Revision 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greg Ruskauff

    2006-06-01

    The Pahute Mesa groundwater flow model supports the FFACO UGTA corrective action strategy objective of providing an estimate of the vertical and horizontal extent of contaminant migration for each CAU in order to predict contaminant boundaries. A contaminant boundary is the model-predicted perimeter that defines the extent of radionuclide-contaminated groundwater from underground nuclear testing above background conditions exceeding Safe Drinking Water Act (SDWA) standards. The contaminant boundary will be composed of both a perimeter boundary and a lower hydrostratigraphic unit (HSU) boundary. Additional results showing contaminant concentrations and the location of the contaminant boundary at selected times will also bemore » presented. These times may include the verification period, the end of the five-year proof-of-concept period, as well as other times that are of specific interest. The FFACO (1996) requires that the contaminant transport model predict the contaminant boundary at 1,000 years and “at a 95% level of confidence.” The Pahute Mesa Phase I flow model described in this report provides, through the flow fields derived from alternative hydrostratigraphic framework models (HFMs) and recharge models, one part of the data required to compute the contaminant boundary. Other components include the simplified source term model, which incorporates uncertainty and variability in the factors that control radionuclide release from an underground nuclear test (SNJV, 2004a), and the transport model with the concomitant parameter uncertainty as described in Shaw (2003). The uncertainty in all the above model components will be evaluated to produce the final contaminant boundary. This report documents the development of the groundwater flow model for the Central and Western Pahute Mesa CAUs.« less

  3. Deep resistivity structure of Yucca Flat, Nevada Test Site, Nevada

    USGS Publications Warehouse

    Asch, Theodore H.; Rodriguez, Brian D.; Sampson, Jay A.; Wallin, Erin L.; Williams, Jackie M.

    2006-01-01

    The Department of Energy (DOE) and the National Nuclear Security Administration (NNSA) at their Nevada Site Office are addressing groundwater contamination resulting from historical underground nuclear testing through the Environmental Management program and, in particular, the Underground Test Area project. One issue of concern is the nature of the somewhat poorly constrained pre Tertiary geology and its effects on ground-water flow in the area adjacent to a nuclear test. Ground water modelers would like to know more about the hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Yucca Flat Corrective Action Unit (CAU). During 2003, the U.S. Geological Survey, supported by the DOE and NNSA-NSO, collected and processed data from 51 magnetotelluric (MT) and audio-magnetotelluric (AMT) stations at the Nevada Test Site in and near Yucca Flat to assist in characterizing the pre-Tertiary geology in that area. The primary purpose was to refine the character, thickness, and lateral extent of pre Tertiary confining units. In particular, a major goal has been to define the upper clastic confining unit (late Devonian - Mississippian-age siliciclastic rocks assigned to the Eleana Formation and Chainman Shale) in the Yucca Flat area. The MT and AMT data have been released in separate USGS Open File Reports. The Nevada Test Site magnetotelluric data interpretation presented in this report includes the results of detailed two-dimensional (2 D) resistivity modeling for each profile (including alternative interpretations) and gross inferences on the three dimensional (3 D) character of the geology beneath each station. The character, thickness, and lateral extent of the Chainman Shale and Eleana Formation that comprise the Upper Clastic Confining Unit are generally well determined in the upper 5 km. Inferences can be made regarding the presence of the Lower Clastic Confining Unit at depths below 5 km. Large fault structures such as the CP Thrust fault, the Carpetbag fault, and the Yucca fault that cross Yucca Flat are also discernable as are other smaller faults. The subsurface electrical resistivity distribution and inferred geologic structures determined by this investigation should help constrain the hydrostratigraphic framework model that is under development.

  4. Hydrostratigraphic and structural controls on streamflow generation in the Chuska Mountains, Navajo Nation, AZ/NM

    NASA Astrophysics Data System (ADS)

    Tsinnajinnie, L.; Frisbee, M. D.; Wilson, J. L.

    2017-12-01

    A conceptual model of hydrostratigraphic and structural influences on 3D streamflow generation processes is tested in the Whiskey Creek watershed located in the Chuska Mountains of the Navajo Nation along the northern NM/AZ border. The role of hydrostratigraphy and structure in groundwater processes has been well studied. However, influences of heterogeneity due to geologic structure and stratigraphy of mountain blocks on 3D streamflow generation has received less attention. Three-dimensional flow in mountainous watersheds, such as Saguache Creek (CO) and Rio Hondo (NM), contributes significant amounts of groundwater from deep circulation to streamflow. This fully 3D conceptual model is fundamentally different than watersheds characterized as 2D, those dominated by surface and shallow subsurface runoff, because 3D watersheds can have much longer flowpaths and mean residence times (up to 1000s of years). In contrast to Saguache Creek (volcanic bedrock) and Rio Hondo (crystalline metamorphic), the bedrock geology of the watersheds draining the Chuska Mountains is primarily comprised of sedimentary bedrock capped by extrusive volcanics. We test this conceptual model using a combination of stream gauging, tritium analyses, and endmember mixing analysis (EMMA) on the general ion chemistry and stable isotope composition of water samples collected in 2013-2016. Springs that emerge from the Chuska Sandstone are tritium dead indicative of a large component of pre-bomb pulse water in discharge and deeper 3D flow. EMMA indicates that most streamflow is generated from groundwater emerging from the Chuska Sandstone. Gaining/losing conditions in Whiskey Creek are strongly related to hydrostratigraphy as evidenced by a transition from gaining conditions largely found in the Chuska Sandstone to losing conditions where the underlying Chinle Formation outcrops. Although tritium in Whiskey Creek suggests 3D interactions are present, hydrostratigraphic and structural controls may limit the occurrence of longer residence times and longer flow paths. Mountainous watersheds similar to the 3D hydrostratigraphic and structurally controlled models will exhibit different responses to perturbations, such as climate change, than watersheds that fit existing 2D and 3D conceptual models.

  5. The Silent Canyon caldera complex: a three-dimensional model based on drill-hole stratigraphy and gravity inversion

    USGS Publications Warehouse

    McKee, Edwin H.; Hildenbrand, Thomas G.; Anderson, Megan L.; Rowley, Peter D.; Sawyer, David A.

    1999-01-01

    The structural framework of Pahute Mesa, Nevada, is dominated by the Silent Canyon caldera complex, a buried, multiple collapse caldera complex. Using the boundary surface between low density Tertiary volcanogenic rocks and denser granitic and weakly metamorphosed sedimentary rocks (basement) as the outer fault surfaces for the modeled collapse caldera complex, it is postulated that the caldera complex collapsed on steeply- dipping arcuate faults two, possibly three, times following eruption of at least two major ash-flow tuffs. The caldera and most of its eruptive products are now deeply buried below the surface of Pahute Mesa. Relatively low-density rocks in the caldera complex produce one of the largest gravity lows in the western conterminous United States. Gravity modeling defines a steep sided, cup-shaped depression as much as 6,000 meters (19,800 feet) deep that is surrounded and floored by denser rocks. The steeply dipping surface located between the low-density basin fill and the higher density external rocks is considered to be the surface of the ring faults of the multiple calderas. Extrapolation of this surface upward to the outer, or topographic rim, of the Silent Canyon caldera complex defines the upper part of the caldera collapse structure. Rock units within and outside the Silent Canyon caldera complex are combined into seven hydrostratigraphic units based on their predominant hydrologic characteristics. The caldera structures and other faults on Pahute Mesa are used with the seven hydrostratigraphic units to make a three-dimensional geologic model of Pahute Mesa using the "EarthVision" (Dynamic Graphics, Inc.) modeling computer program. This method allows graphic representation of the geometry of the rocks and produces computer generated cross sections, isopach maps, and three-dimensional oriented diagrams. These products have been created to aid in visualizing and modeling the ground-water flow system beneath Pahute Mesa.

  6. Bedrock geology and hydrostratigraphy of the Edwards and Trinity aquifers within the Driftwood and Wimberley 7.5-minute quadrangles, Hays and Comal Counties, Texas

    USGS Publications Warehouse

    Clark, Allan K.; Morris, Robert R.

    2017-11-16

    The Edwards and Trinity aquifers are major sources of water in south-central Texas and are both classified as major aquifers by the State of Texas. The population in Hays and Comal Counties is rapidly growing, increasing demands on the area’s water resources. To help effectively manage the water resources in the area, refined maps and descriptions of the geologic structures and hydrostratigraphic units of the aquifers are needed. This report presents the detailed 1:24,000-scale bedrock hydrostratigraphic map as well as names and descriptions of the geologic and hydrostratigraphic units of the Driftwood and Wimberley 7.5-minute quadrangles in Hays and Comal Counties, Tex.Hydrostratigraphically, the rocks exposed in the study area represent a section of the upper confining unit to the Edwards aquifer, the Edwards aquifer, the upper zone of the Trinity aquifer, and the middle zone of the Trinity aquifer. In the study area, the Edwards aquifer is composed of the Georgetown Formation and the rocks forming the Edwards Group. The Trinity aquifer is composed of the rocks forming the Trinity Group. The Edwards and Trinity aquifers are karstic with high secondary porosity along bedding and fractures. The Del Rio Clay is a confining unit above the Edwards aquifer and does not supply appreciable amounts of water to wells in the study area.The hydrologic connection between the Edwards and Trinity aquifers and the various hydrostratigraphic units is complex because the aquifer system is a combination of the original Cretaceous depositional environment, bioturbation, primary and secondary porosity, diagenesis, and fracturing of the area from Miocene faulting. All of these factors have resulted in development of modified porosity, permeability, and transmissivity within and between the aquifers. Faulting produced highly fractured areas which allowed for rapid infiltration of water and subsequently formed solutionally enhanced fractures, bedding planes, channels, and caves that are highly permeable and transmissive. Because of faulting the juxtaposition of the aquifers and hydrostratigraphic units has resulted in areas of interconnectedness between the Edwards and Trinity aquifers and the various hydrostratigraphic units that form the aquifers.

  7. Modeling coastal aquifers in a Mediterranean area: the example of Taranto gulf (southern Italy)

    NASA Astrophysics Data System (ADS)

    De Filippis, Giovanna; Giudici, Mauro; Negri, Sergio; Margiotta, Stefano; Cattaneo, Laura; Vassena, Chiara

    2015-04-01

    Water resources stored in coastal aquifers are of strategic relevance for several regions throughout the world and in particular in the Mediterranean basin. They are extremely important in areas characterized by heavy urbanization, active industrial or touristic systems, where the need for fresh water is very acute and, sometimes, they are the only water resources available. This in turn can lead to the phenomenon of seawater intrusion because of aquifer overexploitation to satisfy the demand of an increasing population in coastal plains. Furthermore, karstic aquifers are well known for their specific vulnerability to natural and human-induced contamination, due to their particular characteristics such as thin soils, point recharge in dolines and swallow holes and increased hydraulic conductivity. Within this framework, the Taranto gulf is an example of paramount importance. In fact the presence of a wide industrial area close to the city of Taranto and the numerous maritime and military activities in the harbor area favored the increase of population density in the XX century. Moreover, they constitute factors of great concern for the protection of groundwater quality and quantity, in particular for the presence of the highly-vulnerable basins of Mar Piccolo and Mar Grande. In this area, groundwater resources are stored in a karst multilayered aquifer, which is very complex from the hydrostratigraphic point of view. Furthermore, the presence of highly water-demanding activities makes the seawater intrusion phenomenon very serious, especially along the coastline. In order to characterize the groundwater dynamic in the study area, we discuss the hydraulic relationships between the different hydrostratigraphic units and between the sea and the aquifer system by developing a numerical groundwater model to test and refine the preliminary conceptual model and estimate the most uncertain hydraulic parameters. To achieve these objectives, we used different data-sets to characterize the study area from the hydrostratigraphic point of view and to identify the source terms and the groundwater outflows (i.e., submarine and subaerial freshwater springs). For the numerical simulations, the computer code YAGMod, which was originally developed to perform 3D groundwater flow simulation with a simplified treatment of unsaturated/saturated conditions and the effects of strong aquifer exploitation, has been upgraded to the case of a variable density flow. This research activity is part of the research program RITMARE (The Italian Research for the Sea), within which a subprogram is specifically dedicated to the problem of the protection and preservation of groundwater quality in Italian coastal aquifers and in particular, among the others, in the Taranto area.

  8. Three-dimensional mapping of equiprobable hydrostratigraphic units at the Frenchman Flat Corrective Action Unit, Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shirley, C.; Pohlmann, K.; Andricevic, R.

    1996-09-01

    Geological and geophysical data are used with the sequential indicator simulation algorithm of Gomez-Hernandez and Srivastava to produce multiple, equiprobable, three-dimensional maps of informal hydrostratigraphic units at the Frenchman Flat Corrective Action Unit, Nevada Test Site. The upper 50 percent of the Tertiary volcanic lithostratigraphic column comprises the study volume. Semivariograms are modeled from indicator-transformed geophysical tool signals. Each equiprobable study volume is subdivided into discrete classes using the ISIM3D implementation of the sequential indicator simulation algorithm. Hydraulic conductivity is assigned within each class using the sequential Gaussian simulation method of Deutsch and Journel. The resulting maps show the contiguitymore » of high and low hydraulic conductivity regions.« less

  9. A refined characterization of the alluvial geology of yucca flat and its effect on bulk hydraulic conductivity

    USGS Publications Warehouse

    Phelps, G.A.; Halford, K.J.

    2011-01-01

    In Yucca Flat, on the Nevada National Security Site in southern Nevada, the migration of radionuclides from tests located in the alluvial deposits into the Paleozoic carbonate aquifer involves passage through a thick, heterogeneous section of late Tertiary and Quaternary alluvial sediments. An understanding of the lateral and vertical changes in the material properties of the alluvial sediments will aid in the further development of the hydrogeologic framework and the delineation of hydrostratigraphic units and hydraulic properties required for simulating groundwater flow in the Yucca Flat area. Previously published geologic models for the alluvial sediments within Yucca Flat are based on extensive examination and categorization of drill-hole data, combined with a simple, data-driven interpolation scheme. The U.S. Geological Survey, in collaboration with Stanford University, is researching improvements to the modeling of the alluvial section, incorporating prior knowledge of geologic structure into the interpolation method and estimating the uncertainty of the modeled hydrogeologic units.

  10. Using 3D geological modelling and geochemical mixing models to characterise alluvial aquifer recharge sources in the upper Condamine River catchment, Queensland, Australia.

    PubMed

    Martinez, Jorge L; Raiber, Matthias; Cendón, Dioni I

    2017-01-01

    The influence of mountain front recharge on the water balance of alluvial valley aquifers located in upland catchments of the Condamine River basin in Queensland, Australia, is investigated through the development of an integrated hydrogeological framework. A combination of three-dimensional (3D) geological modelling, hydraulic gradient maps, multivariate statistical analyses and hydrochemical mixing calculations is proposed for the identification of hydrochemical end-members and quantification of the relative contributions of each end-member to alluvial aquifer recharge. The recognised end-members correspond to diffuse recharge and lateral groundwater inflows from three hydrostratigraphic units directly connected to the alluvial aquifer. This approach allows mapping zones of potential inter-aquifer connectivity and areas of groundwater mixing between underlying units and the alluvium. Mixing calculations using samples collected under baseflow conditions reveal that lateral contribution from a regional volcanic aquifer system represents the majority (41%) of inflows to the alluvial aquifer. Diffuse recharge contribution (35%) and inflow from two sedimentary bedrock hydrostratigraphic units (collectively 24%) comprise the remainder of major recharge sources. A detailed geochemical assessment of alluvial groundwater evolution along a selected flowpath of a representative subcatchment of the Condamine River basin confirms mixing as a key process responsible for observed spatial variations in hydrochemistry. Dissolution of basalt-related minerals and dolomite, CO 2 uptake, ion-exchange, precipitation of clay minerals, and evapotranspiration further contribute to the hydrochemical evolution of groundwater in the upland alluvial aquifer. This study highlights the benefits of undertaking an integrated approach that combines multiple independent lines of evidence. The proposed methods can be applied to investigate processes associated with inter-aquifer mixing, including groundwater contamination resulting from depressurisation of underlying geological units hydraulically connected to the shallower water reservoirs. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Geochemical Models of Water-Quality Changes During Aquifer Storage Recovery (ASR) Cycle Tests, Phase 1: Geochemical Models Using Existing Data

    DTIC Science & Technology

    2006-09-01

    Richardson, in review). Figure 1 shows the lithostratigraphic setting for Eocene through Miocene strata, and the occurrence of hydrostratigraphic units of...basal Haw- thorn unit lies unconformably on lithologies informally called “ Eocene limestones,” which consist of Suwannee Limestone, Ocala Limestone

  12. New insights into the hydrostratigraphy of the High Plains aquifer from three-dimensional visualizations based on well records

    USGS Publications Warehouse

    Macfarlane, P.A.

    2009-01-01

    Regional aquifers in thick sequences of continentally derived heterolithic deposits, such as the High Plains of the North American Great Plains, are difficult to characterize hydrostratigraphically because of their framework complexity and the lack of high-quality subsurface information from drill cores and geophysical logs. However, using a database of carefully evaluated drillers' and sample logs and commercially available visualization software, it is possible to qualitatively characterize these complex frameworks based on the concept of relative permeability. Relative permeability is the permeable fraction of a deposit expressed as a percentage of its total thickness. In this methodology, uncemented coarse and fine sediments are arbitrarily set at relative permeabilities of 100% and 0%, respectively, with allowances made for log entries containing descriptions of mixed lithologies, heterolithic strata, and cementation. To better understand the arrangement of high- and low-permeability domains within the High Plains aquifer, a pilot study was undertaken in southwest Kansas to create three-dimensional visualizations of relative permeability using a database of >3000 logs. Aggregate relative permeability ranges up to 99% with a mean of 51%. Laterally traceable, thick domains of >80% relative permeability embedded within a lower relative permeability matrix strongly suggest that preferred pathways for lateral and vertical water transmission exist within the aquifer. Similarly, domains with relative permeabilities of <45% are traceable laterally over appreciable distances in the sub-surface and probably act as leaky confining layers. This study shows that the aquifer does not consist solely of local, randomly distributed, hydrostratigraphic units, as suggested by previous studies. ?? 2009 Geological Society of America.

  13. Statistical robustness of machine-learning estimates for characterizing a groundwater-surface water system, Southland, New Zealand

    NASA Astrophysics Data System (ADS)

    Friedel, M. J.; Daughney, C.

    2016-12-01

    The development of a successful surface-groundwater management strategy depends on the quality of data provided for analysis. This study evaluates the statistical robustness when using a modified self-organizing map (MSOM) technique to estimate missing values for three hypersurface models: synoptic groundwater-surface water hydrochemistry, time-series of groundwater-surface water hydrochemistry, and mixed-survey (combination of groundwater-surface water hydrochemistry and lithologies) hydrostratigraphic unit data. These models of increasing complexity are developed and validated based on observations from the Southland region of New Zealand. In each case, the estimation method is sufficiently robust to cope with groundwater-surface water hydrochemistry vagaries due to sample size and extreme data insufficiency, even when >80% of the data are missing. The estimation of surface water hydrochemistry time series values enabled the evaluation of seasonal variation, and the imputation of lithologies facilitated the evaluation of hydrostratigraphic controls on groundwater-surface water interaction. The robust statistical results for groundwater-surface water models of increasing data complexity provide justification to apply the MSOM technique in other regions of New Zealand and abroad.

  14. Recent advances in the hydrostratigraphy of paleozoic bedrock in the midwestern united states

    USGS Publications Warehouse

    Bradbury, K.R.; Runkel, Anthony C.

    2011-01-01

    Recent hydrostratigraphic researches have made it possible to acquire knowledge about the relatively undeformed Paleozoic bedrock that forms the most widely used aquifers in Minnesota and Wisconsin. Ongoing evaluation of the Cambrian Eau Claire Formation in southern Wisconsin has caused the formation to be considered a major regional aquitard. Subsurface logs indicate that its thickness ranges from absent to <75 m, and parts of the formation yield significant amounts of water to wells. A key part of modern aquitard hydrogeology is the integration of multi-level hydraulic head measurements into hydrostratigraphic analysis. In south-central Wisconsin, regional groundwater withdrawals from the confined Mount Simon aquifer have created a regional cone of depression. Regional groundwater modeling has demonstrated that this relatively thin unit exerts a major control on regional groundwater flow in the ??300-m-thick bedrock aquifer system and that it is critical in protecting deep wells from contamination.

  15. Hydrostratigraphic analysis of the MADE site with full-resolution GPR and direct-push hydraulic profiling

    USGS Publications Warehouse

    Dogan, M.; Van Dam, R. L.; Bohling, Geoffrey C.; Butler, J.J.; Hyndman, D.W.

    2011-01-01

    Full-resolution 3D Ground-Penetrating Radar (GPR) data were combined with high-resolution hydraulic conductivity (K) data from vertical Direct-Push (DP) profiles to characterize a portion of the highly heterogeneous MAcro Dispersion Experiment (MADE) site. This is an important first step to better understand the influence of aquifer heterogeneities on observed anomalous transport. Statistical evaluation of DP data indicates non-normal distributions that have much higher similarity within each GPR facies than between facies. The analysis of GPR and DP data provides high-resolution estimates of the 3D geometry of hydrostratigraphic zones, which can then be populated with stochastic K fields. The lack of such estimates has been a significant limitation for testing and parameterizing a range of novel transport theories at sites where the traditional advection-dispersion model has proven inadequate. ?? 2011 by the American Geophysical Union.

  16. Coastal groundwater discharge for the U.S. East and Gulf Coasts calculated with three-dimensional groundwater flow models

    NASA Astrophysics Data System (ADS)

    Befus, K. M.; Kroeger, K. D.; Smith, C. G.; Swarzenski, P. W.

    2017-12-01

    Fresh groundwater discharge to coastal environments contribute to the physical and chemical conditions of coastal waters. At regional scales, groundwater fluxes remain poorly constrained, representing uncertainty in both water and chemical budgets that have implications for downstream ecosystem health and for how human activities alter coastal hydrologic processes. Coastal groundwater discharges remain widely unconstrained due to the interconnectedness of highly heterogeneous hydrogeologic frameworks and hydrologic conditions. We use regional-scale, three-dimensional groundwater flow models with the best available hydrostratigraphic framework data to calculate the magnitude of groundwater discharging from coastal aquifers to coastal waterbodies along the eastern U.S. In addition, we constrain the inland areas that contribute to coastal groundwater discharges using particle tracking. We find that 27 km3/yr of groundwater enters coastal waters of the eastern U.S. and Gulf of Mexico and was over 175,000 km2. The contributing areas to coastal groundwater discharge extended kilometers inland and often were supplied by recharge occurring tens of kilometers inland. These results suggest that coastal groundwater discharges rely on larger contributing areas and potentially transport more dissolved constituents than previously calculated, which are important factors for constraining the role of groundwater in coastal chemical budgets and its impacts on coastal ecosystems.

  17. Physical stratigraphy and hydrostratigraphy of Upper Cretaceous and Paleocene sediments, Burke and Screven Counties, Georgia

    USGS Publications Warehouse

    Falls, W.F.; Baum, J.S.; Prowell, D.C.

    1997-01-01

    Six geologic units are recognized in the Cretaceous and the Paleocene sediments of eastern Burke and Screven Counties in Georgia on the basis of lithologic, geophysical, and paleontologic data collected from three continuously cored testholes in Georgia and one testhole in South Carolina. The six geologic units are separated by regional unconformities and are designated from oldest to youngest as the Cape Fear Formation, the Middendorf Formation, the Black Creek Group (undivided), and the Steel Creek Formation in the Upper Cretaceous section, and the Ellenton and the Snapp Formations in the Paleocene section. The geologic units provide a spatial and temporal framework for the identification and correlation of a basal confining unit beneath the Midville aquifer system and five aquifers and five confining units in the Dublin and the Midville aquifer systems. The Dublin aquifer system is divided hydrostratigraphically into the Millers Pond, the upper Dublin, and the lower Dublin aquifers. The Midville aquifer system is divided hydrostratigraphically into the upper and the lower Midville aquifers. The fine-grained sediments of the Millers Pond, the lower Dublin, and the lower Midville confining units are nonmarine deposits and are present in the upper part of the Snapp Formation, the Black Creek Group (undivided), and the Middendorf Formation, respectively. Hydrologic data for specific sets of monitoring wells at the Savannah River Site in South Carolina and the Millers Pond site in Georgia confirm that these three units are leaky confining units and locally impede vertical ground-water flow between adjacent aquifers. The fine-grained sediments of the upper Dublin and the upper Midville confining units are marine-deltaic deposits of the Ellenton Formation and the Black Creek Group (undivided), respectively. Hydrologic data confirm that the upper Dublin confining unit regionally impedes vertical ground-water flow on both sides of the Savannah River. The upper Midville confining unit impedes vertical ground-water flow in the middle and downdip parts of the study area and is a leaky confining unit in the updip part of the study area. Recognition of the upper Dublin confining unit as a regional confining unit between the Millers Pond and the upper Dublin aquifers also confirms that the Millers Pond aquifer is a separate hydrologic unit from the rest of the Dublin aquifer system. This multi-aquifer framework increases the vertical hydrostratigraphic resolution of hydraulic properties and gradients in the Dublin and Midville aquifer systems for the investigation of ground-water flow beneath the Savannah River in the vicinity of the U.S. Department of Energy Savannah River Site.

  18. Numerical modeling of groundwater flow in the coastal aquifer system of Taranto (southern Italy)

    NASA Astrophysics Data System (ADS)

    De Filippis, Giovanna; Giudici, Mauro; Negri, Sergio; Margiotta, Stefano; Cattaneo, Laura; Vassena, Chiara

    2014-05-01

    The Mediterranean region is characterized by a strong development of coastal areas with a high concentration of water-demanding human activities, resulting in weakly controlled withdrawals of groundwater which accentuate the saltwater intrusion phenomenon. The worsening of groundwater quality is a huge problem especially for those regions, like Salento (southern Italy), where a karst aquifer system represents the most important water resource because of the deficiency of a well developed superficial water supply. In this frame, the first 2D numerical model describing the groundwater flow in the karst aquifer of Salento peninsula was developed by Giudici et al. [1] at the regional scale and then improved by De Filippis et al. [2]. In particular, the estimate of the saturated thickness of the deep aquifer highlighted that the Taranto area is particularly sensitive to the phenomenon of seawater intrusion, both for the specific hydrostratigraphic configuration and for the presence of highly water-demanding industrial activities. These remarks motivate a research project which is part of the research program RITMARE (The Italian Research for the Sea), within which a subprogram is specifically dedicated to the problem of the protection and preservation of groundwater quality in Italian coastal aquifers and in particular, among the others, in the Taranto area. In this context, the CINFAI operative unit aims at providing a contribution to the characterization of groundwater in the study area. The specific objectives are: a. the reconstruction of the groundwater dynamic (i.e., the preliminary identification of a conceptual model for the aquifer system and the subsequent modeling of groundwater flow in a multilayered system which is very complex from the hydrostratigraphical point of view); b. the characterization of groundwater outflows through submarine and subaerial springs and the water exchanges with the shallow coastal water bodies (e.g. Mar Piccolo) and the off-shore sea; c. the modeling of seawater intrusion in the coastal aquifer system. The first objective is achieved through the analysis of hydrostratigraphic reconstructions obtained from different data sets: well logs, published geological field maps, studies for the characterization of contaminated sites. The hydrostratigraphic setup is merged with maps of land use, hydraulic head maps, data on water extraction and source discharge, in order to identify the conceptual model. For the numerical simulations, the computer code YAGMod, which was originally developed to perform 3D groundwater flow simulation with a simplified treatment of unsaturated/saturated conditions and the effects of strong aquifer exploitation (i.e., high well pumping rates), is extended to the case of a variable density flow. The results will be compared with those obtained with other modeling software (e.g., Tough2). [1] Giudici M., Margiotta S., Mazzone F., Negri S., Vassena C., 2012. Modelling Hydrostratigraphy and groundwater flow of a fractured and karst aquifer in a Mediterranean basin (Salento peninsula, southeastern Italy), Environmental Earth Sciences. doi: 10.1007/s12665-012-1631-1 [2] De Filippis G., Giudici M., Margiotta S., Mazzone F., Negri S., Vassena C., 2013. Numerical modeling of the groundwater flow in the fractured and karst aquifer of the Salento peninsula (Southern Italy), Acque Sotterranee, 2:17-28. doi: 10.7343/AS-016-013-0040

  19. Hydrostratigraphy of Tree Island Cores from Water Conservation Area 3

    USGS Publications Warehouse

    McNeill, Donald F.; Cunningham, Kevin J.

    2003-01-01

    Cores and borehole-geophysical logs collected on and around two tree islands in Water Conservation Area 3 have been examined to develop a stratigraphic framework for these ecosystems. Especially important is the potential for the exchange of ground water and surface water within these features. The hydrostratigraphic results from this study document the lithologic nature of the foundation of the tree islands, the distribution of porous intervals, the potential for paleotopographic influence on their formation, and the importance of low-permeability, subaerial-exposure horizons on the vertical exchange of ground water and surface water. Figure 1. Location of Tree Islands 3AS3 and 3BS1. [larger image] Results from this hydrostratigraphic study indicate that subtle differences occur in lithofacies and topography between the on-island and off-island subsurface geologic records. Specifics are described herein. Firstly, at both tree-island sites, the top of the limestone bedrock is slightly elevated beneath the head of the tree islands relative to the off-island core sites and the tail of the tree islands, which suggests that bedrock 'highs' acted as 'seeds' for the development of the tree islands of this study and possibly many others. Secondly, examination of the recovered core and the caliper logs tentatively suggest that the elevated limestone beneath the tree islands may have a preferentially more porous framework relative to limestone beneath the adjacent areas, possibly providing a ground-water-to-surface-water connection that sustains the tree island system. Finally, because the elevation of the top of the limestone bedrock at the head of Tree Island 3AS3 is slightly higher than the surrounding upper surface of the peat, and because the wetland peats have a lower hydraulic conductivity than the limestone bedrock (Miami Limestone and Fort Thompson Formation), it is possible that there is a head difference between surface water of the wetlands and the ground water in underlying limestone bedrock.

  20. Modeling complex aquifer systems: a case study in Baton Rouge, Louisiana (USA)

    NASA Astrophysics Data System (ADS)

    Pham, Hai V.; Tsai, Frank T.-C.

    2017-05-01

    This study targets two challenges in groundwater model development: grid generation and model calibration for aquifer systems that are fluvial in origin. Realistic hydrostratigraphy can be developed using a large quantity of well log data to capture the complexity of an aquifer system. However, generating valid groundwater model grids to be consistent with the complex hydrostratigraphy is non-trivial. Model calibration can also become intractable for groundwater models that intend to match the complex hydrostratigraphy. This study uses the Baton Rouge aquifer system, Louisiana (USA), to illustrate a technical need to cope with grid generation and model calibration issues. A grid generation technique is introduced based on indicator kriging to interpolate 583 wireline well logs in the Baton Rouge area to derive a hydrostratigraphic architecture with fine vertical discretization. Then, an upscaling procedure is developed to determine a groundwater model structure with 162 layers that captures facies geometry in the hydrostratigraphic architecture. To handle model calibration for such a large model, this study utilizes a derivative-free optimization method in parallel computing to complete parameter estimation in a few months. The constructed hydrostratigraphy indicates the Baton Rouge aquifer system is fluvial in origin. The calibration result indicates hydraulic conductivity for Miocene sands is higher than that for Pliocene to Holocene sands and indicates the Baton Rouge fault and the Denham Springs-Scotlandville fault to be low-permeability leaky aquifers. The modeling result shows significantly low groundwater level in the "2,000-foot" sand due to heavy pumping, indicating potential groundwater upward flow from the "2,400-foot" sand.

  1. A Hierarchical Multi-Model Approach for Uncertainty Segregation, Prioritization and Comparative Evaluation of Competing Modeling Propositions

    NASA Astrophysics Data System (ADS)

    Tsai, F. T.; Elshall, A. S.; Hanor, J. S.

    2012-12-01

    Subsurface modeling is challenging because of many possible competing propositions for each uncertain model component. How can we judge that we are selecting the correct proposition for an uncertain model component out of numerous competing propositions? How can we bridge the gap between synthetic mental principles such as mathematical expressions on one hand, and empirical observation such as observation data on the other hand when uncertainty exists on both sides? In this study, we introduce hierarchical Bayesian model averaging (HBMA) as a multi-model (multi-proposition) framework to represent our current state of knowledge and decision for hydrogeological structure modeling. The HBMA framework allows for segregating and prioritizing different sources of uncertainty, and for comparative evaluation of competing propositions for each source of uncertainty. We applied the HBMA to a study of hydrostratigraphy and uncertainty propagation of the Southern Hills aquifer system in the Baton Rouge area, Louisiana. We used geophysical data for hydrogeological structure construction through indictor hydrostratigraphy method and used lithologic data from drillers' logs for model structure calibration. However, due to uncertainty in model data, structure and parameters, multiple possible hydrostratigraphic models were produced and calibrated. The study considered four sources of uncertainties. To evaluate mathematical structure uncertainty, the study considered three different variogram models and two geological stationarity assumptions. With respect to geological structure uncertainty, the study considered two geological structures with respect to the Denham Springs-Scotlandville fault. With respect to data uncertainty, the study considered two calibration data sets. These four sources of uncertainty with their corresponding competing modeling propositions resulted in 24 calibrated models. The results showed that by segregating different sources of uncertainty, HBMA analysis provided insights on uncertainty priorities and propagation. In addition, it assisted in evaluating the relative importance of competing modeling propositions for each uncertain model component. By being able to dissect the uncertain model components and provide weighted representation of the competing propositions for each uncertain model component based on the background knowledge, the HBMA functions as an epistemic framework for advancing knowledge about the system under study.

  2. Generation of 3-D hydrostratigraphic zones from dense airborne electromagnetic data to assess groundwater model prediction error

    USGS Publications Warehouse

    Christensen, Nikolaj K; Minsley, Burke J.; Christensen, Steen

    2017-01-01

    We present a new methodology to combine spatially dense high-resolution airborne electromagnetic (AEM) data and sparse borehole information to construct multiple plausible geological structures using a stochastic approach. The method developed allows for quantification of the performance of groundwater models built from different geological realizations of structure. Multiple structural realizations are generated using geostatistical Monte Carlo simulations that treat sparse borehole lithological observations as hard data and dense geophysically derived structural probabilities as soft data. Each structural model is used to define 3-D hydrostratigraphical zones of a groundwater model, and the hydraulic parameter values of the zones are estimated by using nonlinear regression to fit hydrological data (hydraulic head and river discharge measurements). Use of the methodology is demonstrated for a synthetic domain having structures of categorical deposits consisting of sand, silt, or clay. It is shown that using dense AEM data with the methodology can significantly improve the estimated accuracy of the sediment distribution as compared to when borehole data are used alone. It is also shown that this use of AEM data can improve the predictive capability of a calibrated groundwater model that uses the geological structures as zones. However, such structural models will always contain errors because even with dense AEM data it is not possible to perfectly resolve the structures of a groundwater system. It is shown that when using such erroneous structures in a groundwater model, they can lead to biased parameter estimates and biased model predictions, therefore impairing the model's predictive capability.

  3. Generation of 3-D hydrostratigraphic zones from dense airborne electromagnetic data to assess groundwater model prediction error

    NASA Astrophysics Data System (ADS)

    Christensen, N. K.; Minsley, B. J.; Christensen, S.

    2017-02-01

    We present a new methodology to combine spatially dense high-resolution airborne electromagnetic (AEM) data and sparse borehole information to construct multiple plausible geological structures using a stochastic approach. The method developed allows for quantification of the performance of groundwater models built from different geological realizations of structure. Multiple structural realizations are generated using geostatistical Monte Carlo simulations that treat sparse borehole lithological observations as hard data and dense geophysically derived structural probabilities as soft data. Each structural model is used to define 3-D hydrostratigraphical zones of a groundwater model, and the hydraulic parameter values of the zones are estimated by using nonlinear regression to fit hydrological data (hydraulic head and river discharge measurements). Use of the methodology is demonstrated for a synthetic domain having structures of categorical deposits consisting of sand, silt, or clay. It is shown that using dense AEM data with the methodology can significantly improve the estimated accuracy of the sediment distribution as compared to when borehole data are used alone. It is also shown that this use of AEM data can improve the predictive capability of a calibrated groundwater model that uses the geological structures as zones. However, such structural models will always contain errors because even with dense AEM data it is not possible to perfectly resolve the structures of a groundwater system. It is shown that when using such erroneous structures in a groundwater model, they can lead to biased parameter estimates and biased model predictions, therefore impairing the model's predictive capability.

  4. A conceptual model of the hydrogeologic framework, geochemistry, and groundwater-flow system of the Edwards-Trinity and related aquifers in the Pecos County region, Texas

    USGS Publications Warehouse

    Bumgarner, Johnathan R.; Stanton, Gregory P.; Teeple, Andrew; Thomas, Jonathan V.; Houston, Natalie A.; Payne, Jason; Musgrove, MaryLynn

    2012-01-01

    A conceptual model of the hydrogeologic framework, geochemistry, and groundwater-flow system of the Edwards-Trinity and related aquifers, which include the Pecos Valley, Igneous, Dockum, Rustler, and Capitan Reef aquifers, was developed as the second phase of a groundwater availability study in the Pecos County region in west Texas. The first phase of the study was to collect and compile groundwater, surface-water, water-quality, geophysical, and geologic data in the area. The third phase of the study involves a numerical groundwater-flow model of the Edwards-Trinity aquifer in order to simulate groundwater conditions based on various groundwater-withdrawal scenarios. Resource managers plan to use the results of the study to establish management strategies for the groundwater system. The hydrogeologic framework is composed of the hydrostratigraphy, structural features, and hydraulic properties of the groundwater system. Well and geophysical logs were interpreted to define the top and base surfaces of the Edwards-Trinity aquifer units. Elevations of the top and base of the Edwards-Trinity aquifer generally decrease from the southwestern part of the study area to the northeast. The thicknesses of the Edwards-Trinity aquifer units were calculated using the interpolated top and base surfaces of the hydrostratigraphic units. Some of the thinnest sections of the aquifer were in the eastern part of the study area and some of the thickest sections were in the Pecos, Monument Draw, and Belding-Coyanosa trough areas. Normal-fault zones, which formed as growth and collapse features as sediments were deposited along the margins of more resistant rocks and as overlying sediments collapsed into the voids created by the dissolution of Permian-age evaporite deposits, were delineated based on the interpretation of hydrostratigraphic cross sections. The lowest aquifer transmissivity values were measured in the eastern part of the study area; the highest transmissivity values were measured in a faulted area of the Monument Draw trough. Hydraulic conductivity values generally exhibited the same trends as the transmissivity values. Groundwater-quality data and groundwater-level data were used in context with the hydrogeologic framework to assess the chemical characteristics of water from different sources, regional groundwater-flow paths, recharge sources, the mixing of water from different sources, and discharge in the study area. Groundwater-level altitudes generally decrease from southwest to northeast and regional groundwater flow is from areas of recharge south and west to the north and northeast. Four principal sources of recharge to the Edwards-Trinity aquifer were identified: (1) regional flow that originated as recharge northwest of the study area, (2) runoff from the Barilla, Davis, and Glass Mountains, (3) return flow from irrigation, and (4) upwelling from deeper aquifers. Results indicated Edwards-Trinity aquifer water in the study area was dominated by mineralized, regional groundwater flow that most likely recharged during the cooler, wetter climates of the Pleistocene with variable contributions of recent, local recharge. Groundwater generally flows into the down-dip extent of the Edwards-Trinity aquifer where it discharges into overlying or underlying aquifer units, discharges from springs, discharges to the Pecos River, follows a regional flow path east out of the study area, or is withdrawn by groundwater wells. Structural features such as mountains, troughs, and faults play a substantial role in the distribution of recharge, local and regional groundwater flow, spring discharge, and aquifer interaction.

  5. Integrating borehole logs and aquifer tests in aquifer characterization

    USGS Publications Warehouse

    Paillet, Frederick L.; Reese, R.S.

    2000-01-01

    Integration of lithologic logs, geophysical logs, and hydraulic tests is critical in characterizing heterogeneous aquifers. Typically only a limited number of aquifer tests can be performed, and these need to be designed to provide hydraulic properties for the principle aquifers in the system. This study describes the integration of logs and aquifer tests in the development of a hydrostratigraphic model for the surficial aquifer system in and around Big Cypress National Preserve in eastern Collier County, Florida. Borehole flowmeter tests provide qualitative permeability profiles in most of 26 boreholes drilled in the Study area. Flow logs indicate the depth of transmissive units, which are correlated across the study area. Comparison to published studies in adjacent areas indicates that the main limestone aquifer of the 000000Tamiami Formation in the study area corresponds with the gray limestone aquifer in western Dade County and the water table and lower Tamiami Aquifer in western Collier County. Four strategically located, multiwell aquifer tests are used to quantify the qualitative permeability profiles provided by the flowmeter log analysis. The hydrostratigraphic model based on these results defines the main aquifer in the central part of the study area as unconfined to semiconfined with a transmissivity as high as 30,000 m2/day. The aquifer decreases in transmissivity to less than 10,000 m2/day in some parts of western Collier County, and becomes confined to the east and northeast of the study area, where transmissivity decreases to below 5000 m2/day.Integration of lithologic logs, geophysical logs, and hydraulic tests is critical in characterizing heterogeneous aquifers. Typically only a limited number of aquifer tests can be performed, and these need to be designed to provide hydraulic properties for the principle aquifers in the system. This study describes the integration of logs and aquifer tests in the development of a hydrostratigraphic model for the surficial aquifer system in and around Big Cypress National Preserve in eastern Collier County, Florida. Borehole flowmeter tests provide qualitative permeability profiles in most of 26 boreholes drilled in the study area. Flow logs indicate the depth of transmissive units, which are correlated across the study area. Comparison to published studies in adjacent areas indicates that the main limestone aquifer of the Tamiami Formation in the study area corresponds with the gray limestone aquifer in western Dade County and the water table and lower Tamiami Aquifer in western Collier County. Four strategically located, multiwell aquifer tests are used to quantify the qualitative permeability profiles provided by the flowmeter log analysis. The hydrostratigraphic model based on these results defines the main aquifer in the central part of the study area as unconfined to semiconfined with a transmissivity as high as 30,000 m2/day. The aquifer decreases in transmissivity to less than 10,000 m2/day in some parts of western Collier County, and becomes confined to the east and northeast of the study area, where transmissivity decreases to below 5000 m2/day.

  6. Bayesian Chance-Constrained Hydraulic Barrier Design under Geological Structure Uncertainty.

    PubMed

    Chitsazan, Nima; Pham, Hai V; Tsai, Frank T-C

    2015-01-01

    The groundwater community has widely recognized geological structure uncertainty as a major source of model structure uncertainty. Previous studies in aquifer remediation design, however, rarely discuss the impact of geological structure uncertainty. This study combines chance-constrained (CC) programming with Bayesian model averaging (BMA) as a BMA-CC framework to assess the impact of geological structure uncertainty in remediation design. To pursue this goal, the BMA-CC method is compared with traditional CC programming that only considers model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from salt water intrusion in the "1500-foot" sand and the "1700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address geological structure uncertainty, three groundwater models based on three different hydrostratigraphic architectures are developed. The results show that using traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from the connector wells is higher than the total pumpage of the protected public supply wells. While reducing the injection rate can be achieved by reducing the reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station may not be economically attractive. © 2014, National Ground Water Association.

  7. Groundwater resources of the East Mountain area, Bernalillo, Sandoval, Santa Fe, and Torrance Counties, New Mexico, 2005

    USGS Publications Warehouse

    Bartolino, James R.; Anderholm, Scott K.; Myers, Nathan C.

    2010-01-01

    The groundwater resources of about 400 square miles of the East Mountain area of Bernalillo, Sandoval, Santa Fe, and Torrance Counties in central New Mexico were evaluated by using groundwater levels and water-quality analyses, and updated geologic mapping. Substantial development in the study area (population increased by 11,000, or 50 percent, from 1990 through 2000) has raised concerns about the effects of growth on water resources. The last comprehensive examination of the water resources of the study area was done in 1980-this study examines a slightly different area and incorporates data collected in the intervening 25 years. The East Mountain area is geologically and hydrologically complex-in addition to the geologic units, such features as the Sandia Mountains, Tijeras and Gutierrez Faults, Tijeras syncline and anticline, and the Estancia Basin affect the movement, availability, and water quality of the groundwater system. The stratigraphic units were separated into eight hydrostratigraphic units, each having distinct hydraulic and chemical properties. Overall, the major hydrostratigraphic units are the Madera-Sandia and Abo-Yeso; however, other units are the primary source of supply in some areas. Despite the eight previously defined hydrostratigraphic units, water-level contours were drawn on the generalized regional potentiometric map assuming all hydrostratigraphic units are connected and function as a single aquifer system. Groundwater originates as infiltration of precipitation in upland areas (Sandia, Manzano, and Manzanita Mountains, and the Ortiz Porphyry Belt) and moves downgradient into the Tijeras Graben, Tijeras Canyon, San Pedro synclinorium, and the Hagan, Estancia, and Espanola Basins. The study area was divided into eight groundwater areas defined on the basis of geologic, hydrologic, and geochemical information-Tijeras Canyon, Cedar Crest, Tijeras Graben, Estancia Basin, San Pedro Creek, Ortiz Porphyry Belt, Hagan Basin, and Upper Sandia Mountains. View report for unabridged abstract.

  8. Using Geophysics to Define Hydrostratigraphic Units in the Edwards and Trinity Aquifers, Texas

    NASA Astrophysics Data System (ADS)

    Smith, B. D.; Blome, C. D.; Clark, A. K.; Kress, W.; Smith, D. V.

    2007-05-01

    Airborne and ground geophysical surveys conducted in Uvalde, Medina, and northern Bexar counties, Texas, can be used to define and characterize hydrostratigraphic units of the Edwards and Trinity aquifers. Airborne magnetic surveys have defined numerous Cretaceous intrusive stocks and laccoliths, mainly in Uvalde County, that influence local hydrology and perhaps regional ground-water flow paths. Depositional environments in the aquifers can be classified as shallow water platforms (San Marcos Platform, Edwards Group), shoal and reef facies (Devils River Trend, Devils River Formation), and deeper water basins (Maverick Basin, West Nueces, McKnight, and Salmon Peak Formations). Detailed airborne and ground electromagnetic surveys have been conducted over the Edwards aquifer catchment zone (exposed Trinity aquifer rocks), recharge zone (exposed Edwards aquifer rocks), and artesian zone (confined Edwards) in the Seco Creek area (northeast Uvalde and Medina Counties; Devils River Trend). These geophysical survey data have been used to divide the Edwards exposed within the Balcones fault zone into upper and lower hydrostratigraphic units. Although both units are high electrical resistivity, the upper unit has slightly lower resistivity than the lower unit. The Georgetown Formation, at the top of the Edwards Group has a moderate resistivity. The formations that comprise the upper confining units to the Edwards aquifer rocks have varying resistivities. The Eagleford and Del Rio Groups (mainly clays) have very low resistivities and are excellent electrical marker beds in the Seco Creek area. The Buda Limestone is characterized by high resistivities. Moderate resistivities characterize the Austin Group rocks (mainly chalk). The older Trinity aquifer, underlying the Edwards aquifer rocks, is characterized by less limestone (electrically resistive or low conductivity units) and greater quantities of mudstones (electrically conductive or low resistivity units). In the western area (Devils River Trend and Maverick Basin) of the Trinity aquifer system there are well-defined collapse units and features that are marked by moderate resistivities bracketed by resistive limestone and conductive mudstone of the Glen Rose Limestone. In the central part of the aquifer (San Marcos Platform) the Trinity's lithologies are divided into upper and lower units with further subdivisions into hydrostratigraphic units. These hydrostratigraphic units are well mapped by an airborne electromagnetic survey in Bexar County. Electrical properties of the Edwards aquifer also vary across the fresh-saline water interface where ground and borehole electrical surveys have been conducted. The saline- saturated Edwards is predictably more conductive than the fresh-water saturated rocks. Similar fresh-saline water interfaces exist within the upper confining units of the Edwards aquifer (Carrizo-Wilcox aquifer) and the Trinity aquifer rocks.

  9. High-resolution characterization of chemical heterogeneity in an alluvial aquifer

    USGS Publications Warehouse

    Schulmeister, M.K.; Healey, J.M.; McCall, G.W.; Birk, S.; Butler, J.J.

    2002-01-01

    The high-resolution capabilities of direct-push technology were exploited to develop new insights into the hydrochemistry at the margin of an alluvial aquifer. Hydrostratigraphic controls on groundwater flow and contaminant loading were revealed through the combined use of direct-push electrical conductivity (EC) logging and geochemical profiling. Vertical and lateral variations in groundwater chemistry were consistent with sedimentary features indicated by EC logs, and supported a conceptual model of recharge along the floodplain margin.

  10. High-resolution characterization of chemical heterogeneity in an alluvial aquifer

    USGS Publications Warehouse

    Schulmeister, M.K.; Healey, J.M.; Butler, J.J.; McCall, G.W.; Birk, S.

    2002-01-01

    The high-resolution capabilities of direct push technology were exploited to develop new insights into the hydrochemistry at the margin of an alluvial aquifer. Hydrostratigraphic controls on groundwater flow and contaminant loading were revealed through the combined use of direct push electrical conductivity (EC) logging and geochemical profiling. Vertical and lateral variations in groundwater chemistry were consistent with sedimentary features indicated by EC logs, and were supported by a conceptual model of recharge along the flood plain margin.

  11. A new technology for determining transport parameters in porous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conca, J.L.; Wright, J.

    The UFA Method can directly and rapidly measure transport parameters for any porous medium over a wide range of water contents and conditions. UFA results for subsurface sediments at a mixed-waste disposal site at the Hanford Site in Washington State provided the data necessary for detailed hydrostratigraphic mapping, subsurface flux and recharge distributions, and subsurface chemical mapping. Seven hundred unsaturated conductivity measurements along with pristine pore water extractions were obtained in only six months using the UFA. These data are used to provide realistic information to conceptual models, predictive models and restoration strategies.

  12. Petroleum hydrogeology of the Great Hungarian Plain, Eastern Pannonian Basin, Hungary

    NASA Astrophysics Data System (ADS)

    Almasi, Istvan

    The results of a regional scale hydrogeological investigation conducted in the Great Hungarian Plain, Eastern Pannonian Basin, for the purposes of petroleum exploration are presented. Two regional aquitards and three regional aquifers were determined in the poorly-to-well consolidated clastic basin fill of the Neogene-Quaternary age and the indurated basement of the Pre-Neogene age. The fluid-potential field was mapped using measured values of stabilised water level and pore-pressure. Two regional fluid flow regimes were recognised: an upper gravity-driven flow regime, and a lower overpressured regime, where super-hydrostatic pore pressures of 1--35 MPa are encountered. The transition between the two flow regimes does not correlate with any particular hydrostratigraphic boundary or elevation range. Apparently, its position and nature are controlled by the morphology of the rigid basement, and locally by the permeability contrasts within the overlying hydrostratigraphic units. Local hydrostratigraphic breaches and conduit faults facilitate hydraulic communication across the regional aquitards. The basin is hydraulically continuous. The mapped groundwater flow directions do not match the predictions of compactional flow models. At two gas-fields, up to 10 MPa overpressures are probably caused by buoyancy forces. Transient overpressures can not be maintained over geologic time in the basin, due to the rock's low hydraulic resistance. Regional tectonic compressive stress, probably with a Recent increase in intensity, offers a new and plausible explanation for the distribution pattern of overpressures in the Great Hungarian Plain. Gravity-driven groundwater flow plays a determinant role in petroleum migration and entrapment. Compactional flow models can explain the present-day position of several known petroleum accumulations within the overpressured regime. However, most accumulations are also associated with particular fluid-potential anomaly-patterns of the actual flow field, which also suggest the possibility of petroleum remigration toward the graben centres and upward. The geothermal characteristics show that pure conduction is the dominant regional heat transfer mechanism within the entire basin. The encountered advective thermal anomalies correlate well with fluid potential anomalies observed in both fluid flow regimes, as well as with certain petroleum accumulations. Toth's (1980) hydraulic theory of petroleum migration was found applicable in a deforming Neogene sedimentary basin, the Great Hungarian Plain.* *This dissertation includes a CD that is compound (contains both a paper copy and a CD as part of the dissertation). The CD requires the following applications: Adobe Acrobat, Microsoft Office.

  13. Helicopter electromagnetic and magnetic geophysical survey data, portions of the North Platte and South Platte Natural Resources Districts, western Nebraska, May 2009

    USGS Publications Warehouse

    Smith, B.D.; Abraham, J.D.; Cannia, J.C.; Minsley, B.J.; Deszcz-Pan, M.; Ball, L.B.

    2010-01-01

    This report is a release of digital data from a helicopter electromagnetic and magnetic survey that was conducted during June 2009 in areas of western Nebraska as part of a joint hydrologic study by the North Platte Natural Resource District (NRD), South Platte NRD, and U.S. Geological Survey (USGS). Flight lines for the survey totaled 937 line kilometers (582 line miles). The objective of the contracted survey, conducted by Fugro Airborne, Ltd., is to improve the understanding of the relation between surface-water and groundwater systems critical to developing groundwater models used in management programs for water resources. A unique aspect of the survey is the flight line layout. One set of flight lines was flown in a zig-zag pattern extending along the length of the previously collected airborne data. The success of this survey design depended on a well-understood regional hydrogeologic framework and model developed by the Cooperative Hydrologic Study of the Platte River Basin and the airborne geophysical data collected in 2008. Resistivity variations along lines could be related to this framework. In addition to these lines, more traditional surveys consisting of parallel flight lines, separated by about 400 meters were carried out for three blocks in the North Platte NRD, the South Platte NRD and in the area of Crescent Lakes. These surveys helped to establish the spatial variations of the resistivity of hydrostratigraphic units. An additional survey was flown over the Crescent Lake area. The objective of this survey, funded by the USGS Office of Groundwater, was to map shallow hydrogeologic features of the southwestern part of the Sand Hills that contain a mix of fresh to saline lakes.

  14. Reducing Uncertainty in the Distribution of Hydrogeologic Units within Volcanic Composite Units of Pahute Mesa Using High-Resolution 3-D Resistivity Methods, Nevada Test Site, Nevada

    USGS Publications Warehouse

    Rodriguez, Brian D.; Sweetkind, Don; Burton, Bethany L.

    2010-01-01

    The U.S. Department of Energy (DOE) and the National Nuclear Security Administration (NNSA) at their Nevada Site Office (NSO) are addressing groundwater contamination resulting from historical underground nuclear testing through the Environmental Management program and, in particular, the Underground Test Area (UGTA) project. From 1951 to 1992, 828 underground nuclear tests were conducted at the Nevada Test Site (NTS) northwest of Las Vegas (DOE UGTA, 2003). Most of these tests were conducted hundreds of feet above the groundwater table; however, more than 200 of the tests were near, or within, the water table. This underground testing was limited to specific areas of the NTS including Pahute Mesa, Rainier Mesa/Shoshone Mountain, Frenchman Flat, and Yucca Flat. Volcanic composite units make up much of the area within the Pahute Mesa Corrective Action Unit (CAU) at the NTS, Nevada. The extent of many of these volcanic composite units extends throughout and south of the primary areas of past underground testing at Pahute and Rainier Mesas. As situated, these units likely influence the rate and direction of groundwater flow and radionuclide transport. Currently, these units are poorly resolved in terms of their hydrologic properties introducing large uncertainties into current CAU-scale flow and transport models. In 2007, the U.S. Geological Survey (USGS), in cooperation with DOE and NNSA-NSO acquired three-dimensional (3-D) tensor magnetotelluric data at the NTS in Area 20 of Pahute Mesa CAU. A total of 20 magnetotelluric recording stations were established at about 600-m spacing on a 3-D array and were tied to ER20-6 well and other nearby well control (fig. 1). The purpose of this survey was to determine if closely spaced 3-D resistivity measurements can be used to characterize the distribution of shallow (600- to 1,500-m-depth range) devitrified rhyolite lava-flow aquifers (LFA) and zeolitic tuff confining units (TCU) in areas of limited drill hole control on Pahute Mesa within the Calico Hills zeolitic volcanic composite unit (VCU), an important hydrostratigraphic unit in Area 20. The resistivity response was evaluated and compared with existing well data and hydrogeologic unit tops from the current Pahute Mesa framework model. In 2008, the USGS processed and inverted the magnetotelluric data into a 3-D resistivity model. We interpreted nine depth slices and four west-east profile cross sections of the 3-D resistivity inversion model. This report documents the geologic interpretation of the 3-D resistivity model. Expectations are that spatial variations in the electrical properties of the Calico Hills zeolitic VCU can be detected and mapped with 3-D resistivity, and that these changes correlate to differences in rock permeability. With regard to LFA and TCU, electrical resistivity and permeability are typically related. Tuff confining units will typically have low electrical resistivity and low permeability, whereas LFA will have higher electrical resistivity and zones of higher fracture-related permeability. If expectations are shown to be correct, the method can be utilized by the UGTA scientists to refine the hydrostratigraphic unit (HSU) framework in an effort to more accurately predict radionuclide transport away from test areas on Pahute and Rainier Mesas.

  15. Magnetotelluric Data, Central Yucca Flat, Nevada Test Site, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.M. Williams; B.D. Rodriguez, and T.H. Asch

    2005-11-23

    Nuclear weapons are integral to the defense of the United States. The U.S. Department of Energy, as the steward of these devices, must continue to gauge the efficacy of the individual weapons. This could be accomplished by occasional testing at the Nevada Test Site (NTS) in Nevada, northwest of Las Vegas. Yucca Flat Basin is one of the testing areas at the NTS. One issue of concern is the nature of the somewhat poorly constrained pre-Tertiary geology and its effects on ground-water flow in the area subsequent to a nuclear test. Ground-water modelers would like to know more about themore » hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Yucca Flat Corrective Action Unit (CAU). During 2003, the U.S. Geological Survey (USGS) collected and processed Magnetotelluric (MT) and Audio-magnetotelluric (AMT) data at the Nevada Test Site in and near Yucca Flat to help characterize this pre-Tertiary geology. That work will help to define the character, thickness, and lateral extent of pre-Tertiary confining units. In particular, a major goal has been to define the upper clastic confining unit (UCCU) in the Yucca Flat area. Interpretation will include a three-dimensional (3-D) character analysis and two-dimensional (2-D) resistivity model. The purpose of this report is to release the MT sounding data for Central Yucca Flat, Profile 1, as shown in figure 1. No interpretation of the data is included here.« less

  16. Magnetotelluric Data, North Central Yucca Flat, Nevada Test Site, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.M. Williams; B.D. Rodriguez, and T.H. Asch

    2005-11-23

    Nuclear weapons are integral to the defense of the United States. The U.S. Department of Energy, as the steward of these devices, must continue to gauge the efficacy of the individual weapons. This could be accomplished by occasional testing at the Nevada Test Site (NTS) in Nevada, northwest of Las Vegas. Yucca Flat Basin is one of the testing areas at the NTS. One issue of concern is the nature of the somewhat poorly constrained pre-Tertiary geology and its effects on ground-water flow in the area subsequent to a nuclear test. Ground-water modelers would like to know more about themore » hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Yucca Flat Corrective Action Unit (CAU). During 2003, the U.S. Geological Survey (USGS) collected and processed Magnetotelluric (MT) and Audio-magnetotelluric (AMT) data at the Nevada Test Site in and near Yucca Flat to help characterize this pre-Tertiary geology. That work will help to define the character, thickness, and lateral extent of pre-Tertiary confining units. In particular, a major goal has been to define the upper clastic confining unit (UCCU) in the Yucca Flat area. Interpretation will include a three-dimensional (3-D) character analysis and two-dimensional (2-D) resistivity model. The purpose of this report is to release the MT sounding data for north central Yucca Flat, Profile 7, as shown in Figure 1. No interpretation of the data is included here.« less

  17. Magnetotelluric Data, Northern Frenchman Flat, Nevada Test Site Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.M. Williams; B.D. Rodriguez, and T. H. Asch

    2005-11-23

    Nuclear weapons are integral to the defense of the United States. The U.S. Department of Energy, as the steward of these devices, must continue to gauge the efficacy of the individual weapons. This could be accomplished by occasional testing at the Nevada Test Site (NTS) in Nevada, northwest of Las Vegas. Yucca Flat Basin is one of the testing areas at the NTS. One issue of concern is the nature of the somewhat poorly constrained pre-Tertiary geology and its effects on ground-water flow in the area subsequent to a nuclear test. Ground-water modelers would like to know more about themore » hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Yucca Flat Corrective Action Unit (CAU). During 2003, the U.S. Geological Survey (USGS) collected and processed Magnetotelluric (MT) and Audio-magnetotelluric (AMT) data at the Nevada Test Site in and near Yucca Flat to help characterize this pre-Tertiary geology. That work will help to define the character, thickness, and lateral extent of pre-Tertiary confining units. In particular, a major goal has been to define the upper clastic confining unit (UCCU) in the Yucca Flat area. Interpretation will include a three-dimensional (3-D) character analysis and two-dimensional (2-D) resistivity model. The purpose of this report is to release the MT sounding data for Frenchman Flat Profile 3, as shown in Figure 1. No interpretation of the data is included here.« less

  18. Magnetotelluric Data, Across Quartzite Ridge, Nevada Test Site, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.M. Williams; B.D. Rodriguez, and T.H. Asch

    2005-11-23

    Nuclear weapons are integral to the defense of the United States. The U.S. Department of Energy, as the steward of these devices, must continue to gauge the efficacy of the individual weapons. This could be accomplished by occasional testing at the Nevada Test Site (NTS) in Nevada, northwest of Las Vegas. Yucca Flat Basin is one of the testing areas at the NTS. One issue of concern is the nature of the somewhat poorly constrained pre-Tertiary geology and its effects on ground-water flow in the area subsequent to a nuclear test. Ground-water modelers would like to know more about themore » hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Yucca Flat Corrective Action Unit (CAU). During 2003, the U.S. Geological Survey (USGS) collected and processed Magnetotelluric (MT) and Audio-magnetotelluric (AMT) data at the Nevada Test Site in and near Yucca Flat to help characterize this pre-Tertiary geology. That work will help to define the character, thickness, and lateral extent of pre-Tertiary confining units. In particular, a major goal has been to define the upper clastic confining unit (UCCU) in the Yucca Flat area. Interpretation will include a three-dimensional (3-D) character analysis and two-dimensional (2-D) resistivity model. The purpose of this report is to release the MT soundings across Quartzite Ridge, Profiles 5, 6a, and 6b, as shown in Figure 1. No interpretation of the data is included here.« less

  19. Magnetotelluric Data, Southern Yucca Flat, Nevada Test Site, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.M. Williams; B.D. Rodriguez, and T.H. Asch

    2005-11-23

    Nuclear weapons are integral to the defense of the United States. The U.S. Department of Energy, as the steward of these devices, must continue to gauge the efficacy of the individual weapons. This could be accomplished by occasional testing at the Nevada Test Site (NTS) in Nevada, northwest of Las Vegas. Yucca Flat Basin is one of the testing areas at the NTS. One issue of concern is the nature of the somewhat poorly constrained pre-Tertiary geology and its effects on ground-water flow in the area subsequent to a nuclear test. Ground-water modelers would like to know more about themore » hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Yucca Flat Corrective Action Unit (CAU). During 2003, the U.S. Geological Survey (USGS) collected and processed Magnetotelluric (MT) and Audio-magnetotelluric (AMT) data at the Nevada Test Site in and near Yucca Flat to help characterize this pre-Tertiary geology. That work will help to define the character, thickness, and lateral extent of pre-Tertiary confining units. In particular, a major goal has been to define the upper clastic confining unit (UCCU) in the Yucca Flat area. Interpretation will include a three-dimensional (3-D) character analysis and two-dimensional (2-D) resistivity model. The purpose of this report is to release the MT sounding data for Southern Yucca Flat, Profile 4, as shown in Figure 1. No interpretation of the data is included here.« less

  20. Magnetotelluric Data, Northern Yucca Flat, Nevada Test Site, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J.M. Williams; B.D. Rodriguez, and T.H. Asch

    2005-11-23

    Nuclear weapons are integral to the defense of the United States. The U.S. Department of Energy, as the steward of these devices, must continue to gauge the efficacy of the individual weapons. This could be accomplished by occasional testing at the Nevada Test Site (NTS) in Nevada, northwest of Las Vegas. Yucca Flat Basin is one of the testing areas at the NTS. One issue of concern is the nature of the somewhat poorly constrained pre-Tertiary geology and its effects on ground-water flow in the area subsequent to a nuclear test. Ground-water modelers would like to know more about themore » hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Yucca Flat Corrective Action Unit (CAU). During 2003, the U.S. Geological Survey (USGS) collected and processed Magnetotelluric (MT) and Audio-magnetotelluric (AMT) data at the Nevada Test Site in and near Yucca Flat to help characterize this pre-Tertiary geology. That work will help to define the character, thickness, and lateral extent of pre-Tertiary confining units. In particular, a major goal has been to define the upper clastic confining unit (UCCU) in the Yucca Flat area. Interpretation will include a three-dimensional (3-D) character analysis and two-dimensional (2-D) resistivity model. The purpose of this report is to release the MT sounding data for Profile 2, (fig. 1), located in the northern Yucca Flat area. No interpretation of the data is included here.« less

  1. Model Package Report: Central Plateau Vadose Zone Geoframework Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Springer, Sarah D.

    The purpose of the Central Plateau Vadose Zone (CPVZ) Geoframework model (GFM) is to provide a reasonable, consistent, and defensible three-dimensional (3D) representation of the vadose zone beneath the Central Plateau at the Hanford Site to support the Composite Analysis (CA) vadose zone contaminant fate and transport models. The GFM is a 3D representation of the subsurface geologic structure. From this 3D geologic model, exported results in the form of point, surface, and/or volumes are used as inputs to populate and assemble the various numerical model architectures, providing a 3D-layered grid that is consistent with the GFM. The objective ofmore » this report is to define the process used to produce a hydrostratigraphic model for the vadose zone beneath the Hanford Site Central Plateau and the corresponding CA domain.« less

  2. New insights from well responses to fluctuations in barometric pressure

    USGS Publications Warehouse

    Butler, J.J.; Jin, W.; Mohammed, G.A.; Reboulet, E.C.

    2011-01-01

    Hydrologists have long recognized that changes in barometric pressure can produce changes in water levels in wells. The barometric response function (BRF) has proven to be an effective means to characterize this relationship; we show here how it can also be utilized to glean valuable insights into semi-confined aquifer systems. The form of the BRF indicates the degree of aquifer confinement, while a comparison of BRFs between wells sheds light on hydrostratigraphic continuity. A new approach for estimating hydraulic properties of aquitards from BRFs has been developed and verified. The BRF is not an invariant characteristic of a well; in unconfined or semi-confined aquifers, it can change with conditions in the vadose zone. Field data from a long-term research site demonstrate the hydrostratigraphic insights that can be gained from monitoring water levels and barometric pressure. Such insights should be of value for a wide range of practical applications. ?? 2010 The Author(s). Journal compilation ?? 2010 National Ground Water Association.

  3. Core drilling provides information about Santa Fe Group aquifer system beneath Albuquerque's West Mesa

    USGS Publications Warehouse

    Allen, B.D.; Connell, S.D.; Hawley, J.W.; Stone, B.D.

    1998-01-01

    Core samples from the upper ???1500 ft of the Santa Fe Group in the Albuquerque West Mesa area provide a first-hand look at the sediments and at subsurface stratigraphic relationships in this important part of the basin-fill aquifer system. Two major hydrostratigraphic subunits consisting of a lower coarse-grained, sandy interval and an overlying fine-grained, interbedded silty sand and clay interval lie beneath the water table at the 98th St core hole. Borehole electrical conductivity measurements reproduce major textural changes observed in the recovered cores and support subsurface correlations of hydrostratigraphic units in the Santa Fe Group aquifer system based on geophysical logs. Comparison of electrical logs from the core hole and from nearby city wells reveals laterally consistent lithostratigraphic patterns over much of the metropolitan area west of the Rio Grande that may be used to delineate structural and related stratigraphic features that have a direct bearing on the availability of ground water.

  4. Application of frequency- and time-domain electromagnetic surveys to characterize hydrostratigraphy and landfill construction at the Amargosa Desert Research Site, Beatty, Nevada

    USGS Publications Warehouse

    White, Eric A.; Day-Lewis, Frederick D.; Johnson, Carole D.; Lane, John W.

    2016-01-01

    In 2014 and 2015, the U.S. Geological Survey (USGS), conducted frequency-domain electromagnetic (FDEM) surveys at the USGS Amargosa Desert Research Site (ADRS), approximately 17 kilometers (km) south of Beatty, Nevada. The FDEM surveys were conducted within and adjacent to a closed low-level radioactive waste disposal site located at the ADRS. FDEM surveys were conducted on a grid of north-south and east-west profiles to assess the locations and boundaries of historically recorded waste-disposal trenches. In 2015, the USGS conducted time-domain (TDEM) soundings along a profile adjacent to the disposal site (landfill) in cooperation with the U.S. Environmental Protection Agency (USEPA), to assess the thickness and characteristics of the underlying deep unsaturated zone, and the hydrostratigraphy of the underlying saturated zone.FDEM survey results indicate the general location and extent of the waste-disposal trenches and reveal potential differences in material properties and the type and concentration of waste in several areas of the landfill. The TDEM surveys provide information on the underlying hydrostratigraphy and characteristics of the unsaturated zone that inform the site conceptual model and support an improved understanding of the hydrostratigraphic framework. Additional work is needed to interpret the TDEM results in the context of the local and regional structural geology.

  5. Nonpoint Source Solute Transport Normal to Aquifer Bedding in Heterogeneous, Markov Chain Random Fields

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Harter, T.; Sivakumar, B.

    2005-12-01

    Facies-based geostatistical models have become important tools for the stochastic analysis of flow and transport processes in heterogeneous aquifers. However, little is known about the dependency of these processes on the parameters of facies- based geostatistical models. This study examines the nonpoint source solute transport normal to the major bedding plane in the presence of interconnected high conductivity (coarse- textured) facies in the aquifer medium and the dependence of the transport behavior upon the parameters of the constitutive facies model. A facies-based Markov chain geostatistical model is used to quantify the spatial variability of the aquifer system hydrostratigraphy. It is integrated with a groundwater flow model and a random walk particle transport model to estimate the solute travel time probability distribution functions (pdfs) for solute flux from the water table to the bottom boundary (production horizon) of the aquifer. The cases examined include, two-, three-, and four-facies models with horizontal to vertical facies mean length anisotropy ratios, ek, from 25:1 to 300:1, and with a wide range of facies volume proportions (e.g, from 5% to 95% coarse textured facies). Predictions of travel time pdfs are found to be significantly affected by the number of hydrostratigraphic facies identified in the aquifer, the proportions of coarse-textured sediments, the mean length of the facies (particularly the ratio of length to thickness of coarse materials), and - to a lesser degree - the juxtapositional preference among the hydrostratigraphic facies. In transport normal to the sedimentary bedding plane, travel time pdfs are not log- normally distributed as is often assumed. Also, macrodispersive behavior (variance of the travel time pdf) was found to not be a unique function of the conductivity variance. The skewness of the travel time pdf varied from negatively skewed to strongly positively skewed within the parameter range examined. We also show that the Markov chain approach may give significantly different travel time pdfs when compared to the more commonly used Gaussian random field approach even though the first and second order moments in the geostatistical distribution of the lnK field are identical. The choice of the appropriate geostatistical model is therefore critical in the assessment of nonpoint source transport.

  6. Nonpoint source solute transport normal to aquifer bedding in heterogeneous, Markov chain random fields

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Harter, Thomas; Sivakumar, Bellie

    2006-06-01

    Facies-based geostatistical models have become important tools for analyzing flow and mass transport processes in heterogeneous aquifers. Yet little is known about the relationship between these latter processes and the parameters of facies-based geostatistical models. In this study, we examine the transport of a nonpoint source solute normal (perpendicular) to the major bedding plane of an alluvial aquifer medium that contains multiple geologic facies, including interconnected, high-conductivity (coarse textured) facies. We also evaluate the dependence of the transport behavior on the parameters of the constitutive facies model. A facies-based Markov chain geostatistical model is used to quantify the spatial variability of the aquifer system's hydrostratigraphy. It is integrated with a groundwater flow model and a random walk particle transport model to estimate the solute traveltime probability density function (pdf) for solute flux from the water table to the bottom boundary (the production horizon) of the aquifer. The cases examined include two-, three-, and four-facies models, with mean length anisotropy ratios for horizontal to vertical facies, ek, from 25:1 to 300:1 and with a wide range of facies volume proportions (e.g., from 5 to 95% coarse-textured facies). Predictions of traveltime pdfs are found to be significantly affected by the number of hydrostratigraphic facies identified in the aquifer. Those predictions of traveltime pdfs also are affected by the proportions of coarse-textured sediments, the mean length of the facies (particularly the ratio of length to thickness of coarse materials), and, to a lesser degree, the juxtapositional preference among the hydrostratigraphic facies. In transport normal to the sedimentary bedding plane, traveltime is not lognormally distributed as is often assumed. Also, macrodispersive behavior (variance of the traveltime) is found not to be a unique function of the conductivity variance. For the parameter range examined, the third moment of the traveltime pdf varies from negatively skewed to strongly positively skewed. We also show that the Markov chain approach may give significantly different traveltime distributions when compared to the more commonly used Gaussian random field approach, even when the first- and second-order moments in the geostatistical distribution of the lnK field are identical. The choice of the appropriate geostatistical model is therefore critical in the assessment of nonpoint source transport, and uncertainty about that choice must be considered in evaluating the results.

  7. Vulnerability assessment of groundwater-dependent ecosystems based on integrated groundwater flow modell construction

    NASA Astrophysics Data System (ADS)

    Tóth, Ádám; Simon, Szilvia; Galsa, Attila; Havril, Timea; Monteiro Santos, Fernando A.; Müller, Imre; Mádl-Szőnyi, Judit

    2017-04-01

    Groundwater-dependent ecosystems (GDEs) are highly influenced by the amount of groundwater, seasonal variation of precipitation and consequent water table fluctuation and also the anthropogenic activities. They can be regarded as natural surface manifestations of the flowing groundwater. The preservation of environment and biodiversity of these GDEs is an important issue worldwide, however, the water management policy and action plan could not be constructed in absense of proper hydrogeological knowledge. The concept of gravity-driven regional groundwater flow could aid the understanding of flow pattern and interpretation of environmental processes and conditions. Unless the required well data are available, the geological-hydrogeological numerical model of the study area cannot be constructed based only on borehole information. In this case, spatially continuous geophysical data can support groundwater flow model building: systematically combined geophysical methods can provide model input. Integration of lithostratigraphic, electrostratigraphic and hydrostratigraphic information could aid groundwater flow model construction: hydrostratigraphic units and their hydraulic behaviour, boundaries and geometry can be obtained. Groundwater-related natural manifestations, such as GDEs, can be explained with the help of the revealed flow pattern and field mapping of features. Integrated groundwater flow model construction for assessing the vulnerability of GDEs was presented via the case study of the geologically complex area of Tihany Peninsula, Hungary, with the aims of understanding the background and occurrence of groundwater-related environmental phenomena, surface water-groundwater interaction, and revealing the potential effect of anthropogenic activity and climate change. In spite of its important and protected status, fluid flow model of the area, which could support water management and natural protection policy, had not been constructed previously. The 3D groundwater flow model, which was based on the scarce geologic information and the electromagnetic geophysical results, could answer the subsurface hydraulic connection between GDEs. Moreover, the gravity-driven regional groundwater flow concept could help to interpret the hydraulically nested flow systems (local and intermediate). Validation of numerical simulation by natural surface conditions and phenomena was performed. Consequently, the position of wetlands, their vegetation type, discharge features and induced landslides were explained as environmental imprints of groundwater. Anthropogenic activities and climate change have great impact on groundwater. Since the GDEs are fed by local flow systems, the impact of climate change and anthropogenic activities could be notable, therefore the highly vulnerable wetlands have to be in focus of water management and natural conservation policy.

  8. Geologic framework and hydrostratigraphy of the Edwards and Trinity aquifers within northern Bexar and Comal Counties, Texas

    USGS Publications Warehouse

    Clark, Allan K.; Golab, James A.; Morris, Robert R.

    2016-11-28

    During 2014–16, the U.S. Geological Survey, in cooperation with the Edwards Aquifer Authority, documented the geologic framework and hydrostratigraphy of the Edwards and Trinity aquifers within northern Bexar and Comal Counties, Texas. The Edwards and Trinity aquifers are major sources of water for agriculture, industry, and urban and rural communities in south-central Texas. Both the Edwards and Trinity are classified as major aquifers by the State of Texas.The purpose of this report is to present the geologic framework and hydrostratigraphy of the Edwards and Trinity aquifers within northern Bexar and Comal Counties, Tex. The report includes a detailed 1:24,000-scale hydrostratigraphic map, names, and descriptions of the geology and hydrostratigraphic units (HSUs) in the study area.The scope of the report is focused on geologic framework and hydrostratigraphy of the outcrops and hydrostratigraphy of the Edwards and Trinity aquifers within northern Bexar and Comal Counties, Tex. In addition, parts of the adjacent upper confining unit to the Edwards aquifer are included.The study area, approximately 866 square miles, is within the outcrops of the Edwards and Trinity aquifers and overlying confining units (Washita, Eagle Ford, Austin, and Taylor Groups) in northern Bexar and Comal Counties, Tex. The rocks within the study area are sedimentary and range in age from Early to Late Cretaceous. The Miocene-age Balcones fault zone is the primary structural feature within the study area. The fault zone is an extensional system of faults that generally trends southwest to northeast in south-central Texas. The faults have normal throw, are en echelon, and are mostly downthrown to the southeast.The Early Cretaceous Edwards Group rocks were deposited in an open marine to supratidal flats environment during two marine transgressions. The Edwards Group is composed of the Kainer and Person Formations. Following tectonic uplift, subaerial exposure, and erosion near the end of Early Cretaceous time, the area of present-day south-central Texas was again submerged during the Late Cretaceous by a marine transgression resulting in deposition of the Georgetown Formation of the Washita Group.The Early Cretaceous Edwards Group, which overlies the Trinity Group, is composed of mudstone to boundstone, dolomitic limestone, argillaceous limestone, evaporite, shale, and chert. The Kainer Formation is subdivided into (bottom to top) the basal nodular, dolomitic, Kirschberg Evaporite, and grainstone members. The Person Formation is subdivided into (bottom to top) the regional dense, leached and collapsed (undivided), and cyclic and marine (undivided) members.Hydrostratigraphically the rocks exposed in the study area represent a section of the upper confining unit to the Edwards aquifer, the Edwards aquifer, the upper zone of the Trinity aquifer, and the middle zone of the Trinity aquifer. The Pecan Gap Formation (Taylor Group), Austin Group, Eagle Ford Group, Buda Limestone, and Del Rio Clay are generally considered to be the upper confining unit to the Edwards aquifer.The Edwards aquifer was subdivided into HSUs I to VIII. The Georgetown Formation of the Washita Group contains HSU I. The Person Formation of the Edwards Group contains HSUs II (cyclic and marine members [Kpcm], undivided), III (leached and collapsed members [Kplc,] undivided), and IV (regional dense member [Kprd]), and the Kainer Formation of the Edwards Group contains HSUs V (grainstone member [Kkg]), VI (Kirschberg Evaporite Member [Kkke]), VII (dolomitic member [Kkd]), and VIII (basal nodular member [Kkbn]).The Trinity aquifer is separated into upper, middle, and lower aquifer units (hereinafter referred to as “zones”). The upper zone of the Trinity aquifer is in the upper member of the Glen Rose Limestone. The middle zone of the Trinity aquifer is formed in the lower member of the Glen Rose Limestone, Hensell Sand, and Cow Creek Limestone. The regionally extensive Hammett Shale forms a confining unit between the middle and lower zones of the Trinity aquifer. The lower zone of the Trinity aquifer consists of the Sligo and Hosston Formations, which do not crop out in the study area.The upper zone of the Trinity aquifer is subdivided into five informal HSUs (top to bottom): cavernous, Camp Bullis, upper evaporite, fossiliferous, and lower evaporite. The middle zone of the Trinity aquifer is composed of the (top to bottom) Bulverde, Little Blanco, Twin Sisters, Doeppenschmidt, Rust, Honey Creek, Hensell, and Cow Creek HSUs. The underlying Hammett HSU is a regional confining unit between the middle and lower zones of the Trinity aquifer. The lower zone of the Trinity aquifer is not exposed in the study area.Groundwater recharge and flow paths in the study area are influenced not only by the hydrostratigraphic characteristics of the individual HSUs but also by faults and fractures and geologic structure. Faulting associated with the Balcones fault zone (1) might affect groundwater flow paths by forming a barrier to flow that results in water moving parallel to the fault plane, (2) might affect groundwater flow paths by increasing flow across the fault because of fracturing and juxtaposing porous and permeable units, or (3) might have no effect on the groundwater flow paths.The hydrologic connection between the Edwards and Trinity aquifers and the various HSUs is complex. The complexity of the aquifer system is a combination of the original depositional history, bioturbation, primary and secondary porosity, diagenesis, and fracturing of the area from faulting. All of these factors have resulted in development of modified porosity, permeability, and transmissivity within and between the aquifers. Faulting produced highly fractured areas that have allowed for rapid infiltration of water and subsequently formed solutionally enhanced fractures, bedding planes, channels, and caves that are highly permeable and transmissive. The juxtaposition resulting from faulting has resulted in areas of interconnectedness between the Edwards and Trinity aquifers and the various HSUs that form the aquifers.

  9. Fluid Exchange Across the Seafloor of the Continental Shelf in the South Atlantic Bight

    NASA Astrophysics Data System (ADS)

    White, S. M.; Wilson, A. M.; Moore, W. S.; Smoak, E. A.; George, C.

    2014-12-01

    Increasing evidence suggests that saline submarine groundwater discharges from the seafloor in volumes that rival river discharge, but this discharge occurs far from shore, spread regionally across the continental shelves. The very limited observational data suggest that saline discharge occurs via long-term regional flow systems and rapid flushing of porewaters from sandy sediment during storm events. This study aims to overcome the paucity of available observational constraints on characterizing regional-scale fluid exchange on passive margin continental shelves. We are developing a detailed hydrostratigraphic framework based on 200 km of CHIRP seismic lines 5-20 km offshore from Charleston, SC and 13 sediment cores up to 6.5 m long. This survey revealed varying thicknesses (0-15 m) of sediment overlying Cretaceous limestone basement, and a filled paleochannel fluvial system. We have installed 3 sets of nested wells and an additional 10 temperature-gradient arrays to observe a wide variety of environments across the shelf. The wells and thermal arrays have been recently installed in the upper 5 m of the sediment, to allow monitoring of pressure and temperature. The wells will also be sampled for Ra tracers and nutrient concentrations. The combination of wells and survey data will allow us to estimate rates of submarine groundwater discharge via hydraulic gradients and by using heat and geochemical tracers. We have developed a numerical model to invert thermal data to estimate both long-term regional groundwater flow and rapid flushing associated with storm events.

  10. H-Area Seepage Basins groundwater monitoring report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-03-01

    During fourth quarter 1992, the groundwater at the H-Area Seepage Basins (HASB) was monitored in compliance with South Carolina Hazardous Waste Management Regulations, R61-79.265, Subpart F. Samples were collected from 130 wells that monitor the three separate hydrostratigraphic units that make up the uppermost aquifer beneath the HASB. A detailed description of the uppermost aquifer is included in the Resource Conservation and Recovery Act Part B Post-Closure Care Permit Application for the H-Area Hazardous Waste Management Facility submitted to the South Carolina Department of Health and Environmental Control in December 1990. Historically, as well as currently, tritium, nitrate, total alpha-emittingmore » radium, gross alpha, and mercury have been the primary constituents observed above final Primary Drinking Water Standards (PDWS) in groundwater at the HASB. Isoconcentration/isoactivity maps included in this report indicate both the concentration/activity and extent of the primary contaminants in each of the three hydrostratigraphic units during first and fourth quarter 1992. Water-level maps indicate that the groundwater flow rates and directions at the HASB have remained relatively constant since the basins ceased to be active in 1988.« less

  11. H-Area Seepage Basins groundwater monitoring report. Fourth quarter 1992 and 1992 summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-03-01

    During fourth quarter 1992, the groundwater at the H-Area Seepage Basins (HASB) was monitored in compliance with South Carolina Hazardous Waste Management Regulations, R61-79.265, Subpart F. Samples were collected from 130 wells that monitor the three separate hydrostratigraphic units that make up the uppermost aquifer beneath the HASB. A detailed description of the uppermost aquifer is included in the Resource Conservation and Recovery Act Part B Post-Closure Care Permit Application for the H-Area Hazardous Waste Management Facility submitted to the South Carolina Department of Health and Environmental Control in December 1990. Historically, as well as currently, tritium, nitrate, total alpha-emittingmore » radium, gross alpha, and mercury have been the primary constituents observed above final Primary Drinking Water Standards (PDWS) in groundwater at the HASB. Isoconcentration/isoactivity maps included in this report indicate both the concentration/activity and extent of the primary contaminants in each of the three hydrostratigraphic units during first and fourth quarter 1992. Water-level maps indicate that the groundwater flow rates and directions at the HASB have remained relatively constant since the basins ceased to be active in 1988.« less

  12. Boise Hydrogeophysical Research Site: Control Volume/Test Cell and Community Research Asset

    NASA Astrophysics Data System (ADS)

    Barrash, W.; Bradford, J.; Malama, B.

    2008-12-01

    The Boise Hydrogeophysical Research Site (BHRS) is a research wellfield or field-scale test facility developed in a shallow, coarse, fluvial aquifer with the objectives of supporting: (a) development of cost- effective, non- or minimally-invasive quantitative characterization and imaging methods in heterogeneous aquifers using hydrologic and geophysical techniques; (b) examination of fundamental relationships and processes at multiple scales; (c) testing theories and models for groundwater flow and solute transport; and (d) educating and training of students in multidisciplinary subsurface science and engineering. The design of the wells and the wellfield support modular use and reoccupation of wells for a wide range of single-well, cross-hole, multiwell and multilevel hydrologic, geophysical, and combined hydrologic-geophysical experiments. Efforts to date by Boise State researchers and collaborators have been largely focused on: (a) establishing the 3D distributions of geologic, hydrologic, and geophysical parameters which can then be used as the basis for jointly inverting hard and soft data to return the 3D K distribution and (b) developing subsurface measurement and imaging methods including tomographic characterization and imaging methods. At this point the hydrostratigraphic framework of the BHRS is known to be a hierarchical multi-scale system which includes layers and lenses that are recognized with geologic, hydrologic, radar, seismic, and EM methods; details are now emerging which may allow 3D deterministic characterization of zones and/or material variations at the meter scale in the central wellfield. Also the site design and subsurface framework have supported a variety of testing configurations for joint hydrologic and geophysical experiments. Going forward we recognize the opportunity to increase the R&D returns from use of the BHRS with additional infrastructure (especially for monitoring the vadose zone and surface water-groundwater interactions), more collaborative activity, and greater access to site data. Our broader goal of becoming more available as a research asset for the scientific community also supports the long-term business plan of increasing funding opportunities to maintain and operate the site.

  13. Map Showing Geology and Hydrostratigraphy of the Edwards Aquifer Catchment Area, Northern Bexar County, South-Central Texas

    USGS Publications Warehouse

    Clark, Amy R.; Blome, Charles D.; Faith, Jason R.

    2009-01-01

    Rock units forming the Edwards and Trinity aquifers in northern Bexar County, Texas, are exposed within all or parts of seven 7.5-minute quadrangles: Bulverde, Camp Bullis, Castle Hills, Helotes, Jack Mountain, San Geronimo, and Van Raub. The Edwards aquifer is the most prolific ground-water source in Bexar County, whereas the Trinity aquifer supplies water for residential, commercial, and industrial uses for areas north of the San Antonio. The geologic map of northern Bexar County shows the distribution of informal hydrostratigraphic members of the Edwards Group and the underlying upper member of the Glen Rose Limestone. Exposures of the Glen Rose Limestone, which forms the Trinity aquifer alone, cover approximately 467 km2 in the county. This study also describes and names five informal hydrostratigraphic members that constitute the upper member of the Glen Rose Limestone; these include, in descending order, the Caverness, Camp Bullis, Upper evaporite, Fossiliferous, and Lower evaporite members. This study improves our understanding of the hydrogeologic connection between the two aquifers as it describes the geology that controls the infiltration of surface water and subsurface flow of ground water from the catchment area (outcropping Trinity aquifer rocks) to the Edwards water-bearing exposures.

  14. Helicopter Electromagnetic and Magnetic Geophysical Survey Data for Portions of the North Platte River and Lodgepole Creek, Nebraska, June 2008

    USGS Publications Warehouse

    Smith, Bruce D.; Abraham, Jared D.; Cannia, James C.; Hill, Patricia

    2009-01-01

    This report is a release of digital data from a helicopter electromagnetic and magnetic survey that was conducted during June 2008 in areas of western Nebraska as part of a joint hydrologic study by the North Platte Natural Resource District, South Platte Natural Resource District, and U.S. Geological Survey. The objective of the contracted survey, conducted by Fugro Airborne, Ltd., was to improve the understanding of the relationship between surface water and groundwater systems critical to developing groundwater models used in management programs for water resources. The survey covered 1,375 line km (854 line mi). A unique aspect of this survey is the flight line layout. One set of flight lines were flown paralleling each side of the east-west trending North Platte River and Lodgepole Creek. The survey also included widely separated (10 km) perpendicular north-south lines. The success of this survey design depended on a well understood regional hydrogeologic framework and model developed by the Cooperative Hydrologic Study of the Platte River Basin. Resistivity variations along lines could be related to this framework. In addition to these lines, more traditional surveys consisting of parallel flight lines separated by about 270 m were carried out for one block in each of the drainages. These surveys helped to establish the spatial variations of the resistivity of hydrostratigraphic units. The electromagnetic equipment consisted of six different coil-pair orientations that measured resistivity at separated frequencies from about 400 Hz to about 140,000 Hz. The electromagnetic data along flight lines were converted to electrical resistivity. The resulting line data were converted to geo-referenced grids and maps which are included with this report. In addition to the electromagnetic data, total field magnetic data and digital elevation data were collected. Data released in this report consist of data along flight lines, digital grids, and digital maps of the apparent resistivity and total magnetic field. The depth range of the subsurface investigation for the electromagnetic survey (estimated as deep as 60 m) is comparable to the depth of shallow aquifers. The geophysical data and hydrologic information from U.S. Geological Survey and cooperator studies are being used by resource managers to develop groundwater resource plans for the area. In addition, data will be used to refine hydrologic models in western Nebraska.

  15. Sub-crop geologic map of pre-Tertiary rocks in the Yucca Flat and northern Frenchman Flat areas, Nevada Test Site, southern Nevada

    USGS Publications Warehouse

    Cole, James C.; Harris, Anita G.; Wahl, Ronald R.

    1997-01-01

    This map displays interpreted structural and stratigraphic relations among the Paleozoic and older rocks of the Nevada Test Site region beneath the Miocene volcanic rocks and younger alluvium in the Yucca Flat and northern Frenchman Flat basins. These interpretations are based on a comprehensive examination and review of data for more than 77 drillholes that penetrated part of the pre-Tertiary basement beneath these post-middle Miocene structural basins. Biostratigraphic data from conodont fossils were newly obtained for 31 of these holes, and a thorough review of all prior microfossil paleontologic data is incorporated in the analysis. Subsurface relationships are interpreted in light of a revised regional geologic framework synthesized from detailed geologic mapping in the ranges surrounding Yucca Flat, from comprehensive stratigraphic studies in the region, and from additional detailed field studies on and around the Nevada Test Site.All available data indicate the subsurface geology of Yucca Flat is considerably more complicated than previous interpretations have suggested. The western part of the basin, in particular, is underlain by relics of the eastward-vergent Belted Range thrust system that are folded back toward the west and thrust by local, west-vergent contractional structures of the CP thrust system. Field evidence from the ranges surrounding the north end of Yucca Flat indicate that two significant strike-slip faults track southward beneath the post-middle Miocene basin fill, but their subsurface traces cannot be closely defined from the available evidence. In contrast, the eastern part of the Yucca Flat basin is interpreted to be underlain by a fairly simple north-trending, broad syncline in the pre-Tertiary units. Far fewer data are available for the northern Frenchman Flat basin, but regional analysis indicates the pre- Tertiary structure there should also be relatively simple and not affected by thrusting.This new interpretation has implications for ground water flow through pre-Tertiary rocks beneath the Yucca Flat and northern Frenchman Flat areas, and has consequences for ground water modeling and model validation. Our data indicate that the Mississippian Chainman Shale is not a laterally extensive confining unit in the western part of the basin because it is folded back onto itself by the convergent structures of the Belted Range and CP thrust systems. Early and Middle Paleozoic limestone and dolomite are present beneath most of both basins and, regardless of structural complications, are interpreted to form a laterally continuous and extensive carbonate aquifer. Structural culmination that marks the French Peak accommodation zone along the topographic divide between the two basins provides a lateral pathway through highly fractured rock between the volcanic aquifers of Yucca Flat and the regional carbonate aquifer. This pathway may accelerate the migration of ground-water contaminants introduced by underground nuclear testing toward discharge areas beyond the Nevada Test Site boundaries. Predictive three-dimensional models of hydrostratigraphic units and ground-water flow in the pre-Tertiary rocks of subsurface Yucca Flat are likely to be unrealistic due to the extreme structural complexities. The interpretation of hydrologic and geochemical data obtained from monitoring wells will be difficult to extrapolate through the flow system until more is known about the continuity of hydrostratigraphic units.

  16. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  17. Role of the sedimentary structure of the urban vadose zone (URVAZO) on the transfer of heavy metals of an urban stormwater basin

    NASA Astrophysics Data System (ADS)

    Angulo-Jaramillo, R.; Winiarski, T.; Goutaland, D.; Lassabatere, L.

    2009-12-01

    Stormwater infiltration basins have become a common alternative practice to traditional stormwater pipe networks in urban areas. They are often built in permeable subsurface soils (Urban Vadose Zone, URVAZO), such as alluvial deposits. These sedimentary deposits are highly heterogeneous and generate preferential flow paths that may cause either rapid or non-uniform transport of contaminants at great depths. The understanding of how subsurface vadose zone heterogeneities transfer contaminant and fluid flow to the aquifer still remains a challenge in urban hydrology. Indeed, urban stormwater may contain pollutants that can contaminate either soil or groundwater. The aim of this study is to evaluate the role of the lithological heterogeneity of a glaciofluvial deposit underlying an urban infiltration basin on the link between water flow and heavy metals retention. A trench wall (14m length x 3m depth) was exposed by excavating the glaciofluvial formation. By a hydrogeophysical approach based on a sedimentary structural units and in situ hydraulic characterization (Beerkan tests), a realistic hydrostratigraphic 2D model was defined. The trench was sampled on nine vertical sections of 1.5m length, with ten samples per vertical section following each lithofacies. A total of 90 samples were analyzed. Coarse (mechanical sieving) and fine (laser diffraction) particle size distribution analysis, as well as the concentration of three replicates of Pb, Cu, Zn and organic matter (OM) was measured for each sample. The principal component analysis shows a strong correlation between metal concentration and the lithofacies. This hydrostratigraphic model was implemented in the finite element program Hydrus2D. The soil heterogeneity exerts an impact on the heterogeneity of the water content field under slightly saturated conditions, as they induce capillary barrier effects. These capillary barrier effects may generate water accumulation in some lithofacies overlying matrix-free gravel; they lead to lateral flow patterns known as funneled flows. Knowledge of the geometry (orientation, dip) at the structural scale is therefore a prerequisite for evaluating the preferential flow paths. They can explain that the silt fraction may come from colloidal migration through the vadose zone. The use of coupled water-geochemical transfer models enables us to advance assumptions helping the comprehension of principal hydrogeochemical process in the urban vadose zone.

  18. Hydrostratigraphic interpretation of test-hole and borehole geophysical data, Kimball, Cheyenne, and Deuel Counties, Nebraska, 2011-12

    USGS Publications Warehouse

    Hobza, Christopher M.; Sibray, Steven S.

    2014-01-01

    Recently (2004) adopted legislation in Nebraska requires a sustainable balance between long-term supplies and uses of surface-water and groundwater and requires Natural Resources Districts to understand the effect of groundwater use on surface-water systems when developing a groundwater-management plan. The South Platte Natural Resources District (SPNRD) is located in the southern Nebraska Panhandle and overlies the nationally important High Plains aquifer. Declines in water levels have been documented, and more stringent regulations have been enacted to ensure the supply of ground-water will be sufficient to meet the needs of future generations. Because an improved understanding of the hydrogeologic characteristics of this aquifer system is needed to ensure sustainability of groundwater withdrawals, the U.S. Geological Survey, in cooperation with the SPNRD, Conservation and Survey Division of the University of Nebraska-Lincoln, and the Nebraska Environmental Trust, began a hydrogeologic study of the SPNRD to describe the lithology and thickness of the High Plains aquifer. This report documents these characteristics at 29 new test holes, 28 of which were drilled to the base of the High Plains aquifer. Herein the High Plains aquifer is considered to include all hydrologically connected units of Tertiary and Quaternary age. The depth to the base of aquifer was interpreted to range from 37 to 610 feet in 28 of the 29 test holes. At some locations, particularly northern Kimball County, the base-of-aquifer surface was difficult to interpret from drill cutting samples and borehole geophysical logs. The depth to the base of aquifer determined for test holes drilled for this report was compared with the base-of-aquifer surface interpreted by previous researchers. In general, there were greater differences between the base-of-aquifer elevation reported herein and those in previous studies for areas north of Lodgepole Creek compared to areas south of Lodgepole Creek. The largest difference was at test hole 5-SP-11, where an Ogallala-filled paleovalley prevously had been interpreted based on relatively sparse test-hole data west of 5-SP-11. The base of aquifer near test hole 5-SP-11 reported herein is approximately 230 ft higher in elevation than previously interpreted. Among other test holes that are likely to have been drilled in Ogallala-filled paleovalleys, the greatest difference in the interpreted base of aquifer was for test hole 7-CC-11, northeast of Potter, Nebraska, where the base of aquifer is 180 feet deeper than previously interpreted. Interpretation of test-hole and borehole geophysical data for 29 additional test holes will improve resource managers’ understanding of the hydrogeologic characteristics, including aquifer thickness. Aquifer thickness, which is related to total water in storage, is not well quantified in the north and south tablelands. The additional hydrostratigraphic interpretations provided in this report will improve the hydrogeologic framework used in current (2014) and future groundwater models, which are the basis for many water-management decisions.

  19. Hydrostratigraphic mapping of the Milford-Souhegan glacial drift aquifer, and effects of hydrostratigraphy on transport of PCE, Operable Unit 1, Savage Superfund Site, Milford, New Hampshire

    USGS Publications Warehouse

    Harte, Philip T.

    2010-01-01

    The Savage Municipal Well Superfund site in the Town of Milford, New Hampshire, was underlain by a 0.5-square mile plume (as mapped in 1994) of volatile organic compounds (VOCs), most of which consisted of tetrachloroethylene (PCE). The plume occurs mostly within highly transmissive stratified-drift deposits but also extends into underlying till and bedrock. The plume has been divided into two areas called Operable Unit 1 (OU1), which contains the primary source area, and Operable Unit 2 (OU2), which is defined as the extended plume area outside of OU1. The OU1 remedial system includes a low-permeability barrier wall that encircles the highest detected concentrations of PCE and a series of injection and extraction wells to contain and remove contaminants. The barrier wall likely penetrates the full thickness of the sand and gravel; in many places, it also penetrates the full thickness of the underlying basal till and sits atop bedrock.From 1998 to 2004, PCE concentrations decreased by an average of 80 percent at most wells outside the barrier wall. However, inside the barrier, PCE concentrations greater than 10,000 micrograms per liter (μg/L) still exist (2008). The remediation of these areas of recalcitrant PCE presents challenges to successful remediation.The U.S. Geological Survey (USGS), in cooperation with the New Hampshire Department of Environmental Services (NHDES) and the U.S. Environmental Protection Agency (USEPA), Region 1, is studying the solute transport of VOCs (primarily PCE) in contaminated groundwater in the unconsolidated sediments (overburden) of the Savage site and specifically assisting in the evaluation of the effectiveness of remedial operations in the OU1 area. As part of this effort, the USGS analyzed the subsurface stratigraphy to help understand hydrostratigraphic controls on remediation.A combination of lithologic, borehole natural gamma-ray and electromagnetic (EM) induction logging, and test drilling has identified 11 primary hydrostratigraphic units in OU1. These 11 units consist of several well-sorted sandy layers with some gravel that are separated by poorly sorted cobble layers with a fine-grained matrix. Collectively these units represent glacial sediments deposited by localized ice-margin fluctuations. For the most part, the units are semi-planar, particularly the cobble units, and truncated by an undulating bedrock surface. The lowermost unit is a basal till that ranges in thickness from zero to greater than 10 feet and mantles the bedrock surface.The 11 units have different lithologic and hydraulic characteristics. The hydraulic conductivity of the well-sorted sand and gravel units is typically greater than the conductivity of the poorly sorted cobble units and the basal till. The hydraulic conductivity ranges from 5 to greater than 500 feet per day. Lateral and vertical variation in lithology and hydraulic conductivity are inferred by variations in borehole natural gamma-ray counts and estimates of hydraulic conductivity.The comparison of hydrostratigraphic units with the spatial distribution of PCE concentrations suggests that solute transport away from source areas is primarily lateral within the permeable sandy units in the middle to lower parts of the aquifer. Along the centerline of the interior barrier area, highest PCE concentrations are in the sandy units to the east of suspected source areas.

  20. Highly parameterized model calibration with cloud computing: an example of regional flow model calibration in northeast Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Hayley, Kevin; Schumacher, J.; MacMillan, G. J.; Boutin, L. C.

    2014-05-01

    Expanding groundwater datasets collected by automated sensors, and improved groundwater databases, have caused a rapid increase in calibration data available for groundwater modeling projects. Improved methods of subsurface characterization have increased the need for model complexity to represent geological and hydrogeological interpretations. The larger calibration datasets and the need for meaningful predictive uncertainty analysis have both increased the degree of parameterization necessary during model calibration. Due to these competing demands, modern groundwater modeling efforts require a massive degree of parallelization in order to remain computationally tractable. A methodology for the calibration of highly parameterized, computationally expensive models using the Amazon EC2 cloud computing service is presented. The calibration of a regional-scale model of groundwater flow in Alberta, Canada, is provided as an example. The model covers a 30,865-km2 domain and includes 28 hydrostratigraphic units. Aquifer properties were calibrated to more than 1,500 static hydraulic head measurements and 10 years of measurements during industrial groundwater use. Three regionally extensive aquifers were parameterized (with spatially variable hydraulic conductivity fields), as was the aerial recharge boundary condition, leading to 450 adjustable parameters in total. The PEST-based model calibration was parallelized on up to 250 computing nodes located on Amazon's EC2 servers.

  1. Sublake geologic structure from high-resolution seismic-reflection data from four sinkhole lakes in the Lake Wales Ridge, Central Florida

    USGS Publications Warehouse

    Tihansky, A.B.; Arthur, J.D.; DeWitt, D.W.

    1996-01-01

    Seismic-reflection profiles from Lake Wales, Blue Lake, Lake Letta, and Lake Apthorp located along the Lake Wales Ridge in central Florida provide local detail within the regional hydrogeologic framework as described by litho- and hydrostratigraphic cross sections. Lakes located with the mantled karst region have long been considered to be sinkhole lakes, originating from subsidence activity. High-resolution seismic- reflection data confirm this origin for these four lakes. The geologic framework of the Lake Wales Ridge has proven to be a suitable geologic setting for continuous high-resolution seismic-reflection profiling in lakes; however, the nature of the lake-bottom sediments largely controls the quality of the seismic data. In lakes with significant organic-rich bottom deposits, interpretable record was limited to areas where organic deposits were minimal. In lakes with clean, sandy bottoms, the seismic-reflection methods were highly successful in obtaining data that can be correlated with sublake subsidence features. These techniques are useful in examining sublake geology and providing a better understanding of how confining units are affected by subsidence in a region where their continuity is of significant importance to local lake hydrology. Although local geologic control around each lake generally corresponds to the regional geologic framework, local deviations from regional geologic trends occur in sublake areas affected by subsidence activity. Each of the four lakes examined represents a unique set of geologic controls and provides some degree of structural evidence of subsidence activity. Sublake geologic structures identified include: (1) marginal lake sediments dipping into bathymetric lows, (2) lateral discontinuity of confining units including sags and breaches, (3) the disruption and reworking of overlying unconsolidated siliciclastic sediments as they subside into the underlying irregular limestone surface, and (4) sublake regions where confining units appear to remain intact and unaffected by nearby subsidence activity. Each lake likely is underlain by several piping features rather than one large subsidence feature.

  2. 3D Geospatial Models for Visualization and Analysis of Groundwater Contamination at a Nuclear Materials Processing Facility

    NASA Astrophysics Data System (ADS)

    Stirewalt, G. L.; Shepherd, J. C.

    2003-12-01

    Analysis of hydrostratigraphy and uranium and nitrate contamination in groundwater at a former nuclear materials processing facility in Oklahoma were undertaken employing 3-dimensional (3D) geospatial modeling software. Models constructed played an important role in the regulatory decision process of the U.S. Nuclear Regulatory Commission (NRC) because they enabled visualization of temporal variations in contaminant concentrations and plume geometry. Three aquifer systems occur at the site, comprised of water-bearing fractured shales separated by indurated sandstone aquitards. The uppermost terrace groundwater system (TGWS) aquifer is composed of terrace and alluvial deposits and a basal shale. The shallow groundwater system (SGWS) aquifer is made up of three shale units and two sandstones. It is separated from the overlying TGWS and underlying deep groundwater system (DGWS) aquifer by sandstone aquitards. Spills of nitric acid solutions containing uranium and radioactive decay products around the main processing building (MPB), leakage from storage ponds west of the MPB, and leaching of radioactive materials from discarded equipment and waste containers contaminated both the TGWS and SGWS aquifers during facility operation between 1970 and 1993. Constructing 3D geospatial property models for analysis of groundwater contamination at the site involved use of EarthVision (EV), a 3D geospatial modeling software developed by Dynamic Graphics, Inc. of Alameda, CA. A viable 3D geohydrologic framework model was initially constructed so property data could be spatially located relative to subsurface geohydrologic units. The framework model contained three hydrostratigraphic zones equivalent to the TGWS, SGWS, and DGWS aquifers in which groundwater samples were collected, separated by two sandstone aquitards. Groundwater data collected in the three aquifer systems since 1991 indicated high concentrations of uranium (>10,000 micrograms/liter) and nitrate (> 500 milligrams/liter) around the MPB and elevated nitrate (> 2000 milligrams/ liter) around storage ponds. Vertical connectivity was suggested between the TGWS and SGWS, while the DGWS appeared relatively isolated from the overlying aquifers. Lateral movement of uranium was also suggested over time. For example, lateral migration in the TGWS is suggested along a shallow depression in the bedrock surface trending south-southwest from the southwest corner of the MPB. Another pathway atop the buried bedrock surface, trending west-northwest from the MPB and partially reflected by current surface topography, suggested lateral migration of nitrate in the SGWS. Lateral movement of nitrate in the SGWS was also indicated north, south, and west of the largest storage pond. Definition of contaminant plume movement over time is particularly important for assessing direction and rate of migration and the potential need for preventive measures to control contamination of groundwater outside facility property lines. The 3D geospatial property models proved invaluable for visualizing and analyzing variations in subsurface uranium and nitrate contamination in space and time within and between the three aquifers at the site. The models were an exceptional visualization tool for illustrating extent, volume, and quantitative amounts of uranium and nitrate contamination in the subsurface to regulatory decision-makers in regard to site decommissioning issues, including remediation concerns, providing a perspective not possible to achieve with traditional 2D maps. The geohydrologic framework model provides a conceptual model for consideration in flow and transport analyses.

  3. Quantifying Subsurface Water and Heat Distribution and its Linkage with Landscape Properties in Terrestrial Environment using Hydro-Thermal-Geophysical Monitoring and Coupled Inverse Modeling

    NASA Astrophysics Data System (ADS)

    Dafflon, B.; Tran, A. P.; Wainwright, H. M.; Hubbard, S. S.; Peterson, J.; Ulrich, C.; Williams, K. H.

    2015-12-01

    Quantifying water and heat fluxes in the subsurface is crucial for managing water resources and for understanding the terrestrial ecosystem where hydrological properties drive a variety of biogeochemical processes across a large range of spatial and temporal scales. Here, we present the development of an advanced monitoring strategy where hydro-thermal-geophysical datasets are continuously acquired and further involved in a novel inverse modeling framework to estimate the hydraulic and thermal parameter that control heat and water dynamics in the subsurface and further influence surface processes such as evapotranspiration and vegetation growth. The measured and estimated soil properties are also used to investigate co-interaction between subsurface and surface dynamics by using above-ground aerial imaging. The value of this approach is demonstrated at two different sites, one in the polygonal shaped Arctic tundra where water and heat dynamics have a strong impact on freeze-thaw processes, vegetation and biogeochemical processes, and one in a floodplain along the Colorado River where hydrological fluxes between compartments of the system (surface, vadose zone and groundwater) drive biogeochemical transformations. Results show that the developed strategy using geophysical, point-scale and aerial measurements is successful to delineate the spatial distribution of hydrostratigraphic units having distinct physicochemical properties, to monitor and quantify in high resolution water and heat distribution and its linkage with vegetation, geomorphology and weather conditions, and to estimate hydraulic and thermal parameters for enhanced predictions of water and heat fluxes as well as evapotranspiration. Further, in the Colorado floodplain, results document the potential presence of only periodic infiltration pulses as a key hot moment controlling soil hydro and biogeochemical functioning. In the arctic, results show the strong linkage between soil water content, thermal parameters, thaw layer thickness and vegetation distribution. Overall, results of these efforts demonstrate the value of coupling various datasets at high spatial and temporal resolution to improve predictive understanding of subsurface and surface dynamics.

  4. Vertical groundwater flow in Permo-Triassic sediments underlying two cities in the Trent River Basin (UK)

    NASA Astrophysics Data System (ADS)

    Taylor, R. G.; Cronin, A. A.; Trowsdale, S. A.; Baines, O. P.; Barrett, M. H.; Lerner, D. N.

    2003-12-01

    The vertical component of groundwater flow that is responsible for advective penetration of contaminants in sandstone aquifers is poorly understood. This lack of knowledge is of particular concern in urban areas where abstraction disrupts natural groundwater flow regimes and there exists an increased density of contaminant sources. Vertical hydraulic gradients that control vertical groundwater flow were investigated using bundled multilevel piezometers and a double-packer assembly in dedicated boreholes constructed to depths of between 50 and 92 m below ground level in Permo-Triassic sediments underlying two cities within the Trent River Basin of central England (Birmingham, Nottingham). The hydrostratigraphy of the Permo-Triassic sediments, indicated by geophysical logging and hydraulic (packer) testing, demonstrates considerable control over observed vertical hydraulic gradients and, hence, vertical groundwater flow. The direction and magnitude of vertical hydraulic gradients recorded in multilevel piezometers and packers are broadly complementary and range, within error, from +0.1 to -0.7. Groundwater is generally found to flow vertically toward transmissive zones within the hydrostratigraphical profile though urban abstraction from the Sherwood Sandstone aquifer also influences observed vertical hydraulic gradients. Bulk, downward Darcy velocities at two locations affected by abstraction are estimated to be in the order of several metres per year. Consistency in the distribution of hydraulic head with depth in Permo-Triassic sediments is observed over a one-year period and adds support the deduction of hydrostratigraphic control over vertical groundwater flow.

  5. A regional groundwater-flow model for sustainable groundwater-resource management in the south Asian megacity of Dhaka, Bangladesh

    NASA Astrophysics Data System (ADS)

    Islam, Md Bayzidul; Firoz, A. B. M.; Foglia, Laura; Marandi, Andres; Khan, Abidur Rahman; Schüth, Christoph; Ribbe, Lars

    2017-05-01

    The water resources that supply most of the megacities in the world are under increased pressure because of land transformation, population growth, rapid urbanization, and climate-change impacts. Dhaka, in Bangladesh, is one of the largest of 22 growing megacities in the world, and it depends on mainly groundwater for all kinds of water needs. The regional groundwater-flow model MODFLOW-2005 was used to simulate the interaction between aquifers and rivers in steady-state and transient conditions during the period 1981-2013, to assess the impact of development and climate change on the regional groundwater resources. Detailed hydro-stratigraphic units are described according to 150 lithology logs, and a three-dimensional model of the upper 400 m of the Greater Dhaka area was constructed. The results explain how the total abstraction (2.9 million m3/d) in the Dhaka megacity, which has caused regional cones of depression, is balanced by recharge and induced river leakage. The simulated outcome shows the general trend of groundwater flow in the sedimentary Holocene aquifers under a variety of hydrogeological conditions, which will assist in the future development of a rational and sustainable management approach.

  6. Multi-scale approach for 3D hydrostratigraphic and groundwater flow modelling of Milan (Northern Italy) urban aquifers.

    NASA Astrophysics Data System (ADS)

    De Caro, Mattia; Crosta, Giovanni; Frattini, Paolo; Perico, Roberta

    2017-04-01

    During the last century, urban groundwater was heavily exploited for public and industrial supply. As the water demands of industry have fallen, many cities are now experiencing rising groundwater levels with consequent concerns about localized flooding of basements, reduction of soil bearing capacities under foundations, soil internal erosion and the mobilization of contaminants. The city of Milan (Northern Italy) draws water for domestic and industrial purposes from aquifers beneath the urban area. The rate of abstraction has been varying during the last century, depending upon the number of inhabitants and the development of industrial activities. The groundwater abstraction raised to a maximum of about 350x106 m3/yr in the middle 1970s and has successively decreased to a value of about 230x106 m3/yr at present days. This caused a water table raise at an average rate of about 1 m/yr inducing infiltrations and flooding of deep constructions (e.g. building foundations and basements, underground subway excavations). Starting from a large hydrostratigraphic database (8628 borehole logs), a multi-scale approach for the reconstruction of the aquifers geometry (unconfined and semi-confined) at regional-scale has been used. First, a three-level hierarchical classification of the lithologies (lithofacies, hydrofacies, aquifer groups) has been adopted. Then, the interpretation of several 2D cross-sections was attained with Target for ArcGIS exploration software. The interpretation of cross-sections was based on the characteristics of depositional environments of the analysed aquifers (from meandering plain to proximal outwash deposits), on the position of quaternary deposits, and on the distribution of geochemical parameters (i.e. indicator contaminants and major ions). Finally, the aquifer boundary surfaces were interpolated with standard algorithms. The hydraulic properties of analysed aquifers were estimated through the analyses of available step-drawdown tests (Theis with the superimposition solution) and use of empirical relationships from grain-size distribution data, respectively for semi-confined and unconfined aquifers. Finally, 3D Finite Element groundwater flow models have been developed both at regional and local "metropolitan" scale. The regional model covers an area of 3,135.15 km2, while the local model comprises the Milan metropolitan area with an extension of 457 km2. Both models were discretized into a triangular finite element mesh with local refinement in proximity of pumping wells. The element size ranges from 350 to 30 meters and from 200 to 2 meters, respectively for regional and local model. The calibration was done by the comparison with the available water level data for different years (from 1994 to 2016). The calibrated permeability values are consistent with the estimated ones and the sensitivity analysis on hydraulic parameters suggests a minor influence of the aquiclude layer separating the two aquifers. The challenge is to provide the basis for new types of applied outputs so that they may better inform strategic planning options, ground investigation, and abstraction strategies.

  7. Analysis of Geologic CO2 Sequestration at Farnham Dome, Utah, USA

    NASA Astrophysics Data System (ADS)

    Lee, S.; Han, W.; Morgan, C.; Lu, C.; Esser, R.; Thorne, D.; McPherson, B.

    2008-12-01

    The Farnham Dome in east-central Utah is an elongated, Laramide-age anticline along the northern plunge of the San Rafael uplift and the western edge of the Uinta Basin. We are helping design a proposed field demonstration of commercial-scale geologic CO2 sequestration, including injection of 2.9 million tons of CO2 over four years time. The Farnham Dome pilot site stratigraphy includes a stacked system of saline formations alternating with low-permeability units. Facilitating the potential sequestration demonstration is a natural CO2 reservoir at depth, the Jurassic-age Navajo formation, which contains an estimated 50 million tons of natural CO2. The sequestration test design includes two deep formations suitable for supercritical CO2 injection, the Jurassic-age Wingate sandstone and the Permian-age White Rim sandstone. We developed a site-specific geologic model based on available geophysical well logs and formation tops data for use with numerical simulation. The current geologic model is limited to an area of approximately 6.5x4.5 km2 and 2.5 km thick, which contains 12 stacked formations starting with the White Rim formation at the bottom (>5000 feet bgl) and extending to the Jurassic Curtis formation at the top of the model grid. With the detail of the geologic model, we are able to estimate the Farnham Dome CO2 capacity at approximately 36.5 million tones within a 5 mile radius of a single injection well. Numerical simulation of multiphase, non- isothermal CO2 injection and flow suggest that the injected CO2 plume will not intersect nearby fault zones mapped in previous geologic studies. Our simulations also examine and compare competing roles of different trapping mechanisms, including hydrostratigraphic, residual gas, solubility, and mineralization trapping. Previous studies of soil gas flux at the surface of the fault zones yield no significant evidence of CO2 leakage from the natural reservoir at Farnham Dome, and thus we use these simulations to evaluate what factors make this natural reservoir so effective for CO2 storage. Our characterization and simulation efforts are producing a CO2 sequestration framework that incorporates production and capacity estimation, area-of-review, injectivity, and trapping mechanisms. Likewise, mitigation and monitoring strategies have been formulated from the site characterization and modeling results.

  8. Hydrostratigraphic interpretation of test-hole and surface geophysical data, Elkhorn and Loup River Basins, Nebraska, 2008 to 2011

    USGS Publications Warehouse

    Hobza, Christopher M.; Bedrosian, Paul A.; Bloss, Benjamin R.

    2012-01-01

    The Elkhorn-Loup Model (ELM) was begun in 2006 to understand the effect of various groundwater-management scenarios on surface-water resources. During phase one of the ELM study, a lack of subsurface geological information was identified as a data gap. Test holes drilled to the base of the aquifer in the ELM study area are spaced as much as 25 miles apart, especially in areas of the western Sand Hills. Given the variable character of the hydrostratigraphic units that compose the High Plains aquifer system, substantial variation in aquifer thickness and characteristics can exist between test holes. To improve the hydrogeologic understanding of the ELM study area, the U.S. Geological Survey, in cooperation with the Nebraska Department of Natural Resources, multiple Natural Resources Districts participating in the ELM study, and the University of Nebraska-Lincoln Conservation and Survey Division, described the subsurface lithology at six test holes drilled in 2010 and concurrently collected borehole geophysical data to identify the base of the High Plains aquifer system. A total of 124 time-domain electromagnetic (TDEM) soundings of resistivity were collected at and between selected test-hole locations during 2008-11 as a quick, non-invasive means of identifying the base of the High Plains aquifer system. Test-hole drilling and geophysical logging indicated the base-of-aquifer elevation was less variable in the central ELM area than in previously reported results from the western part of the ELM study area, where deeper paleochannels were eroded into the Brule Formation. In total, more than 435 test holes were examined and compared with the modeled-TDEM soundings. Even where present, individual stratigraphic units could not always be identified in modeled-TDEM sounding results if sufficient resistivity contrast was not evident; however, in general, the base of aquifer [top of the aquifer confining unit (ACU)] is one of the best-resolved results from the TDEM-based models, and estimates of the base-of-aquifer elevation are in good accordance with those from existing test-hole data. Differences between ACU elevations based on modeled-TDEM and test-hole data ranged from 2 to 113 feet (0.6 to 34 meters). The modeled resistivity results reflect the eastward thinning of Miocene-age and older stratigraphic units, and generally allowed confident identification of the accompanying change in the stratigraphic unit forming the ACU. The differences in elevation of the top of the Ogallala, estimated on the basis of the modeled-TDEM resistivity, and the test-hole data ranged from 11 to 251 feet (3.4 to 77 meters), with two-thirds of model results being within 60 feet of the test-hole contact elevation. The modeled-TDEM soundings also provided information regarding the distribution of Plio-Pleistocene gravel deposits, which had an average thickness of 100 feet (30 meters) in the study area; however, in many cases the contact between the Plio-Pleistocene deposits and the overlying Quaternary deposits cannot be reliably distinguished using TDEM soundings alone because of insufficient thickness or resistivity contrast.

  9. Aquifer Vulnerability Assessment Based on Sequence Stratigraphic and ³⁹Ar Transport Modeling.

    PubMed

    Sonnenborg, Torben O; Scharling, Peter B; Hinsby, Klaus; Rasmussen, Erik S; Engesgaard, Peter

    2016-03-01

    A large-scale groundwater flow and transport model is developed for a deep-seated (100 to 300 m below ground surface) sedimentary aquifer system. The model is based on a three-dimensional (3D) hydrostratigraphic model, building on a sequence stratigraphic approach. The flow model is calibrated against observations of hydraulic head and stream discharge while the credibility of the transport model is evaluated against measurements of (39)Ar from deep wells using alternative parameterizations of dispersivity and effective porosity. The directly simulated 3D mean age distributions and vertical fluxes are used to visualize the two-dimensional (2D)/3D age and flux distribution along transects and at the top plane of individual aquifers. The simulation results are used to assess the vulnerability of the aquifer system that generally has been assumed to be protected by thick overlaying clayey units and therefore proposed as future reservoirs for drinking water supply. The results indicate that on a regional scale these deep-seated aquifers are not as protected from modern surface water contamination as expected because significant leakage to the deeper aquifers occurs. The complex distribution of local and intermediate groundwater flow systems controlled by the distribution of the river network as well as the topographical variation (Tóth 1963) provides the possibility for modern water to be found in even the deepest aquifers. © 2015, National Ground Water Association.

  10. Preliminary geologic framework developed for a proposed environmental monitoring study of a deep, unconventional Marcellus Shale drill site, Washington County, Pennsylvania

    USGS Publications Warehouse

    Stamm, Robert G.

    2018-06-08

    BackgroundIn the fall of 2011, the U.S. Geological Survey (USGS) was afforded an opportunity to participate in an environmental monitoring study of the potential impacts of a deep, unconventional Marcellus Shale hydraulic fracturing site. The drill site of the prospective case study is the “Range Resources MCC Partners L.P. Units 1-5H” location (also referred to as the “RR–MCC” drill site), located in Washington County, southwestern Pennsylvania. Specifically, the USGS was approached to provide a geologic framework that would (1) provide geologic parameters for the proposed area of a localized groundwater circulation model, and (2) provide potential information for the siting of both shallow and deep groundwater monitoring wells located near the drill pad and the deviated drill legs.The lead organization of the prospective case study of the RR–MCC drill site was the Groundwater and Ecosystems Restoration Division (GWERD) of the U.S. Environmental Protection Agency. Aside from the USGS, additional partners/participants were to include the Department of Energy, the Pennsylvania Geological Survey, the Pennsylvania Department of Environmental Protection, and the developer Range Resources LLC. During the initial cooperative phase, GWERD, with input from the participating agencies, drafted a Quality Assurance Project Plan (QAPP) that proposed much of the objectives, tasks, sampling and analytical procedures, and documentation of results.Later in 2012, the proposed cooperative agreement between the aforementioned partners and the associated land owners for a monitoring program at the drill site was not executed. Therefore, the prospective case study of the RR–MCC site was terminated and no installation of groundwater monitoring wells nor the collection of nearby soil, stream sediment, and surface-water samples were made.Prior to the completion of the QAPP and termination of the perspective case study the geologic framework was rapidly conducted and nearly completed. This was done for three principal reasons. First, there was an immediate need to know the distribution of the relatively undisturbed surface to near-surface bedrock geology and unconsolidated materials for the collection of baseline surface data prior to drill site development (drill pad access road, drill pad leveling) and later during monitoring associated with well drilling, well development, and well production. Second, it was necessary to know the bedrock geology to support the siting of: (1) multiple shallow groundwater monitoring wells (possibly as many as four) surrounding and located immediately adjacent to the drill pad, and (2) deep groundwater monitoring wells (possibly two) located at distance from the drill pad with one possibly being sited along one of the deviated production drill legs. Lastly, the framework geology would provide the lateral extent, thickness, lithology, and expected discontinuities of geologic units (to be parsed or grouped as hydrostratigraphic units) and regional structure trends as inputs into the groundwater model.This report provides the methodology of geologic data accumulation and aggregation, and its integration into a geographic information system (GIS) based program. The GIS program will allow multiple data to be exported in various formats (shapefiles [.shp], database files [.dbf], and Keyhole Markup Language files [.KML]) for use in surface and subsurface geologic site characterization, for sampling strategies, and for inputs for groundwater modeling.

  11. Effects of different boundary conditions on the simulation of groundwater flow in a multi-layered coastal aquifer system (Taranto Gulf, southern Italy)

    NASA Astrophysics Data System (ADS)

    De Filippis, Giovanna; Foglia, Laura; Giudici, Mauro; Mehl, Steffen; Margiotta, Stefano; Negri, Sergio L.

    2017-11-01

    The evaluation of the accuracy or reasonableness of numerical models of groundwater flow is a complex task, due to the uncertainties in hydrodynamic properties and boundary conditions and the scarcity of good-quality field data. To assess model reliability, different calibration techniques are joined to evaluate the effects of different kinds of boundary conditions on the groundwater flow in a coastal multi-layered aquifer in southern Italy. In particular, both direct and indirect approaches for inverse modeling were joined through the calibration of one of the most uncertain parameters, namely the hydraulic conductivity of the karst deep hydrostratigraphic unit. The methodology proposed here, and applied to a real case study, confirmed that the selection of boundary conditions is among the most critical and difficult aspects of the characterization of a groundwater system for conceptual analysis or numerical simulation. The practical tests conducted in this study show that incorrect specification of boundary conditions prevents an acceptable match between the model response to the hydraulic stresses and the behavior of the natural system. Such effects have a negative impact on the applicability of numerical modeling to simulate groundwater dynamics in complex hydrogeological situations. This is particularly important for management of the aquifer system investigated in this work, which represents the only available freshwater resource of the study area, and is threatened by overexploitation and saltwater intrusion.

  12. Phase I Flow and Transport Model Document for Corrective Action Unit 97: Yucca Flat/Climax Mine, Nevada National Security Site, Nye County, Nevada, Revision 1 with ROTCs 1 and 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Robert

    The Underground Test Area (UGTA) Corrective Action Unit (CAU) 97, Yucca Flat/Climax Mine, in the northeast part of the Nevada National Security Site (NNSS) requires environmental corrective action activities to assess contamination resulting from underground nuclear testing. These activities are necessary to comply with the UGTA corrective action strategy (referred to as the UGTA strategy). The corrective action investigation phase of the UGTA strategy requires the development of groundwater flow and contaminant transport models whose purpose is to identify the lateral and vertical extent of contaminant migration over the next 1,000 years. In particular, the goal is to calculate themore » contaminant boundary, which is defined as a probabilistic model-forecast perimeter and a lower hydrostratigraphic unit (HSU) boundary that delineate the possible extent of radionuclide-contaminated groundwater from underground nuclear testing. Because of structural uncertainty in the contaminant boundary, a range of potential contaminant boundaries was forecast, resulting in an ensemble of contaminant boundaries. The contaminant boundary extent is determined by the volume of groundwater that has at least a 5 percent chance of exceeding the radiological standards of the Safe Drinking Water Act (SDWA) (CFR, 2012).« less

  13. Magnetotelluric Data, Mid Valley, Nevada Test Site, Nevada

    USGS Publications Warehouse

    Williams, Jackie M.; Wallin, Erin L.; Rodriguez, Brian D.; Lindsey, Charles R.; Sampson, Jay A.

    2007-01-01

    Introduction The United States Department of Energy (DOE) and the National Nuclear Security Administration (NNSA) at their Nevada Site Office (NSO) are addressing ground-water contamination resulting from historical underground nuclear testing through the Environmental Management (EM) program and, in particular, the Underground Test Area (UGTA) project. One issue of concern is the nature of the somewhat poorly constrained pre-Tertiary geology and its effects on ground-water flow. Ground-water modelers would like to know more about the hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Rainier Mesa/Shoshone Mountain Corrective Action Unit (CAU). During 2003, the U.S. Geological Survey (USGS), in cooperation with the DOE and NNSA-NSO, collected and processed data at the Nevada Test Site in and near Yucca Flat (YF) to help define the character, thickness, and lateral extent of the pre-Tertiary confining units. We collected 51 magnetotelluric (MT) and audio-magnetotelluric (AMT), stations for that research. In early 2005 we extended that research with 26 additional MT data stations, located on and near Rainier Mesa and Shoshone Mountain (RM-SM). The new stations extended the area of the hydrogeologic study previously conducted in Yucca Flat. This work was done to help refine what is known about the character, thickness, and lateral extent of pre-Tertiary confining units. In particular, a major goal was to define the upper clastic confining unit (UCCU). The UCCU is comprised of late Devonian to Mississippian siliciclastic rocks assigned to the Eleana Formation and Chainman Shale. The UCCU underlies the Yucca Flat area and extends westward towards Shoshone Mountain, southward to Buckboard Mesa, and northward to Rainier Mesa. Late in 2005 we collected another 14 MT stations in Mid Valley and in northern Yucca Flat basin. That work was done to better determine the extent and thickness of the UCCU near the southeastern RM-SM CAU boundary with the southwestern YF CAU, and also in the northern YF CAU. The purpose of this report is to release the MT data at those 14 stations. No interpretation of the data is included here.

  14. Geologic Characterization of Young Alluvial Basin-Fill Deposits from Drill-Hole Data in Yucca Flat, Nye County, Nevada

    USGS Publications Warehouse

    Sweetkind, Donald S.; Drake II, Ronald M.

    2007-01-01

    Yucca Flat is a topographic and structural basin in the northeastern part of the Nevada Test Site in Nye County, Nevada, that has been the site of numerous underground nuclear tests; many of these tests occurred within the young alluvial basin-fill deposits. The migration of radionuclides to the Paleozoic carbonate aquifer involves passage through this thick, heterogeneous section of Tertiary and Quaternary rock. An understanding of the lateral and vertical changes in the material properties of young alluvial basin-fill deposits will aid in the further development of the hydrogeologic framework and the delineation of hydrostratigraphic units and hydraulic properties required for simulating ground-water flow in the Yucca Flat area. This report by the U.S. Geological Survey, in cooperation with the U.S. Department of Energy, presents data and interpretation regarding the three-dimensional variability of the shallow alluvial aquifers in areas of testing at Yucca Flat, data that are potentially useful in the understanding of the subsurface flow system. This report includes a summary and interpretation of alluvial basin-fill stratigraphy in the Yucca Flat area based on drill-hole data from 285 selected drill holes. Spatial variations in lithology and grain size of the Neogene basin-fill sediments can be established when data from numerous drill holes are considered together. Lithologic variations are related to different depositional environments within the basin such as alluvial fan, channel, basin axis, and playa deposits.

  15. Geologic Characterization of Young Alluvial Basin-Fill Deposits from Drill Hole Data in Yucca Flat, Nye County, Nevada

    USGS Publications Warehouse

    Sweetkind, Donald S.; Drake II, Ronald M.

    2007-01-01

    Yucca Flat is a topographic and structural basin in the northeastern part of the Nevada Test Site (NTS) in Nye County, Nevada, that has been the site of numerous underground nuclear tests; many of these tests occurred within the young alluvial basin-fill deposits. The migration of radionuclides to the Paleozoic carbonate aquifer involves passage through this thick, heterogeneous section of Tertiary and Quaternary rock. An understanding of the lateral and vertical changes in the material properties of young alluvial basin-fill deposits will aid in the further development of the hydrogeologic framework and the delineation of hydrostratigraphic units and hydraulic properties required for simulating ground-water flow in the Yucca Flat area. This report by the U.S. Geological Survey, in cooperation with the U.S. Department of Energy, presents data and interpretation regarding the three-dimensional variability of the shallow alluvial aquifers in areas of testing at Yucca Flat, data that are potentially useful in the understanding of the subsurface flow system. This report includes a summary and interpretation of alluvial basin-fill stratigraphy in the Yucca Flat area based on drill hole data from 285 selected drill holes. Spatial variations in lithology and grain size of the Neogene basin-fill sediments can be established when data from numerous drill holes are considered together. Lithologic variations are related to different depositional environments within the basin including alluvial fan, channel, basin axis, and playa deposits.

  16. Constructive Epistemic Modeling: A Hierarchical Bayesian Model Averaging Method

    NASA Astrophysics Data System (ADS)

    Tsai, F. T. C.; Elshall, A. S.

    2014-12-01

    Constructive epistemic modeling is the idea that our understanding of a natural system through a scientific model is a mental construct that continually develops through learning about and from the model. Using the hierarchical Bayesian model averaging (HBMA) method [1], this study shows that segregating different uncertain model components through a BMA tree of posterior model probabilities, model prediction, within-model variance, between-model variance and total model variance serves as a learning tool [2]. First, the BMA tree of posterior model probabilities permits the comparative evaluation of the candidate propositions of each uncertain model component. Second, systemic model dissection is imperative for understanding the individual contribution of each uncertain model component to the model prediction and variance. Third, the hierarchical representation of the between-model variance facilitates the prioritization of the contribution of each uncertain model component to the overall model uncertainty. We illustrate these concepts using the groundwater modeling of a siliciclastic aquifer-fault system. The sources of uncertainty considered are from geological architecture, formation dip, boundary conditions and model parameters. The study shows that the HBMA analysis helps in advancing knowledge about the model rather than forcing the model to fit a particularly understanding or merely averaging several candidate models. [1] Tsai, F. T.-C., and A. S. Elshall (2013), Hierarchical Bayesian model averaging for hydrostratigraphic modeling: Uncertainty segregation and comparative evaluation. Water Resources Research, 49, 5520-5536, doi:10.1002/wrcr.20428. [2] Elshall, A.S., and F. T.-C. Tsai (2014). Constructive epistemic modeling of groundwater flow with geological architecture and boundary condition uncertainty under Bayesian paradigm, Journal of Hydrology, 517, 105-119, doi: 10.1016/j.jhydrol.2014.05.027.

  17. 3D geological modeling for complex aquifer system conception and groundwater storage assessment: Case of Sisseb El Alem Nadhour Saouaf basin, northeastern Tunisia

    NASA Astrophysics Data System (ADS)

    Hamdi, Mohamed; Zagrarni, Mohamed Faouzi; Djamai, Najib; Jerbi, Hamza; Goita, Kalifa; Tarhouni, Jamila

    2018-07-01

    With water table drop, managers got extremely concerned about the future of the groundwater resources sustainability of the Sisseb El Alem Nadhour Saouaf aquifer (SANS). In order to understand the groundwater flow dynamic and to assess the functioning of the aquifer system, a three-dimensional (3D) regional geological model of the SANS basin was carried on. The 3D geological model was developed by the combination of 2D seismic reflection profiles, calibrated by wireline logging data of oil wells, hydraulic wells and geological field sections. The 3D geological model shows that the Oligo-Neogene and Eocene aquifers in the study area represent important geometric variations and cumulated thickness affected by intensive fractures. The modeled stratigraphic units were combined with the hydraulic properties to estimate the groundwater storage. The estimated storage in 2016 was around 11 × 109 m3 and in 1971, it was 16 × 109 m3, so, 30% of the groundwater stored previously was consumed in 45 years. Yet, a variable spatial distribution of storativity was demonstrated, ranging from 1 to 3.4 × 106 m3/km2. These results prove the importance of hydro-geophysical investigation and numerical modeling to depicting hydrostratigraphic trends and suggest, that the fate of groundwater resources in the SANS aquifer seems though to be more a matter of the disparity of the groundwater storage than a matter of quantity.

  18. Magnetotelluric Data, Rainier Mesa/Shoshone Mountain, Nevada Test Site, Nevada.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackie M. Williams; Jay A. Sampson; Brian D. Rodriguez

    2006-11-03

    The United States Department of Energy (DOE) and the National Nuclear Security Administration (NNSA) at their Nevada Site Office (NSO) are addressing ground-water contamination resulting from historical underground nuclear testing through the Environmental Management (EM) program and, in particular, the Underground Test Area (UGTA) project. From 1951 to 1992, 828 underground nuclear tests were conducted at the Nevada Test Site northwest of Las Vegas. Most of these tests were conducted hundreds of feet above the ground-water table; however, more than 200 of the tests were near or within the water table. This underground testing was limited to specific areas ofmore » the Nevada Test Site, including Pahute Mesa, Rainier Mesa/Shoshone Mountain, Frenchman Flat, and Yucca Flat. One issue of concern is the nature of the somewhat poorly constrained pre-Tertiary geology, and its effects on ground-water flow. Ground-water modelers would like to know more about the hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Rainier Mesa/Shoshone Mountain Corrective Action Unit (Bechtel Nevada, 2006). During 2005, the U.S. Geological Survey (USGS), in cooperation with the DOE and NNSA-NSO, collected and processed data from twenty-six magnetotelluric (MT) and audio-magnetotelluric (AMT) sites at the Nevada Test Site. The 2005 data stations were located on and near Rainier Mesa and Shoshone Mountain to assist in characterizing the pre-Tertiary geology in those areas. These new stations extend the area of the hydrogeologic study previously conducted in Yucca Flat. This work will help refine what is known about the character, thickness, and lateral extent of pre-Tertiary confining units. In particular, a major goal has been to define the upper clastic confining unit (UCCU – late Devonian to Mississippian-age siliciclastic rocks assigned to the Eleana Formation and Chainman Shale) from the Yucca Flat area and west towards Shoshone Mountain, to Buckboard Mesa in the south, and onto Rainier Mesa in the north. Subsequent interpretation will include a three-dimensional (3-D) character analysis and a two-dimensional (2-D) resistivity model. The purpose of this report is to release the MT sounding data for the twenty-six stations shown in figure 1. No interpretation of the data is included here.« less

  19. The 2016 groundwater flow model for Dane County, Wisconsin

    USGS Publications Warehouse

    Parsen, Michael J.; Bradbury, Kenneth R.; Hunt, Randall J.; Feinstein, Daniel T.

    2016-01-01

    A new groundwater flow model for Dane County, Wisconsin, replaces an earlier model developed in the 1990s by the Wisconsin Geological and Natural History Survey (WGNHS) and the U.S. Geological Survey (USGS). This modeling study was conducted cooperatively by the WGNHS and the USGS with funding from the Capital Area Regional Planning Commission (CARPC). Although the overall conceptual model of the groundwater system remains largely unchanged, the incorporation of newly acquired high-quality datasets, recent research findings, and improved modeling and calibration techniques have led to the development of a more detailed and sophisticated model representation of the groundwater system. The new model is three-dimensional and transient, and conceptualizes the county’s hydrogeology as a 12-layer system including all major unlithified and bedrock hydrostratigraphic units and two high-conductivity horizontal fracture zones. Beginning from the surface down, the model represents the unlithified deposits as two distinct model layers (1 and 2). A single layer (3) simulates the Ordovician sandstone and dolomite of the Sinnipee, Ancell, and Prairie du Chien Groups. Sandstone of the Jordan Formation (layer 4) and silty dolostone of the St. Lawrence Formation (layer 5) each comprise separate model layers. The underlying glauconitic sandstone of the Tunnel City Group makes up three distinct layers: an upper aquifer (layer 6), a fracture feature (layer 7), and a lower aquifer (layer 8). The fracture layer represents a network of horizontal bedding-plane fractures that serve as a preferential pathway for groundwater flow. The model simulates the sandstone of the Wonewoc Formation as an upper aquifer (layer 9) with a bedding-plane fracture feature (layer 10) at its base. The Eau Claire aquitard (layer 11) includes shale beds within the upper portion of the Eau Claire Formation. This layer, along with overlying bedrock units, is mostly absent in the preglacially eroded valleys along the Yahara River valley and in northeastern Dane County. Layer 12 represents the Mount Simon sandstone as the lowermost model layer. It directly overlies the Precambrian crystalline basement rock, whose top surface forms the lower boundary of the model. The model uses the USGS MODFLOW-NWT finite-difference code, a standalone version of MODFLOW-2005 that incorporates the Newton (NWT) solver. MODFLOW-NWT improves the handling of unconfined conditions by smoothing the transition from wet to dry cells. The model explicitly simulates groundwater–surface-water interaction with streamflow routing and lake-level fluctuation. Model input included published and unpublished hydrogeologic data from recent estimates of aquifer hydraulic conductivities. A spatial groundwater recharge distribution was obtained from a recent GIS-based, soil-water-balance model for Dane County. Groundwater withdrawals from pumping were simulated for 572 wells across the entire model domain, which includes Dane County and portions of seven neighboring counties—Columbia, Dodge, Green, Iowa, Jefferson, Lafayette, and Rock. These wells withdrew an average of 60 million gallons per day (mgd) over the 5-year period from 2006 through 2010. Within Dane County, 385 wells were simulated with an average withdrawal rate of 52 mgd.Model calibration used the parameter estimation code PEST, and calibration targets included heads, stream and spring flows, lake levels, and borehole flows. Steady-state calibration focused on the period 2006 through 2010; the transient calibration focused on the 7-week drought period from late May through July 2012. This model represents a significant step forward from previous work because of its finer grid resolution, improved hydrostratigraphic discretization, transient capabilities, and more sophisticated representation of surface-water features and multi-aquifer wells.Potential applications of the model include evaluation of potential sites for and impacts of new high-capacity wells, development of wellhead protection plans, evaluating the effects of changing land use and climate on groundwater, and quantifying the relationships between groundwater and surface water.

  20. Analysis of alluvial hydrostratigraphy using indicator geostatistics, with examples from Santa Clara Valley, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-03-01

    Current trends in hydrogeology seek to enlist sedimentary concepts in the interpretation of permeability structures. However, existing conceptual models of alluvial deposition tend to inadequately account for the heterogeneity caused by complex sedimentological and external factors. This dissertation presents three analyses of alluvial hydrostratigraphy using indicator geostatistics. This approach empirically acknowledges both the random and structured qualities of alluvial structures at scales relevant to site investigations. The first analysis introduces the indicator approach, whereby binary values are assigned to borehole-log intervals on the basis of inferred relative permeability; it presents a case study of indicator variography at a well-documented ground-watermore » contamination site, and uses indicator kriging to interpolate an aquifer-aquitard sequence in three dimensions. The second analysis develops an alluvial-architecture context for interpreting semivariograms, and performs comparative variography for a suite of alluvial sites in Santa Clara Valley, California. The third analysis investigates the use of a water well perforation indicator for assessing large-scale hydrostratigraphic structures within relatively deep production zones.« less

  1. UGTA Photograph Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NSTec Environmental Restoration

    One of the advantages of the Nevada Test Site (NTS) is that most of the geologic and hydrologic features such as hydrogeologic units (HGUs), hydrostratigraphic units (HSUs), and faults, which are important aspects of flow and transport modeling, are exposed at the surface somewhere in the vicinity of the NTS and thus are available for direct observation. However, due to access restrictions and the remote locations of many of the features, most Underground Test Area (UGTA) participants cannot observe these features directly in the field. Fortunately, National Security Technologies, LLC, geologists and their predecessors have photographed many of these featuresmore » through the years. During fiscal year 2009, work was done to develop an online photograph database for use by the UGTA community. Photographs were organized, compiled, and imported into Adobe® Photoshop® Elements 7. The photographs were then assigned keyword tags such as alteration type, HGU, HSU, location, rock feature, rock type, and stratigraphic unit. Some fully tagged photographs were then selected and uploaded to the UGTA website. This online photograph database provides easy access for all UGTA participants and can help “ground truth” their analytical and modeling tasks. It also provides new participants a resource to more quickly learn the geology and hydrogeology of the NTS.« less

  2. Simulating groundwater flow and runoff for the Oro Moraine aquifer system. Part II. Automated calibration and mass balance calculations

    NASA Astrophysics Data System (ADS)

    Beckers, J.; Frind, E. O.

    2001-03-01

    A steady-state groundwater model of the Oro Moraine aquifer system in Central Ontario, Canada, is developed. The model is used to identify the role of baseflow in the water balance of the Minesing Swamp, a 70 km 2 wetland of international significance. Lithologic descriptions are used to develop a hydrostratigraphic conceptual model of the aquifer system. The numerical model uses long-term averages to represent temporal variations of the flow regime and includes a mechanism to redistribute recharge in response to near-surface geologic heterogeneity. The model is calibrated to water level and streamflow measurements through inverse modeling. Observed baseflow and runoff quantities validate the water mass balance of the numerical model and provide information on the fraction of the water surplus that contributes to groundwater flow. The inverse algorithm is used to compare alternative model zonation scenarios, illustrating the power of non-linear regression in calibrating complex aquifer systems. The adjoint method is used to identify sensitive recharge areas for groundwater discharge to the Minesing Swamp. Model results suggest that nearby urban development will have a significant impact on baseflow to the swamp. Although the direct baseflow contribution makes up only a small fraction of the total inflow to the swamp, it provides an important steady influx of water over relatively large portions of the wetland. Urban development will also impact baseflow to the headwaters of local streams. The model provides valuable insight into crucial characteristics of the aquifer system although definite conclusions regarding details of its water budget are difficult to draw given current data limitations. The model therefore also serves to guide future data collection and studies of sub-areas within the basin.

  3. Summary of the San Juan structural basin regional aquifer-system analysis, New Mexico, Colorado, Arizona, and Utah

    USGS Publications Warehouse

    Levings, G.W.; Kernodle, J.M.; Thorn, C.R.

    1996-01-01

    Ground-water resources are the only source of water in most of the San Juan structural basin and are mainly used for municipal, industrial, domestic, and stock purposes. Industrial use increased dramatically during the late 1970's and early 1980's because of increased exploration and development of uranium and coal resources. The San Juan structural basin is a northwest-trending, asymmetric structural depression at the eastern edge of the Colorado Plateau. The basin contains as much as 14,000 feet of sedimentary rocks overlying a Precambrian basement complex. The sedimentary rocks dip basinward from the basin margins toward the troughlike structural center, or deepest part of the basin. Rocks of Triassic age were selected as the lower boundary for the study. The basin is well defined by structural boundaries in many places with structural relief of as much as 20,000 feet reported. Faulting is prevalent in parts of the basin with displacement of several thousand feet along major faults. The regional aquifers in the basin generally are coincident with the geologic units that have been mapped. Data on the hydrologic properties of the regional aquifers are minimal. Most data were collected on those aquifers associated with uranium and coal resource production. These data are summarized in table format in the report. The regional flow system throughout most of the basin has been affected by the production of oil or gas and subsequent disposal of produced brine. To date more than 26,000 oil- or gas- test holes have been drilled in the basin, the majority penetrating no deeper than the bottom of the Cretaceous rocks. The general water chemistry of the regional aquifers is based on available data. The depositional environments are the major factor controlling the quality of water in the units. The dominant ions are generally sodium, bicarbonate, and sulfate. A detailed geochemical study of three sandstone aquifers--Morrison, Dakota, and Gallup--was undertaken in the northwestern part of the study area. Results of this study indicate that water chemistry changed in individual wells over short periods of time, not expected in a regional flow system. The chemistry of the water is affected by mixing of recharge, ion filtrate, or very dilute ancient water, and by leakage of saline water. The entire system of ground-water flow and its controlling factors has been defined as the conceptual model. A steady-state, three-dimensional ground-water flow model was constructed to simulate modern predevelopment flow in the post-Jurassic rocks of the regional flow system. In the ground-water flow model, 14 geologic units or combinations of geologic units were considered to be regional aquifers, and 5 geologic units or combinations of geologic units were considered to be regional confining units. The model simulated flow in 12 layers (hydrostratigraphic units) and used harmonic-mean vertical leakance to indirectly simulate aquifer connection across 3 other hydrostratigraphic confining units in addition to coupling the 12 units.

  4. Modeling Effects of Groundwater Basin Closure, and Reversal of Closure, on Groundwater Quality

    NASA Astrophysics Data System (ADS)

    Pauloo, R.; Guo, Z.; Fogg, G. E.

    2017-12-01

    Population growth, the expansion of agriculture, and climate uncertainties have accelerated groundwater pumping and overdraft in aquifers worldwide. In many agricultural basins, a water budget may be stable or not in overdraft, yet disconnected ground and surface water bodies can contribute to the formation of a "closed" basin, where water principally exits the basin as evapotranspiration. Although decreasing water quality associated with increases in Total Dissolved Solids (TDS) have been documented in aquifers across the United States in the past half century, connections between water quality declines and significant changes in hydrologic budgets leading to closed basin formation remain poorly understood. Preliminary results from an analysis with a regional-scale mixing model of the Tulare Lake Basin in California indicate that groundwater salinization resulting from open to closed basin conversion can operate on a decades-to-century long time scale. The only way to reverse groundwater salinization caused by basin closure is to refill the basin and change the hydrologic budget sufficiently for natural groundwater discharge to resume. 3D flow and transport modeling, including the effects of heterogeneity based on a hydrostratigraphic facies model, is used to explore rates and time scales of groundwater salinization and its reversal under different water and land management scenarios. The modeling is also used to ascertain the extent to which local and regional heterogeneity need to be included in order to appropriately upscale the advection-dispersion equation in a basin scale groundwater quality management model. Results imply that persistent managed aquifer recharge may slow groundwater salinization, and complete reversal may be possible at sufficiently high water tables.

  5. Multiscale solute transport upscaling for a three-dimensional hierarchical porous medium

    NASA Astrophysics Data System (ADS)

    Zhang, Mingkan; Zhang, Ye

    2015-03-01

    A laboratory-generated hierarchical, fully heterogeneous aquifer model (FHM) provides a reference for developing and testing an upscaling approach that integrates large-scale connectivity mapping with flow and transport modeling. Based on the FHM, three hydrostratigraphic models (HSMs) that capture lithological (static) connectivity at different resolutions are created, each corresponding to a sedimentary hierarchy. Under increasing system lnK variances (0.1, 1.0, 4.5), flow upscaling is first conducted to calculate equivalent hydraulic conductivity for individual connectivity (or unit) of the HSMs. Given the computed flow fields, an instantaneous, conservative tracer test is simulated by all models. For the HSMs, two upscaling formulations are tested based on the advection-dispersion equation (ADE), implementing space versus time-dependent macrodispersivity. Comparing flow and transport predictions of the HSMs against those of the reference model, HSMs capturing connectivity at increasing resolutions are more accurate, although upscaling errors increase with system variance. Results suggest: (1) by explicitly modeling connectivity, an enhanced degree of freedom in representing dispersion can improve the ADE-based upscaled models by capturing non-Fickian transport of the FHM; (2) when connectivity is sufficiently resolved, the type of data conditioning used to model transport becomes less critical. Data conditioning, however, is influenced by the prediction goal; (3) when aquifer is weakly-to-moderately heterogeneous, the upscaled models adequately capture the transport simulation of the FHM, despite the existence of hierarchical heterogeneity at smaller scales. When aquifer is strongly heterogeneous, the upscaled models become less accurate because lithological connectivity cannot adequately capture preferential flows; (4) three-dimensional transport connectivities of the hierarchical aquifer differ quantitatively from those analyzed for two-dimensional systems. This article was corrected on 7 MAY 2015. See the end of the full text for details.

  6. Hydrogeologic framework, ground-water geochemistry, and assessment of nitrogen yield from base flow in two agricultural watersheds, Kent County, Maryland

    USGS Publications Warehouse

    Bachman, L.J.; Krantz, D.E.; Böhlke, John Karl

    2002-01-01

    Hydrostratigraphic and geochemical data collected in two adjacent watersheds on the Delmarva Peninsula, in Kent County, Maryland, indicate that shallow subsurface stratigraphy is an important factor that affects the concentrations of nitrogen in ground water discharging as stream base flow. The flux of nitrogen from shallow aquifers can contribute substantially to theeutrophication of streams and estuaries, degrading water quality and aquatic habitats. The information presented in this report includes a hydrostratigraphic framework for the Locust Grove study area, analyses and interpretation of ground-water chemistry, and an analysis of nutrient yields from stream base flow. An understanding of the processes by which ground-waternitrogen discharges to streams is important for optimal management of nutrients in watersheds in which ground-water discharge is an appreciable percentage of total streamflow. The U.S. Geological Survey, in cooperation with the U.S. Environmental Protection Agency (USEPA), collected and analyzed hydrostratigraphic and geochemical data in support ofground-water flow modeling by the USEPA.The adjacent watersheds of Morgan Creek and Chesterville Branch have similar topography and land use; however, reported nitrogen concentrations are generally 6 to 10 milligrams per liter in Chesterville Branch but only 2 to 4 milligrams per liter in Morgan Creek. Ground water in the surficial aquifer in the recharge areas of both streams has high concentrations of nitrate(greater than 10 milligrams per liter as N) and dissolved oxygen. One component of the ground water discharging to Morgan Creek typically is anoxic and contains virtually no dissolved nitrate; most of the ground water discharging to Chesterville Branch is oxygenated and contains moderately high concentrations of nitrate.The surficial aquifer in the study area is composed of the deeply weathered sands and gravels of the Pensauken Formation (the Columbia aquifer) and the underlying glauconitic sands of the upper Aquia Formation (the Aquia aquifer). The lower 6 to 9 meters of the Aquia Formation is a low-permeability silt-clay with abundant glauconite. The Aquia confining layer underliesthe Columbia-Aquia surficial aquifer throughout the study area. The sediment redox transition, identified in cores, that occurs in the upper 0.5 to 1 meter of the Aquia confining layer is thought to be a site for subsurface denitrification of ground water. The first confined aquifer is composed of the glauconitic sands in the upper 9 to 11 meters of the Hornerstown Formation. TheHornerstown aquifer is underlain by 10 to 15 meters of glauconitic silt-clay at the base of the Hornerstown Formation (the Hornerstown confining layer), and 5 meters of low-permeability clay in the underlying Severn Formation.The Aquia and Hornerstown Formations dip and thicken to the southeast, and the Aquia confining layer subcrops shallowly (within 5 meters of the land surface) in a band that strikes southwest to northeast across the northern edge of the study area. The surficial aquifer is very thin (generally less than 5 meters) north of Morgan Creek, and the alluvial valley of Morgan Creek has incised into the top of the Aquia confining layer. In contrast, the Aquia confining layer lies 22 meters below Chesterville Branch, and the surficial aquifer approaches 30 meters in thickness (away from the creek).Chemically reduced iron sulfides and glauconite in the Aquia confining layer are likely substrates for denitrification of nitrate in ground water. Evidence from the dissolved concentrations of nitrate, sulfate, iron, argon, and nitrogen gas, and stable nitrogen isotopes support the interpretation that ground water flowing near the top of the Aquia confining layer, or through the confined Hornerstown aquifer, has undergone denitrification. This process appears to have the greatest effect on ground-water chemistry north of Morgan Creek, where the surficial aquifer is thin and a greater percentage of the ground water contacts the Aquia confining layer.The base-flow discharges of total nitrogen from the two watersheds are of similar magnitude, although Chesterville Branch has somewhat higher loads (29,000 kilograms of nitrogen per year) than Morgan Creek (20,000 kilograms of nitrogen per year), although Morgan Creek has a larger drainage area and a greater discharge of water. The base-flow yield of nitrogen (load per unit area) in Chesterville Branch (median of 0.058 grams per second per square kilometer at the outlet) is more than twice that of Morgan Creek (median of 0.022 grams per second per square kilometer at the outlet), reflecting the higher concentration of nitrate in ground water discharging to Chesterville Branch. Total nitrogen concentrations tend to decrease downstream inChesterville Branch and increase downstream in Morgan Creek. The downstream trend in Chesterville Branch may be affected by instream nitrogen uptake and denitrification, and an increasing proportion of older, denitrified ground water in downstream discharge. The downstream trends in Morgan Creek may be affected by inflow from tributaries, downstream changes in the source of discharge water, and downstream changes in the riparian zone, which could affect the processes and degree of denitrification.Although these two watersheds appear to have landscape features (such as topography, land use, and soils) that would produce similar nitrogen discharges, a more detailed examination of landscape features indicates that Chesterville Branch has soils that are slightly better drained, tributary stream outlets at higher altitudes, and a slightly higher percentage of agricultural land. All of these factors have been related to higher nitrogen yields. Nonetheless, most of the data support the interpretation that hydrostratigraphy has the greatest effect in producing the difference in nitrogen yields between the two watersheds.

  7. Magnetotelluric Data, Mid Valley, Nevada Test Site, Nevada.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackie M. Williams; Erin L. Wallin; Brian D. Rodriguez

    2007-08-15

    The United States Department of Energy (DOE) and the National Nuclear Security Administration (NNSA) at their Nevada Site Office (NSO) are addressing ground-water contamination resulting from historical underground nuclear testing through the Environmental Management (EM) program and, in particular, the Underground Test Area (UGTA) project. One issue of concern is the nature of the somewhat poorly constrained pre-Tertiary geology and its effects on ground-water flow. Ground-water modelers would like to know more about the hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Rainier Mesa/Shoshone Mountain Corrective Action Unit (CAU) (Bechtel Nevada, 2006).more » During 2003, the U.S. Geological Survey (USGS), in cooperation with the DOE and NNSA-NSO, collected and processed data at the Nevada Test Site in and near Yucca Flat (YF) to help define the character, thickness, and lateral extent of the pre-tertiary confining units. We collected 51 magnetotelluric (MT) and audio-magnetotelluric (AMT), stations for that research (Williams and others, 2005a, 2005b, 2005c, 2005d, 2005e, 2005f). In early 2005 we extended that research with 26 additional MT data stations (Williams and others, 2006), located on and near Rainier Mesa and Shoshone Mountain (RM-SM). The new stations extended the area of the hydrogeologic study previously conducted in Yucca Flat. This work was done to help refine what is known about the character, thickness, and lateral extent of pre-Tertiary confining units. In particular, a major goal was to define the upper clastic confining unit (UCCU). The UCCU is comprised of late Devonian to Mississippian siliciclastic rocks assigned to the Eleana Formation and Chainman Shale. The UCCU underlies the Yucca Flat area and extends westward towards Shoshone Mountain, southward to Buckboard Mesa, and northward to Rainier Mesa. Late in 2005 we collected another 14 MT stations in Mid Valley and in northern Yucca Flat basin. That work was done to better determine the extent and thickness of the UCCU near the southeastern RM-SM CAU boundary with the southwestern YF CAU, and also in the northern YF CAU. The purpose of this report is to release the MT data at those 14 stations shown in figure 1. No interpretation of the data is included here.« less

  8. Deep Vadose Zone Flow and Transport Behavior at T-Tunnel Complex, Rainier Mesa, Nevada National Security Site

    NASA Astrophysics Data System (ADS)

    Parashar, R.; Reeves, D. M.

    2010-12-01

    Rainier Mesa, a tuffaceous plateau on the Nevada National Security Site, has been the location of numerous subsurface nuclear tests conducted in a series of tunnel complexes located approximately 450 m below the top of the mesa and 500 m above the regional groundwater flow system. The tunnels were constructed near the middle of an 800 m Tertiary sequence of faulted, low-permeability welded and non-welded bedded, vitric, and zeolitized tuff units. Water levels from wells in the vicinity of the T-tunnel complex indicate the presence of a perched saturation zone located approximately 100 m above the T-tunnel complex. This upper zone of saturation extends downward through most of the Tertiary sequence. The groundwater table is located at an elevation of 1300 m within a thrust sheet of Paleozoic carbonates, corresponding to the lower carbonate aquifer hydrostratigraphic unit (LCA3). The LCA3 is considered to be hydraulically connected to the Death Valley regional flow system. The objective of this project is to simulate complex downward patterns of fluid flow and radionuclide transport from the T-tunnel complex through the matrix and fault networks of the Tertiary tuff units to the water table. We developed an improved fracture characterization and mapping methodology consisting of displacement-length scaling relationships, simulation of realistic fault networks based on site-specific data, and the development of novel fracture network upscaling techniques that preserves fracture network flow and transport properties on coarse continuum grid. Development of upscaling method for fracture continua is based on the concepts of discrete fracture network modeling approach which performs better at honoring network connectivity and anisotropy of sparse networks in comparison to other established methods such as a tensor approach. Extensive flow simulations in the dual-continuum framework demonstrate that the characteristics of fault networks strongly influences the saturation profile and formation of perched zones, although they may not conduct a large amount of flow when compared to the matrix continua. The simulated results are found to be very sensitive to distribution of fracture aperture, density of the network, and spatial pattern of fracture clustering. The faults provide rapid pathways for radionuclide transport and the conceptual modeling of diffusional mass transfer between matrix and fracture continua plays a vital role in prediction of the overall behavior of the breakthrough curve.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wainwright, Haruko M.; Flores Orozco, Adrian; Bucker, Matthias

    In floodplain environments, a naturally reduced zone (NRZ) is considered to be a common biogeochemical hot spot, having distinct microbial and geochemical characteristics. Although important for understanding their role in mediating floodplain biogeochemical processes, mapping the subsurface distribution of NRZs over the dimensions of a floodplain is challenging, as conventional wellbore data are typically spatially limited and the distribution of NRZs is heterogeneous. In this work, we present an innovative methodology for the probabilistic mapping of NRZs within a three-dimensional (3-D) subsurface domain using induced polarization imaging, which is a noninvasive geophysical technique. Measurements consist of surface geophysical surveys andmore » drilling-recovered sediments at the U.S. Department of Energy field site near Rifle, CO (USA). Inversion of surface time domain-induced polarization (TDIP) data yielded 3-D images of the complex electrical resistivity, in terms of magnitude and phase, which are associated with mineral precipitation and other lithological properties. By extracting the TDIP data values colocated with wellbore lithological logs, we found that the NRZs have a different distribution of resistivity and polarization from the other aquifer sediments. To estimate the spatial distribution of NRZs, we developed a Bayesian hierarchical model to integrate the geophysical and wellbore data. In addition, the resistivity images were used to estimate hydrostratigraphic interfaces under the floodplain. Validation results showed that the integration of electrical imaging and wellbore data using a Bayesian hierarchical model was capable of mapping spatially heterogeneous interfaces and NRZ distributions thereby providing a minimally invasive means to parameterize a hydrobiogeochemical model of the floodplain.« less

  10. Superfund record of decision (EPA Region 7): St. Louis Airport/HHS/Futura Coatings Co. , St. Louis County, MO, August 27, 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1999-03-01

    This document presents the selected remedial action for the cleanup of wastes related to Manhattan Engineering District/Atomic Energy Commission (MED/AEC) operations in accessible soils and ground water at the St. Louis Downtown Site (SLDS). The main components of the selected remedial action include: Excavation and off-site disposal of approximately 65,000 cubic meters (85,000 cubic yards) (in-situ) contaminated soil; and No remedial action is required for ground water beneath the site. Perimeter monitoring of the ground water in the Mississippi River alluvial aquifer, designated as the hydrostratigraphic B Unit, will be performed and the need for ground water remediation will bemore » evaluated as part of the periodic reviews performed for the site.« less

  11. Detailed Geological Modelling in Urban Areas focused on Structures relevant to the Near Surface Groundwater Flow in the context of Climatic Changes

    NASA Astrophysics Data System (ADS)

    Bach, T.; Pallesen, T. M.; Jensen, N. P.; Mielby, S.; Sandersen, P.; Kristensen, M.

    2015-12-01

    This case demonstrates a practical example from the city of Odense (DK) where new geological modeling techniques has been developed and used in the software GeoScene3D, to create a detailed voxel model of the anthropogenic layer. The voxel model has been combined with a regional hydrostratigraphic layer model. The case is part of a pilot project partly financed by VTU (Foundation for Development of Technology in the Danish Water Sector) and involves many different datatypes such as borehole information, geophysical data, human related elements (landfill, pipelines, basements, roadbeds etc). In the last few years, there has been increased focus on detailed geological modeling in urban areas. The models serve as important input to hydrological models. This focus is partly due to climate changes as high intensity rainfalls are seen more often than in the past, and water recharge is a topic too. In urban areas, this arises new challenges. There is a need of a high level of detailed geological knowledge for the uppermost zone of the soil, which typically are problematic due to practically limitations, especially when using geological layer models. Furthermore, to accommodate the need of a high detail, all relevant available data has to be used in the modeling process. Human activity has deeply changed the soil layers, e.g. by constructions as roadbeds, buildings with basements, pipelines, landfill etc. These elements can act as barriers or pathways regarding surface near groundwater flow and can attribute to local flooding or mobilization and transport of contaminants etc. A geological voxel model is built by small boxes (a voxel). Each box can contain several parameters, ex. lithology, transmissivity or contaminant concentration. Human related elements can be implemented using tools, which gives the modeler advanced options for making detailed small-scale models. This case demonstrates the workflow and the resulting geological model for the pilot area.

  12. Assessing groundwater availability in a folded carbonate aquifer through the development of a numerical model

    NASA Astrophysics Data System (ADS)

    Di Salvo, Cristina; Romano, Emanuele; Guyennon, Nicolas; Bruna Petrangeli, Anna; Preziosi, Elisabetta

    2015-04-01

    The study of aquifer systems from a quantitative point of view is fundamental for adopting water management plans aiming at preserving water resources and reducing environmental risks related to groundwater level and discharge changes. This is also what the European Union Water Framework Directive (WFD, 2000/60/EC) states, holding the development of numerical models as a key aspect for groundwater management. The objective of this research is to i) define a methodology for modeling a complex hydrogeological structure in a structurally folded carbonate area and ii) estimate the concurrent effects of exploitation and climate changes on groundwater availability through the implementation of a 3D groundwater flow model. This study concerns the Monte Coscerno karst aquifer located in the Apennine chain in Central Italy in the Nera River Valley.This aquifer, is planned to be exploited in the near future for water supply. Negative trends of precipitation in Central Italy have been reported in relation to global climate changes, which are expected to affect the availability of recharge to carbonate aquifers throughout the region . A great concern is the combined impact of climate change and groundwater exploitation, hence scenarios are needed taking into account the effect of possible temperature and precipitation trends on recharge rates. Following a previous experience with model conceptualization and long-term simulation of groundwater flow, an integrated three-dimensional groundwater model has been developed for the Monte Coscerno aquifer. In a previous paper (Preziosi et al 2014) the spatial distribution of recharge to this aquifer was estimated through the Thornthwaite Mather model at a daily time step using as inputs past precipitation and temperature values (1951-2013) as well as soil and landscape properties. In this paper the numerical model development is described. On the basis of well logs from private consulting companies and literature cross sections the multilayer aquifer was conceptualized as five folded hydrostratigraphic units: three main carbonate aquifers are separated by two aquitards, which can be locally discontinuous, leading to a complicated flow pattern. In general the vertical leakance is upward from the basal aquifer to the unconfined uppermost aquifer. As shown by the increasing discharge from north to south, the Nera river acts as the main sink of the study area, gaining groundwater as it cuts through the folded terrain. The numerical model was implemented using the MODFLOW-2000 code and extends over an area of 235 km2 with a grid spacing of 100 meters in each of the 5 layers. Model calibration was achieved by comparing the model results with observed streamflow of the Nera river (8-10 measures per year during 1991-1993 and 1996-2012) which on the basis of the river hydrograph at gaging locations is considered to be derived entirely from groundwater. The effects of climate variation on groundwater discharge to the river in the past 60 years are analyzed. Key issues related to the elaboration of a numerical model of a folded structure are also described.

  13. Hierarchical Bayesian method for mapping biogeochemical hot spots using induced polarization imaging

    DOE PAGES

    Wainwright, Haruko M.; Flores Orozco, Adrian; Bucker, Matthias; ...

    2016-01-29

    In floodplain environments, a naturally reduced zone (NRZ) is considered to be a common biogeochemical hot spot, having distinct microbial and geochemical characteristics. Although important for understanding their role in mediating floodplain biogeochemical processes, mapping the subsurface distribution of NRZs over the dimensions of a floodplain is challenging, as conventional wellbore data are typically spatially limited and the distribution of NRZs is heterogeneous. In this work, we present an innovative methodology for the probabilistic mapping of NRZs within a three-dimensional (3-D) subsurface domain using induced polarization imaging, which is a noninvasive geophysical technique. Measurements consist of surface geophysical surveys andmore » drilling-recovered sediments at the U.S. Department of Energy field site near Rifle, CO (USA). Inversion of surface time domain-induced polarization (TDIP) data yielded 3-D images of the complex electrical resistivity, in terms of magnitude and phase, which are associated with mineral precipitation and other lithological properties. By extracting the TDIP data values colocated with wellbore lithological logs, we found that the NRZs have a different distribution of resistivity and polarization from the other aquifer sediments. To estimate the spatial distribution of NRZs, we developed a Bayesian hierarchical model to integrate the geophysical and wellbore data. In addition, the resistivity images were used to estimate hydrostratigraphic interfaces under the floodplain. Validation results showed that the integration of electrical imaging and wellbore data using a Bayesian hierarchical model was capable of mapping spatially heterogeneous interfaces and NRZ distributions thereby providing a minimally invasive means to parameterize a hydrobiogeochemical model of the floodplain.« less

  14. Evaluating the importance of characterizing soil structure and horizons in parameterizing a hydrologic process model

    USGS Publications Warehouse

    Mirus, Benjamin B.

    2015-01-01

    Incorporating the influence of soil structure and horizons into parameterizations of distributed surface water/groundwater models remains a challenge. Often, only a single soil unit is employed, and soil-hydraulic properties are assigned based on textural classification, without evaluating the potential impact of these simplifications. This study uses a distributed physics-based model to assess the influence of soil horizons and structure on effective parameterization. This paper tests the viability of two established and widely used hydrogeologic methods for simulating runoff and variably saturated flow through layered soils: (1) accounting for vertical heterogeneity by combining hydrostratigraphic units with contrasting hydraulic properties into homogeneous, anisotropic units and (2) use of established pedotransfer functions based on soil texture alone to estimate water retention and conductivity, without accounting for the influence of pedon structures and hysteresis. The viability of this latter method for capturing the seasonal transition from runoff-dominated to evapotranspiration-dominated regimes is also tested here. For cases tested here, event-based simulations using simplified vertical heterogeneity did not capture the state-dependent anisotropy and complex combinations of runoff generation mechanisms resulting from permeability contrasts in layered hillslopes with complex topography. Continuous simulations using pedotransfer functions that do not account for the influence of soil structure and hysteresis generally over-predicted runoff, leading to propagation of substantial water balance errors. Analysis suggests that identifying a dominant hydropedological unit provides the most acceptable simplification of subsurface layering and that modified pedotransfer functions with steeper soil-water retention curves might adequately capture the influence of soil structure and hysteresis on hydrologic response in headwater catchments.

  15. Investigating Vertical Mixing Between Two Carbonate Aquifers Using a Multiport Well, Central Texas

    NASA Astrophysics Data System (ADS)

    Kromann, J.; Wong, C. I.; Hunt, B.; Smith, B.; Banner, J. L.

    2011-12-01

    Determining the occurrence and extent of mixing between vertically-adjacent aquifers is critical to dual-aquifer management. This can be challenging due to variable well depths and uncertainty as to hydrostratigraphic sources of groundwater. This study uses a multiport monitor well to investigate the degree of aquifer mixing between the overlying Edwards aquifer and underlying Trinity aquifer in central Texas. The results will inform dual-aquifer management as the Trinity aquifer is being developed as an alternative water source to the Edwards aquifer due to pumping limits and projections of increasing water demand. Water levels from isolated hydrostratigraphic units (n = 19) were measured monthly in the well as climate conditions transitioned from wet to dry (Sept 2010 to May 2011). Groundwater was sampled over a two-week interval in May to June 2011. At the start of the monitoring interval, water levels were high in the Edwards and the uppermost units of the Trinity relative to the rest of the Trinity units. Water levels decreased to lower elevations, from about 635 to 585 ft-msl, under dry conditions. Water levels in the lowermost Trinity declined less, from about 630 to 620 ft-msl, under dry conditions. Two zones separating the Edwards and lowermost Trinity showed almost no head change during this period. The water-level variations between the two aquifers suggest that: i) vertical flow potential from the Edwards to the Trinity occurs during dry conditions, ii) the uppermost stratigraphic units of the Trinity and Edwards are mixing, and iii) portions of the Trinity behave as an aquitard, providing hydrologic separation between the Edwards and lowermost Trinity units. Groundwater samples indicate the presence of three distinct hydrochemical facies: Ca-HCO3 (Edwards), Ca-HCO3-SO4 (lowermost Trinity), and Ca-SO4 (Trinity-Glen Rose Fm), suggesting little vertical flow and mixing. Covariation between groundwater 87Sr/86Sr values and SO4 concentrations from units of the Edwards and lowermost Trinity units can be accounted for by a two-end-member fluid mixing model, which uses a unit from the Edwards and lowermost Trinity as end members. This may indicate that 87Sr/86Sr values and SO4 concentrations are controlled by varying extents of mixing between the two units. Groundwater from units in the Glen Rose Formation (between the Edwards and lowermost Trinity units) cannot be accounted for by this mixing process due to elevated SO4 concentrations likely associated with dissolution of evaporites. 87Sr/86Sr values of evaporites recovered from the well are consistent with 87Sr/86Sr values of groundwater from these Glen Rose units. Although the geochemical model results suggest possible mixing between the Edwards and Trinity aquifers, water-level variations and the presence of distinct hydrochemical facies indicate that vertical flow between the Edwards and Trinity is limited to the uppermost units of the Trinity. This study suggests that the Edwards aquifer and lowermost Trinity units are not likely in hydrologic communication and independent management may be possible.

  16. Completion Report for Model Evaluation Well ER-5-5: Corrective Action Unit 98: Frenchman Flat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NSTec Underground Test Area and Boreholes Programs and Operations

    2013-01-18

    Model Evaluation Well ER-5-5 was drilled for the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office in support of Nevada Environmental Management Operations at the Nevada National Security Site (formerly known as the Nevada Test Site). The well was drilled in July and August 2012 as part of a model evaluation well program in the Frenchman Flat area of Nye County, Nevada. The primary purpose of the well was to provide detailed geologic, hydrogeologic, chemical, and radiological data that can be used to test and build confidence in the applicability of the Frenchman Flat Corrective Action Unitmore » flow and transport models for their intended purpose. In particular, this well was designed to obtain data to evaluate the uncertainty in model forecasts of contaminant migration from the upgradient underground nuclear test MILK SHAKE, conducted in Emplacement Hole U-5k in 1968, which were considered to be uncertain due to the unknown extent of a basalt lava-flow aquifer present in this area. Well ER-5-5 is expected to provide information to refine the Phase II Frenchman Flat hydrostratigraphic framework model, if necessary, as well as to support future groundwater flow and transport modeling. The 31.1-centimeter (cm) diameter hole was drilled to a total depth of 331.3 meters (m). The completion string, set at the depth of 317.2 m, consists of 16.8-cm stainless-steel casing hanging from 19.4-cm carbon-steel casing. The 16.8-cm stainless-steel casing has one slotted interval open to the basalt lava-flow aquifer and limited intervals of the overlying and underlying alluvial aquifer. A piezometer string was also installed in the annulus between the completion string and the borehole wall. The piezometer is composed of 7.3-cm stainless-steel tubing suspended from 6.0-cm carbon-steel tubing. The piezometer string was landed at 319.2 m, to monitor the basalt lava-flow aquifer. Data collected during and shortly after hole construction include composite drill cuttings samples collected every 3.0 m, various geophysical logs, preliminary water quality measurements, and water-level measurements. The well penetrated 331.3 m of Quaternary–Tertiary alluvium, including an intercalated layer of saturated basalt lava rubble. No well development or hydrologic testing was conducted in this well immediately after completion; however, a preliminary water level was measured in the piezometer string at the depth of 283.4 m on September 25, 2012. No tritium above the minimum detection limit of the field instruments was detected in this hole. Future well development, sampling, and hydrologic testing planned for this well will provide more accurate hydrologic information for this site. The stratigraphy, general lithology, and water level were as expected, though the expected basalt lava-flow aquifer is basalt rubble and not the dense, fractured lava as modeled. The lack of tritium transport is likely due to the difference in hydraulic properties of the basalt lava-flow rubble encountered in the well, compared to those of the fractured aquifer used in the flow and transport models.« less

  17. External Peer Review Team Report Underground Testing Area Subproject for Frenchman Flat, Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sam Marutzky

    2010-09-01

    An external peer review was conducted to review the groundwater models used in the corrective action investigation stage of the Underground Test Area (UGTA) subproject to forecast zones of potential contamination in 1,000 years for the Frenchman Flat area. The goal of the external peer review was to provide technical evaluation of the studies and to assist in assessing the readiness of the UGTA subproject to progress to monitoring activities for further model evaluation. The external peer review team consisted of six independent technical experts with expertise in geology, hydrogeology,'''groundwater modeling, and radiochemistry. The peer review team was tasked withmore » addressing the following questions: 1. Are the modeling approaches, assumptions, and model results for Frenchman Flat consistent with the use of modeling studies as a decision tool for resolution of environmental and regulatory requirements? 2. Do the modeling results adequately account for uncertainty in models of flow and transport in the Frenchman Flat hydrological setting? a. Are the models of sufficient scale/resolution to adequately predict contaminant transport in the Frenchman Flat setting? b. Have all key processes been included in the model? c. Are the methods used to forecast contaminant boundaries from the transport modeling studies reasonable and appropriate? d. Are the assessments of uncertainty technically sound and consistent with state-of-the-art approaches currently used in the hydrological sciences? 3. Are the datasets and modeling results adequate for a transition to Corrective Action Unit monitoring studies—the next stage in the UGTA strategy for Frenchman Flat? The peer review team is of the opinion that, with some limitations, the modeling approaches, assumptions, and model results are consistent with the use of modeling studies for resolution of environmental and regulatory requirements. The peer review team further finds that the modeling studies have accounted for uncertainty in models of flow and transport in the Frenchman Flat except for a few deficiencies described in the report. Finally, the peer review team concludes that the UGTA subproject has explored a wide range of variations in assumptions, methods, and data, and should proceed to the next stage with an emphasis on monitoring studies. The corrective action strategy, as described in the Federal Facility Agreement and Consent Order, states that the groundwater flow and transport models for each corrective action unit will consider, at a minimum, the following: • Alternative hydrostratigraphic framework models of the modeling domain. • Uncertainty in the radiological and hydrological source terms. • Alternative models of recharge. • Alternative boundary conditions and groundwater flows. • Multiple permissive sets of calibrated flow models. • Probabilistic simulations of transport using plausible sets of alternative framework and recharge models, and boundary and groundwater flows from calibrated flow models. • Ensembles of forecasts of contaminant boundaries. • Sensitivity and uncertainty analyses of model outputs. The peer review team finds that these minimum requirements have been met. While the groundwater modeling and uncertainty analyses have been quite detailed, the peer review team has identified several modeling-related issues that should be addressed in the next phase of the corrective action activities: • Evaluating and using water-level gradients from the pilot wells at the Area 5 Radioactive Waste Management Site in model calibration. • Re-evaluating the use of geochemical age-dating data to constrain model calibrations. • Developing water budgets for the alluvial and upper volcanic aquifer systems in Frenchman Flat. • Considering modeling approaches in which calculated groundwater flow directions near the water table are not predetermined by model boundary conditions and areas of recharge, all of which are very uncertain. • Evaluating local-scale variations in hydraulic conductivity on the calculated contaminant boundaries. • Evaluating the effects of non-steady-state flow conditions on calculated contaminant boundaries, including the effects of long-term declines in water levels, climatic change, and disruption of groundwater system by potential earthquake faulting along either of the two major controlling fault zones in the flow system (the Cane Spring and Rock Valley faults). • Considering the use of less-complex modeling approaches. • Evaluating the large change in water levels in the vicinity of the Frenchman Flat playa and developing a conceptual model to explain these water-level changes. • Developing a long-term groundwater level monitoring program for Frenchman Flat with regular monitoring of water levels at key monitoring wells. Despite these reservations, the peer review team strongly believes that the UGTA subproject should proceed to the next stage.« less

  18. Utilizing Temperature and Resistivity Data as a Way to Characterize Water and Solute Movement and Groundwater-Surface Water Interaction in Variably Saturated Porous Media

    NASA Astrophysics Data System (ADS)

    Scotch, C.; Murgulet, D.; Hay, R.

    2012-12-01

    This study utilizes a multidisciplinary approach to better analyze the extent to which groundwater and surface water interact in the Oso Creek water shed of South Texas using temperature data, electrical resistivity and numerical modeling methods. The three primary objectives of this study are to: (1) identify primary areas of streambed groundwater-surface water interaction using temperature time series and resistivity soundings; (2) improve understanding of solute flow and groundwater, surface water, and sediment interaction in a semiarid, urban coastal area; (3) improve our understanding of groundwater contribution to contaminant transport and discharge to the bays and estuaries and ultimately the Gulf of Mexico. Temperature data was acquired over a one year period, using temperature loggers, from June 11, 2009 to May 18, 2010 at 15-minute intervals from 17 monitoring sites along Oso Creek and its tributaries. Each monitoring site consisted of 4 temperature loggers equally vertically spaced from the stream surface down to a depth of one meter. Furthermore, groundwater temperatures and water levels were collected from wells adjacent to the temperature monitoring sites. In order to fulfill the objectives of this study, existing hydrogeologic, stratigraphic, and other ancillary data are being integrated into a finite difference model developed using the USGS VS2DT software for the Oso Creek Watershed. The model will be calibrated using existing temperature and water level data and a resistivity component will also be added to assure accuracy of the model and temperature data by helping to identify varying lithologies and water conductivities. Compiling a time-series of temperature data and incorporating available hydrostratigraphic, geomorphologic and water level data will enable the development of a comprehensive database. This database is necessary to develop the detailed flow model that will enable an understanding of the extent of groundwater surface water interaction and their associated flow regimes.

  19. Saltwater intrusion in the surficial aquifer system of the Big Cypress Basin, southwest Florida, and a proposed plan for improved salinity monitoring

    USGS Publications Warehouse

    Prinos, Scott T.

    2013-01-01

    The installation of drainage canals, poorly cased wells, and water-supply withdrawals have led to saltwater intrusion in the primary water-use aquifers in southwest Florida. Increasing population and water use have exacerbated this problem. Installation of water-control structures, well-plugging projects, and regulation of water use have slowed saltwater intrusion, but the chloride concentration of samples from some of the monitoring wells in this area indicates that saltwater intrusion continues to occur. In addition, rising sea level could increase the rate and extent of saltwater intrusion. The existing saltwater intrusion monitoring network was examined and found to lack the necessary organization, spatial distribution, and design to properly evaluate saltwater intrusion. The most recent hydrogeologic framework of southwest Florida indicates that some wells may be open to multiple aquifers or have an incorrect aquifer designation. Some of the sampling methods being used could result in poor-quality data. Some older wells are badly corroded, obstructed, or damaged and may not yield useable samples. Saltwater in some of the canals is in close proximity to coastal well fields. In some instances, saltwater occasionally occurs upstream from coastal salinity control structures. These factors lead to an incomplete understanding of the extent and threat of saltwater intrusion in southwest Florida. A proposed plan to improve the saltwater intrusion monitoring network in the South Florida Water Management District’s Big Cypress Basin describes improvements in (1) network management, (2) quality assurance, (3) documentation, (4) training, and (5) data accessibility. The plan describes improvements to hydrostratigraphic and geospatial network coverage that can be accomplished using additional monitoring, surface geophysical surveys, and borehole geophysical logging. Sampling methods and improvements to monitoring well design are described in detail. Geochemical analyses that provide insights concerning the sources of saltwater in the aquifers are described. The requirement to abandon inactive wells is discussed.

  20. Description and Evaluation of Numerical Groundwater Flow Models for the Edwards Aquifer, South-Central Texas

    USGS Publications Warehouse

    Lindgren, Richard J.; Taylor, Charles J.; Houston, Natalie A.

    2009-01-01

    A substantial number of public water system wells in south-central Texas withdraw groundwater from the karstic, highly productive Edwards aquifer. However, the use of numerical groundwater flow models to aid in the delineation of contributing areas for public water system wells in the Edwards aquifer is problematic because of the complex hydrogeologic framework and the presence of conduit-dominated flow paths in the aquifer. The U.S. Geological Survey, in cooperation with the Texas Commission on Environmental Quality, evaluated six published numerical groundwater flow models (all deterministic) that have been developed for the Edwards aquifer San Antonio segment or Barton Springs segment, or both. This report describes the models developed and evaluates each with respect to accessibility and ease of use, range of conditions simulated, accuracy of simulations, agreement with dye-tracer tests, and limitations of the models. These models are (1) GWSIM model of the San Antonio segment, a FORTRAN computer-model code that pre-dates the development of MODFLOW; (2) MODFLOW conduit-flow model of San Antonio and Barton Springs segments; (3) MODFLOW diffuse-flow model of San Antonio and Barton Springs segments; (4) MODFLOW Groundwater Availability Modeling [GAM] model of the Barton Springs segment; (5) MODFLOW recalibrated GAM model of the Barton Springs segment; and (6) MODFLOW-DCM (dual conductivity model) conduit model of the Barton Springs segment. The GWSIM model code is not commercially available, is limited in its application to the San Antonio segment of the Edwards aquifer, and lacks the ability of MODFLOW to easily incorporate newly developed processes and packages to better simulate hydrologic processes. MODFLOW is a widely used and tested code for numerical modeling of groundwater flow, is well documented, and is in the public domain. These attributes make MODFLOW a preferred code with regard to accessibility and ease of use. The MODFLOW conduit-flow model incorporates improvements over previous models by using (1) a user-friendly interface, (2) updated computer codes (MODFLOW-96 and MODFLOW-2000), (3) a finer grid resolution, (4) less-restrictive boundary conditions, (5) an improved discretization of hydraulic conductivity, (6) more accurate estimates of pumping stresses, (7) a long transient simulation period (54 years, 1947-2000), and (8) a refined representation of high-permeability zones or conduits. All of the models except the MODFLOW-DCM conduit model have limitations resulting from the use of Darcy's law to simulate groundwater flow in a karst aquifer system where non-Darcian, turbulent flow might actually dominate. The MODFLOW-DCM conduit model is an improvement in the ability to simulate karst-like flow conditions in conjunction with porous-media-type matrix flow. However, the MODFLOW-DCM conduit model has had limited application and testing and currently (2008) lacks commercially available pre- and post-processors. The MODFLOW conduit-flow and diffuse-flow Edwards aquifer models are limited by the lack of calibration for the northern part of the Barton Springs segment (Travis County) and their reliance on the use of the calibrated hydraulic conductivity and storativity values from the calibrated Barton Springs segment GAM model. The major limitation of the Barton Springs segment GAM and recalibrated GAM models is that they were calibrated to match measured water levels and springflows for a restrictive range of hydrologic conditions, with each model having different hydraulic conductivity and storativity values appropriate to the hydrologic conditions that were simulated. The need for two different sets of hydraulic conductivity and storativity values increases the uncertainty associated with the accuracy of either set of values, illustrates the non-uniqueness of the model solution, and probably most importantly demonstrates the limitations of using a one-layer model to represent the heterogeneous hydrostratigraph

  1. The Use of a Geomorphometric Classification to Estimate Subsurface Heterogeneity in the Unconsolidated Sediments of Mountain Watersheds

    NASA Astrophysics Data System (ADS)

    Cairns, D.; Byrne, J. M.; Jiskoot, H.; McKenzie, J. M.; Johnson, D. L.

    2013-12-01

    Groundwater controls many aspects of water quantity and quality in mountain watersheds. Groundwater recharge and flow originating in mountain watersheds are often difficult to quantify due to challenges in the characterization of the local geology, as subsurface data are sparse and difficult to collect. Remote sensing data are more readily available and are beneficial for the characterization of watershed hydrodynamics. We present an automated geomorphometric model to identify the approximate spatial distribution of geomorphic features, and to segment each of these features based on relative hydrostratigraphic differences. A digital elevation model (DEM) dataset and predefined indices are used as inputs in a mountain watershed. The model uses periglacial, glacial, fluvial, slope evolution and lacustrine processes to identify regions that are subsequently delineated using morphometric principles. A 10 m cell size DEM from the headwaters of the St. Mary River watershed in Glacier National Park, Montana, was considered sufficient for this research. Morphometric parameters extracted from the DEM that were found to be useful for the calibration of the model were elevation, slope, flow direction, flow accumulation, and surface roughness. Algorithms were developed to utilize these parameters and delineate the distributions of bedrock outcrops, periglacial landscapes, alluvial channels, fans and outwash plains, glacial depositional features, talus slopes, and other mass wasted material. Theoretical differences in sedimentation and hydrofacies associated with each of the geomorphic features were used to segment the watershed into units reflecting similar hydrogeologic properties such as hydraulic conductivity and thickness. The results of the model were verified by comparing the distribution of geomorphic features with published geomorphic maps. Although agreement in semantics between datasets caused difficulties, a consensus yielded a comparison Dice Coefficient of 0.65. The results can be used to assist in groundwater model calibration, or to estimate spatial differences in near-surface groundwater behaviour. Verification of the geomorphometric model would be augmented by evaluating its success after use in the calibration of the groundwater simulation. These results may also be used directly in momentum-based equations to create a stochastic routing routine beneath the soil interface for a hydrometeorological model.

  2. The sensitivity to land use and climate variations in the whitewater river basin, kansas, USA: Closing the water budget using groundwater models

    NASA Astrophysics Data System (ADS)

    Beeson, P.; Duffy, C.; Springer, E.

    2003-04-01

    A water budget was developed using groundwater models to assess the impact of land use and climate variability on the Whitewater River Basin located in southeastern Kansas within the ARM-SGP as part of the DOE Water Cycle Pilot Study. The Whitewater River Basin has an area of 1,100 km2, an elevation range of 380 - 470 m above mean sea level, and an average annual precipitation of 858 mm. Time series and geospatial analysis are used to identify significant spatial structure and dominant temporal modes in the watershed runoff and groundwater response. Space-time analyses confirmed the hydrogeologic conceptual model developed from the hydrostratigraphic information provided by existing geologic studies and over 2,000 wells located in the area. The groundwater-surface water interactions are identified by time series analysis of stream discharge, precipitation, temperature, and water levels in wells. Singular spectrum analysis suggests a two layer leaky perched system with strong influences of daily, monthly, seasonal, and interannual oscillations. The geospatial analysis identifies the important length scales and the time series analysis the corresponding time scales, which must be incorporated in the model. The fine scale layering, which creates the perched leaky top layer, was represented by using an anisotropy ratio. This ratio was determined from select well data to be 100 (Kh/Kv), by calculating the vertical conductivity from harmonic mean and horizontal conductivity from arithmetic mean. MODFLOW is used to assess the importance of groundwater when attempting to close the water budget. The R-squared value between MODFLOW predicted and observed head values for the watershed was 0.85 indicating a good fit. Mean recharge was estimated to be approximately 17 percent of total annual precipitation. The approach presented here is an initial attempt to examine the importance of groundwater in the water budget of a relatively small river basin.

  3. Hydrostratigraphic interpretation of test-hole and geophysical data, Upper Loup River Basin, Nebraska, 2008-10

    USGS Publications Warehouse

    Hobza, Christopher M.; Asch, Theodore H.; Bedrosian, Paul A.

    2011-01-01

    Test-hole drilling has indicated greater variation in the base-of-aquifer elevation in the western part of the upper Loup study area than in the eastern part reflecting a number of deep paleovalleys incised into the Brule Formation of the White River Group. TDEM measurements within the upper Loup study area were shown to be effective as virtual boreholes in mapping out the base of the aquifer. TDEM estimates of the base of aquifer were in good accordance with existing test-hole data and were able to improve the interpreted elevation and topology of the base of the aquifer. In 2010, AMT data were collected along a profile, approximately 12 miles (19 kilometers) in length, along Whitman Road, in Grant and Cherry Counties. The AMT results along Whitman Road indicated substantial variability in the elevation of the base of the High Plains aquifer and in the distribution of highly permeable zones within the aquifer.

  4. Sensitivity models and design protocol for partitioning tracer tests in alluvial aquifers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, M.; Butler, G.W.; Jackson, R.E.

    1997-11-01

    Zones of dense, nonaqueous phase liquids (DNAPLs) are difficult to characterize as to their volume, composition, and spatial distribution using conventional ground-water extraction and soil-sampling methods. Such incompletely characterized sites have negative consequences for those responsible for their remedial design, e.g., the uncertainties in the optimal placement of ground-water extraction wells and in the duration of remediation. However, the recent use of the partitioning interwell tracer test (PITT) to characterize DNAPL zones at sites in New Mexico [unsaturated alluvium] and in Ohio, Texas, and Utah [saturated alluvium] demonstrates that the volume and spatial distribution of residual DNAPL can be determinedmore » with accuracy. The PITT involves injection of a suite of tracers which reversibly partition to different degrees between the DNAPL and the ground water or soil air resulting in the chromatographic separation of the tracer signals observed at the extraction well(s). The design of a PITT requires careful consideration of the hydrostratigraphic, hydraulic, and certain geochemical properties of the alluvium being tested. A three-dimensional, numerical model of a heterogeneous alluvial aquifer containing DNAPL has been developed for use with the UTCHEM simulator to demonstrate partitioning tracer testing and to address questions that are frequently raised in its application. The simulations include (1) the estimation of DNAPL volume for the simple case where only residual DNAPL is present in heterogeneous alluvium, (2) sensitivity studies to demonstrate the effect of increasingly low residual DNAPL saturation on the tracer signal, and (3) the effect of free-phase DNAPL on the estimation of the volume of DNAPL present. Furthermore, the potential interference of sedimentary organic carbon as a DNAPL surrogate on the tracer signal is considered and shown to be readily resolved by the careful choice of tracers.« less

  5. Characterizing Structural and Stratigraphic Heterogeneities in a Faulted Aquifer Using Pump Tests with an Array of Westbay Multilevel Monitoring Wells

    NASA Astrophysics Data System (ADS)

    Johnson, B.; Zhurina, E. N.

    2001-12-01

    We are developing and assessing field testing and analysis methodologies for quantitative characterization of aquifer heterogenities using data measured in an array of multilevel monitoring wells (MLW) during pumping and recovery well tests. We have developed a unique field laboratory to determine the permeability field in a 20m by 40m by 70m volume in the fault partitioned, siliciclastic Hickory aquifer system in central Texas. The site incorporates both stratigraphic variations and a normal fault system that partially offsets the aquifer and impedes cross-fault flow. We constructed a high-resolution geologic model of the site based upon 1050 m of core and a suite of geophysical logs from eleven, closely spaced (3-10m), continuously cored boreholes to depths of 125 m. Westbay multilevel monitoring systems installed in eight holes provide 94 hydraulically isolated measurement zones and 25 injection zones. A good geologic model is critical to proper installation of the MLW. Packers are positioned at all significant fault piercements and selected, laterally extensive, clay-rich strata. Packers in adjacent MLW bracket selected hydrostratigraphic intervals. Pump tests utilized two, uncased, fully penetrating irrigation wells that straddle the fault system and are in close proximity (7 to 65 m) to the MLW. Pumping and recovery transient pressure histories were measured in 85 zones using pressure transducers with a resolution of 55 Pa (0.008 psi). The hydraulic response is that of an anisotropic, unconfined aquifer. The transient pressure histories vary significantly from zone to zone in a single MLW as well as between adjacent MLW. Derivative plots are especially useful for differentiating details of pressure histories. Based on the geologic model, the derivative curve of a zone reflects its absolute vertical position, vertical stratigraphic position, and proximity to either a fault or significant stratigraphic heterogeneity. Additional forward modeling is needed to assist qualitative interpretation of response curves. Prior geologic knowledge appears critical. Quantitative interpretation of the transient pressure histories requires utilizing a numerical aquifer response model coupled with a geophysical inversion algorithm.

  6. A comparison of item response models for accuracy and speed of item responses with applications to adaptive testing.

    PubMed

    van Rijn, Peter W; Ali, Usama S

    2017-05-01

    We compare three modelling frameworks for accuracy and speed of item responses in the context of adaptive testing. The first framework is based on modelling scores that result from a scoring rule that incorporates both accuracy and speed. The second framework is the hierarchical modelling approach developed by van der Linden (2007, Psychometrika, 72, 287) in which a regular item response model is specified for accuracy and a log-normal model for speed. The third framework is the diffusion framework in which the response is assumed to be the result of a Wiener process. Although the three frameworks differ in the relation between accuracy and speed, one commonality is that the marginal model for accuracy can be simplified to the two-parameter logistic model. We discuss both conditional and marginal estimation of model parameters. Models from all three frameworks were fitted to data from a mathematics and spelling test. Furthermore, we applied a linear and adaptive testing mode to the data off-line in order to determine differences between modelling frameworks. It was found that a model from the scoring rule framework outperformed a hierarchical model in terms of model-based reliability, but the results were mixed with respect to correlations with external measures. © 2017 The British Psychological Society.

  7. A complete categorization of multiscale models of infectious disease systems.

    PubMed

    Garira, Winston

    2017-12-01

    Modelling of infectious disease systems has entered a new era in which disease modellers are increasingly turning to multiscale modelling to extend traditional modelling frameworks into new application areas and to achieve higher levels of detail and accuracy in characterizing infectious disease systems. In this paper we present a categorization framework for categorizing multiscale models of infectious disease systems. The categorization framework consists of five integration frameworks and five criteria. We use the categorization framework to give a complete categorization of host-level immuno-epidemiological models (HL-IEMs). This categorization framework is also shown to be applicable in categorizing other types of multiscale models of infectious diseases beyond HL-IEMs through modifying the initial categorization framework presented in this study. Categorization of multiscale models of infectious disease systems in this way is useful in bringing some order to the discussion on the structure of these multiscale models.

  8. A UML profile for framework modeling.

    PubMed

    Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong

    2004-01-01

    The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.

  9. Not ''just'' pump and treat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angleberger, K; Bainer, R W

    2000-12-12

    The Lawrence Livermore National Laboratory (LLNL) has been consistently improving the site cleanup methods by adopting new philosophies, strategies and technologies to address constrained or declining budgets, lack of useable space due to a highly industrialized site, and significant technical challenges. As identified in the ROD, the preferred remedy at the LLNL Livermore Site is pump and treat, although LLNL has improved this strategy to bring the remediation of the ground water to closure as soon as possible. LLNL took the logical progression from a pump and treat system to the philosophy of ''Smart Pump and Treat'' coupled with themore » concepts of ''Hydrostratigraphic Unit Analysis,'' ''Engineered Plume Collapse,'' and ''Phased Source Remediation,'' which led to the development of new, more cost-effective technologies which have accelerated the attainment of cleanup goals significantly. Modeling is also incorporated to constantly develop new, cost-effective methodologies to accelerate cleanup and communicate the progress of cleanup to stakeholders. In addition, LLNL improved on the efficiency and flexibility of ground water treatment facilities. Ground water cleanup has traditionally relied on costly and obtrusive fixed treatment facilities. LLNL has designed and implemented various portable ground water treatment units to replace the fixed facilities; the application of each type of facility is determined by the amount of ground water flow and contaminant concentrations. These treatment units have allowed for aggressive ground water cleanup, increased cleanup flexibility, and reduced capital and electrical costs. After a treatment unit has completed ground water cleanup at one location, it can easily be moved to another location for additional ground water cleanup.« less

  10. Use of Annotations for Component and Framework Interoperability

    NASA Astrophysics Data System (ADS)

    David, O.; Lloyd, W.; Carlson, J.; Leavesley, G. H.; Geter, F.

    2009-12-01

    The popular programming languages Java and C# provide annotations, a form of meta-data construct. Software frameworks for web integration, web services, database access, and unit testing now take advantage of annotations to reduce the complexity of APIs and the quantity of integration code between the application and framework infrastructure. Adopting annotation features in frameworks has been observed to lead to cleaner and leaner application code. The USDA Object Modeling System (OMS) version 3.0 fully embraces the annotation approach and additionally defines a meta-data standard for components and models. In version 3.0 framework/model integration previously accomplished using API calls is now achieved using descriptive annotations. This enables the framework to provide additional functionality non-invasively such as implicit multithreading, and auto-documenting capabilities while achieving a significant reduction in the size of the model source code. Using a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside of it. To study the effectiveness of an annotation based framework approach with other modeling frameworks, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A monthly water balance model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. In a next step, the PRMS model was implemented in OMS 3.0 and is currently being implemented for water supply forecasting in the western United States at the USDA NRCS National Water and Climate Center. PRMS is a component based modular precipitation-runoff model developed to evaluate the impacts of various combinations of precipitation, climate, and land use on streamflow and general basin hydrology. The new OMS 3.0 PRMS model source code is more concise and flexible as a result of using the new framework’s annotation based approach. The fully annotated components are now providing information directly for (i) model assembly and building, (ii) dataflow analysis for implicit multithreading, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks. As a prototype example, model code annotations were used to generate binding and mediation code to allow the use of OMS 3.0 model components within the OpenMI context.

  11. DEVELOP MULTI-STRESSOR, OPEN ARCHITECTURE MODELING FRAMEWORK FOR ECOLOGICAL EXPOSURE FROM SITE TO WATERSHED SCALE

    EPA Science Inventory

    A number of multimedia modeling frameworks are currently being developed. The Multimedia Integrated Modeling System (MIMS) is one of these frameworks. A framework should be seen as more of a multimedia modeling infrastructure than a single software system. This infrastructure do...

  12. Deep Resistivity Structure of Mid Valley, Nevada Test Site, Nevada

    USGS Publications Warehouse

    Wallin, Erin L.; Rodriguez, Brian D.; Williams, Jackie M.

    2009-01-01

    The U.S. Department of Energy (DOE) and the National Nuclear Security Administration (NNSA) at their Nevada Site Office (NSO) are addressing ground-water contamination resulting from historical underground nuclear testing through the Environmental Management (EM) program and, in particular, the Underground Test Area (UGTA) project. From 1951 to 1992, 828 underground nuclear tests were conducted at the Nevada Test Site northwest of Las Vegas (DOE UGTA, 2003). Most of these tests were conducted hundreds of feet above the ground-water table; however, more than 200 of the tests were near, or within, the water table. This underground testing was limited to specific areas of the Nevada Test Site including Pahute Mesa, Rainier Mesa/Shoshone Mountain (RM-SM), Frenchman Flat, and Yucca Flat. One issue of concern is the nature of the somewhat poorly constrained pre-Tertiary geology and its effects on ground-water flow in the area subsequent to a nuclear test. Ground-water modelers would like to know more about the hydrostratigraphy and geologic structure to support a hydrostratigraphic framework model that is under development for the Rainier Mesa/Shoshone Mountain (RM-SM) Corrective Action Unit (CAU) (National Security Technologies, 2007). During 2003, the U.S. Geological Survey (USGS), in cooperation with the DOE and NNSA-NSO collected and processed data at the Nevada Test Site in and near Yucca Flat (YF) to help define the character, thickness, and lateral extent of the pre-Tertiary confining units. We collected 51 magnetotelluric (MT) and audio-magnetotelluric (AMT) stations for that research (Williams and others, 2005a, 2005b, 2005c, 2005d, 2005e, and 2005f). In early 2005 we extended that research with 26 additional MT data stations (Williams and others, 2006) located on and near Rainier Mesa and Shoshone Mountain (RM-SM). The new stations extended the area of the hydrogeologic study previously conducted in Yucca Flat, further refining what is known about the pre-Tertiary confining units. In particular, a major goal was to define the extent of the upper clastic confining unit (UCCU). The UCCU is composed of late Devonian to Mississippian siliciclastic rocks assigned to the Eleana Formation and Chainman Shale (National Security Technologies, 2007). The UCCU underlies the Yucca Flat area and extends southwestward toward Shoshone Mountain, westward toward Buckboard Mesa, and northwestward toward Rainier Mesa. Late in 2005 we collected data at an additional 14 MT stations in Mid Valley, CP Hills, and northern Yucca Flat. That work was done to better determine the extent and thickness of the UCCU near the boundary between the southeastern RM-SM CAU and the southwestern YF CAU, and also in the northern YF CAU. The MT data have been released in a separate U.S. Geological Survey report (Williams and others, 2007). The Nevada Test Site magnetotelluric data interpretation presented in this report includes the results of detailed two-dimensional (2-D) resistivity modeling for each profile and inferences on the three-dimensional (3-D) character of the geology within the region.

  13. Narrative review of frameworks for translating research evidence into policy and practice.

    PubMed

    Milat, Andrew J; Li, Ben

    2017-02-15

    A significant challenge in research translation is that interested parties interpret and apply the associated terms and conceptual frameworks in different ways. The purpose of this review was to: a) examine different research translation frameworks; b) examine the similarities and differences between the frameworks; and c) identify key strengths and weaknesses of the models when they are applied in practice. The review involved a keyword search of PubMed. The search string was (translational research OR knowledge translation OR evidence to practice) AND (framework OR model OR theory) AND (public health OR health promotion OR medicine). Included studies were published in English between January 1990 and December 2014, and described frameworks, models or theories associated with research translation. The final review included 98 papers, and 41 different frameworks and models were identified. The most frequently applied knowledge translation framework in the literature was RE-AIM, followed by the knowledge translation continuum or 'T' models, the Knowledge to Action framework, the PARiHS framework, evidence based public health models, and the stages of research and evaluation model. The models identified in this review stem from different fields, including implementation science, basic and medical sciences, health services research and public health, and propose different but related pathways to closing the research-practice gap.

  14. USEEIO Framework Demo

    EPA Science Inventory

    The code base for creating versions of the USEEIO model and USEEIO-like models is called the USEEIO Modeling Framework. The framework is built in a combination of R and Python languages.This demonstration provides a brief overview and introduction into the framework.

  15. Model and Interoperability using Meta Data Annotations

    NASA Astrophysics Data System (ADS)

    David, O.

    2011-12-01

    Software frameworks and architectures are in need for meta data to efficiently support model integration. Modelers have to know the context of a model, often stepping into modeling semantics and auxiliary information usually not provided in a concise structure and universal format, consumable by a range of (modeling) tools. XML often seems the obvious solution for capturing meta data, but its wide adoption to facilitate model interoperability is limited by XML schema fragmentation, complexity, and verbosity outside of a data-automation process. Ontologies seem to overcome those shortcomings, however the practical significance of their use remains to be demonstrated. OMS version 3 took a different approach for meta data representation. The fundamental building block of a modular model in OMS is a software component representing a single physical process, calibration method, or data access approach. Here, programing language features known as Annotations or Attributes were adopted. Within other (non-modeling) frameworks it has been observed that annotations lead to cleaner and leaner application code. Framework-supported model integration, traditionally accomplished using Application Programming Interfaces (API) calls is now achieved using descriptive code annotations. Fully annotated components for various hydrological and Ag-system models now provide information directly for (i) model assembly and building, (ii) data flow analysis for implicit multi-threading or visualization, (iii) automated and comprehensive model documentation of component dependencies, physical data properties, (iv) automated model and component testing, calibration, and optimization, and (v) automated audit-traceability to account for all model resources leading to a particular simulation result. Such a non-invasive methodology leads to models and modeling components with only minimal dependencies on the modeling framework but a strong reference to its originating code. Since models and modeling components are not directly bound to framework by the use of specific APIs and/or data types they can more easily be reused both within the framework as well as outside. While providing all those capabilities, a significant reduction in the size of the model source code was achieved. To support the benefit of annotations for a modeler, studies were conducted to evaluate the effectiveness of an annotation based framework approach with other modeling frameworks and libraries, a framework-invasiveness study was conducted to evaluate the effects of framework design on model code quality. A typical hydrological model was implemented across several modeling frameworks and several software metrics were collected. The metrics selected were measures of non-invasive design methods for modeling frameworks from a software engineering perspective. It appears that the use of annotations positively impacts several software quality measures. Experience to date has demonstrated the multi-purpose value of using annotations. Annotations are also a feasible and practical method to enable interoperability among models and modeling frameworks.

  16. A Framework for Developing the Structure of Public Health Economic Models.

    PubMed

    Squires, Hazel; Chilcott, James; Akehurst, Ronald; Burr, Jennifer; Kelly, Michael P

    2016-01-01

    A conceptual modeling framework is a methodology that assists modelers through the process of developing a model structure. Public health interventions tend to operate in dynamically complex systems. Modeling public health interventions requires broader considerations than clinical ones. Inappropriately simple models may lead to poor validity and credibility, resulting in suboptimal allocation of resources. This article presents the first conceptual modeling framework for public health economic evaluation. The framework presented here was informed by literature reviews of the key challenges in public health economic modeling and existing conceptual modeling frameworks; qualitative research to understand the experiences of modelers when developing public health economic models; and piloting a draft version of the framework. The conceptual modeling framework comprises four key principles of good practice and a proposed methodology. The key principles are that 1) a systems approach to modeling should be taken; 2) a documented understanding of the problem is imperative before and alongside developing and justifying the model structure; 3) strong communication with stakeholders and members of the team throughout model development is essential; and 4) a systematic consideration of the determinants of health is central to identifying the key impacts of public health interventions. The methodology consists of four phases: phase A, aligning the framework with the decision-making process; phase B, identifying relevant stakeholders; phase C, understanding the problem; and phase D, developing and justifying the model structure. Key areas for further research involve evaluation of the framework in diverse case studies and the development of methods for modeling individual and social behavior. This approach could improve the quality of Public Health economic models, supporting efficient allocation of scarce resources. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  17. [A preliminary study on the forming quality of titanium alloy removable partial denture frameworks fabricated by selective laser melting].

    PubMed

    Liu, Y F; Yu, H; Wang, W N; Gao, B

    2017-06-09

    Objective: To evaluate the processing accuracy, internal quality and suitability of the titanium alloy frameworks of removable partial denture (RPD) fabricated by selective laser melting (SLM) technique, and to provide reference for clinical application. Methods: The plaster model of one clinical patient was used as the working model, and was scanned and reconstructed into a digital working model. A RPD framework was designed on it. Then, eight corresponding RPD frameworks were fabricated using SLM technique. Three-dimensional (3D) optical scanner was used to scan and obtain the 3D data of the frameworks and the data was compared with the original computer aided design (CAD) model to evaluate their processing precision. The traditional casting pure titanium frameworks was used as the control group, and the internal quality was analyzed by X-ray examination. Finally, the fitness of the frameworks was examined on the plaster model. Results: The overall average deviation of the titanium alloy RPD framework fabricated by SLM technology was (0.089±0.076) mm, the root mean square error was 0.103 mm. No visible pores, cracks and other internal defects was detected in the frameworks. The framework fits on the plaster model completely, and its tissue surface fitted on the plaster model well. There was no obvious movement. Conclusions: The titanium alloy RPD framework fabricated by SLM technology is of good quality.

  18. Theories, models and frameworks used in capacity building interventions relevant to public health: a systematic review.

    PubMed

    Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather

    2017-11-28

    There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners with a menu of potentially usable theories, models and frameworks to support capacity building efforts. The findings also support the need for the use of theories, models or frameworks to be intentional, explicitly identified, referenced and for it to be clearly outlined how they were applied to the capacity building intervention.

  19. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a modeling framework's native component interface. (3) Create semantic mappings between modeling frameworks that support semantic mediation. This third goal involves creating a crosswalk between the CF Standard Names and the CSDMS Standard Names (a set of naming conventions). This talk will summarize progress towards these goals.

  20. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  1. Science for informed decision: A 3D unified conceptual model of the Milk River Transboundary Aquifer (Alberta-Montana)

    NASA Astrophysics Data System (ADS)

    Rivera, A.; Pétré, M.

    2013-12-01

    The Milk River transboundary aquifer straddles southern Alberta (Canada) and northern Montana (United States), in a semi-arid region considered water short. This confined sandstone aquifer is a source for municipal supply and agricultural uses on the Canadian side, as well as for secondary oil recovery on the US-side of the border. The extensive use of this resource since the mid 1950's has led to a dramatic drop in the water level in some places and concerns about the durability of the resource have risen. The Milk River aquifer has been the object of many studies during the 20th century; however most of them were limited by the USCanada border, preventing a sound understanding of the global dynamics of the aquifer. The objectives of this transboundary study are to better understand the dynamics of the Milk River aquifer, following its natural limits, in order to make recommendations for a sustainable management and its good governance by the two international jurisdictions, as recommended in the UNGA resolution 63/124 on the Law of Transboundary Aquifers. Since 2009, the Milk River transboundary aquifer is part of the inventory of UNESCO ISARM-Americas initiative, which encourages riparian states to work cooperatively toward mutually beneficial and sustainable aquifer development However, the use of this shared resource is not ruled by any international agreement or convention between the USA and the Canada. Stakeholders from the two countries have been involved, at various levels of jurisdictions (municipal, provincial, state, federal) to establish a strong cooperation. In these contexts, models can constitute useful tools for informed decisions. In the case of the Milk River aquifer, models could support scientists and managers from both countries in avoiding potential tensions linked to the water shortage context in this region. Models can determine the conditions of overexploitation and provide an assessment of a sustainable yield. A unified conceptual model of the Milk River Aquifer has been built. This model follows the natural limits of the aquifer and is not interrupted by the USCanada border. The conceptual model covers many aspects such as the hydrostratigraphic 3D model, the groundwater flow, the recharge and discharge areas, the hydrogeological parameters, the pumping and observation wells, and the transboundary aspects. This model covers circa 55 000 km2. The study area is limited to the North/Northeast and Southeast by gas fields. This unified conceptual model will form the basis for a future 3D numerical hydrogeological model of groundwater flow in the Milk River Aquifer across the Canada-US border.

  2. Business model framework applications in health care: A systematic review.

    PubMed

    Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl

    2017-11-01

    It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.

  3. A Modular Simulation Framework for Assessing Swarm Search Models

    DTIC Science & Technology

    2014-09-01

    SUBTITLE A MODULAR SIMULATION FRAMEWORK FOR ASSESSING SWARM SEARCH MODELS 5. FUNDING NUMBERS 6. AUTHOR(S) Blake M. Wanier 7. PERFORMING ORGANIZATION...Numerical studies demonstrate the ability to leverage the developed simulation and analysis framework to investigate three canonical swarm search models ...as benchmarks for future exploration of more sophisticated swarm search scenarios. 14. SUBJECT TERMS Swarm Search, Search Theory, Modeling Framework

  4. EPA'S NEW EMISSIONS MODELING FRAMEWORK

    EPA Science Inventory

    EPA's Office of Air Quality Planning and Standards is building a new Emissions Modeling Framework that will solve many of the long-standing difficulties of emissions modeling. The goals of the Framework are to (1) prevent bottlenecks and errors caused by emissions modeling activi...

  5. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    ERIC Educational Resources Information Center

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  6. Rethinking modeling framework design: object modeling system 3.0

    USDA-ARS?s Scientific Manuscript database

    The Object Modeling System (OMS) is a framework for environmental model development, data provisioning, testing, validation, and deployment. It provides a bridge for transferring technology from the research organization to the program delivery agency. The framework provides a consistent and efficie...

  7. Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework

    ERIC Educational Resources Information Center

    Chen, Huilin; Chen, Jinsong

    2016-01-01

    Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…

  8. Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results

    ERIC Educational Resources Information Center

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-01-01

    We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…

  9. The Foundations Framework for Developing and Reporting New Models of Care for Multimorbidity

    PubMed Central

    Stokes, Jonathan; Man, Mei-See; Guthrie, Bruce; Mercer, Stewart W.; Salisbury, Chris; Bower, Peter

    2017-01-01

    PURPOSE Multimorbidity challenges health systems globally. New models of care are urgently needed to better manage patients with multimorbidity; however, there is no agreed framework for designing and reporting models of care for multimorbidity and their evaluation. METHODS Based on findings from a literature search to identify models of care for multimorbidity, we developed a framework to describe these models. We illustrate the application of the framework by identifying the focus and gaps in current models of care, and by describing the evolution of models over time. RESULTS Our framework describes each model in terms of its theoretical basis and target population (the foundations of the model) and of the elements of care implemented to deliver the model. We categorized elements of care into 3 types: (1) clinical focus, (2) organization of care, (3) support for model delivery. Application of the framework identified a limited use of theory in model design and a strong focus on some patient groups (elderly, high users) more than others (younger patients, deprived populations). We found changes in elements with time, with a decrease in models implementing home care and an increase in models offering extended appointments. CONCLUSIONS By encouragin greater clarity about the underpinning theory and target population, and by categorizing the wide range of potentially important elements of an intervention to improve care for patients with multimorbidity, the framework may be useful in designing and reporting models of care and help advance the currently limited evidence base. PMID:29133498

  10. BioASF: a framework for automatically generating executable pathway models specified in BioPAX.

    PubMed

    Haydarlou, Reza; Jacobsen, Annika; Bonzanni, Nicola; Feenstra, K Anton; Abeln, Sanne; Heringa, Jaap

    2016-06-15

    Biological pathways play a key role in most cellular functions. To better understand these functions, diverse computational and cell biology researchers use biological pathway data for various analysis and modeling purposes. For specifying these biological pathways, a community of researchers has defined BioPAX and provided various tools for creating, validating and visualizing BioPAX models. However, a generic software framework for simulating BioPAX models is missing. Here, we attempt to fill this gap by introducing a generic simulation framework for BioPAX. The framework explicitly separates the execution model from the model structure as provided by BioPAX, with the advantage that the modelling process becomes more reproducible and intrinsically more modular; this ensures natural biological constraints are satisfied upon execution. The framework is based on the principles of discrete event systems and multi-agent systems, and is capable of automatically generating a hierarchical multi-agent system for a given BioPAX model. To demonstrate the applicability of the framework, we simulated two types of biological network models: a gene regulatory network modeling the haematopoietic stem cell regulators and a signal transduction network modeling the Wnt/β-catenin signaling pathway. We observed that the results of the simulations performed using our framework were entirely consistent with the simulation results reported by the researchers who developed the original models in a proprietary language. The framework, implemented in Java, is open source and its source code, documentation and tutorial are available at http://www.ibi.vu.nl/programs/BioASF CONTACT: j.heringa@vu.nl. © The Author 2016. Published by Oxford University Press.

  11. Comparison and Contrast of Two General Functional Regression Modeling Frameworks

    PubMed Central

    Morris, Jeffrey S.

    2017-01-01

    In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable. PMID:28736502

  12. Comparison and Contrast of Two General Functional Regression Modeling Frameworks.

    PubMed

    Morris, Jeffrey S

    2017-02-01

    In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable.

  13. Enterprise application architecture development based on DoDAF and TOGAF

    NASA Astrophysics Data System (ADS)

    Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng

    2017-05-01

    For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.

  14. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    PubMed

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  15. Koopman Operator Framework for Time Series Modeling and Analysis

    NASA Astrophysics Data System (ADS)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  16. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  17. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    PubMed

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  18. A modelling framework to simulate foliar fungal epidemics using functional–structural plant models

    PubMed Central

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-01-01

    Background and Aims Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional–structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Methods Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant–environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Key Results Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. Conclusions This study provides a framework for modelling a large number of pathosystems using FSPMs. This structure can accommodate both previously developed models for individual aspects of pathosystems and new ones. Complex models are deconstructed into separate ‘knowledge sources’ originating from different specialist areas of expertise and these can be shared and reassembled into multidisciplinary models. The framework thus provides a beneficial tool for a potential diverse and dynamic research community. PMID:24925323

  19. Documentation for the MODFLOW 6 framework

    USGS Publications Warehouse

    Hughes, Joseph D.; Langevin, Christian D.; Banta, Edward R.

    2017-08-10

    MODFLOW is a popular open-source groundwater flow model distributed by the U.S. Geological Survey. Growing interest in surface and groundwater interactions, local refinement with nested and unstructured grids, karst groundwater flow, solute transport, and saltwater intrusion, has led to the development of numerous MODFLOW versions. Often times, there are incompatibilities between these different MODFLOW versions. The report describes a new MODFLOW framework called MODFLOW 6 that is designed to support multiple models and multiple types of models. The framework is written in Fortran using a modular object-oriented design. The primary framework components include the simulation (or main program), Timing Module, Solutions, Models, Exchanges, and Utilities. The first version of the framework focuses on numerical solutions, numerical models, and numerical exchanges. This focus on numerical models allows multiple numerical models to be tightly coupled at the matrix level.

  20. A conceptual modeling framework for discrete event simulation using hierarchical control structures.

    PubMed

    Furian, N; O'Sullivan, M; Walker, C; Vössner, S; Neubacher, D

    2015-08-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM's applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models' system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example.

  1. A framework for modelling the complexities of food and water security under globalisation

    NASA Astrophysics Data System (ADS)

    Dermody, Brian J.; Sivapalan, Murugesu; Stehfest, Elke; van Vuuren, Detlef P.; Wassen, Martin J.; Bierkens, Marc F. P.; Dekker, Stefan C.

    2018-01-01

    We present a new framework for modelling the complexities of food and water security under globalisation. The framework sets out a method to capture regional and sectoral interdependencies and cross-scale feedbacks within the global food system that contribute to emergent water use patterns. The framework integrates aspects of existing models and approaches in the fields of hydrology and integrated assessment modelling. The core of the framework is a multi-agent network of city agents connected by infrastructural trade networks. Agents receive socio-economic and environmental constraint information from integrated assessment models and hydrological models respectively and simulate complex, socio-environmental dynamics that operate within those constraints. The emergent changes in food and water resources are aggregated and fed back to the original models with minimal modification of the structure of those models. It is our conviction that the framework presented can form the basis for a new wave of decision tools that capture complex socio-environmental change within our globalised world. In doing so they will contribute to illuminating pathways towards a sustainable future for humans, ecosystems and the water they share.

  2. Public Acceptance and User Response to ATIS Products and Services: Modeling Framework and Data Requirements

    DOT National Transportation Integrated Search

    1993-12-01

    This report presents a comprehensive modeling framework for user responses to Advanced Traveler Information Systems (ATIS) services and identifies the data needs for the validation of such a framework. The authors present overviews of the framework b...

  3. Models of Recognition, Repetition Priming, and Fluency : Exploring a New Framework

    ERIC Educational Resources Information Center

    Berry, Christopher J.; Shanks, David R.; Speekenbrink, Maarten; Henson, Richard N. A.

    2012-01-01

    We present a new modeling framework for recognition memory and repetition priming based on signal detection theory. We use this framework to specify and test the predictions of 4 models: (a) a single-system (SS) model, in which one continuous memory signal drives recognition and priming; (b) a multiple-systems-1 (MS1) model, in which completely…

  4. A methodological framework to support the initiation, design and institutionalization of participatory modeling processes in water resources management

    NASA Astrophysics Data System (ADS)

    Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan

    2018-01-01

    Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.

  5. Developing a theoretical framework for complex community-based interventions.

    PubMed

    Angeles, Ricardo N; Dolovich, Lisa; Kaczorowski, Janusz; Thabane, Lehana

    2014-01-01

    Applying existing theories to research, in the form of a theoretical framework, is necessary to advance knowledge from what is already known toward the next steps to be taken. This article proposes a guide on how to develop a theoretical framework for complex community-based interventions using the Cardiovascular Health Awareness Program as an example. Developing a theoretical framework starts with identifying the intervention's essential elements. Subsequent steps include the following: (a) identifying and defining the different variables (independent, dependent, mediating/intervening, moderating, and control); (b) postulating mechanisms how the independent variables will lead to the dependent variables; (c) identifying existing theoretical models supporting the theoretical framework under development; (d) scripting the theoretical framework into a figure or sets of statements as a series of hypotheses, if/then logic statements, or a visual model; (e) content and face validation of the theoretical framework; and (f) revising the theoretical framework. In our example, we combined the "diffusion of innovation theory" and the "health belief model" to develop our framework. Using the Cardiovascular Health Awareness Program as the model, we demonstrated a stepwise process of developing a theoretical framework. The challenges encountered are described, and an overview of the strategies employed to overcome these challenges is presented.

  6. Technology-induced errors. The current use of frameworks and models from the biomedical and life sciences literatures.

    PubMed

    Borycki, E M; Kushniruk, A W; Bellwood, P; Brender, J

    2012-01-01

    The objective of this paper is to examine the extent, range and scope to which frameworks, models and theories dealing with technology-induced error have arisen in the biomedical and life sciences literature as indexed by Medline®. To better understand the state of work in the area of technology-induced error involving frameworks, models and theories, the authors conducted a search of Medline® using selected key words identified from seminal articles in this research area. Articles were reviewed and those pertaining to frameworks, models or theories dealing with technology-induced error were further reviewed by two researchers. All articles from Medline® from its inception to April of 2011 were searched using the above outlined strategy. 239 citations were returned. Each of the abstracts for the 239 citations were reviewed by two researchers. Eleven articles met the criteria based on abstract review. These 11 articles were downloaded for further in-depth review. The majority of the articles obtained describe frameworks and models with reference to theories developed in other literatures outside of healthcare. The papers were grouped into several areas. It was found that articles drew mainly from three literatures: 1) the human factors literature (including human-computer interaction and cognition), 2) the organizational behavior/sociotechnical literature, and 3) the software engineering literature. A variety of frameworks and models were found in the biomedical and life sciences literatures. These frameworks and models drew upon and extended frameworks, models and theoretical perspectives that have emerged in other literatures. These frameworks and models are informing an emerging line of research in health and biomedical informatics involving technology-induced errors in healthcare.

  7. Template-Based Geometric Simulation of Flexible Frameworks

    PubMed Central

    Wells, Stephen A.; Sartbaeva, Asel

    2012-01-01

    Specialised modelling and simulation methods implementing simplified physical models are valuable generators of insight. Template-based geometric simulation is a specialised method for modelling flexible framework structures made up of rigid units. We review the background, development and implementation of the method, and its applications to the study of framework materials such as zeolites and perovskites. The “flexibility window” property of zeolite frameworks is a particularly significant discovery made using geometric simulation. Software implementing geometric simulation of framework materials, “GASP”, is freely available to researchers. PMID:28817055

  8. A Flexible Modeling Framework For Hydraulic and Water Quality Performance Assessment of Stormwater Green Infrastructure

    EPA Science Inventory

    A flexible framework has been created for modeling multi-dimensional hydrological and water quality processes within stormwater green infrastructures (GIs). The framework models a GI system using a set of blocks (spatial features) and connectors (interfaces) representing differen...

  9. ASSESSING MULTIMEDIA/MULTIPATHWAY EXPOSURE TO ARSENIC USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...

  10. Space-Time Processing for Tactical Mobile Ad Hoc Networks

    DTIC Science & Technology

    2008-08-01

    vision for multiple concurrent communication settings, i.e., a many-to-many framework where multi-packet transmissions (MPTs) and multi-packet...modelling framework of capacity-delay tradeoffs We have introduced the first unified modeling framework for the computation of fundamental limits o We...dalities in wireless n twor i-packet modelling framework to account for the use of m lti-packet reception (MPR) f ad hoc networks with MPT under

  11. A conceptual modeling framework for discrete event simulation using hierarchical control structures

    PubMed Central

    Furian, N.; O’Sullivan, M.; Walker, C.; Vössner, S.; Neubacher, D.

    2015-01-01

    Conceptual Modeling (CM) is a fundamental step in a simulation project. Nevertheless, it is only recently that structured approaches towards the definition and formulation of conceptual models have gained importance in the Discrete Event Simulation (DES) community. As a consequence, frameworks and guidelines for applying CM to DES have emerged and discussion of CM for DES is increasing. However, both the organization of model-components and the identification of behavior and system control from standard CM approaches have shortcomings that limit CM’s applicability to DES. Therefore, we discuss the different aspects of previous CM frameworks and identify their limitations. Further, we present the Hierarchical Control Conceptual Modeling framework that pays more attention to the identification of a models’ system behavior, control policies and dispatching routines and their structured representation within a conceptual model. The framework guides the user step-by-step through the modeling process and is illustrated by a worked example. PMID:26778940

  12. Advances in the spatially distributed ages-w model: parallel computation, java connection framework (JCF) integration, and streamflow/nitrogen dynamics assessment

    USDA-ARS?s Scientific Manuscript database

    AgroEcoSystem-Watershed (AgES-W) is a modular, Java-based spatially distributed model which implements hydrologic and water quality (H/WQ) simulation components under the Java Connection Framework (JCF) and the Object Modeling System (OMS) environmental modeling framework. AgES-W is implicitly scala...

  13. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  14. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  15. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance

    PubMed Central

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y.; Fairchild, Geoffrey; Hyman, James M.; Kiang, Richard; Morse, Andrew P.; Pancerella, Carmen M.; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models. PMID:26820405

  16. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the Use of Epidemiological Models for Infectious Disease Surveillance.

    PubMed

    Margevicius, Kristen J; Generous, Nicholas; Abeyta, Esteban; Althouse, Ben; Burkom, Howard; Castro, Lauren; Daughton, Ashlynn; Del Valle, Sara Y; Fairchild, Geoffrey; Hyman, James M; Kiang, Richard; Morse, Andrew P; Pancerella, Carmen M; Pullum, Laura; Ramanathan, Arvind; Schlegelmilch, Jeffrey; Scott, Aaron; Taylor-McCabe, Kirsten J; Vespignani, Alessandro; Deshpande, Alina

    2016-01-01

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subject matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.

  17. The Biosurveillance Analytics Resource Directory (BARD): Facilitating the use of epidemiological models for infectious disease surveillance

    DOE PAGES

    Margevicius, Kristen J.; Generous, Nicholas; Abeyta, Esteban; ...

    2016-01-28

    Epidemiological modeling for infectious disease is important for disease management and its routine implementation needs to be facilitated through better description of models in an operational context. A standardized model characterization process that allows selection or making manual comparisons of available models and their results is currently lacking. A key need is a universal framework to facilitate model description and understanding of its features. Los Alamos National Laboratory (LANL) has developed a comprehensive framework that can be used to characterize an infectious disease model in an operational context. The framework was developed through a consensus among a panel of subjectmore » matter experts. In this paper, we describe the framework, its application to model characterization, and the development of the Biosurveillance Analytics Resource Directory (BARD; http://brd.bsvgateway.org/brd/), to facilitate the rapid selection of operational models for specific infectious/communicable diseases. We offer this framework and associated database to stakeholders of the infectious disease modeling field as a tool for standardizing model description and facilitating the use of epidemiological models.« less

  18. A generic framework for individual-based modelling and physical-biological interaction

    PubMed Central

    2018-01-01

    The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280

  19. Modeling Synergistic Drug Inhibition of Mycobacterium tuberculosis Growth in Murine Macrophages

    DTIC Science & Technology

    2011-01-01

    important application of metabolic network modeling is the ability to quantitatively model metabolic enzyme inhibition and predict bacterial growth...describe the extensions of this framework to model drug- induced growth inhibition of M. tuberculosis in macrophages.39 Mathematical framework Fig. 1 shows...starting point, we used the previously developed iNJ661v model to represent the metabolic Fig. 1 Mathematical framework: a set of coupled models used to

  20. Hydrogeology of a Biosolids-Application Site Near Deer Trail, Colorado, 1993-99

    USGS Publications Warehouse

    Yager, Tracy J.B.; Arnold, L. Rick

    2003-01-01

    This report presents hydrogeology data and interpretations resulting from two studies related to biosolids applications at the Metro Wastewater Reclamation District property near Deer Trail, Colorado, done by the U.S. Geological Survey in cooperation with the Metro Wastewater Reclamation District: (1) a 1993-99 study of hydrology and water quality for the Metro Wastewater Reclamation District central property and (2) a 1999 study of regional bedrock-aquifer structure and local ground-water recharge. Biosolids were applied as a fertilizer during late 1993 through 1999. The 1993 Metro Wastewater Reclamation District property boundary constitutes the study area, but hydrogeologic structure maps for a much larger area are included in the report. The study area is located on the eastern margin of the Denver Basin, a bowl-shaped sequence of sedimentary rocks. The uppermost bedrock formations in the vicinity of the study area consist of the Pierre Shale, the Fox Hills Sandstone, and the Laramie Formation, parts of which comprise the Laramie-Fox Hills hydrostratigraphic unit and thus, where saturated, the Laramie-Fox Hills aquifer. In the vicinity of the study area, the Laramie-Fox Hills hydrostratigraphic unit dips gently to the northwest, crops out, and is partially eroded. The Laramie-Fox Hills aquifer is either absent or not fully saturated within the Metro Wastewater Reclamation District properties, although this aquifer is the principal aquifer used for domestic supply in the vicinity of the study area. Yield was small from two deep monitoring wells in the Laramie-Fox Hills aquifer within the study area. Depth to water in these wells was about 110 and 150 feet below land surface, and monthly water levels fluctuated 0.5 foot or less. Alluvial aquifers also are present in the unconsolidated sand and loess deposits in the valleys of the study area. Interactions of the deeper parts of the Laramie-Fox Hills aquifer with shallow ground water in the study area include a general close hydraulic connection between alluvial and bedrock aquifers, recharge of the Cottonwood Creek and much of the Muddy Creek alluvial aquifers by the bedrock aquifer, and possible recharge of the bedrock aquifer by a Rattlesnake Creek tributary. Some areas of shallow ground water were recharged by infiltration from rain or ponds, but other areas likely were recharged by other ground water. Data for shallow ground water indicate that ground-water recharge takes less than a day at some sites to about 40 years at another site. Depth to shallow ground water in the study area ranged from about 2 feet to about 37 feet below land surface. Shallow ground-water levels likely were affected by evapotranspiration. Ground water is present in shallow parts of the bedrock aquifer or in alluvial aquifers in four drainage basins: Badger Creek, Cottonwood Creek, Muddy Creek, and Rattlesnake Creek. These drainage basins generally contained only ephemeral streams, which flow only after intense rain.

  1. An Integrated Modeling Framework Forecasting Ecosystem Services: Application to the Albemarle Pamlico Basins, NC and VA (USA)

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  2. An Integrated Modeling Framework Forcasting Ecosystem Services--Application to the Albemarle Pamlico Basins, NC and VA (USA) and Beyond

    EPA Science Inventory

    We demonstrate an Integrated Modeling Framework that predicts the state of freshwater ecosystem services within the Albemarle-Pamlico Basins. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standa...

  3. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: An Earth Modeling System Software Framework Strawman Design that Integrates Cactus and UCLA/UCB Distributed Data Broker

    NASA Technical Reports Server (NTRS)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.

  4. Advanced Computational Framework for Environmental Management ZEM, Version 1.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin

    2016-11-04

    Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions).more » To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.« less

  5. Model-Based Reasoning in Upper-division Lab Courses

    NASA Astrophysics Data System (ADS)

    Lewandowski, Heather

    2015-05-01

    Modeling, which includes developing, testing, and refining models, is a central activity in physics. Well-known examples from AMO physics include everything from the Bohr model of the hydrogen atom to the Bose-Hubbard model of interacting bosons in a lattice. Modeling, while typically considered a theoretical activity, is most fully represented in the laboratory where measurements of real phenomena intersect with theoretical models, leading to refinement of models and experimental apparatus. However, experimental physicists use models in complex ways and the process is often not made explicit in physics laboratory courses. We have developed a framework to describe the modeling process in physics laboratory activities. The framework attempts to abstract and simplify the complex modeling process undertaken by expert experimentalists. The framework can be applied to understand typical processes such the modeling of the measurement tools, modeling ``black boxes,'' and signal processing. We demonstrate that the framework captures several important features of model-based reasoning in a way that can reveal common student difficulties in the lab and guide the development of curricula that emphasize modeling in the laboratory. We also use the framework to examine troubleshooting in the lab and guide students to effective methods and strategies.

  6. A Simulation and Modeling Framework for Space Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less

  7. Frameworks for Assessing the Quality of Modeling and Simulation Capabilities

    NASA Astrophysics Data System (ADS)

    Rider, W. J.

    2012-12-01

    The importance of assuring quality in modeling and simulation has spawned several frameworks for structuring the examination of quality. The format and content of these frameworks provides an emphasis, completeness and flow to assessment activities. I will examine four frameworks that have been developed and describe how they can be improved and applied to a broader set of high consequence applications. Perhaps the first of these frameworks was known as CSAU [Boyack] (code scaling, applicability and uncertainty) used for nuclear reactor safety and endorsed the United States' Nuclear Regulatory Commission (USNRC). This framework was shaped by nuclear safety practice, and the practical structure needed after the Three Mile Island accident. It incorporated the dominant experimental program, the dominant analysis approach, and concerns about the quality of modeling. The USNRC gave it the force of law that made the nuclear industry take it seriously. After the cessation of nuclear weapons' testing the United States began a program of examining the reliability of these weapons without testing. This program utilizes science including theory, modeling, simulation and experimentation to replace the underground testing. The emphasis on modeling and simulation necessitated attention on the quality of these simulations. Sandia developed the PCMM (predictive capability maturity model) to structure this attention [Oberkampf]. PCMM divides simulation into six core activities to be examined and graded relative to the needs of the modeling activity. NASA [NASA] has built yet another framework in response to the tragedy of the space shuttle accidents. Finally, Ben-Haim and Hemez focus upon modeling robustness and predictive fidelity in another approach. These frameworks are similar, and applied in a similar fashion. The adoption of these frameworks at Sandia and NASA has been slow and arduous because the force of law has not assisted acceptance. All existing frameworks are incomplete and need to be extended incorporating elements from the other as well as new elements related to how models are solved, and how the model will be applied. I will describe this merger of approach and how it should be applied. The problems in adoption are related to basic human nature in that no one likes to be graded, or told they are not sufficiently quality oriented. Rather than engage in an adversarial role, I suggest that the frameworks be viewed as a collaborative tool. Instead these frameworks should be used to structure collaborations that can be used to assist the modeling and simulation efforts to be high quality. The framework provides a comprehensive setting of modeling and simulation themes that should be explored in providing high quality. W. Oberkampf, M. Pilch, and T. Trucano, Predictive Capability Maturity Model for Computational Modeling and Simulation, SAND2007-5948, 2007. B. Boyack, Quantifying Reactor Safety Margins Part 1: An Overview of the Code Scaling, Applicability, and Uncertainty Evaluation Methodology, Nuc. Eng. Design, 119, pp. 1-15, 1990. National Aeronautics and Space Administration, STANDARD FOR MODELS AND SIMULATIONS, NASA-STD-7009, 2008. Y. Ben-Haim and F. Hemez, Robustness, fidelity and prediction-looseness of models, Proc. R. Soc. A (2012) 468, 227-244.

  8. PACS/information systems interoperability using Enterprise Communication Framework.

    PubMed

    alSafadi, Y; Lord, W P; Mankovich, N J

    1998-06-01

    Interoperability among healthcare applications goes beyond connectivity to allow components to exchange structured information and work together in a predictable, coordinated fashion. To facilitate building an interoperability infrastructure, an Enterprise Communication Framework (ECF) was developed by the members of the Andover Working Group for Healthcare Interoperability (AWG-OHI). The ECF consists of four models: 1) Use Case Model, 2) Domain Information Model (DIM), 3) Interaction Model, and 4) Message Model. To realize this framework, a software component called the Enterprise Communicator (EC) is used. In this paper, we will demonstrate the use of the framework in interoperating a picture archiving and communication system (PACS) with a radiology information system (RIS).

  9. Modeling Nonlinear Change via Latent Change and Latent Acceleration Frameworks: Examining Velocity and Acceleration of Growth Trajectories

    ERIC Educational Resources Information Center

    Grimm, Kevin; Zhang, Zhiyong; Hamagami, Fumiaki; Mazzocco, Michele

    2013-01-01

    We propose the use of the latent change and latent acceleration frameworks for modeling nonlinear growth in structural equation models. Moving to these frameworks allows for the direct identification of "rates of change" and "acceleration" in latent growth curves--information available indirectly through traditional growth…

  10. The Smoothed Dirichlet Distribution: Understanding Cross-Entropy Ranking in Information Retrieval

    DTIC Science & Technology

    2006-07-01

    reflect those of the spon- sor. viii ABSTRACT Unigram Language modeling is a successful probabilistic framework for Information Retrieval (IR) that uses...the Relevance model (RM), a state-of-the-art model for IR in the language modeling framework that uses the same cross-entropy as its ranking function...In addition, the SD based classifier provides more flexibility than RM in modeling documents owing to a consistent generative framework . We

  11. Modelling Participatory Geographic Information System for Customary Land Conflict Resolution

    NASA Astrophysics Data System (ADS)

    Gyamera, E. A.; Arko-Adjei, A.; Duncan, E. E.; Kuma, J. S. Y.

    2017-11-01

    Since land contributes to about 73 % of most countries Gross Domestic Product (GDP), attention on land rights have tremendously increased globally. Conflicts over land have therefore become part of the major problems associated with land administration. However, the conventional mechanisms for land conflict resolution do not provide satisfactory result to disputants due to various factors. This study sought to develop a Framework of using Participatory Geographic Information System (PGIS) for customary land conflict resolution. The framework was modelled using Unified Modelling Language (UML). The PGIS framework, called butterfly model, consists of three units namely, Social Unit (SU), Technical Unit (TU) and Decision Making Unit (DMU). The name butterfly model for land conflict resolution was adopted for the framework based on its features and properties. The framework has therefore been recommended to be adopted for land conflict resolution in customary areas.

  12. New model framework and structure and the commonality evaluation model. [concerning unmanned spacecraft projects

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The development of a framework and structure for shuttle era unmanned spacecraft projects and the development of a commonality evaluation model is documented. The methodology developed for model utilization in performing cost trades and comparative evaluations for commonality studies is discussed. The model framework consists of categories of activities associated with the spacecraft system's development process. The model structure describes the physical elements to be treated as separate identifiable entities. Cost estimating relationships for subsystem and program-level components were calculated.

  13. An Illustrative Guide to the Minerva Framework

    NASA Astrophysics Data System (ADS)

    Flom, Erik; Leonard, Patrick; Hoeffel, Udo; Kwak, Sehyun; Pavone, Andrea; Svensson, Jakob; Krychowiak, Maciej; Wendelstein 7-X Team Collaboration

    2017-10-01

    Modern phsyics experiments require tracking and modelling data and their associated uncertainties on a large scale, as well as the combined implementation of multiple independent data streams for sophisticated modelling and analysis. The Minerva Framework offers a centralized, user-friendly method of large-scale physics modelling and scientific inference. Currently used by teams at multiple large-scale fusion experiments including the Joint European Torus (JET) and Wendelstein 7-X (W7-X), the Minerva framework provides a forward-model friendly architecture for developing and implementing models for large-scale experiments. One aspect of the framework involves so-called data sources, which are nodes in the graphical model. These nodes are supplied with engineering and physics parameters. When end-user level code calls a node, it is checked network-wide against its dependent nodes for changes since its last implementation and returns version-specific data. Here, a filterscope data node is used as an illustrative example of the Minerva Framework's data management structure and its further application to Bayesian modelling of complex systems. This work has been carried out within the framework of the EUROfusion Consortium and has received funding from the Euratom research and training programme 2014-2018 under Grant Agreement No. 633053.

  14. The Goddard Snow Radiance Assimilation Project: An Integrated Snow Radiance and Snow Physics Modeling Framework for Snow/cold Land Surface Modeling

    NASA Technical Reports Server (NTRS)

    Kim, E.; Tedesco, M.; Reichle, R.; Choudhury, B.; Peters-Lidard C.; Foster, J.; Hall, D.; Riggs, G.

    2006-01-01

    Microwave-based retrievals of snow parameters from satellite observations have a long heritage and have so far been generated primarily by regression-based empirical "inversion" methods based on snapshots in time. Direct assimilation of microwave radiance into physical land surface models can be used to avoid errors associated with such retrieval/inversion methods, instead utilizing more straightforward forward models and temporal information. This approach has been used for years for atmospheric parameters by the operational weather forecasting community with great success. Recent developments in forward radiative transfer modeling, physical land surface modeling, and land data assimilation are converging to allow the assembly of an integrated framework for snow/cold lands modeling and radiance assimilation. The objective of the Goddard snow radiance assimilation project is to develop such a framework and explore its capabilities. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. In fact, multiple models are available for each element enabling optimization to match the needs of a particular study. Together these form a modular and flexible framework for self-consistent, physically-based remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster. Capabilities for assimilation of snow retrieval products are already under development for LIS. We will describe plans to add radiance-based assimilation capabilities. Plans for validation activities using field measurements will also be discussed.

  15. Framework for assessing key variable dependencies in loose-abrasive grinding and polishing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, J.S.; Aikens, D.M.; Brown, N.J.

    1995-12-01

    This memo describes a framework for identifying all key variables that determine the figuring performance of loose-abrasive lapping and polishing machines. This framework is intended as a tool for prioritizing R&D issues, assessing the completeness of process models and experimental data, and for providing a mechanism to identify any assumptions in analytical models or experimental procedures. Future plans for preparing analytical models or performing experiments can refer to this framework in establishing the context of the work.

  16. Model-based reasoning in the physics laboratory: Framework and initial results

    NASA Astrophysics Data System (ADS)

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-12-01

    [This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.

  17. Generalized Multilevel Structural Equation Modeling

    ERIC Educational Resources Information Center

    Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew

    2004-01-01

    A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…

  18. [Computer aided design and rapid manufacturing of removable partial denture frameworks].

    PubMed

    Han, Jing; Lü, Pei-jun; Wang, Yong

    2010-08-01

    To introduce a method of digital modeling and fabricating removable partial denture (RPD) frameworks using self-developed software for RPD design and rapid manufacturing system. The three-dimensional data of two partially dentate dental casts were obtained using a three-dimensional crossing section scanner. Self-developed software package for RPD design was used to decide the path of insertion and to design different components of RPD frameworks. The components included occlusal rest, clasp, lingual bar, polymeric retention framework and maxillary major connector. The design procedure for the components was as following: first, determine the outline of the component. Second, build the tissue surface of the component using the scanned data within the outline. Third, preset cross section was used to produce the polished surface. Finally, different RPD components were modeled respectively and connected by minor connectors to form an integrated RPD framework. The finished data were imported into a self-developed selective laser melting (SLM) machine and metal frameworks were fabricated directly. RPD frameworks for the two scanned dental casts were modeled with this self-developed program and metal RPD frameworks were successfully fabricated using SLM method. The finished metal frameworks fit well on the plaster models. The self-developed computer aided design and computer aided manufacture (CAD-CAM) system for RPD design and fabrication has completely independent intellectual property rights. It provides a new method of manufacturing metal RPD frameworks.

  19. Calibration and Propagation of Uncertainty for Independence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Troy Michael; Kress, Joel David; Bhat, Kabekode Ghanasham

    This document reports on progress and methods for the calibration and uncertainty quantification of the Independence model developed at UT Austin. The Independence model is an advanced thermodynamic and process model framework for piperazine solutions as a high-performance CO 2 capture solvent. Progress is presented in the framework of the CCSI standard basic data model inference framework. Recent work has largely focused on the thermodynamic submodels of Independence.

  20. An active monitoring method for flood events

    NASA Astrophysics Data System (ADS)

    Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya

    2018-07-01

    Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.

  1. GiPSi:a framework for open source/open architecture software development for organ-level surgical simulation.

    PubMed

    Cavuşoğlu, M Cenk; Göktekin, Tolga G; Tendick, Frank

    2006-04-01

    This paper presents the architectural details of an evolving open source/open architecture software framework for developing organ-level surgical simulations. Our goal is to facilitate shared development of reusable models, to accommodate heterogeneous models of computation, and to provide a framework for interfacing multiple heterogeneous models. The framework provides an application programming interface for interfacing dynamic models defined over spatial domains. It is specifically designed to be independent of the specifics of the modeling methods used, and therefore facilitates seamless integration of heterogeneous models and processes. Furthermore, each model has separate geometries for visualization, simulation, and interfacing, allowing the model developer to choose the most natural geometric representation for each case. Input/output interfaces for visualization and haptics for real-time interactive applications have also been provided.

  2. A Survey of Statistical Models for Reverse Engineering Gene Regulatory Networks

    PubMed Central

    Huang, Yufei; Tienda-Luna, Isabel M.; Wang, Yufeng

    2009-01-01

    Statistical models for reverse engineering gene regulatory networks are surveyed in this article. To provide readers with a system-level view of the modeling issues in this research, a graphical modeling framework is proposed. This framework serves as the scaffolding on which the review of different models can be systematically assembled. Based on the framework, we review many existing models for many aspects of gene regulation; the pros and cons of each model are discussed. In addition, network inference algorithms are also surveyed under the graphical modeling framework by the categories of point solutions and probabilistic solutions and the connections and differences among the algorithms are provided. This survey has the potential to elucidate the development and future of reverse engineering GRNs and bring statistical signal processing closer to the core of this research. PMID:20046885

  3. Generic framework for mining cellular automata models on protein-folding simulations.

    PubMed

    Diaz, N; Tischer, I

    2016-05-13

    Cellular automata model identification is an important way of building simplified simulation models. In this study, we describe a generic architectural framework to ease the development process of new metaheuristic-based algorithms for cellular automata model identification in protein-folding trajectories. Our framework was developed by a methodology based on design patterns that allow an improved experience for new algorithms development. The usefulness of the proposed framework is demonstrated by the implementation of four algorithms, able to obtain extremely precise cellular automata models of the protein-folding process with a protein contact map representation. Dynamic rules obtained by the proposed approach are discussed, and future use for the new tool is outlined.

  4. First-Order Frameworks for Managing Models in Engineering Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natlia M.; Lewis, Robert Michael

    2000-01-01

    Approximation/model management optimization (AMMO) is a rigorous methodology for attaining solutions of high-fidelity optimization problems with minimal expense in high- fidelity function and derivative evaluation. First-order AMMO frameworks allow for a wide variety of models and underlying optimization algorithms. Recent demonstrations with aerodynamic optimization achieved three-fold savings in terms of high- fidelity function and derivative evaluation in the case of variable-resolution models and five-fold savings in the case of variable-fidelity physics models. The savings are problem dependent but certain trends are beginning to emerge. We give an overview of the first-order frameworks, current computational results, and an idea of the scope of the first-order framework applicability.

  5. GeoFramework: A Modeling Framework for Solid Earth Geophysics

    NASA Astrophysics Data System (ADS)

    Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.

    2003-12-01

    As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic earthquake rupture; SNAC, a developing 3-D coded based on the FLAC method for visco-elastoplastic deformation; SNARK, a 3-D FE-PIC method for viscoplastic deformation; and gPLATES an open source paleogeographic/plate tectonics modeling package. We will demonstrate how codes can be linked with themselves, such as a regional and global model of mantle convection and a visco-elastoplastic representation of the crust within viscous mantle flow. Finally, we will describe how http://GeoFramework.org has become a distribution site for a suite of modeling software in geophysics.

  6. Systematic narrative review of decision frameworks to select the appropriate modelling approaches for health economic evaluations.

    PubMed

    Tsoi, B; O'Reilly, D; Jegathisawaran, J; Tarride, J-E; Blackhouse, G; Goeree, R

    2015-06-17

    In constructing or appraising a health economic model, an early consideration is whether the modelling approach selected is appropriate for the given decision problem. Frameworks and taxonomies that distinguish between modelling approaches can help make this decision more systematic and this study aims to identify and compare the decision frameworks proposed to date on this topic area. A systematic review was conducted to identify frameworks from peer-reviewed and grey literature sources. The following databases were searched: OVID Medline and EMBASE; Wiley's Cochrane Library and Health Economic Evaluation Database; PubMed; and ProQuest. Eight decision frameworks were identified, each focused on a different set of modelling approaches and employing a different collection of selection criterion. The selection criteria can be categorized as either: (i) structural features (i.e. technical elements that are factual in nature) or (ii) practical considerations (i.e. context-dependent attributes). The most commonly mentioned structural features were population resolution (i.e. aggregate vs. individual) and interactivity (i.e. static vs. dynamic). Furthermore, understanding the needs of the end-users and stakeholders was frequently incorporated as a criterion within these frameworks. There is presently no universally-accepted framework for selecting an economic modelling approach. Rather, each highlights different criteria that may be of importance when determining whether a modelling approach is appropriate. Further discussion is thus necessary as the modelling approach selected will impact the validity of the underlying economic model and have downstream implications on its efficiency, transparency and relevance to decision-makers.

  7. Multicriteria framework for selecting a process modelling language

    NASA Astrophysics Data System (ADS)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  8. State Event Models for the Formal Analysis of Human-Machine Interactions

    NASA Technical Reports Server (NTRS)

    Combefis, Sebastien; Giannakopoulou, Dimitra; Pecheur, Charles

    2014-01-01

    The work described in this paper was motivated by our experience with applying a framework for formal analysis of human-machine interactions (HMI) to a realistic model of an autopilot. The framework is built around a formally defined conformance relation called "fullcontrol" between an actual system and the mental model according to which the system is operated. Systems are well-designed if they can be described by relatively simple, full-control, mental models for their human operators. For this reason, our framework supports automated generation of minimal full-control mental models for HMI systems, where both the system and the mental models are described as labelled transition systems (LTS). The autopilot that we analysed has been developed in the NASA Ames HMI prototyping tool ADEPT. In this paper, we describe how we extended the models that our HMI analysis framework handles to allow adequate representation of ADEPT models. We then provide a property-preserving reduction from these extended models to LTSs, to enable application of our LTS-based formal analysis algorithms. Finally, we briefly discuss the analyses we were able to perform on the autopilot model with our extended framework.

  9. Physiome-model-based state-space framework for cardiac deformation recovery.

    PubMed

    Wong, Ken C L; Zhang, Heye; Liu, Huafeng; Shi, Pengcheng

    2007-11-01

    To more reliably recover cardiac information from noise-corrupted, patient-specific measurements, it is essential to employ meaningful constraining models and adopt appropriate optimization criteria to couple the models with the measurements. Although biomechanical models have been extensively used for myocardial motion recovery with encouraging results, the passive nature of such constraints limits their ability to fully count for the deformation caused by active forces of the myocytes. To overcome such limitations, we propose to adopt a cardiac physiome model as the prior constraint for cardiac motion analysis. The cardiac physiome model comprises an electric wave propagation model, an electromechanical coupling model, and a biomechanical model, which are connected through a cardiac system dynamics for a more complete description of the macroscopic cardiac physiology. Embedded within a multiframe state-space framework, the uncertainties of the model and the patient's measurements are systematically dealt with to arrive at optimal cardiac kinematic estimates and possibly beyond. Experiments have been conducted to compare our proposed cardiac-physiome-model-based framework with the solely biomechanical model-based framework. The results show that our proposed framework recovers more accurate cardiac deformation from synthetic data and obtains more sensible estimates from real magnetic resonance image sequences. With the active components introduced by the cardiac physiome model, cardiac deformations recovered from patient's medical images are more physiologically plausible.

  10. R-SWAT-FME user's guide

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shu-Guang

    2012-01-01

    R program language-Soil and Water Assessment Tool-Flexible Modeling Environment (R-SWAT-FME) (Wu and Liu, 2012) is a comprehensive modeling framework that adopts an R package, Flexible Modeling Environment (FME) (Soetaert and Petzoldt, 2010), for the Soil and Water Assessment Tool (SWAT) model (Arnold and others, 1998; Neitsch and others, 2005). This framework provides the functionalities of parameter identifiability, model calibration, and sensitivity and uncertainty analysis with instant visualization. This user's guide shows how to apply this framework for a customized SWAT project.

  11. Modeling asset price processes based on mean-field framework

    NASA Astrophysics Data System (ADS)

    Ieda, Masashi; Shiino, Masatoshi

    2011-12-01

    We propose a model of the dynamics of financial assets based on the mean-field framework. This framework allows us to construct a model which includes the interaction among the financial assets reflecting the market structure. Our study is on the cutting edge in the sense of a microscopic approach to modeling the financial market. To demonstrate the effectiveness of our model concretely, we provide a case study, which is the pricing problem of the European call option with short-time memory noise.

  12. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    DOE PAGES

    Hagos, Samson; Feng, Zhe; Plant, Robert S.; ...

    2018-02-20

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less

  13. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    NASA Astrophysics Data System (ADS)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.; Houze, Robert A.; Xiao, Heng

    2018-02-01

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii) the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. In addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.

  14. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less

  15. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    NASA Astrophysics Data System (ADS)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014, 11th International Conf. on Hydroinformatics, New York, NY.

  16. A software engineering perspective on environmental modeling framework design: The object modeling system

    USDA-ARS?s Scientific Manuscript database

    The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...

  17. A framework for modeling uncertainty in regional climate change

    EPA Science Inventory

    In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...

  18. A Framework for Dimensionality Assessment for Multidimensional Item Response Models

    ERIC Educational Resources Information Center

    Svetina, Dubravka; Levy, Roy

    2014-01-01

    A framework is introduced for considering dimensionality assessment procedures for multidimensional item response models. The framework characterizes procedures in terms of their confirmatory or exploratory approach, parametric or nonparametric assumptions, and applicability to dichotomous, polytomous, and missing data. Popular and emerging…

  19. Stress distribution in Co-Cr implant frameworks after laser or TIG welding.

    PubMed

    de Castro, Gabriela Cassaro; de Araújo, Cleudmar Amaral; Mesquita, Marcelo Ferraz; Consani, Rafael Leonardo Xediek; Nóbilo, Mauro Antônio de Arruda

    2013-01-01

    Lack of passivity has been associated with biomechanical problems in implant-supported prosthesis. The aim of this study was to evaluate the passivity of three techniques to fabricate an implant framework from a Co-Cr alloy by photoelasticity. The model was obtained from a steel die simulating an edentulous mandible with 4 external hexagon analog implants with a standard platform. On this model, five frameworks were fabricated for each group: a monoblock framework (control), laser and TIG welding frameworks. The photoelastic model was made from a flexible epoxy resin. On the photoelastic analysis, the frameworks were bolted onto the model for the verification of maximum shear stress at 34 selected points around the implants and 5 points in the middle of the model. The stresses were compared all over the photoelastic model, between the right, left, and center regions and between the cervical and apical regions. The values were subjected to two-way ANOVA, and Tukey's test (α=0.05). There was no significant difference among the groups and studied areas (p>0.05). It was concluded that the stresses generated around the implants were similar for all techniques.

  20. Model-theoretic framework for sensor data fusion

    NASA Astrophysics Data System (ADS)

    Zavoleas, Kyriakos P.; Kokar, Mieczyslaw M.

    1993-09-01

    The main goal of our research in sensory data fusion (SDF) is the development of a systematic approach (a methodology) to designing systems for interpreting sensory information and for reasoning about the situation based upon this information and upon available data bases and knowledge bases. To achieve such a goal, two kinds of subgoals have been set: (1) develop a theoretical framework in which rational design/implementation decisions can be made, and (2) design a prototype SDF system along the lines of the framework. Our initial design of the framework has been described in our previous papers. In this paper we concentrate on the model-theoretic aspects of this framework. We postulate that data are embedded in data models, and information processing mechanisms are embedded in model operators. The paper is devoted to analyzing the classes of model operators and their significance in SDF. We investigate transformation abstraction and fusion operators. A prototype SDF system, fusing data from range and intensity sensors, is presented, exemplifying the structures introduced. Our framework is justified by the fact that it provides modularity, traceability of information flow, and a basis for a specification language for SDF.

  1. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. A Framework of Operating Models for Interdisciplinary Research Programs in Clinical Service Organizations

    ERIC Educational Resources Information Center

    King, Gillian; Currie, Melissa; Smith, Linda; Servais, Michelle; McDougall, Janette

    2008-01-01

    A framework of operating models for interdisciplinary research programs in clinical service organizations is presented, consisting of a "clinician-researcher" skill development model, a program evaluation model, a researcher-led knowledge generation model, and a knowledge conduit model. Together, these models comprise a tailored, collaborative…

  3. A general modeling framework for describing spatially structured population dynamics

    USGS Publications Warehouse

    Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan

    2017-01-01

    Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance that comparative analyses are colored by model details rather than general principles

  4. Entity-Centric Abstraction and Modeling Framework for Transportation Architectures

    NASA Technical Reports Server (NTRS)

    Lewe, Jung-Ho; DeLaurentis, Daniel A.; Mavris, Dimitri N.; Schrage, Daniel P.

    2007-01-01

    A comprehensive framework for representing transpportation architectures is presented. After discussing a series of preceding perspectives and formulations, the intellectual underpinning of the novel framework using an entity-centric abstraction of transportation is described. The entities include endogenous and exogenous factors and functional expressions are offered that relate these and their evolution. The end result is a Transportation Architecture Field which permits analysis of future concepts under the holistic perspective. A simulation model which stems from the framework is presented and exercised producing results which quantify improvements in air transportation due to advanced aircraft technologies. Finally, a modeling hypothesis and its accompanying criteria are proposed to test further use of the framework for evaluating new transportation solutions.

  5. Clinical Knowledge Governance Framework for Nationwide Data Infrastructure Projects.

    PubMed

    Wulff, Antje; Haarbrandt, Birger; Marschollek, Michael

    2018-01-01

    The availability of semantically-enriched and interoperable clinical information models is crucial for reusing once collected data across institutions like aspired in the German HiGHmed project. Funded by the Federal Ministry of Education and Research, this nationwide data infrastructure project adopts the openEHR approach for semantic modelling. Here, strong governance is required to define high-quality and reusable models. Design of a clinical knowledge governance framework for openEHR modelling in cross-institutional settings like HiGHmed. Analysis of successful practices from international projects, published ideas on archetype governance and own modelling experiences as well as modelling of BPMN processes. We designed a framework by presenting archetype variations, roles and responsibilities, IT support and modelling workflows. Our framework has great potential to make the openEHR modelling efforts manageable. Because practical experiences are rare, prospectively our work will be predestinated to evaluate the benefits of such structured governance approaches.

  6. A new fit-for-purpose model testing framework: Decision Crash Tests

    NASA Astrophysics Data System (ADS)

    Tolson, Bryan; Craig, James

    2016-04-01

    Decision-makers in water resources are often burdened with selecting appropriate multi-million dollar strategies to mitigate the impacts of climate or land use change. Unfortunately, the suitability of existing hydrologic simulation models to accurately inform decision-making is in doubt because the testing procedures used to evaluate model utility (i.e., model validation) are insufficient. For example, many authors have identified that a good standard framework for model testing called the Klemes Crash Tests (KCTs), which are the classic model validation procedures from Klemeš (1986) that Andréassian et al. (2009) rename as KCTs, have yet to become common practice in hydrology. Furthermore, Andréassian et al. (2009) claim that the progression of hydrological science requires widespread use of KCT and the development of new crash tests. Existing simulation (not forecasting) model testing procedures such as KCTs look backwards (checking for consistency between simulations and past observations) rather than forwards (explicitly assessing if the model is likely to support future decisions). We propose a fundamentally different, forward-looking, decision-oriented hydrologic model testing framework based upon the concept of fit-for-purpose model testing that we call Decision Crash Tests or DCTs. Key DCT elements are i) the model purpose (i.e., decision the model is meant to support) must be identified so that model outputs can be mapped to management decisions ii) the framework evaluates not just the selected hydrologic model but the entire suite of model-building decisions associated with model discretization, calibration etc. The framework is constructed to directly and quantitatively evaluate model suitability. The DCT framework is applied to a model building case study on the Grand River in Ontario, Canada. A hypothetical binary decision scenario is analysed (upgrade or not upgrade the existing flood control structure) under two different sets of model building decisions. In one case, we show the set of model building decisions has a low probability to correctly support the upgrade decision. In the other case, we show evidence suggesting another set of model building decisions has a high probability to correctly support the decision. The proposed DCT framework focuses on what model users typically care about: the management decision in question. The DCT framework will often be very strict and will produce easy to interpret results enabling clear unsuitability determinations. In the past, hydrologic modelling progress has necessarily meant new models and model building methods. Continued progress in hydrologic modelling requires finding clear evidence to motivate researchers to disregard unproductive models and methods and the DCT framework is built to produce this kind of evidence. References: Andréassian, V., C. Perrin, L. Berthet, N. Le Moine, J. Lerat, C. Loumagne, L. Oudin, T. Mathevet, M.-H. Ramos, and A. Valéry (2009), Crash tests for a standardized evaluation of hydrological models. Hydrology and Earth System Sciences, 13, 1757-1764. Klemeš, V. (1986), Operational testing of hydrological simulation models. Hydrological Sciences Journal, 31 (1), 13-24.

  7. Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Briggs, Jeffery L.

    2008-01-01

    The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.

  8. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.

    Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less

  9. Incorporating physically-based microstructures in materials modeling: Bridging phase field and crystal plasticity frameworks

    DOE PAGES

    Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; ...

    2016-04-25

    Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less

  10. Enhancing a socio-hydrological modelling framework through field observations: a case study in India

    NASA Astrophysics Data System (ADS)

    den Besten, Nadja; Pande, Saket; Savenije, Huub H. G.

    2016-04-01

    Recently a smallholder socio-hydrological modelling framework was proposed and deployed to understand the underlying dynamics of Agrarian Crisis in Maharashtra state of India. It was found that cotton and sugarcane smallholders whom lack irrigation and storage techniques are most susceptible to distress. This study further expands the application of the modelling framework to other crops that are abundant in the state of Maharashtra, such as Paddy, Jowar and Soyabean to assess whether the conclusions on the possible causes behind smallholder distress still hold. Further, a fieldwork will be undertaken in March 2016 in the district of Pune. During the fieldwork 50 smallholders will be interviewed in which socio-hydrological assumptions on hydrology and capital equations and corresponding closure relationships, incorporated the current model, will be put to test. Besides the assumptions, the questionnaires will be used to better understand the hydrological reality of the farm holders, in terms of water usage and storage capacity. In combination with historical records on the smallholders' socio-economic data acquired over the last thirty years available through several NGOs in the region, socio-hydrological realism of the modelling framework will be enhanced. The preliminary outcomes of a desktop study show the possibilities of a water-centric modelling framework in understanding the constraints on smallholder farming. The results and methods described can be a first step guiding following research on the modelling framework: a start in testing the framework in multiple rural locations around the globe.

  11. Groundwater Flow Model of Göksu Delta Coastal Aquifer System

    NASA Astrophysics Data System (ADS)

    Erdem Dokuz, Uǧur; Çelik, Mehmet; Arslan, Şebnem; Engin, Hilal

    2016-04-01

    Like many other coastal areas, Göksu Delta (Mersin-Silifke, Southern Turkey) is a preferred place for human settlement especially due to its productive farmlands and water resources. The water dependent ecosystem in Göksu delta hosts about 332 different plant species and 328 different bird species besides serving for human use. Göksu Delta has been declared as Special Environmental Protection Zone, Wildlife Protection Area, and RAMSAR Convention for Wetlands of International Importance area. Unfortunately, rising population, agricultural and industrial activities cause degradation of water resources both by means of quality and quantity. This problem also exists for other wetlands around the world. It is necessary to prepare water management plans by taking global warming issues into account to protect water resources for next generations. To achieve this, the most efficient tool is to come up with groundwater management strategies by constructing groundwater flow models. By this aim, groundwater modeling studies were carried out for Göksu Delta coastal aquifer system. As a first and most important step in all groundwater modeling studies, geological and hydrogeological settings of the study area have been investigated. Göksu Delta, like many other deltaic environments, has a complex structure because it was formed with the sediments transported by Göksu River throughout the Quaternary period and shaped throughout the transgression-regression periods. Both due to this complex structure and the lack of observation wells penetrating deep enough to give an idea of the total thickness of the delta, it was impossible to reveal out the hydrogeological setting in a correct manner. Therefore, six wells were drilled to construct the conceptual hydrogeological model of Göksu Delta coastal aquifer system. On the basis of drilling studies and slug tests that were conducted along Göksu Delta, hydrostratigraphic units of the delta system have been obtained. According to the conceptual hydrogeological model of Göksu Delta coastal aquifer system, Göksu Delta is restricted by limestones from north and northwest and reaches up to 250 m in thickness in the southern part. Moreover, a combined aquifer system of confined and unconfined layers has been developed within the delta. The groundwater flow direction is towards south and southeast to the Mediterranean Sea. Data from this study were used to calibrate the flow model under steady-state and transient conditions by using MOFLOW. According to the calibrated model, alluvium aquifer is primarily recharged by limestone aquifer and partially by Göksu River. Discharge from the aquifer is generally towards the Mediterranean Sea and in part to Göksu River in the southern part of the delta. Transient calibration of the model for the year 2012 indicates that Göksu Delta groundwater system is extremely sensitive for groundwater exploitation for agricultural purposes.

  12. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  13. A Theoretically Consistent Framework for Modelling Lagrangian Particle Deposition in Plant Canopies

    NASA Astrophysics Data System (ADS)

    Bailey, Brian N.; Stoll, Rob; Pardyjak, Eric R.

    2018-06-01

    We present a theoretically consistent framework for modelling Lagrangian particle deposition in plant canopies. The primary focus is on describing the probability of particles encountering canopy elements (i.e., potential deposition), and provides a consistent means for including the effects of imperfect deposition through any appropriate sub-model for deposition efficiency. Some aspects of the framework draw upon an analogy to radiation propagation through a turbid medium with which to develop model theory. The present method is compared against one of the most commonly used heuristic Lagrangian frameworks, namely that originally developed by Legg and Powell (Agricultural Meteorology, 1979, Vol. 20, 47-67), which is shown to be theoretically inconsistent. A recommendation is made to discontinue the use of this heuristic approach in favour of the theoretically consistent framework developed herein, which is no more difficult to apply under equivalent assumptions. The proposed framework has the additional advantage that it can be applied to arbitrary canopy geometries given readily measurable parameters describing vegetation structure.

  14. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  15. A smoothed particle hydrodynamics framework for modelling multiphase interactions at meso-scale

    NASA Astrophysics Data System (ADS)

    Li, Ling; Shen, Luming; Nguyen, Giang D.; El-Zein, Abbas; Maggi, Federico

    2018-01-01

    A smoothed particle hydrodynamics (SPH) framework is developed for modelling multiphase interactions at meso-scale, including the liquid-solid interaction induced deformation of the solid phase. With an inter-particle force formulation that mimics the inter-atomic force in molecular dynamics, the proposed framework includes the long-range attractions between particles, and more importantly, the short-range repulsive forces to avoid particle clustering and instability problems. Three-dimensional numerical studies have been conducted to demonstrate the capabilities of the proposed framework to quantitatively replicate the surface tension of water, to model the interactions between immiscible liquids and solid, and more importantly, to simultaneously model the deformation of solid and liquid induced by the multiphase interaction. By varying inter-particle potential magnitude, the proposed SPH framework has successfully simulated various wetting properties ranging from hydrophobic to hydrophilic surfaces. The simulation results demonstrate the potential of the proposed framework to genuinely study complex multiphase interactions in wet granular media.

  16. Sequentially Executed Model Evaluation Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    Provides a message passing framework between generic input, model and output drivers, and specifies an API for developing such drivers. Also provides batch and real-time controllers which step the model and I/O through the time domain (or other discrete domain), and sample I/O drivers. This is a library framework, and does not, itself, solve any problems or execute any modeling. The SeMe framework aids in development of models which operate on sequential information, such as time-series, where evaluation is based on prior results combined with new data for this iteration. Has applications in quality monitoring, and was developed as partmore » of the CANARY-EDS software, where real-time water quality data is being analyzed for anomalies.« less

  17. Large scale air pollution estimation method combining land use regression and chemical transport modeling in a geostatistical framework.

    PubMed

    Akita, Yasuyuki; Baldasano, Jose M; Beelen, Rob; Cirach, Marta; de Hoogh, Kees; Hoek, Gerard; Nieuwenhuijsen, Mark; Serre, Marc L; de Nazelle, Audrey

    2014-04-15

    In recognition that intraurban exposure gradients may be as large as between-city variations, recent air pollution epidemiologic studies have become increasingly interested in capturing within-city exposure gradients. In addition, because of the rapidly accumulating health data, recent studies also need to handle large study populations distributed over large geographic domains. Even though several modeling approaches have been introduced, a consistent modeling framework capturing within-city exposure variability and applicable to large geographic domains is still missing. To address these needs, we proposed a modeling framework based on the Bayesian Maximum Entropy method that integrates monitoring data and outputs from existing air quality models based on Land Use Regression (LUR) and Chemical Transport Models (CTM). The framework was applied to estimate the yearly average NO2 concentrations over the region of Catalunya in Spain. By jointly accounting for the global scale variability in the concentration from the output of CTM and the intraurban scale variability through LUR model output, the proposed framework outperformed more conventional approaches.

  18. An ontology for component-based models of water resource systems

    NASA Astrophysics Data System (ADS)

    Elag, Mostafa; Goodall, Jonathan L.

    2013-08-01

    Component-based modeling is an approach for simulating water resource systems where a model is composed of a set of components, each with a defined modeling objective, interlinked through data exchanges. Component-based modeling frameworks are used within the hydrologic, atmospheric, and earth surface dynamics modeling communities. While these efforts have been advancing, it has become clear that the water resources modeling community in particular, and arguably the larger earth science modeling community as well, faces a challenge of fully and precisely defining the metadata for model components. The lack of a unified framework for model component metadata limits interoperability between modeling communities and the reuse of models across modeling frameworks due to ambiguity about the model and its capabilities. To address this need, we propose an ontology for water resources model components that describes core concepts and relationships using the Web Ontology Language (OWL). The ontology that we present, which is termed the Water Resources Component (WRC) ontology, is meant to serve as a starting point that can be refined over time through engagement by the larger community until a robust knowledge framework for water resource model components is achieved. This paper presents the methodology used to arrive at the WRC ontology, the WRC ontology itself, and examples of how the ontology can aid in component-based water resources modeling by (i) assisting in identifying relevant models, (ii) encouraging proper model coupling, and (iii) facilitating interoperability across earth science modeling frameworks.

  19. Predevelopment Water-Level Contours for Aquifers in the Rainier Mesa and Shoshone Mountain area of the Nevada Test Site, Nye County, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph M. Fenelon; Randell J. Laczniak; and Keith J. Halford

    2008-06-24

    Contaminants introduced into the subsurface of the Nevada Test Site at Rainier Mesa and Shoshone Mountain by underground nuclear testing are of concern to the U.S. Department of Energy and regulators responsible for protecting human health and safety. Although contaminants were introduced into low-permeability rocks above the regional flow system, the potential for contaminant movement away from the underground test areas and into the accessible environment is greatest by ground-water transport. The primary hydrologic control on this transport is evaluated and examined through a series of contour maps developed to represent the water-level distribution within each of the major aquifersmore » underlying the area. Aquifers were identified and their extents delineated by merging and analyzing multiple hydrostratigraphic framework models developed by other investigators from existing geologic information. The contoured water-level distribution in each major aquifer was developed from a detailed evaluation and assessment of available water-level measurements. Multiple spreadsheets that accompany this report provide pertinent water-level and geologic data by well or drill hole. Aquifers are mapped, presented, and discussed in general terms as being one of three aquifer types—volcanic aquifer, upper carbonate aquifer, or lower carbonate aquifer. Each of these aquifer types was subdivided and mapped as independent continuous and isolated aquifers, based on the continuity of its component rock. Ground-water flow directions, as related to the transport of test-generated contaminants, were developed from water-level contours and are presented and discussed for each of the continuous aquifers. Contoured water-level altitudes vary across the study area and range from more than 5,000 feet in the volcanic aquifer beneath a recharge area in the northern part of the study area to less than 2,450 feet in the lower carbonate aquifer in the southern part of the study area. Variations in water-level altitudes within any single continuous aquifer range from a few hundred feet in a lower carbonate aquifer to just more than 1,100 feet in a volcanic aquifer. Flow directions throughout the study area are dominantly southward with minor eastward or westward deviations. Primary exceptions are westward flow in the northern part of the volcanic aquifer and eastward flow in the eastern part of the lower carbonate aquifer. Northward flow in the upper and lower carbonate aquifers in the northern part of the study area is possible but cannot be substantiated because data are lacking. Interflow between continuous aquifers is evaluated and mapped to define major flow paths. These flow paths delineate tributary flow systems, which converge to form the regional ground-water flow system. The implications of these tributary flow paths in controlling transport away from the underground test areas at Rainier Mesa and Shoshone Mountain are discussed. The obvious data gaps contributing to uncertainties in the delineation of aquifers and development of water-level contours are identified and evaluated.« less

  20. Predevelopment Water-Level Contours for Aquifers in the Rainier Mesa and Shoshone Mountain area of the Nevada Test Site, Nye County, Nevada

    USGS Publications Warehouse

    Fenelon, Joseph M.; Laczniak, Randell J.; Halford, Keith J.

    2008-01-01

    Contaminants introduced into the subsurface of the Nevada Test Site at Rainier Mesa and Shoshone Mountain by underground nuclear testing are of concern to the U.S. Department of Energy and regulators responsible for protecting human health and safety. Although contaminants were introduced into low-permeability rocks above the regional flow system, the potential for contaminant movement away from the underground test areas and into the accessible environment is greatest by ground-water transport. The primary hydrologic control on this transport is evaluated and examined through a series of contour maps developed to represent the water-level distribution within each of the major aquifers underlying the area. Aquifers were identified and their extents delineated by merging and analyzing multiple hydrostratigraphic framework models developed by other investigators from existing geologic information. The contoured water-level distribution in each major aquifer was developed from a detailed evaluation and assessment of available water-level measurements. Multiple spreadsheets that accompany this report provide pertinent water-level and geologic data by well or drill hole. Aquifers are mapped, presented, and discussed in general terms as being one of three aquifer types?volcanic aquifer, upper carbonate aquifer, or lower carbonate aquifer. Each of these aquifer types was subdivided and mapped as independent continuous and isolated aquifers, based on the continuity of its component rock. Ground-water flow directions, as related to the transport of test-generated contaminants, were developed from water-level contours and are presented and discussed for each of the continuous aquifers. Contoured water-level altitudes vary across the study area and range from more than 5,000 feet in the volcanic aquifer beneath a recharge area in the northern part of the study area to less than 2,450 feet in the lower carbonate aquifer in the southern part of the study area. Variations in water-level altitudes within any single continuous aquifer range from a few hundred feet in a lower carbonate aquifer to just more than 1,100 feet in a volcanic aquifer. Flow directions throughout the study area are dominantly southward with minor eastward or westward deviations. Primary exceptions are westward flow in the northern part of the volcanic aquifer and eastward flow in the eastern part of the lower carbonate aquifer. Northward flow in the upper and lower carbonate aquifers in the northern part of the study area is possible but cannot be substantiated because data are lacking. Interflow between continuous aquifers is evaluated and mapped to define major flow paths. These flow paths delineate tributary flow systems, which converge to form the regional ground-water flow system. The implications of these tributary flow paths in controlling transport away from the underground test areas at Rainier Mesa and Shoshone Mountain are discussed. The obvious data gaps contributing to uncertainties in the delineation of aquifers and development of water-level contours are identified and evaluated.

  1. lazar: a modular predictive toxicology framework

    PubMed Central

    Maunz, Andreas; Gütlein, Martin; Rautenberg, Micha; Vorgrimmler, David; Gebele, Denis; Helma, Christoph

    2013-01-01

    lazar (lazy structure–activity relationships) is a modular framework for predictive toxicology. Similar to the read across procedure in toxicological risk assessment, lazar creates local QSAR (quantitative structure–activity relationship) models for each compound to be predicted. Model developers can choose between a large variety of algorithms for descriptor calculation and selection, chemical similarity indices, and model building. This paper presents a high level description of the lazar framework and discusses the performance of example classification and regression models. PMID:23761761

  2. Control of Distributed Parameter Systems

    DTIC Science & Technology

    1990-08-01

    vari- ant of the general Lotka - Volterra model for interspecific competition. The variant described the emergence of one subpopulation from another as a...distribut ion unlimited. I&. ARSTRACT (MAUMUnw2O1 A unified arioroximation framework for Parameter estimation In general linear POE models has been completed...unified approximation framework for parameter estimation in general linear PDE models. This framework has provided the theoretical basis for a number of

  3. Multi-Fidelity Framework for Modeling Combustion Instability

    DTIC Science & Technology

    2016-07-27

    generated from the reduced-domain dataset. Evaluations of the framework are performed based on simplified test problems for a model rocket combustor showing...generated from the reduced-domain dataset. Evaluations of the framework are performed based on simplified test problems for a model rocket combustor...of Aeronautics and Astronautics and Associate Fellow AIAA. ‡ Professor Emeritus. § Senior Scientist, Rocket Propulsion Division and Senior Member

  4. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  5. A Framework for Cloudy Model Optimization and Database Storage

    NASA Astrophysics Data System (ADS)

    Calvén, Emilia; Helton, Andrew; Sankrit, Ravi

    2018-01-01

    We present a framework for producing Cloudy photoionization models of the nebular emission from novae ejecta and storing a subset of the results in SQL database format for later usage. The database can be searched for models best fitting observed spectral line ratios. Additionally, the framework includes an optimization feature that can be used in tandem with the database to search for and improve on models by creating new Cloudy models while, varying the parameters. The database search and optimization can be used to explore the structures of nebulae by deriving their properties from the best-fit models. The goal is to provide the community with a large database of Cloudy photoionization models, generated from parameters reflecting conditions within novae ejecta, that can be easily fitted to observed spectral lines; either by directly accessing the database using the framework code or by usage of a website specifically made for this purpose.

  6. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  7. Testing of stack-unit/aquifer sensitivity analysis using contaminant plume distribution in the subsurface of Savannah River Site, South Carolina, USA

    USGS Publications Warehouse

    Rine, J.M.; Shafer, J.M.; Covington, E.; Berg, R.C.

    2006-01-01

    Published information on the correlation and field-testing of the technique of stack-unit/aquifer sensitivity mapping with documented subsurface contaminant plumes is rare. The inherent characteristic of stack-unit mapping, which makes it a superior technique to other analyses that amalgamate data, is the ability to deconstruct the sensitivity analysis on a unit-by-unit basis. An aquifer sensitivity map, delineating the relative sensitivity of the Crouch Branch aquifer of the Administrative/Manufacturing Area (A/M) at the Savannah River Site (SRS) in South Carolina, USA, incorporates six hydrostratigraphic units, surface soil units, and relevant hydrologic data. When this sensitivity map is compared with the distribution of the contaminant tetrachloroethylene (PCE), PCE is present within the Crouch Branch aquifer within an area classified as highly sensitive, even though the PCE was primarily released on the ground surface within areas classified with low aquifer sensitivity. This phenomenon is explained through analysis of the aquifer sensitivity map, the groundwater potentiometric surface maps, and the plume distributions within the area on a unit-by- unit basis. The results of this correlation show how the paths of the PCE plume are influenced by both the geology and the groundwater flow. ?? Springer-Verlag 2006.

  8. La prospection geothermique de surface au Maroc: hydrodynamisme, anomalies thermiques et indices de surfaceGeothermal prospecting in Morocco: hydrodynamics, thermal anomalies and surface indices

    NASA Astrophysics Data System (ADS)

    Zarhloule, Y.; Lahrache, A.; Ben Abidate, L.; Khattach, D.; Bouri, S.; Boukdir, A.; Ben Dhia, H.

    2001-05-01

    Shallow geothermal prospecting ( < 700 m) has been performed in four zones in Morocco for which few deep data are available: northwestern basin, northeastern basin, Tadla Basin and Agadir Basin. These areas are different geologically and hydrogeologically. The temperature data from 250 wells at depths between 15 and 500 m have been analysed in order to estimate the natural geothermal gradient in these areas, to determine the principal thermal anomalies, to identify the main thermal indices and to characterise the recharge, discharge and potential mixing limits of the aquifers. The hydrostratigraphical study of each basin revealed several potential reservoir layers in which the Turonian carbonate aquifer (Tadal and Agadir Basins) and Liassic acquifer (Moroccan northwestern and northeastern basins) are the most important hot water reservoirs in Morocco. The recharge zones of each aquifer are characterised by high topography, high water potential, shallow cold water, low geothermal gradient and negative anomalies. The discharge zones are characterized by low topography, low piezometric level, high geothermal gradient, high temperature with hot springs and positive anomalies. The main thermal indices and the principal thermal anomalies that coincide with the artesian zones of the Turonian and Liassic aquifers have been identified.

  9. Obtaining parsimonious hydraulic conductivity fields using head and transport observations: A Bayesian geostatistical parameter estimation approach

    NASA Astrophysics Data System (ADS)

    Fienen, M.; Hunt, R.; Krabbenhoft, D.; Clemo, T.

    2009-08-01

    Flow path delineation is a valuable tool for interpreting the subsurface hydrogeochemical environment. Different types of data, such as groundwater flow and transport, inform different aspects of hydrogeologic parameter values (hydraulic conductivity in this case) which, in turn, determine flow paths. This work combines flow and transport information to estimate a unified set of hydrogeologic parameters using the Bayesian geostatistical inverse approach. Parameter flexibility is allowed by using a highly parameterized approach with the level of complexity informed by the data. Despite the effort to adhere to the ideal of minimal a priori structure imposed on the problem, extreme contrasts in parameters can result in the need to censor correlation across hydrostratigraphic bounding surfaces. These partitions segregate parameters into facies associations. With an iterative approach in which partitions are based on inspection of initial estimates, flow path interpretation is progressively refined through the inclusion of more types of data. Head observations, stable oxygen isotopes (18O/16O ratios), and tritium are all used to progressively refine flow path delineation on an isthmus between two lakes in the Trout Lake watershed, northern Wisconsin, United States. Despite allowing significant parameter freedom by estimating many distributed parameter values, a smooth field is obtained.

  10. Obtaining parsimonious hydraulic conductivity fields using head and transport observations: A Bayesian geostatistical parameter estimation approach

    USGS Publications Warehouse

    Fienen, M.; Hunt, R.; Krabbenhoft, D.; Clemo, T.

    2009-01-01

    Flow path delineation is a valuable tool for interpreting the subsurface hydrogeochemical environment. Different types of data, such as groundwater flow and transport, inform different aspects of hydrogeologic parameter values (hydraulic conductivity in this case) which, in turn, determine flow paths. This work combines flow and transport information to estimate a unified set of hydrogeologic parameters using the Bayesian geostatistical inverse approach. Parameter flexibility is allowed by using a highly parameterized approach with the level of complexity informed by the data. Despite the effort to adhere to the ideal of minimal a priori structure imposed on the problem, extreme contrasts in parameters can result in the need to censor correlation across hydrostratigraphic bounding surfaces. These partitions segregate parameters into facies associations. With an iterative approach in which partitions are based on inspection of initial estimates, flow path interpretation is progressively refined through the inclusion of more types of data. Head observations, stable oxygen isotopes (18O/16O ratios), and tritium are all used to progressively refine flow path delineation on an isthmus between two lakes in the Trout Lake watershed, northern Wisconsin, United States. Despite allowing significant parameter freedom by estimating many distributed parameter values, a smooth field is obtained.

  11. Translation from UML to Markov Model: A Performance Modeling Framework

    NASA Astrophysics Data System (ADS)

    Khan, Razib Hayat; Heegaard, Poul E.

    Performance engineering focuses on the quantitative investigation of the behavior of a system during the early phase of the system development life cycle. Bearing this on mind, we delineate a performance modeling framework of the application for communication system that proposes a translation process from high level UML notation to Continuous Time Markov Chain model (CTMC) and solves the model for relevant performance metrics. The framework utilizes UML collaborations, activity diagrams and deployment diagrams to be used for generating performance model for a communication system. The system dynamics will be captured by UML collaboration and activity diagram as reusable specification building blocks, while deployment diagram highlights the components of the system. The collaboration and activity show how reusable building blocks in the form of collaboration can compose together the service components through input and output pin by highlighting the behavior of the components and later a mapping between collaboration and system component identified by deployment diagram will be delineated. Moreover the UML models are annotated to associate performance related quality of service (QoS) information which is necessary for solving the performance model for relevant performance metrics through our proposed framework. The applicability of our proposed performance modeling framework in performance evaluation is delineated in the context of modeling a communication system.

  12. Bayesian calibration for electrochemical thermal model of lithium-ion cells

    NASA Astrophysics Data System (ADS)

    Tagade, Piyush; Hariharan, Krishnan S.; Basu, Suman; Verma, Mohan Kumar Singh; Kolake, Subramanya Mayya; Song, Taewon; Oh, Dukjin; Yeo, Taejung; Doo, Seokgwang

    2016-07-01

    Pseudo-two dimensional electrochemical thermal (P2D-ECT) model contains many parameters that are difficult to evaluate experimentally. Estimation of these model parameters is challenging due to computational cost and the transient model. Due to lack of complete physical understanding, this issue gets aggravated at extreme conditions like low temperature (LT) operations. This paper presents a Bayesian calibration framework for estimation of the P2D-ECT model parameters. The framework uses a matrix variate Gaussian process representation to obtain a computationally tractable formulation for calibration of the transient model. Performance of the framework is investigated for calibration of the P2D-ECT model across a range of temperatures (333 Ksbnd 263 K) and operating protocols. In the absence of complete physical understanding, the framework also quantifies structural uncertainty in the calibrated model. This information is used by the framework to test validity of the new physical phenomena before incorporation in the model. This capability is demonstrated by introducing temperature dependence on Bruggeman's coefficient and lithium plating formation at LT. With the incorporation of new physics, the calibrated P2D-ECT model accurately predicts the cell voltage with high confidence. The accurate predictions are used to obtain new insights into the low temperature lithium ion cell behavior.

  13. NoSQL Based 3D City Model Management System

    NASA Astrophysics Data System (ADS)

    Mao, B.; Harrie, L.; Cao, J.; Wu, Z.; Shen, J.

    2014-04-01

    To manage increasingly complicated 3D city models, a framework based on NoSQL database is proposed in this paper. The framework supports import and export of 3D city model according to international standards such as CityGML, KML/COLLADA and X3D. We also suggest and implement 3D model analysis and visualization in the framework. For city model analysis, 3D geometry data and semantic information (such as name, height, area, price and so on) are stored and processed separately. We use a Map-Reduce method to deal with the 3D geometry data since it is more complex, while the semantic analysis is mainly based on database query operation. For visualization, a multiple 3D city representation structure CityTree is implemented within the framework to support dynamic LODs based on user viewpoint. Also, the proposed framework is easily extensible and supports geoindexes to speed up the querying. Our experimental results show that the proposed 3D city management system can efficiently fulfil the analysis and visualization requirements.

  14. A Regularized Volumetric Fusion Framework for Large-Scale 3D Reconstruction

    NASA Astrophysics Data System (ADS)

    Rajput, Asif; Funk, Eugen; Börner, Anko; Hellwich, Olaf

    2018-07-01

    Modern computational resources combined with low-cost depth sensing systems have enabled mobile robots to reconstruct 3D models of surrounding environments in real-time. Unfortunately, low-cost depth sensors are prone to produce undesirable estimation noise in depth measurements which result in either depth outliers or introduce surface deformations in the reconstructed model. Conventional 3D fusion frameworks integrate multiple error-prone depth measurements over time to reduce noise effects, therefore additional constraints such as steady sensor movement and high frame-rates are required for high quality 3D models. In this paper we propose a generic 3D fusion framework with controlled regularization parameter which inherently reduces noise at the time of data fusion. This allows the proposed framework to generate high quality 3D models without enforcing additional constraints. Evaluation of the reconstructed 3D models shows that the proposed framework outperforms state of art techniques in terms of both absolute reconstruction error and processing time.

  15. Using Bayesian regression to test hypotheses about relationships between parameters and covariates in cognitive models.

    PubMed

    Boehm, Udo; Steingroever, Helen; Wagenmakers, Eric-Jan

    2018-06-01

    An important tool in the advancement of cognitive science are quantitative models that represent different cognitive variables in terms of model parameters. To evaluate such models, their parameters are typically tested for relationships with behavioral and physiological variables that are thought to reflect specific cognitive processes. However, many models do not come equipped with the statistical framework needed to relate model parameters to covariates. Instead, researchers often revert to classifying participants into groups depending on their values on the covariates, and subsequently comparing the estimated model parameters between these groups. Here we develop a comprehensive solution to the covariate problem in the form of a Bayesian regression framework. Our framework can be easily added to existing cognitive models and allows researchers to quantify the evidential support for relationships between covariates and model parameters using Bayes factors. Moreover, we present a simulation study that demonstrates the superiority of the Bayesian regression framework to the conventional classification-based approach.

  16. Model Diagnostics for the Department of Energy's Accelerated Climate Modeling for Energy (ACME) Project

    NASA Astrophysics Data System (ADS)

    Smith, B.

    2015-12-01

    In 2014, eight Department of Energy (DOE) national laboratories, four academic institutions, one company, and the National Centre for Atmospheric Research combined forces in a project called Accelerated Climate Modeling for Energy (ACME) with the goal to speed Earth system model development for climate and energy. Over the planned 10-year span, the project will conduct simulations and modeling on DOE's most powerful high-performance computing systems at Oak Ridge, Argonne, and Lawrence Berkeley Leadership Compute Facilities. A key component of the ACME project is the development of an interactive test bed for the advanced Earth system model. Its execution infrastructure will accelerate model development and testing cycles. The ACME Workflow Group is leading the efforts to automate labor-intensive tasks, provide intelligent support for complex tasks and reduce duplication of effort through collaboration support. As part of this new workflow environment, we have created a diagnostic, metric, and intercomparison Python framework, called UVCMetrics, to aid in the testing-to-production execution of the ACME model. The framework exploits similarities among different diagnostics to compactly support diagnosis of new models. It presently focuses on atmosphere and land but is designed to support ocean and sea ice model components as well. This framework is built on top of the existing open-source software framework known as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT). Because of its flexible framework design, scientists and modelers now can generate thousands of possible diagnostic outputs. These diagnostics can compare model runs, compare model vs. observation, or simply verify a model is physically realistic. Additional diagnostics are easily integrated into the framework, and our users have already added several. Diagnostics can be generated, viewed, and manipulated from the UV-CDAT graphical user interface, Python command line scripts and programs, and web browsers. The framework is designed to be scalable to large datasets, yet easy to use and familiar to scientists using previous tools. Integration in the ACME overall user interface facilitates data publication, further analysis, and quick feedback to model developers and scientists making component or coupled model runs.

  17. Field Markup Language: biological field representation in XML.

    PubMed

    Chang, David; Lovell, Nigel H; Dokos, Socrates

    2007-01-01

    With an ever increasing number of biological models available on the internet, a standardized modeling framework is required to allow information to be accessed or visualized. Based on the Physiome Modeling Framework, the Field Markup Language (FML) is being developed to describe and exchange field information for biological models. In this paper, we describe the basic features of FML, its supporting application framework and its ability to incorporate CellML models to construct tissue-scale biological models. As a typical application example, we present a spatially-heterogeneous cardiac pacemaker model which utilizes both FML and CellML to describe and solve the underlying equations of electrical activation and propagation.

  18. Hydrogeophysics and remote sensing for the design of hydrogeological conceptual models in hard rocks - Sardón catchment (Spain)

    NASA Astrophysics Data System (ADS)

    Francés, Alain P.; Lubczynski, Maciek W.; Roy, Jean; Santos, Fernando A. M.; Mahmoudzadeh Ardekani, Mohammad R.

    2014-11-01

    Hard rock aquifers are highly heterogeneous and hydrogeologically complex. To contribute to the design of hydrogeological conceptual models of hard rock aquifers, we propose a multi-techniques methodology based on a downward approach that combines remote sensing (RS), non-invasive hydrogeophysics and hydrogeological field data acquisition. The proposed methodology is particularly suitable for data scarce areas. It was applied in the pilot research area of Sardón catchment (80 km2) located west of Salamanca (Spain). The area was selected because of hard-rock hydrogeology, semi-arid climate and scarcity of groundwater resources. The proposed methodology consisted of three main steps. First, we detected the main hydrogeological features at the catchment scale by processing: (i) a high resolution digital terrain model to map lineaments and to outline fault zones; and (ii) high-resolution, multispectral satellite QuickBird and WorldView-2 images to map the outcropping granite. Second, we characterized at the local scale the hydrogeological features identified at step one with: i) ground penetrating radar (GPR) to assess groundwater table depth complementing the available monitoring network data; ii) 2D electric resistivity tomography (ERT) and frequency domain electromagnetic (FDEM) to retrieve the hydrostratigraphy along selected survey transects; iii) magnetic resonance soundings (MRS) to retrieve the hydrostratigraphy and aquifer parameters at the selected survey sites. In the third step, we drilled 5 boreholes (25 to 48 m deep) and performed slug tests to verify the hydrogeophysical interpretation and to calibrate the MRS parameters. Finally, we compiled and integrated all acquired data to define the geometry and parameters of the Sardón aquifer at the catchment scale. In line with a general conceptual model of hard rock aquifers, we identified two main hydrostratigraphic layers: a saprolite layer and a fissured layer. Both layers were intersected and drained by fault zones that control the hydrogeology of the catchment. The spatial discontinuities of the saprolite layer were well defined by RS techniques while subsurface geometry and aquifer parameters by hydrogeophysics. The GPR method was able to detect shallow water table at depth between 1 and 3 m b.g.s. The hydrostratigraphy and parameterization of the fissured layer remained uncertain because ERT and FDEM geophysical methods were quantitatively not conclusive while MRS detectability was restricted by low volumetric water content. The proposed multi-technique methodology integrating cost efficient RS, hydrogeophysics and hydrogeological field investigations allowed us to characterize geometrically and parametrically the Sardón hard rock aquifer system, facilitating the design of hydrogeological conceptual model of the area.

  19. Exploring uncertainty and model predictive performance concepts via a modular snowmelt-runoff modeling framework

    Treesearch

    Tyler Jon Smith; Lucy Amanda Marshall

    2010-01-01

    Model selection is an extremely important aspect of many hydrologic modeling studies because of the complexity, variability, and uncertainty that surrounds the current understanding of watershed-scale systems. However, development and implementation of a complete precipitation-runoff modeling framework, from model selection to calibration and uncertainty analysis, are...

  20. Digging into the corona: A modeling framework trained with Sun-grazing comet observations

    NASA Astrophysics Data System (ADS)

    Jia, Y. D.; Pesnell, W. D.; Bryans, P.; Downs, C.; Liu, W.; Schwartz, S. J.

    2017-12-01

    Images of comets diving into the low corona have been captured a few times in the past decade. Structures visible at various wavelengths during these encounters indicate a strong variation of the ambient conditions of the corona. We combine three numerical models: a global coronal model, a particle transportation model, and a cometary plasma interaction model into one framework to model the interaction of such Sun-grazing comets with plasma in the low corona. In our framework, cometary vapors are ionized via multiple channels and then captured by the coronal magnetic field. In seconds, these ions are further ionized into their highest charge state, which is revealed by certain coronal emission lines. Constrained by observations, we apply our framework to trace back to the local conditions of the ambient corona, and their spatial/time variation over a broad range of scales. Once trained by multiple stages of the comet's journey in the low corona, we illustrate how this framework can leverage these unique observations to probe the structure of the solar corona and solar wind.

  1. Colaborated Architechture Framework for Composition UML 2.0 in Zachman Framework

    NASA Astrophysics Data System (ADS)

    Hermawan; Hastarista, Fika

    2016-01-01

    Zachman Framework (ZF) is the framework of enterprise architechture that most widely adopted in the Enterprise Information System (EIS) development. In this study, has been developed Colaborated Architechture Framework (CAF) to collaborate ZF with Unified Modeling Language (UML) 2.0 modeling. The CAF provides the composition of ZF matrix that each cell is consist of the Model Driven architechture (MDA) from the various UML models and many Software Requirement Specification (SRS) documents. Implementation of this modeling is used to develops Enterprise Resource Planning (ERP). Because ERP have a coverage of applications in large numbers and complexly relations, it is necessary to use Agile Model Driven Design (AMDD) approach as an advanced method to transforms MDA into components of application modules with efficiently and accurately. Finally, through the using of the CAF, give good achievement in fullfilment the needs from all stakeholders that are involved in the overall process stage of Rational Unified Process (RUP), and also obtaining a high satisfaction to fullfiled the functionality features of the ERP software in PT. Iglas (Persero) Gresik.

  2. Automatic Earth observation data service based on reusable geo-processing workflow

    NASA Astrophysics Data System (ADS)

    Chen, Nengcheng; Di, Liping; Gong, Jianya; Yu, Genong; Min, Min

    2008-12-01

    A common Sensor Web data service framework for Geo-Processing Workflow (GPW) is presented as part of the NASA Sensor Web project. This framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node and BPEL engine. An abstract model designer is used to design the top level GPW model, model instantiation service is used to generate the concrete BPEL, and the BPEL execution engine is adopted. The framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A scenario for an EO-1 Sensor Web data service for fire classification is used to test the feasibility of the proposed framework. The execution time and influences of the service framework are evaluated. The experiments show that this framework can improve the quality of services for sensor data retrieval and processing.

  3. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework

    PubMed Central

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation. PMID:28225811

  4. Automated numerical simulation of biological pattern formation based on visual feedback simulation framework.

    PubMed

    Sun, Mingzhu; Xu, Hui; Zeng, Xingjuan; Zhao, Xin

    2017-01-01

    There are various fantastic biological phenomena in biological pattern formation. Mathematical modeling using reaction-diffusion partial differential equation systems is employed to study the mechanism of pattern formation. However, model parameter selection is both difficult and time consuming. In this paper, a visual feedback simulation framework is proposed to calculate the parameters of a mathematical model automatically based on the basic principle of feedback control. In the simulation framework, the simulation results are visualized, and the image features are extracted as the system feedback. Then, the unknown model parameters are obtained by comparing the image features of the simulation image and the target biological pattern. Considering two typical applications, the visual feedback simulation framework is applied to fulfill pattern formation simulations for vascular mesenchymal cells and lung development. In the simulation framework, the spot, stripe, labyrinthine patterns of vascular mesenchymal cells, the normal branching pattern and the branching pattern lacking side branching for lung branching are obtained in a finite number of iterations. The simulation results indicate that it is easy to achieve the simulation targets, especially when the simulation patterns are sensitive to the model parameters. Moreover, this simulation framework can expand to other types of biological pattern formation.

  5. Modules: A New Tool in the Emissions Modeling Framework

    DOT National Transportation Integrated Search

    2017-08-14

    The Emissions Modeling Framework (EMF) is used by various organizations, including the US Environmental Protection Agency, to manage their emissions inventories, projections, and emissions modeling scenarios. Modules are a new tool under develo...

  6. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications. In this poster, we summarize the key components of the UCVM framework and describe the impact it has had in various computational geoscientific applications.

  7. Modeling spray drift and runoff-related inputs of pesticides to receiving water.

    PubMed

    Zhang, Xuyang; Luo, Yuzhou; Goh, Kean S

    2018-03-01

    Pesticides move to surface water via various pathways including surface runoff, spray drift and subsurface flow. Little is known about the relative contributions of surface runoff and spray drift in agricultural watersheds. This study develops a modeling framework to address the contribution of spray drift to the total loadings of pesticides in receiving water bodies. The modeling framework consists of a GIS module for identifying drift potential, the AgDRIFT model for simulating spray drift, and the Soil and Water Assessment Tool (SWAT) for simulating various hydrological and landscape processes including surface runoff and transport of pesticides. The modeling framework was applied on the Orestimba Creek Watershed, California. Monitoring data collected from daily samples were used for model evaluation. Pesticide mass deposition on the Orestimba Creek ranged from 0.08 to 6.09% of applied mass. Monitoring data suggests that surface runoff was the major pathway for pesticide entering water bodies, accounting for 76% of the annual loading; the rest 24% from spray drift. The results from the modeling framework showed 81 and 19%, respectively, for runoff and spray drift. Spray drift contributed over half of the mass loading during summer months. The slightly lower spray drift contribution as predicted by the modeling framework was mainly due to SWAT's under-prediction of pesticide mass loading during summer and over-prediction of the loading during winter. Although model simulations were associated with various sources of uncertainties, the overall performance of the modeling framework was satisfactory as evaluated by multiple statistics: for simulation of daily flow, the Nash-Sutcliffe Efficiency Coefficient (NSE) ranged from 0.61 to 0.74 and the percent bias (PBIAS) < 28%; for daily pesticide loading, NSE = 0.18 and PBIAS = -1.6%. This modeling framework will be useful for assessing the relative exposure from pesticides related to spray drift and runoff in receiving waters and the design of management practices for mitigating pesticide exposure within a watershed. Published by Elsevier Ltd.

  8. Clinical time series prediction: towards a hierarchical dynamical system framework

    PubMed Central

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. PMID:25534671

  9. High Spatial Resolution Multi-Organ Finite Element Modeling of Ventricular-Arterial Coupling

    PubMed Central

    Shavik, Sheikh Mohammad; Jiang, Zhenxiang; Baek, Seungik; Lee, Lik Chuan

    2018-01-01

    While it has long been recognized that bi-directional interaction between the heart and the vasculature plays a critical role in the proper functioning of the cardiovascular system, a comprehensive study of this interaction has largely been hampered by a lack of modeling framework capable of simultaneously accommodating high-resolution models of the heart and vasculature. Here, we address this issue and present a computational modeling framework that couples finite element (FE) models of the left ventricle (LV) and aorta to elucidate ventricular—arterial coupling in the systemic circulation. We show in a baseline simulation that the framework predictions of (1) LV pressure—volume loop, (2) aorta pressure—diameter relationship, (3) pressure—waveforms of the aorta, LV, and left atrium (LA) over the cardiac cycle are consistent with the physiological measurements found in healthy human. To develop insights of ventricular-arterial interactions, the framework was then used to simulate how alterations in the geometrical or, material parameter(s) of the aorta affect the LV and vice versa. We show that changing the geometry and microstructure of the aorta model in the framework led to changes in the functional behaviors of both LV and aorta that are consistent with experimental observations. On the other hand, changing contractility and passive stiffness of the LV model in the framework also produced changes in both the LV and aorta functional behaviors that are consistent with physiology principles. PMID:29551977

  10. Modeling Real-Time Coordination of Distributed Expertise and Event Response in NASA Mission Control Center Operations

    NASA Astrophysics Data System (ADS)

    Onken, Jeffrey

    This dissertation introduces a multidisciplinary framework for the enabling of future research and analysis of alternatives for control centers for real-time operations of safety-critical systems. The multidisciplinary framework integrates functional and computational models that describe the dynamics in fundamental concepts of previously disparate engineering and psychology research disciplines, such as group performance and processes, supervisory control, situation awareness, events and delays, and expertise. The application in this dissertation is the real-time operations within the NASA Mission Control Center in Houston, TX. This dissertation operationalizes the framework into a model and simulation, which simulates the functional and computational models in the framework according to user-configured scenarios for a NASA human-spaceflight mission. The model and simulation generates data according to the effectiveness of the mission-control team in supporting the completion of mission objectives and detecting, isolating, and recovering from anomalies. Accompanying the multidisciplinary framework is a proof of concept, which demonstrates the feasibility of such a framework. The proof of concept demonstrates that variability occurs where expected based on the models. The proof of concept also demonstrates that the data generated from the model and simulation is useful for analyzing and comparing MCC configuration alternatives because an investigator can give a diverse set of scenarios to the simulation and the output compared in detail to inform decisions about the effect of MCC configurations on mission operations performance.

  11. Tracking Skill Acquisition with Cognitive Diagnosis Models: A Higher-Order, Hidden Markov Model with Covariates

    ERIC Educational Resources Information Center

    Wang, Shiyu; Yang, Yan; Culpepper, Steven Andrew; Douglas, Jeffrey A.

    2018-01-01

    A family of learning models that integrates a cognitive diagnostic model and a higher-order, hidden Markov model in one framework is proposed. This new framework includes covariates to model skill transition in the learning environment. A Bayesian formulation is adopted to estimate parameters from a learning model. The developed methods are…

  12. Smart Frameworks and Self-Describing Models: Model Metadata for Automated Coupling of Hydrologic Process Components (Invited)

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2013-12-01

    Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework service components as necessary to mediate the differences between the coupled models. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. To illustrate the power of standardized model interfaces and metadata, a smart, light-weight modeling framework written in Python will be introduced that can automatically (without user intervention) couple a set of BMI-enabled hydrologic process components together to create a spatial hydrologic model. The same mechanisms could also be used to provide seamless integration (import/export) of data and models.

  13. Communication: Introducing prescribed biases in out-of-equilibrium Markov models

    NASA Astrophysics Data System (ADS)

    Dixit, Purushottam D.

    2018-03-01

    Markov models are often used in modeling complex out-of-equilibrium chemical and biochemical systems. However, many times their predictions do not agree with experiments. We need a systematic framework to update existing Markov models to make them consistent with constraints that are derived from experiments. Here, we present a framework based on the principle of maximum relative path entropy (minimum Kullback-Leibler divergence) to update Markov models using stationary state and dynamical trajectory-based constraints. We illustrate the framework using a biochemical model network of growth factor-based signaling. We also show how to find the closest detailed balanced Markov model to a given Markov model. Further applications and generalizations are discussed.

  14. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    NASA Astrophysics Data System (ADS)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders, time interpolators and unit converters) as necessary to mediate the differences between them so they can work together. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model or data set to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. Recent efforts to bring powerful uncertainty analysis and inverse modeling toolkits such as DAKOTA into modeling frameworks will also be described. This talk will conclude with an overview of several related modeling projects that have been funded by NSF's EarthCube initiative, namely the Earth System Bridge, OntoSoft and GeoSemantics projects.

  15. Crops in silico: A community wide multi-scale computational modeling framework of plant canopies

    NASA Astrophysics Data System (ADS)

    Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.

    2016-12-01

    Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.

  16. National water, food, and trade modeling framework: The case of Egypt.

    PubMed

    Abdelkader, A; Elshorbagy, A; Tuninetti, M; Laio, F; Ridolfi, L; Fahmy, H; Hoekstra, A Y

    2018-10-15

    This paper introduces a modeling framework for the analysis of real and virtual water flows at national scale. The framework has two components: (1) a national water model that simulates agricultural, industrial and municipal water uses, and available water and land resources; and (2) an international virtual water trade model that captures national virtual water exports and imports related to trade in crops and animal products. This National Water, Food & Trade (NWFT) modeling framework is applied to Egypt, a water-poor country and the world's largest importer of wheat. Egypt's food and water gaps and the country's food (virtual water) imports are estimated over a baseline period (1986-2013) and projected up to 2050 based on four scenarios. Egypt's food and water gaps are growing rapidly as a result of steep population growth and limited water resources. The NWFT modeling framework shows the nexus of the population dynamics, water uses for different sectors, and their compounding effects on Egypt's food gap and water self-sufficiency. The sensitivity analysis reveals that for solving Egypt's water and food problem non-water-based solutions like educational, health, and awareness programs aimed at lowering population growth will be an essential addition to the traditional water resources development solution. Both the national and the global models project similar trends of Egypt's food gap. The NWFT modeling framework can be easily adapted to other nations and regions. Copyright © 2018. Published by Elsevier B.V.

  17. A modeling framework for exposing risks in complex systems.

    PubMed

    Sharit, J

    2000-08-01

    This article introduces and develops a modeling framework for exposing risks in the form of human errors and adverse consequences in high-risk systems. The modeling framework is based on two components: a two-dimensional theory of accidents in systems developed by Perrow in 1984, and the concept of multiple system perspectives. The theory of accidents differentiates systems on the basis of two sets of attributes. One set characterizes the degree to which systems are interactively complex; the other emphasizes the extent to which systems are tightly coupled. The concept of multiple perspectives provides alternative descriptions of the entire system that serve to enhance insight into system processes. The usefulness of these two model components derives from a modeling framework that cross-links them, enabling a variety of work contexts to be exposed and understood that would otherwise be very difficult or impossible to identify. The model components and the modeling framework are illustrated in the case of a large and comprehensive trauma care system. In addition to its general utility in the area of risk analysis, this methodology may be valuable in applications of current methods of human and system reliability analysis in complex and continually evolving high-risk systems.

  18. Modeling Philosophies and Applications

    EPA Pesticide Factsheets

    All models begin with a framework and a set of assumptions and limitations that go along with that framework. In terms of fracing and RA, there are several places where models and parameters must be chosen to complete hazard identification.

  19. A FEniCS-based programming framework for modeling turbulent flow by the Reynolds-averaged Navier-Stokes equations

    NASA Astrophysics Data System (ADS)

    Mortensen, Mikael; Langtangen, Hans Petter; Wells, Garth N.

    2011-09-01

    Finding an appropriate turbulence model for a given flow case usually calls for extensive experimentation with both models and numerical solution methods. This work presents the design and implementation of a flexible, programmable software framework for assisting with numerical experiments in computational turbulence. The framework targets Reynolds-averaged Navier-Stokes models, discretized by finite element methods. The novel implementation makes use of Python and the FEniCS package, the combination of which leads to compact and reusable code, where model- and solver-specific code resemble closely the mathematical formulation of equations and algorithms. The presented ideas and programming techniques are also applicable to other fields that involve systems of nonlinear partial differential equations. We demonstrate the framework in two applications and investigate the impact of various linearizations on the convergence properties of nonlinear solvers for a Reynolds-averaged Navier-Stokes model.

  20. Evidence-Based Leadership Development: The 4L Framework

    ERIC Educational Resources Information Center

    Scott, Shelleyann; Webber, Charles F.

    2008-01-01

    Purpose: This paper aims to use the results of three research initiatives to present the life-long learning leader 4L framework, a model for leadership development intended for use by designers and providers of leadership development programming. Design/methodology/approach: The 4L model is a conceptual framework that emerged from the analysis of…

  1. Applying the Nominal Response Model within a Longitudinal Framework to Construct the Positive Family Relationships Scale

    ERIC Educational Resources Information Center

    Preston, Kathleen Suzanne Johnson; Parral, Skye N.; Gottfried, Allen W.; Oliver, Pamella H.; Gottfried, Adele Eskeles; Ibrahim, Sirena M.; Delany, Danielle

    2015-01-01

    A psychometric analysis was conducted using the nominal response model under the item response theory framework to construct the Positive Family Relationships scale. Using data from the Fullerton Longitudinal Study, this scale was constructed within a long-term longitudinal framework spanning middle childhood through adolescence. Items tapping…

  2. A Framework for Studying Minority Youths' Transitions to Fatherhood: The Case of Puerto Rican Adolescents

    ERIC Educational Resources Information Center

    Erkut, Sumru; Szalacha, Laura A.; Coll, Cynthia Garcia

    2005-01-01

    A theoretical framework is proposed for studying minority young men's involvement with their babies that combines the integrative model of minority youth development and a life course developmental perspective with Lamb's revised four-factor model of father involvement. This framework posits a relationship between demographic and family background…

  3. Improving component interoperability and reusability with the java connection framework (JCF): overview and application to the ages-w environmental model

    USDA-ARS?s Scientific Manuscript database

    Environmental modeling framework (EMF) design goals are multi-dimensional and often include many aspects of general software framework development. Many functional capabilities offered by current EMFs are closely related to interoperability and reuse aspects. For example, an EMF needs to support dev...

  4. Alternative Frameworks for the Study of Man.

    ERIC Educational Resources Information Center

    Markova, Ivana

    1979-01-01

    Two frameworks for the study of man are discussed. The Cartesian model views man as a physical object. A dialectic framework, with the emphasis on the self, grew out of nineteenth century romanticism and reflects the theories of Hegel. Both models have had an effect on social psychology and the study of interpersonal communication. (BH)

  5. Assessing Students' Understandings of Biological Models and Their Use in Science to Evaluate a Theoretical Framework

    ERIC Educational Resources Information Center

    Grünkorn, Juliane; Upmeier zu Belzen, Annette; Krüger, Dirk

    2014-01-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical evaluation).…

  6. An introduction to the multisystem model of knowledge integration and translation.

    PubMed

    Palmer, Debra; Kramlich, Debra

    2011-01-01

    Many nurse researchers have designed strategies to assist health care practitioners to move evidence into practice. While many have been identified as "models," most do not have a conceptual framework. They are unidirectional, complex, and difficult for novice research users to understand. These models have focused on empirical knowledge and ignored the importance of practitioners' tacit knowledge. The Communities of Practice conceptual framework allows for the integration of tacit and explicit knowledge into practice. This article describes the development of a new translation model, the Multisystem Model of Knowledge Integration and Translation, supported by the Communities of Practice conceptual framework.

  7. The Inter-Sectoral Impact Model Intercomparison Project (ISI–MIP): Project framework

    PubMed Central

    Warszawski, Lila; Frieler, Katja; Huber, Veronika; Piontek, Franziska; Serdeczny, Olivia; Schewe, Jacob

    2014-01-01

    The Inter-Sectoral Impact Model Intercomparison Project offers a framework to compare climate impact projections in different sectors and at different scales. Consistent climate and socio-economic input data provide the basis for a cross-sectoral integration of impact projections. The project is designed to enable quantitative synthesis of climate change impacts at different levels of global warming. This report briefly outlines the objectives and framework of the first, fast-tracked phase of Inter-Sectoral Impact Model Intercomparison Project, based on global impact models, and provides an overview of the participating models, input data, and scenario set-up. PMID:24344316

  8. Combined Use of GIS, Hydrostratigraphic, Geochemical, and Multi-Isotope Analysis for Groundwater Preservation and Development in a Complex Karst Setting

    NASA Astrophysics Data System (ADS)

    Murgulet, D.; Cook, M. R.

    2011-12-01

    The complex stratigraphy and geologic structure characteristic to fractured karst aquifers underlying an urban part of the north-central Alabama Valley and Ridge Setting make the development and protection of groundwater sources difficult. In this area, population growth accompanied by increased impervious surfaces, storm water runoff, contaminants, subsidence, and pumping rates have rendered the groundwater resource. The potential for aquifer recharge and flow conditions were evaluated in order to determine the current and future alternative water sources available in this area. Geochemical and multi-isotope techniques were coupled with hydrostratigraphic and geomorphic spatial (GIS) analyses to determine the primary mechanisms controlling recharge and flow and evaluate seasonal impacts on groundwater resources and recharge environments. Groundwater samples, collected in summer and fall (2010) from wells developed in the Bangor Limestone and Tuscumbia Fort Payne aquifers (north-central Alabama), were analyzed for major ions, stable isotopes of oxygen (δ^18O), hydrogen (δD), and carbon (δ^13C), and anthropogenic isotopes such as chlorofluorocarbon (CFCs) and sulphur hexafluoride (SF_6). Stable isotope investigations suggest that recharge occurs under relatively closed conditions, with fast percolation rates in short periods (characteristic to karst aquifers) and low evaporation rates during the colder seasons. The average δ^13C value (-11.4±2% PDB, n=9) lies near the combined average δ^13C values of soil CO_2 and the carbonate. Therefore, groundwater δ^13C signature is mainly controlled by two factors: soil CO_2 and carbonate dissolution. Static water levels decrease over the summer causing drawdowns (2 to 5.2 meters) in all the production wells and a slight shift of the δ^18O and δD values towards a more positive member (summer range--δ^18O: -5.1±0.1 to -5.7±0.1% VSMOW, n=11; δD: -25.0±1 to -30.6±1% VSMOW, n=11 and fall range--δ^18O: -4.8±0.1 to -5.4±0.1% VSMOW n=9; δD: -25.4±1 to -27.4±1% VSMOW, n=9). Thus, during the summer, while groundwater levels were dropping, aquifers were replenished with less mineralized waters (specific conductance: 235 to 194 μS/cm, n=8). The higher specific conductance data characteristic to the Bangor Limestone aquifer (290 μS/cm, n=4) are correlated with younger ages (19±2 years, n=2) suggesting faster groundwater travel times compared to the Tuscumbia Fort Payne aquifer (157 μS/cm, n=5; 23.8±2 years, n=4). Generally the highest water levels and groundwater ages are characteristic to the Tuscumbia Fort Payne aquifer suggestive of longer travel times and higher recharge rates. In contrast, the Bangor Limestone aquifer experiences shorter residence times, lower water levels and therefore, lower recharge rates. Recharge areas distribution and geochemical analyses reveal a more localized source of recharge for the Bangor Limestone aquifer (within the delineated potential aquifer area residing on the outcrop) and a more distant source for the Tuscumbia Fort Payne aquifer.

  9. Role of different types of solid models in hydrodynamic modeling and their effects on groundwater protection processes

    NASA Astrophysics Data System (ADS)

    Bódi, Erika; Buday, Tamás; McIntosh, Richard William

    2013-04-01

    Defining extraction-modified flow patterns with hydrodynamic models is a pivotal question in preserving groundwater resources regarding both quality and quantity. Modeling is the first step in groundwater protection the main result of which is the determination of the protective area depending on the amount of extracted water. Solid models have significant effects on hydrodynamic models as they are based on the solid models. Due to the legislative regulations, on protection areas certain restrictions must be applied which has firm consequences on economic activities. In Hungarian regulations there are no clear instructions for the establishment of either geological or hydrodynamic modeling, however, modeling itself is an obligation. Choosing the modeling method is a key consideration for further numerical calculations and it is decisive regarding the shape and size of the groundwater protection area. The geometry of hydrodynamic model layers is derived from the solid model. There are different geological approaches including lithological and sequence stratigraphic classifications furthermore in the case of regional models, formation-based hydrostratigraphic units are also applicable. Lithological classification is based on assigning and mapping of lithotypes. When the geometry (e.g. tectonic characteristics) of the research area is not known, horizontal bedding is assumed the probability of which can not be assessed based on only lithology. If the geological correlation is based on sequence stratigraphic studies, the cyclicity of sediment deposition is also considered. This method is more integrated thus numerous parameters (e.g. electrofacies) are taken into consideration studying the geological conditions ensuring more reliable modeling. Layers of sequence stratigraphic models can be either lithologically homogeneous or they may include greater cycles of sediments containing therefore several lithological units. The advantage of this is that the modeling can handle pinching out lithological units and lenticular bodies easier while most hydrodynamic softwares cannot handle flow units related to such model layers. Interpretation of tectonic disturbance is similar. In Hungary groundwater is extracted mainly from Pleistocene and Pannonian aquifers sediments of which were deposited in the ancient Pannonian Lake. When the basin lost its open-marine connection eustasy had no direct effects on facies changes therefore subsidence and sediment supply became the main factors. Various basin-filling related facies developed including alluvial plain facies, different delta facies types and pelitic deep-basin facies. Creating solid models based on sequence stratigraphic methods requires more raw data and also genetic approaches, in addition more working hours hence this method is seldom used in practice. Lithology-based models can be transformed into sequence stratigraphic models by extending the data base (e.g. detecting more survey data). In environments where the obtained models differ significantly notable changes can occur in the supply directions in addition the groundwater travel-time of the two models even on equal extraction terms. Our study aims to call attention to the consequences of using different solid models for typical depositional systems of the Great Hungarian Plain and to their effects on groundwater protection.

  10. Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model

    ERIC Educational Resources Information Center

    Sandaire, Johnny

    2009-01-01

    A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…

  11. Evolution of 3-D geologic framework modeling and its application to groundwater flow studies

    USGS Publications Warehouse

    Blome, Charles D.; Smith, David V.

    2012-01-01

    In this Fact Sheet, the authors discuss the evolution of project 3-D subsurface framework modeling, research in hydrostratigraphy and airborne geophysics, and methodologies used to link geologic and groundwater flow models.

  12. THE EPA MULTIMEDIA INTEGRATED MODELING SYSTEM SOFTWARE SUITE

    EPA Science Inventory

    The U.S. EPA is developing a Multimedia Integrated Modeling System (MIMS) framework that will provide a software infrastructure or environment to support constructing, composing, executing, and evaluating complex modeling studies. The framework will include (1) common software ...

  13. Conceptual modeling framework to support development of site-specific selenium criteria for Lake Koocanusa, Montana, U.S.A., and British Columbia, Canada

    USGS Publications Warehouse

    Jenni, Karen E.; Naftz, David L.; Presser, Theresa S.

    2017-10-16

    The U.S. Geological Survey, working with the Montana Department of Environmental Quality and the British Columbia Ministry of the Environment and Climate Change Strategy, has developed a conceptual modeling framework that can be used to provide structured and scientifically based input to the Lake Koocanusa Monitoring and Research Working Group as they consider potential site-specific selenium criteria for Lake Koocanusa, a transboundary reservoir located in Montana and British Columbia. This report describes that modeling framework, provides an example of how it can be applied, and outlines possible next steps for implementing the framework.

  14. A Framework for the Optimization of Discrete-Event Simulation Models

    NASA Technical Reports Server (NTRS)

    Joshi, B. D.; Unal, R.; White, N. H.; Morris, W. D.

    1996-01-01

    With the growing use of computer modeling and simulation, in all aspects of engineering, the scope of traditional optimization has to be extended to include simulation models. Some unique aspects have to be addressed while optimizing via stochastic simulation models. The optimization procedure has to explicitly account for the randomness inherent in the stochastic measures predicted by the model. This paper outlines a general purpose framework for optimization of terminating discrete-event simulation models. The methodology combines a chance constraint approach for problem formulation, together with standard statistical estimation and analyses techniques. The applicability of the optimization framework is illustrated by minimizing the operation and support resources of a launch vehicle, through a simulation model.

  15. Conceptual models for cumulative risk assessment.

    PubMed

    Linder, Stephen H; Sexton, Ken

    2011-12-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive "family" of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects.

  16. Conceptual Models for Cumulative Risk Assessment

    PubMed Central

    Sexton, Ken

    2011-01-01

    In the absence of scientific consensus on an appropriate theoretical framework, cumulative risk assessment and related research have relied on speculative conceptual models. We argue for the importance of theoretical backing for such models and discuss 3 relevant theoretical frameworks, each supporting a distinctive “family” of models. Social determinant models postulate that unequal health outcomes are caused by structural inequalities; health disparity models envision social and contextual factors acting through individual behaviors and biological mechanisms; and multiple stressor models incorporate environmental agents, emphasizing the intermediary role of these and other stressors. The conclusion is that more careful reliance on established frameworks will lead directly to improvements in characterizing cumulative risk burdens and accounting for disproportionate adverse health effects. PMID:22021317

  17. A mixed model framework for teratology studies.

    PubMed

    Braeken, Johan; Tuerlinckx, Francis

    2009-10-01

    A mixed model framework is presented to model the characteristic multivariate binary anomaly data as provided in some teratology studies. The key features of the model are the incorporation of covariate effects, a flexible random effects distribution by means of a finite mixture, and the application of copula functions to better account for the relation structure of the anomalies. The framework is motivated by data of the Boston Anticonvulsant Teratogenesis study and offers an integrated approach to investigate substantive questions, concerning general and anomaly-specific exposure effects of covariates, interrelations between anomalies, and objective diagnostic measurement.

  18. Moral judgment as information processing: an integrative review.

    PubMed

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others' behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment.

  19. Toward a consistent modeling framework to assess multi-sectoral climate impacts.

    PubMed

    Monier, Erwan; Paltsev, Sergey; Sokolov, Andrei; Chen, Y-H Henry; Gao, Xiang; Ejaz, Qudsia; Couzo, Evan; Schlosser, C Adam; Dutkiewicz, Stephanie; Fant, Charles; Scott, Jeffery; Kicklighter, David; Morris, Jennifer; Jacoby, Henry; Prinn, Ronald; Haigh, Martin

    2018-02-13

    Efforts to estimate the physical and economic impacts of future climate change face substantial challenges. To enrich the currently popular approaches to impact analysis-which involve evaluation of a damage function or multi-model comparisons based on a limited number of standardized scenarios-we propose integrating a geospatially resolved physical representation of impacts into a coupled human-Earth system modeling framework. Large internationally coordinated exercises cannot easily respond to new policy targets and the implementation of standard scenarios across models, institutions and research communities can yield inconsistent estimates. Here, we argue for a shift toward the use of a self-consistent integrated modeling framework to assess climate impacts, and discuss ways the integrated assessment modeling community can move in this direction. We then demonstrate the capabilities of such a modeling framework by conducting a multi-sectoral assessment of climate impacts under a range of consistent and integrated economic and climate scenarios that are responsive to new policies and business expectations.

  20. Integrated city as a model for a new wave urban tourism

    NASA Astrophysics Data System (ADS)

    Ariani, V.

    2018-03-01

    Cities are a major player for an urban tourism destination. Massive tourism movement for urban tourism gains competitiveness to the city with similar characteristic. The new framework model for new wave urban tourism is crucial to give more experience to the tourist and valuing for the city itself. The integrated city is the answer for creating a new model for an urban tourism destination. The purpose of this preliminary research is to define integrated city framework for urban tourism development. It provides a rationale for tourism planner pursuing an innovative approach, competitive advantages, and general urban tourism destination model. The methodology applies to this research includes desk survey, literature review and focus group discussion. A conceptual framework is proposed, discussed and exemplified. The framework model adopts a place-based approach to tourism destination and suggests an integrated city model for urban tourism development. This model is a tool for strategy making in re-invention integrated city as an urban tourism destination.

  1. A conceptual framework for a long-term economic model for the treatment of attention-deficit/hyperactivity disorder.

    PubMed

    Nagy, Balázs; Setyawan, Juliana; Coghill, David; Soroncz-Szabó, Tamás; Kaló, Zoltán; Doshi, Jalpa A

    2017-06-01

    Models incorporating long-term outcomes (LTOs) are not available to assess the health economic impact of attention-deficit/hyperactivity disorder (ADHD). Develop a conceptual modelling framework capable of assessing long-term economic impact of ADHD therapies. Literature was reviewed; a conceptual structure for the long-term model was outlined with attention to disease characteristics and potential impact of treatment strategies. The proposed model has four layers: i) multi-state short-term framework to differentiate between ADHD treatments; ii) multiple states being merged into three core health states associated with LTOs; iii) series of sub-models in which particular LTOs are depicted; iv) outcomes collected to be either used directly for economic analyses or translated into other relevant measures. This conceptual model provides a framework to assess relationships between short- and long-term outcomes of the disease and its treatment, and to estimate the economic impact of ADHD treatments throughout the course of the disease.

  2. Experimental analysis of chaotic neural network models for combinatorial optimization under a unifying framework.

    PubMed

    Kwok, T; Smith, K A

    2000-09-01

    The aim of this paper is to study both the theoretical and experimental properties of chaotic neural network (CNN) models for solving combinatorial optimization problems. Previously we have proposed a unifying framework which encompasses the three main model types, namely, Chen and Aihara's chaotic simulated annealing (CSA) with decaying self-coupling, Wang and Smith's CSA with decaying timestep, and the Hopfield network with chaotic noise. Each of these models can be represented as a special case under the framework for certain conditions. This paper combines the framework with experimental results to provide new insights into the effect of the chaotic neurodynamics of each model. By solving the N-queen problem of various sizes with computer simulations, the CNN models are compared in different parameter spaces, with optimization performance measured in terms of feasibility, efficiency, robustness and scalability. Furthermore, characteristic chaotic neurodynamics crucial to effective optimization are identified, together with a guide to choosing the corresponding model parameters.

  3. Moral judgment as information processing: an integrative review

    PubMed Central

    Guglielmo, Steve

    2015-01-01

    How do humans make moral judgments about others’ behavior? This article reviews dominant models of moral judgment, organizing them within an overarching framework of information processing. This framework poses two distinct questions: (1) What input information guides moral judgments? and (2) What psychological processes generate these judgments? Information Models address the first question, identifying critical information elements (including causality, intentionality, and mental states) that shape moral judgments. A subclass of Biased Information Models holds that perceptions of these information elements are themselves driven by prior moral judgments. Processing Models address the second question, and existing models have focused on the relative contribution of intuitive versus deliberative processes. This review organizes existing moral judgment models within this framework and critically evaluates them on empirical and theoretical grounds; it then outlines a general integrative model grounded in information processing, and concludes with conceptual and methodological suggestions for future research. The information-processing framework provides a useful theoretical lens through which to organize extant and future work in the rapidly growing field of moral judgment. PMID:26579022

  4. A FRAMEWORK FOR FINE-SCALE COMPUTATIONAL FLUID DYNAMICS AIR QUALITY MODELING AND ANALYSIS

    EPA Science Inventory

    This paper discusses a framework for fine-scale CFD modeling that may be developed to complement the present Community Multi-scale Air Quality (CMAQ) modeling system which itself is a computational fluid dynamics model. A goal of this presentation is to stimulate discussions on w...

  5. Conceptual Modeling Framework for E-Area PA HELP Infiltration Model Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dyer, J. A.

    A conceptual modeling framework based on the proposed E-Area Low-Level Waste Facility (LLWF) closure cap design is presented for conducting Hydrologic Evaluation of Landfill Performance (HELP) model simulations of intact and subsided cap infiltration scenarios for the next E-Area Performance Assessment (PA).

  6. Graphical Means for Inspecting Qualitative Models of System Behaviour

    ERIC Educational Resources Information Center

    Bouwer, Anders; Bredeweg, Bert

    2010-01-01

    This article presents the design and evaluation of a tool for inspecting conceptual models of system behaviour. The basis for this research is the Garp framework for qualitative simulation. This framework includes modelling primitives, such as entities, quantities and causal dependencies, which are combined into model fragments and scenarios.…

  7. Investigating Experimental Effects within the Framework of Structural Equation Modeling: An Example with Effects on Both Error Scores and Reaction Times

    ERIC Educational Resources Information Center

    Schweizer, Karl

    2008-01-01

    Structural equation modeling provides the framework for investigating experimental effects on the basis of variances and covariances in repeated measurements. A special type of confirmatory factor analysis as part of this framework enables the appropriate representation of the experimental effect and the separation of experimental and…

  8. Overview of the Special Issue: A Multi-Model Framework to Achieve Consistent Evaluation of Climate Change Impacts in the United States

    EPA Science Inventory

    The Climate Change Impacts and Risk Analysis (CIRA) project establishes a new multi-model framework to systematically assess the impacts, economic damages, and risks from climate change in the United States. The primary goal of this framework to estimate how climate change impac...

  9. Understanding Illinois Principals' Concerns Implementing Charlotte Danielson's Framework for Teaching as a Model for Evaluation

    ERIC Educational Resources Information Center

    Mckenna, George Tucker

    2017-01-01

    The purpose of this study is to determine the levels of concern of Illinois principals regarding the adoption of an evaluation system modeled after Charlotte Danielson's Framework for Teaching. Principal demographics and involvement in the use of and professional development surrounding Charlotte Danielson's Framework for Teaching were studied for…

  10. A Response to the Review of the Community of Inquiry Framework

    ERIC Educational Resources Information Center

    Akyol, Zehra; Arbaugh, J. Ben; Cleveland-Innes, Marti; Garrison, D. Randy; Ice, Phil; Richardson, Jennifer C.; Swan, Karen

    2009-01-01

    The Community of Inquiry (CoI) framework has become a prominent model of teaching and learning in online and blended learning environments. Considerable research has been conducted which employs the framework with promising results, resulting in wide use to inform the practice of online and blended teaching and learning. For the CoI model to…

  11. A framework to analyze emissions implications of ...

    EPA Pesticide Factsheets

    Future year emissions depend highly on the evolution of the economy, technology and current and future regulatory drivers. A scenario framework was adopted to analyze various technology development pathways and societal change while considering existing regulations and future uncertainty in regulations and evaluate resulting emissions growth patterns. The framework integrates EPA’s energy systems model with an economic Input-Output (I/O) Life Cycle Assessment model. The EPAUS9r MARKAL database is assembled from a set of technologies to represent the U.S. energy system within MARKAL bottom-up technology rich energy modeling framework. The general state of the economy and consequent demands for goods and services from these sectors are taken exogenously in MARKAL. It is important to characterize exogenous inputs about the economy to appropriately represent the industrial sector outlook for each of the scenarios and case studies evaluated. An economic input-output (I/O) model of the US economy is constructed to link up with MARKAL. The I/O model enables user to change input requirements (e.g. energy intensity) for different sectors or the share of consumer income expended on a given good. This gives end-users a mechanism for modeling change in the two dimensions of technological progress and consumer preferences that define the future scenarios. The framework will then be extended to include environmental I/O framework to track life cycle emissions associated

  12. Establishing a Numerical Modeling Framework for Hydrologic Engineering Analyses of Extreme Storm Events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xiaodong; Hossain, Faisal; Leung, L. Ruby

    In this study a numerical modeling framework for simulating extreme storm events was established using the Weather Research and Forecasting (WRF) model. Such a framework is necessary for the derivation of engineering parameters such as probable maximum precipitation that are the cornerstone of large water management infrastructure design. Here this framework was built based on a heavy storm that occurred in Nashville (USA) in 2010, and verified using two other extreme storms. To achieve the optimal setup, several combinations of model resolutions, initial/boundary conditions (IC/BC), cloud microphysics and cumulus parameterization schemes were evaluated using multiple metrics of precipitation characteristics. Themore » evaluation suggests that WRF is most sensitive to IC/BC option. Simulation generally benefits from finer resolutions up to 5 km. At the 15km level, NCEP2 IC/BC produces better results, while NAM IC/BC performs best at the 5km level. Recommended model configuration from this study is: NAM or NCEP2 IC/BC (depending on data availability), 15km or 15km-5km nested grids, Morrison microphysics and Kain-Fritsch cumulus schemes. Validation of the optimal framework suggests that these options are good starting choices for modeling extreme events similar to the test cases. This optimal framework is proposed in response to emerging engineering demands of extreme storm events forecasting and analyses for design, operations and risk assessment of large water infrastructures.« less

  13. System modeling with the DISC framework: evidence from safety-critical domains.

    PubMed

    Reiman, Teemu; Pietikäinen, Elina; Oedewald, Pia; Gotcheva, Nadezhda

    2012-01-01

    The objective of this paper is to illustrate the development and application of the Design for Integrated Safety Culture (DISC) framework for system modeling by evaluating organizational potential for safety in nuclear and healthcare domains. The DISC framework includes criteria for good safety culture and a description of functions that the organization needs to implement in order to orient the organization toward the criteria. Three case studies will be used to illustrate the utilization of the DISC framework in practice.

  14. On the Conditioning of Machine-Learning-Assisted Turbulence Modeling

    NASA Astrophysics Data System (ADS)

    Wu, Jinlong; Sun, Rui; Wang, Qiqi; Xiao, Heng

    2017-11-01

    Recently, several researchers have demonstrated that machine learning techniques can be used to improve the RANS modeled Reynolds stress by training on available database of high fidelity simulations. However, obtaining improved mean velocity field remains an unsolved challenge, restricting the predictive capability of current machine-learning-assisted turbulence modeling approaches. In this work we define a condition number to evaluate the model conditioning of data-driven turbulence modeling approaches, and propose a stability-oriented machine learning framework to model Reynolds stress. Two canonical flows, the flow in a square duct and the flow over periodic hills, are investigated to demonstrate the predictive capability of the proposed framework. The satisfactory prediction performance of mean velocity field for both flows demonstrates the predictive capability of the proposed framework for machine-learning-assisted turbulence modeling. With showing the capability of improving the prediction of mean flow field, the proposed stability-oriented machine learning framework bridges the gap between the existing machine-learning-assisted turbulence modeling approaches and the demand of predictive capability of turbulence models in real applications.

  15. A framework for predicting impacts on ecosystem services ...

    EPA Pesticide Factsheets

    Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. The framework introduced here represents an ongoing initiative supported by the National Institute of Mathematical and Biological Synthesis (NIMBioS; http://www.nimbi

  16. Learning in the model space for cognitive fault diagnosis.

    PubMed

    Chen, Huanhuan; Tino, Peter; Rodan, Ali; Yao, Xin

    2014-01-01

    The emergence of large sensor networks has facilitated the collection of large amounts of real-time data to monitor and control complex engineering systems. However, in many cases the collected data may be incomplete or inconsistent, while the underlying environment may be time-varying or unformulated. In this paper, we develop an innovative cognitive fault diagnosis framework that tackles the above challenges. This framework investigates fault diagnosis in the model space instead of the signal space. Learning in the model space is implemented by fitting a series of models using a series of signal segments selected with a sliding window. By investigating the learning techniques in the fitted model space, faulty models can be discriminated from healthy models using a one-class learning algorithm. The framework enables us to construct a fault library when unknown faults occur, which can be regarded as cognitive fault isolation. This paper also theoretically investigates how to measure the pairwise distance between two models in the model space and incorporates the model distance into the learning algorithm in the model space. The results on three benchmark applications and one simulated model for the Barcelona water distribution network confirm the effectiveness of the proposed framework.

  17. Operationalising a model framework for consumer and community participation in health and medical research

    PubMed Central

    Saunders, Carla; Crossing, Sally; Girgis, Afaf; Butow, Phyllis; Penman, Andrew

    2007-01-01

    The Consumers' Health Forum of Australia and the National Health and Medical Research Council has recently developed a Model Framework for Consumer and Community Participation in Health and Medical Research in order to better align health and medical research with community need, and improve the impact of research. Model frameworks may have little impact on what goes on in practice unless relevant organisations actively make use of them. Philanthropic and government bodies have reported involving consumers in more meaningful or collaborative ways of late. This paper describes how a large charity organisation, which funds a significant proportion of Australian cancer research, operationalised the model framework using a unique approach demonstrating that it is both possible and reasonable for research to be considerate of public values. PMID:17592651

  18. Development of a software framework for data assimilation and its applications for streamflow forecasting in Japan

    NASA Astrophysics Data System (ADS)

    Noh, S. J.; Tachikawa, Y.; Shiiba, M.; Yorozu, K.; Kim, S.

    2012-04-01

    Data assimilation methods have received increased attention to accomplish uncertainty assessment and enhancement of forecasting capability in various areas. Despite of their potentials, applicable software frameworks to probabilistic approaches and data assimilation are still limited because the most of hydrologic modeling software are based on a deterministic approach. In this study, we developed a hydrological modeling framework for sequential data assimilation, so called MPI-OHyMoS. MPI-OHyMoS allows user to develop his/her own element models and to easily build a total simulation system model for hydrological simulations. Unlike process-based modeling framework, this software framework benefits from its object-oriented feature to flexibly represent hydrological processes without any change of the main library. Sequential data assimilation based on the particle filters is available for any hydrologic models based on MPI-OHyMoS considering various sources of uncertainty originated from input forcing, parameters and observations. The particle filters are a Bayesian learning process in which the propagation of all uncertainties is carried out by a suitable selection of randomly generated particles without any assumptions about the nature of the distributions. In MPI-OHyMoS, ensemble simulations are parallelized, which can take advantage of high performance computing (HPC) system. We applied this software framework for short-term streamflow forecasting of several catchments in Japan using a distributed hydrologic model. Uncertainty of model parameters and remotely-sensed rainfall data such as X-band or C-band radar is estimated and mitigated in the sequential data assimilation.

  19. Implementing Restricted Maximum Likelihood Estimation in Structural Equation Models

    ERIC Educational Resources Information Center

    Cheung, Mike W.-L.

    2013-01-01

    Structural equation modeling (SEM) is now a generic modeling framework for many multivariate techniques applied in the social and behavioral sciences. Many statistical models can be considered either as special cases of SEM or as part of the latent variable modeling framework. One popular extension is the use of SEM to conduct linear mixed-effects…

  20. Using the Bifocal Modeling Framework to Resolve "Discrepant Events" between Physical Experiments and Virtual Models in Biology

    ERIC Educational Resources Information Center

    Blikstein, Paulo; Fuhrmann, Tamar; Salehi, Shima

    2016-01-01

    In this paper, we investigate an approach to supporting students' learning in science through a combination of physical experimentation and virtual modeling. We present a study that utilizes a scientific inquiry framework, which we call "bifocal modeling," to link student-designed experiments and computer models in real time. In this…

  1. The intersection of disability and healthcare disparities: a conceptual framework.

    PubMed

    Meade, Michelle A; Mahmoudi, Elham; Lee, Shoou-Yih

    2015-01-01

    This article provides a conceptual framework for understanding healthcare disparities experienced by individuals with disabilities. While health disparities are the result of factors deeply rooted in culture, life style, socioeconomic status, and accessibility of resources, healthcare disparities are a subset of health disparities that reflect differences in access to and quality of healthcare and can be viewed as the inability of the healthcare system to adequately address the needs of specific population groups. This article uses a narrative method to identify and critique the main conceptual frameworks that have been used in analyzing disparities in healthcare access and quality, and evaluating those frameworks in the context of healthcare for individuals with disabilities. Specific models that are examined include the Aday and Anderson Model, the Grossman Utility Model, the Institute of Medicine (IOM)'s models of Access to Healthcare Services and Healthcare Disparities, and the Cultural Competency model. While existing frameworks advance understandings of disparities in healthcare access and quality, they fall short when applied to individuals with disabilities. Specific deficits include a lack of attention to cultural and contextual factors (Aday and Andersen framework), unrealistic assumptions regarding equal access to resources (Grossman's utility model), lack of recognition or inclusion of concepts of structural accessibility (IOM model of Healthcare Disparities) and exclusive emphasis on supply side of the healthcare equation to improve healthcare disparities (Cultural Competency model). In response to identified gaps in the literature and short-comings of current conceptualizations, an integrated model of disability and healthcare disparities is put forth. We analyzed models of access to care and disparities in healthcare to be able to have an integrated and cohesive conceptual framework that could potentially address issues related to access to healthcare among individuals with disabilities. The Model of Healthcare Disparities and Disability (MHDD) provides a framework for conceptualizing how healthcare disparities impact disability and specifically, how a mismatch between personal and environmental factors may result in reduced healthcare access and quality, which in turn may lead to reduced functioning, activity and participation among individuals with impairments and chronic health conditions. Researchers, health providers, policy makers and community advocate groups who are engaged in devising interventions aimed at reducing healthcare disparities would benefit from the discussions. Implications for Rehabilitation Evaluates the main models of healthcare disparity and disability to create an integrated framework. Provides a comprehensive conceptual model of healthcare disparity that specifically targets issues related to individuals with disabilities. Conceptualizes how personal and environmental factors interact to produce disparities in access to healthcare and healthcare quality. Recognizes and targets modifiable factors to reduce disparities between and within individuals with disabilities.

  2. Material and morphology parameter sensitivity analysis in particulate composite materials

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoyu; Oskay, Caglar

    2017-12-01

    This manuscript presents a novel parameter sensitivity analysis framework for damage and failure modeling of particulate composite materials subjected to dynamic loading. The proposed framework employs global sensitivity analysis to study the variance in the failure response as a function of model parameters. In view of the computational complexity of performing thousands of detailed microstructural simulations to characterize sensitivities, Gaussian process (GP) surrogate modeling is incorporated into the framework. In order to capture the discontinuity in response surfaces, the GP models are integrated with a support vector machine classification algorithm that identifies the discontinuities within response surfaces. The proposed framework is employed to quantify variability and sensitivities in the failure response of polymer bonded particulate energetic materials under dynamic loads to material properties and morphological parameters that define the material microstructure. Particular emphasis is placed on the identification of sensitivity to interfaces between the polymer binder and the energetic particles. The proposed framework has been demonstrated to identify the most consequential material and morphological parameters under vibrational and impact loads.

  3. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  4. Price responsiveness of demand for cigarettes: does rationality matter?

    PubMed

    Laporte, Audrey

    2006-01-01

    Meta-analysis is applied to aggregate-level studies that model the demand for cigarettes using static, myopic, or rational addiction frameworks in an attempt to synthesize key findings in the literature and to identify determinants of the variation in reported price elasticity estimates across studies. The results suggest that the rational addiction framework produces statistically similar estimates to the static framework but that studies that use the myopic framework tend to report more elastic price effects. Studies that applied panel data techniques or controlled for cross-border smuggling reported more elastic price elasticity estimates, whereas the use of instrumental variable techniques and time trends or time dummy variables produced less elastic estimates. The finding that myopic models produce different estimates than either of the other two model frameworks underscores that careful attention must be given to time series properties of the data.

  5. Open data models for smart health interconnected applications: the example of openEHR.

    PubMed

    Demski, Hans; Garde, Sebastian; Hildebrand, Claudia

    2016-10-22

    Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.

  6. A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson; Feng, Zhe; Plant, Robert S.

    A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The approach used follows the non-equilibrium statistical mechanical approach through a master equation. The aim is to represent the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and mass flux is a non-linear function of convective cell area, mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated mass flux variability under diurnally varying forcing. Besides its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to be capable of providing alternative, non-equilibrium, closure formulations for spectral mass flux parameterizations.« less

  7. Comparison of methods for the analysis of relatively simple mediation models.

    PubMed

    Rijnhart, Judith J M; Twisk, Jos W R; Chinapaw, Mai J M; de Boer, Michiel R; Heymans, Martijn W

    2017-09-01

    Statistical mediation analysis is an often used method in trials, to unravel the pathways underlying the effect of an intervention on a particular outcome variable. Throughout the years, several methods have been proposed, such as ordinary least square (OLS) regression, structural equation modeling (SEM), and the potential outcomes framework. Most applied researchers do not know that these methods are mathematically equivalent when applied to mediation models with a continuous mediator and outcome variable. Therefore, the aim of this paper was to demonstrate the similarities between OLS regression, SEM, and the potential outcomes framework in three mediation models: 1) a crude model, 2) a confounder-adjusted model, and 3) a model with an interaction term for exposure-mediator interaction. Secondary data analysis of a randomized controlled trial that included 546 schoolchildren. In our data example, the mediator and outcome variable were both continuous. We compared the estimates of the total, direct and indirect effects, proportion mediated, and 95% confidence intervals (CIs) for the indirect effect across OLS regression, SEM, and the potential outcomes framework. OLS regression, SEM, and the potential outcomes framework yielded the same effect estimates in the crude mediation model, the confounder-adjusted mediation model, and the mediation model with an interaction term for exposure-mediator interaction. Since OLS regression, SEM, and the potential outcomes framework yield the same results in three mediation models with a continuous mediator and outcome variable, researchers can continue using the method that is most convenient to them.

  8. Sequence modelling and an extensible data model for genomic database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Peter Wei-Der

    1992-01-01

    The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS's do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data modelmore » that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the Extensible Object Model'', to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.« less

  9. Sequence modelling and an extensible data model for genomic database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Peter Wei-Der

    1992-01-01

    The Human Genome Project (HGP) plans to sequence the human genome by the beginning of the next century. It will generate DNA sequences of more than 10 billion bases and complex marker sequences (maps) of more than 100 million markers. All of these information will be stored in database management systems (DBMSs). However, existing data models do not have the abstraction mechanism for modelling sequences and existing DBMS`s do not have operations for complex sequences. This work addresses the problem of sequence modelling in the context of the HGP and the more general problem of an extensible object data modelmore » that can incorporate the sequence model as well as existing and future data constructs and operators. First, we proposed a general sequence model that is application and implementation independent. This model is used to capture the sequence information found in the HGP at the conceptual level. In addition, abstract and biological sequence operators are defined for manipulating the modelled sequences. Second, we combined many features of semantic and object oriented data models into an extensible framework, which we called the ``Extensible Object Model``, to address the need of a modelling framework for incorporating the sequence data model with other types of data constructs and operators. This framework is based on the conceptual separation between constructors and constraints. We then used this modelling framework to integrate the constructs for the conceptual sequence model. The Extensible Object Model is also defined with a graphical representation, which is useful as a tool for database designers. Finally, we defined a query language to support this model and implement the query processor to demonstrate the feasibility of the extensible framework and the usefulness of the conceptual sequence model.« less

  10. On accommodating spatial interactions in a Generalized Heterogeneous Data Model (GHDM) of mixed types of dependent variables.

    DOT National Transportation Integrated Search

    2015-12-01

    We develop an econometric framework for incorporating spatial dependence in integrated model systems of latent variables and multidimensional mixed data outcomes. The framework combines Bhats Generalized Heterogeneous Data Model (GHDM) with a spat...

  11. Parsing multiple processes of high temperature impacts on corn/soybean yield using a newly developed CLM-APSIM modeling framework

    NASA Astrophysics Data System (ADS)

    Peng, B.; Guan, K.; Chen, M.

    2016-12-01

    Future agricultural production faces a grand challenge of higher temperature under climate change. There are multiple physiological or metabolic processes of how high temperature affects crop yield. Specifically, we consider the following major processes: (1) direct temperature effects on photosynthesis and respiration; (2) speed-up growth rate and the shortening of growing season; (3) heat stress during reproductive stage (flowering and grain-filling); (4) high-temperature induced increase of atmospheric water demands. In this work, we use a newly developed modeling framework (CLM-APSIM) to simulate the corn and soybean growth and explicitly parse the above four processes. By combining the strength of CLM in modeling surface biophysical (e.g., hydrology and energy balance) and biogeochemical (e.g., photosynthesis and carbon-nitrogen interactions), as well as that of APSIM in modeling crop phenology and reproductive stress, the newly developed CLM-APSIM modeling framework enables us to diagnose the impacts of high temperature stress through different processes at various crop phenology stages. Ground measurements from the advanced SoyFACE facility at University of Illinois is used here to calibrate, validate, and improve the CLM-APSIM modeling framework at the site level. We finally use the CLM-APSIM modeling framework to project crop yield for the whole US Corn Belt under different climate scenarios.

  12. A framework for scalable parameter estimation of gene circuit models using structural information.

    PubMed

    Kuwahara, Hiroyuki; Fan, Ming; Wang, Suojin; Gao, Xin

    2013-07-01

    Systematic and scalable parameter estimation is a key to construct complex gene regulatory models and to ultimately facilitate an integrative systems biology approach to quantitatively understand the molecular mechanisms underpinning gene regulation. Here, we report a novel framework for efficient and scalable parameter estimation that focuses specifically on modeling of gene circuits. Exploiting the structure commonly found in gene circuit models, this framework decomposes a system of coupled rate equations into individual ones and efficiently integrates them separately to reconstruct the mean time evolution of the gene products. The accuracy of the parameter estimates is refined by iteratively increasing the accuracy of numerical integration using the model structure. As a case study, we applied our framework to four gene circuit models with complex dynamics based on three synthetic datasets and one time series microarray data set. We compared our framework to three state-of-the-art parameter estimation methods and found that our approach consistently generated higher quality parameter solutions efficiently. Although many general-purpose parameter estimation methods have been applied for modeling of gene circuits, our results suggest that the use of more tailored approaches to use domain-specific information may be a key to reverse engineering of complex biological systems. http://sfb.kaust.edu.sa/Pages/Software.aspx. Supplementary data are available at Bioinformatics online.

  13. A flexible framework for process-based hydraulic and water ...

    EPA Pesticide Factsheets

    Background Models that allow for design considerations of green infrastructure (GI) practices to control stormwater runoff and associated contaminants have received considerable attention in recent years. While popular, generally, the GI models are relatively simplistic. However, GI model predictions are being relied upon by many municipalities and State/Local agencies to make decisions about grey vs. green infrastructure improvement planning. Adding complexity to GI modeling frameworks may preclude their use in simpler urban planning situations. Therefore, the goal here was to develop a sophisticated, yet flexible tool that could be used by design engineers and researchers to capture and explore the effect of design factors and properties of the media used in the performance of GI systems at a relatively small scale. We deemed it essential to have a flexible GI modeling tool that is capable of simulating GI system components and specific biophysical processes affecting contaminants such as reactions, and particle-associated transport accurately while maintaining a high degree of flexibly to account for the myriad of GI alternatives. The mathematical framework for a stand-alone GI performance assessment tool has been developed and will be demonstrated.Framework Features The process-based model framework developed here can be used to model a diverse range of GI practices such as green roof, retention pond, bioretention, infiltration trench, permeable pavement and

  14. When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems

    NASA Astrophysics Data System (ADS)

    Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz

    2015-03-01

    Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions. Understanding the full impact of uncertainty makes it clear that full comprehension and robust certainty about the systems themselves are not feasible. A key research direction is the development of management systems that are robust to this unavoidable uncertainty.

  15. Design of a framework for modeling, integration and simulation of physiological models.

    PubMed

    Erson, E Zeynep; Cavuşoğlu, M Cenk

    2012-09-01

    Multiscale modeling and integration of physiological models carry challenges due to the complex nature of physiological processes. High coupling within and among scales present a significant challenge in constructing and integrating multiscale physiological models. In order to deal with such challenges in a systematic way, there is a significant need for an information technology framework together with related analytical and computational tools that will facilitate integration of models and simulations of complex biological systems. Physiological Model Simulation, Integration and Modeling Framework (Phy-SIM) is an information technology framework providing the tools to facilitate development, integration and simulation of integrated models of human physiology. Phy-SIM brings software level solutions to the challenges raised by the complex nature of physiological systems. The aim of Phy-SIM, and this paper is to lay some foundation with the new approaches such as information flow and modular representation of the physiological models. The ultimate goal is to enhance the development of both the models and the integration approaches of multiscale physiological processes and thus this paper focuses on the design approaches that would achieve such a goal. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  16. Research and Design of the Three-tier Distributed Network Management System Based on COM / COM + and DNA

    NASA Astrophysics Data System (ADS)

    Liang, Likai; Bi, Yushen

    Considered on the distributed network management system's demand of high distributives, extensibility and reusability, a framework model of Three-tier distributed network management system based on COM/COM+ and DNA is proposed, which adopts software component technology and N-tier application software framework design idea. We also give the concrete design plan of each layer of this model. Finally, we discuss the internal running process of each layer in the distributed network management system's framework model.

  17. Design and Application of an Ontology for Component-Based Modeling of Water Systems

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2012-12-01

    Many Earth system modeling frameworks have adopted an approach of componentizing models so that a large model can be assembled by linking a set of smaller model components. These model components can then be more easily reused, extended, and maintained by a large group of model developers and end users. While there has been a notable increase in component-based model frameworks in the Earth sciences in recent years, there has been less work on creating framework-agnostic metadata and ontologies for model components. Well defined model component metadata is needed, however, to facilitate sharing, reuse, and interoperability both within and across Earth system modeling frameworks. To address this need, we have designed an ontology for the water resources community named the Water Resources Component (WRC) ontology in order to advance the application of component-based modeling frameworks across water related disciplines. Here we present the design of the WRC ontology and demonstrate its application for integration of model components used in watershed management. First we show how the watershed modeling system Soil and Water Assessment Tool (SWAT) can be decomposed into a set of hydrological and ecological components that adopt the Open Modeling Interface (OpenMI) standard. Then we show how the components can be used to estimate nitrogen losses from land to surface water for the Baltimore Ecosystem study area. Results of this work are (i) a demonstration of how the WRC ontology advances the conceptual integration between components of water related disciplines by handling the semantic and syntactic heterogeneity present when describing components from different disciplines and (ii) an investigation of a methodology by which large models can be decomposed into a set of model components that can be well described by populating metadata according to the WRC ontology.

  18. An Integrated Modeling Framework Forecasting Ecosystem Exposure-- A Systems Approach to the Cumulative Impacts of Multiple Stressors

    NASA Astrophysics Data System (ADS)

    Johnston, J. M.

    2013-12-01

    Freshwater habitats provide fishable, swimmable and drinkable resources and are a nexus of geophysical and biological processes. These processes in turn influence the persistence and sustainability of populations, communities and ecosystems. Climate change and landuse change encompass numerous stressors of potential exposure, including the introduction of toxic contaminants, invasive species, and disease in addition to physical drivers such as temperature and hydrologic regime. A systems approach that includes the scientific and technologic basis of assessing the health of ecosystems is needed to effectively protect human health and the environment. The Integrated Environmental Modeling Framework 'iemWatersheds' has been developed as a consistent and coherent means of forecasting the cumulative impact of co-occurring stressors. The Framework consists of three facilitating technologies: Data for Environmental Modeling (D4EM) that automates the collection and standardization of input data; the Framework for Risk Assessment of Multimedia Environmental Systems (FRAMES) that manages the flow of information between linked models; and the Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) that provides post-processing and analysis of model outputs, including uncertainty and sensitivity analysis. Five models are linked within the Framework to provide multimedia simulation capabilities for hydrology and water quality processes: the Soil Water Assessment Tool (SWAT) predicts surface water and sediment runoff and associated contaminants; the Watershed Mercury Model (WMM) predicts mercury runoff and loading to streams; the Water quality Analysis and Simulation Program (WASP) predicts water quality within the stream channel; the Habitat Suitability Index (HSI) model scores physicochemical habitat quality for individual fish species; and the Bioaccumulation and Aquatic System Simulator (BASS) predicts fish growth, population dynamics and bioaccumulation of toxic substances. The capability of the Framework to address cumulative impacts will be demonstrated for freshwater ecosystem services and mountaintop mining.

  19. Mechanisms, Monitoring and Modeling Earth Fissure generation and Fault activation due to subsurface Fluid exploitation (M3EF3): A UNESCO-IGCP project in partnership with the UNESCO-IHP Working Group on Land Subsidence

    NASA Astrophysics Data System (ADS)

    Teatini, P.; Carreon-Freyre, D.; Galloway, D. L.; Ye, S.

    2015-12-01

    Land subsidence due to groundwater extraction was recently mentioned as one of the most urgent threats to sustainable development in the latest UNESCO IHP-VIII (2014-2020) strategic plan. Although advances have been made in understanding, monitoring, and predicting subsidence, the influence of differential vertical compaction, horizontal displacements, and hydrostratigraphic and structural features in groundwater systems on localized near-surface ground ruptures is still poorly understood. The nature of ground failure may range from fissuring, i.e., formation of an open crack, to faulting, i.e., differential offset of the opposite sides of the failure plane. Ground ruptures associated with differential subsidence have been reported from many alluvial basins in semiarid and arid regions, e.g. China, India, Iran, Mexico, Saudi Arabia, Spain, and the United States. These ground ruptures strongly impact urban, industrial, and agricultural infrastructures, and affect socio-economic and cultural development. Leveraging previous collaborations, this year the UNESCO Working Group on Land Subsidence began the scientific cooperative project M3EF3 in collaboration with the UNESCO International Geosciences Programme (IGCP n.641; www.igcp641.org) to improve understanding of the processes involved in ground rupturing associated with the exploitation of subsurface fluids, and to facilitate the transfer of knowledge regarding sustainable groundwater management practices in vulnerable aquifer systems. The project is developing effective tools to help manage geologic risks associated with these types of hazards, and formulating recommendations pertaining to the sustainable use of subsurface fluid resources for urban and agricultural development in susceptible areas. The partnership between the UNESCO IHP and IGCP is ensuring that multiple scientific competencies required to optimally investigate earth fissuring and faulting caused by groundwater withdrawals are being employed.

  20. Improving the Hydro-stratigraphic Model of the Oxnard Forebay, Ventura County, California, using Transient Electromagnetic Surveying

    NASA Astrophysics Data System (ADS)

    Quady, Maura Colleen

    2013-01-01

    To characterize the hydro-stratigraphy of an area, drilling and well logs provide high resolution electrical resistivity data, albeit for limited areas (points). The expense of drilling indirectly leads to sparse data and it is necessary to assume lateral homogeneity between wells when creating stratigraphic maps. Unfortunately, this assumption may not apply to areas in complex depositional and tectonically active settings. The goal of this study is to fill in data gaps between wells in a groundwater basin in order to better characterize the hydro-stratigraphy under existing and potential sites for managed aquifer recharge. Basins in the southern California study area have been used for decades to recharge surface water to an upper aquifer system; this work also addresses whether the local hydro-stratigraphy favors surface infiltration as a means to recharge water to the lower aquifer system. Here, soundings of transient electromagnetism (TEM), a surface geophysical method, are correlated with nearby down-hole resistivity and lithology well logs for grain size interpretations of the subsurface in unsaturated conditions. Grain size is used as a proxy for permeability (hydraulic conductivity), with resistivity contrasts highlighting variations in the media, which would affect groundwater flow in both vertical and horizontal directions. Results suggest a nearly horizontal, extensive, low permeability layer exists in the area and only a few noted locations are favorable for surface -to-lower aquifer system recharge. Furthermore, zones of higher permeability deeper than the upper aquifer system are discontinuous and isolated among lower permeability zones. However, the TEM profiles show areas where lower permeability zones are thin, and where alternatives to surface percolation methods could be explored. In addition, the survey adds information about the transition between the upper and lower aquifer systems, and adds detail to the topography of the base of freshwater. Finally, this work effectively decreases the interpolation distance between data points of wellbores, and when viewed in sequence the TEM profiles present a 3D depiction of basin hydro-stratigraphy.

  1. An Epistemological Analysis of the Evolution of Didactical Activities in Teaching-Learning Sequences: The Case of Fluids. Special Issue

    ERIC Educational Resources Information Center

    Psillos, D.; Tselfes, Vassilis; Kariotoglou, Petros

    2004-01-01

    In the present paper we propose a theoretical framework for an epistemological modelling of teaching-learning (didactical) activities, which draws on recent studies of scientific practice. We present and analyse the framework, which includes three categories: namely, Cosmos-Evidence-Ideas (CEI). We also apply this framework in order to model a…

  2. An Overview of Models of Speaking Performance and Its Implications for the Development of Procedural Framework for Diagnostic Speaking Tests

    ERIC Educational Resources Information Center

    Zhao, Zhongbao

    2013-01-01

    This paper aims at developing a procedural framework for the development and validation of diagnostic speaking tests. The researcher reviews the current available models of speaking performance, analyzes the distinctive features and then points out the implications for the development of a procedural framework for diagnostic speaking tests. On…

  3. A scalable delivery framework and a pricing model for streaming media with advertisements

    NASA Astrophysics Data System (ADS)

    Al-Hadrusi, Musab; Sarhan, Nabil J.

    2008-01-01

    This paper presents a delivery framework for streaming media with advertisements and an associated pricing model. The delivery model combines the benefits of periodic broadcasting and stream merging. The advertisements' revenues are used to subsidize the price of the media content. The pricing is determined based on the total ads' viewing time. Moreover, this paper presents an efficient ad allocation scheme and three modified scheduling policies that are well suited to the proposed delivery framework. Furthermore, we study the effectiveness of the delivery framework and various scheduling polices through extensive simulation in terms of numerous metrics, including customer defection probability, average number of ads viewed per client, price, arrival rate, profit, and revenue.

  4. Improved analyses using function datasets and statistical modeling

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2014-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...

  5. Mediation Analysis in a Latent Growth Curve Modeling Framework

    ERIC Educational Resources Information Center

    von Soest, Tilmann; Hagtvet, Knut A.

    2011-01-01

    This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…

  6. Theories and Frameworks for Online Education: Seeking an Integrated Model

    ERIC Educational Resources Information Center

    Picciano, Anthony G.

    2017-01-01

    This article examines theoretical frameworks and models that focus on the pedagogical aspects of online education. After a review of learning theory as applied to online education, a proposal for an integrated "Multimodal Model for Online Education" is provided based on pedagogical purpose. The model attempts to integrate the work of…

  7. Using subject-specific three-dimensional (3D) anthropometry data in digital human modelling: case study in hand motion simulation.

    PubMed

    Tsao, Liuxing; Ma, Liang

    2016-11-01

    Digital human modelling enables ergonomists and designers to consider ergonomic concerns and design alternatives in a timely and cost-efficient manner in the early stages of design. However, the reliability of the simulation could be limited due to the percentile-based approach used in constructing the digital human model. To enhance the accuracy of the size and shape of the models, we proposed a framework to generate digital human models using three-dimensional (3D) anthropometric data. The 3D scan data from specific subjects' hands were segmented based on the estimated centres of rotation. The segments were then driven in forward kinematics to perform several functional postures. The constructed hand models were then verified, thereby validating the feasibility of the framework. The proposed framework helps generate accurate subject-specific digital human models, which can be utilised to guide product design and workspace arrangement. Practitioner Summary: Subject-specific digital human models can be constructed under the proposed framework based on three-dimensional (3D) anthropometry. This approach enables more reliable digital human simulation to guide product design and workspace arrangement.

  8. Qualitative analysis of a discrete thermostatted kinetic framework modeling complex adaptive systems

    NASA Astrophysics Data System (ADS)

    Bianca, Carlo; Mogno, Caterina

    2018-01-01

    This paper deals with the derivation of a new discrete thermostatted kinetic framework for the modeling of complex adaptive systems subjected to external force fields (nonequilibrium system). Specifically, in order to model nonequilibrium stationary states of the system, the external force field is coupled to a dissipative term (thermostat). The well-posedness of the related Cauchy problem is investigated thus allowing the new discrete thermostatted framework to be suitable for the derivation of specific models and the related computational analysis. Applications to crowd dynamics and future research directions are also discussed within the paper.

  9. Using an Integrated, Multi-disciplinary Framework to Support Quantitative Microbial Risk Assessments

    EPA Science Inventory

    The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) provides the infrastructure to link disparate models and databases seamlessly, giving an assessor the ability to construct an appropriate conceptual site model from a host of modeling choices, so a numbe...

  10. A Bivariate Generalized Linear Item Response Theory Modeling Framework to the Analysis of Responses and Response Times.

    PubMed

    Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J

    2015-01-01

    A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.

  11. A penalized framework for distributed lag non-linear models.

    PubMed

    Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G

    2017-09-01

    Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  12. Multi-level multi-task learning for modeling cross-scale interactions in nested geospatial data

    USGS Publications Warehouse

    Yuan, Shuai; Zhou, Jiayu; Tan, Pang-Ning; Fergus, Emi; Wagner, Tyler; Sorrano, Patricia

    2017-01-01

    Predictive modeling of nested geospatial data is a challenging problem as the models must take into account potential interactions among variables defined at different spatial scales. These cross-scale interactions, as they are commonly known, are particularly important to understand relationships among ecological properties at macroscales. In this paper, we present a novel, multi-level multi-task learning framework for modeling nested geospatial data in the lake ecology domain. Specifically, we consider region-specific models to predict lake water quality from multi-scaled factors. Our framework enables distinct models to be developed for each region using both its local and regional information. The framework also allows information to be shared among the region-specific models through their common set of latent factors. Such information sharing helps to create more robust models especially for regions with limited or no training data. In addition, the framework can automatically determine cross-scale interactions between the regional variables and the local variables that are nested within them. Our experimental results show that the proposed framework outperforms all the baseline methods in at least 64% of the regions for 3 out of 4 lake water quality datasets evaluated in this study. Furthermore, the latent factors can be clustered to obtain a new set of regions that is more aligned with the response variables than the original regions that were defined a priori from the ecology domain.

  13. A nursing-specific model of EPR documentation: organizational and professional requirements.

    PubMed

    von Krogh, Gunn; Nåden, Dagfinn

    2008-01-01

    To present the Norwegian documentation KPO model (quality assurance, problem solving, and caring). To present the requirements and multiple electronic patient record (EPR) functions the model is designed to address. The model's professional substance, a conceptual framework for nursing practice is developed by examining, reorganizing, and completing existing frameworks. The model's methodology, an information management system, is developed using an expert group. Both model elements were clinically tested over a period of 1 year. The model is designed for nursing documentation in step with statutory, organizational, and professional requirements. Complete documentation is arranged for by incorporating the Nursing Minimum Data Set. A systematic and comprehensive documentation is arranged for by establishing categories as provided in the model's framework domains. Consistent documentation is arranged for by incorporating NANDA-I Nursing Diagnoses, Nursing Intervention Classification, and Nursing Outcome Classification. The model can be used as a tool in cooperation with vendors to ensure the interests of the nursing profession is met when developing EPR solutions in healthcare. The model can provide clinicians with a framework for documentation in step with legal and organizational requirements and at the same time retain the ability to record all aspects of clinical nursing.

  14. Argumentation in Science Education: A Model-based Framework

    NASA Astrophysics Data System (ADS)

    Böttcher, Florian; Meisert, Anke

    2011-02-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.

  15. Modeling of ultrasonic processes utilizing a generic software framework

    NASA Astrophysics Data System (ADS)

    Bruns, P.; Twiefel, J.; Wallaschek, J.

    2017-06-01

    Modeling of ultrasonic processes is typically characterized by a high degree of complexity. Different domains and size scales must be regarded, so that it is rather difficult to build up a single detailed overall model. Developing partial models is a common approach to overcome this difficulty. In this paper a generic but simple software framework is presented which allows to coupe arbitrary partial models by slave modules with well-defined interfaces and a master module for coordination. Two examples are given to present the developed framework. The first one is the parameterization of a load model for ultrasonically-induced cavitation. The piezoelectric oscillator, its mounting, and the process load are described individually by partial models. These partial models then are coupled using the framework. The load model is composed of spring-damper-elements which are parameterized by experimental results. In the second example, the ideal mounting position for an oscillator utilized in ultrasonic assisted machining of stone is determined. Partial models for the ultrasonic oscillator, its mounting, the simplified contact process, and the workpiece’s material characteristics are presented. For both applications input and output variables are defined to meet the requirements of the framework’s interface.

  16. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations.more » Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes. - Highlights: • Proposed a physics–informed framework to quantify uncertainty in RANS simulations. • Framework incorporates physical prior knowledge and observation data. • Based on a rigorous Bayesian framework yet fully utilizes physical model. • Applicable for many complex physical systems beyond turbulent flows.« less

  17. A modeling framework for the establishment and spread of invasive species in heterogeneous environments.

    PubMed

    Lustig, Audrey; Worner, Susan P; Pitt, Joel P W; Doscher, Crile; Stouffer, Daniel B; Senay, Senait D

    2017-10-01

    Natural and human-induced events are continuously altering the structure of our landscapes and as a result impacting the spatial relationships between individual landscape elements and the species living in the area. Yet, only recently has the influence of the surrounding landscape on invasive species spread started to be considered. The scientific community increasingly recognizes the need for broader modeling framework that focuses on cross-study comparisons at different spatiotemporal scales. Using two illustrative examples, we introduce a general modeling framework that allows for a systematic investigation of the effect of habitat change on invasive species establishment and spread. The essential parts of the framework are (i) a mechanistic spatially explicit model (a modular dispersal framework-MDIG) that allows population dynamics and dispersal to be modeled in a geographical information system (GIS), (ii) a landscape generator that allows replicated landscape patterns with partially controllable spatial properties to be generated, and (iii) landscape metrics that depict the essential aspects of landscape with which dispersal and demographic processes interact. The modeling framework provides functionality for a wide variety of applications ranging from predictions of the spatiotemporal spread of real species and comparison of potential management strategies, to theoretical investigation of the effect of habitat change on population dynamics. Such a framework allows to quantify how small-grain landscape characteristics, such as habitat size and habitat connectivity, interact with life-history traits to determine the dynamics of invasive species spread in fragmented landscape. As such, it will give deeper insights into species traits and landscape features that lead to establishment and spread success and may be key to preventing new incursions and the development of efficient monitoring, surveillance, control or eradication programs.

  18. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  19. Simulation Framework to Estimate the Performance of CO2 and O2 Sensing from Space and Airborne Platforms for the ASCENDS Mission Requirements Analysis

    NASA Technical Reports Server (NTRS)

    Plitau, Denis; Prasad, Narasimha S.

    2012-01-01

    The Active Sensing of CO2 Emissions over Nights Days and Seasons (ASCENDS) mission recommended by the NRC Decadal Survey has a desired accuracy of 0.3% in carbon dioxide mixing ratio (XCO2) retrievals requiring careful selection and optimization of the instrument parameters. NASA Langley Research Center (LaRC) is investigating 1.57 micron carbon dioxide as well as the 1.26-1.27 micron oxygen bands for our proposed ASCENDS mission requirements investigation. Simulation studies are underway for these bands to select optimum instrument parameters. The simulations are based on a multi-wavelength lidar modeling framework being developed at NASA LaRC to predict the performance of CO2 and O2 sensing from space and airborne platforms. The modeling framework consists of a lidar simulation module and a line-by-line calculation component with interchangeable lineshape routines to test the performance of alternative lineshape models in the simulations. As an option the line-by-line radiative transfer model (LBLRTM) program may also be used for line-by-line calculations. The modeling framework is being used to perform error analysis, establish optimum measurement wavelengths as well as to identify the best lineshape models to be used in CO2 and O2 retrievals. Several additional programs for HITRAN database management and related simulations are planned to be included in the framework. The description of the modeling framework with selected results of the simulation studies for CO2 and O2 sensing is presented in this paper.

  20. Episodic Laryngeal Breathing Disorders: Literature Review and Proposal of Preliminary Theoretical Framework.

    PubMed

    Shembel, Adrianna C; Sandage, Mary J; Verdolini Abbott, Katherine

    2017-01-01

    The purposes of this literature review were (1) to identify and assess frameworks for clinical characterization of episodic laryngeal breathing disorders (ELBD) and their subtypes, (2) to integrate concepts from these frameworks into a novel theoretical paradigm, and (3) to provide a preliminary algorithm to classify clinical features of ELBD for future study of its clinical manifestations and underlying pathophysiological mechanisms. This is a literature review. Peer-reviewed literature from 1983 to 2015 pertaining to models for ELBD was searched using Pubmed, Ovid, Proquest, Cochrane Database of Systematic Reviews, and Google Scholar. Theoretical models for ELBD were identified, evaluated, and integrated into a novel comprehensive framework. Consensus across three salient models provided a working definition and inclusionary criteria for ELBD within the new framework. Inconsistencies and discrepancies within the models provided an analytic platform for future research. Comparison among three conceptual models-(1) Irritable larynx syndrome, (2) Dichotomous triggers, and (3) Periodic occurrence of laryngeal obstruction-showed that the models uniformly consider ELBD to involve episodic laryngeal obstruction causing dyspnea. The models differed in their description of source of dyspnea, in their inclusion of corollary behaviors, in their inclusion of other laryngeal-based behaviors (eg, cough), and types of triggers. The proposed integrated theoretical framework for ELBD provides a preliminary systematic platform for the identification of key clinical feature patterns indicative of ELBD and associated clinical subgroups. This algorithmic paradigm should evolve with better understanding of this spectrum of disorders and its underlying pathophysiological mechanisms. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  1. A modeling framework for evaluating streambank stabilization practices for reach-scale sediment reduction

    USDA-ARS?s Scientific Manuscript database

    Streambank stabilization techniques are often implemented to reduce sediment loads from unstable streambanks. Process-based models can predict sediment yields with stabilization scenarios prior to implementation. However, a framework does not exist on how to effectively utilize these models to evalu...

  2. A Conceptual Framework Curriculum Evaluation Electrical Engineering Education

    ERIC Educational Resources Information Center

    Imansari, Nurulita; Sutadji, Eddy

    2017-01-01

    This evaluation is a conceptual framework that has been analyzed in the hope that can help research related an evaluation of the curriculum. The Model of evaluation used was CIPPO model. CIPPO Model consists of "context," "input," "process," "product," and "outcomes." On the dimension of the…

  3. A modeling framework for characterizing near-road air pollutant concentration at community scales

    EPA Science Inventory

    In this study, we combine information from transportation network, traffic emissions, and dispersion model to develop a framework to inform exposure estimates for traffic-related air pollutants (TRAPs) with a high spatial resolution. A Research LINE source dispersion model (R-LIN...

  4. HexSim - A general purpose framework for spatially-explicit, individual-based modeling

    EPA Science Inventory

    HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications. This talk will focus on a subset of those ap...

  5. A development framework for semantically interoperable health information systems.

    PubMed

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  6. Scalable geocomputation: evolving an environmental model building platform from single-core to supercomputers

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; de Jong, Kor; Karssenberg, Derek

    2017-04-01

    There is an increasing demand to run environmental models on a big scale: simulations over large areas at high resolution. The heterogeneity of available computing hardware such as multi-core CPUs, GPUs or supercomputer potentially provides significant computing power to fulfil this demand. However, this requires detailed knowledge of the underlying hardware, parallel algorithm design and the implementation thereof in an efficient system programming language. Domain scientists such as hydrologists or ecologists often lack this specific software engineering knowledge, their emphasis is (and should be) on exploratory building and analysis of simulation models. As a result, models constructed by domain specialists mostly do not take full advantage of the available hardware. A promising solution is to separate the model building activity from software engineering by offering domain specialists a model building framework with pre-programmed building blocks that they combine to construct a model. The model building framework, consequently, needs to have built-in capabilities to make full usage of the available hardware. Developing such a framework providing understandable code for domain scientists and being runtime efficient at the same time poses several challenges on developers of such a framework. For example, optimisations can be performed on individual operations or the whole model, or tasks need to be generated for a well-balanced execution without explicitly knowing the complexity of the domain problem provided by the modeller. Ideally, a modelling framework supports the optimal use of available hardware whichsoever combination of model building blocks scientists use. We demonstrate our ongoing work on developing parallel algorithms for spatio-temporal modelling and demonstrate 1) PCRaster, an environmental software framework (http://www.pcraster.eu) providing spatio-temporal model building blocks and 2) parallelisation of about 50 of these building blocks using the new Fern library (https://github.com/geoneric/fern/), an independent generic raster processing library. Fern is a highly generic software library and its algorithms can be configured according to the configuration of a modelling framework. With manageable programming effort (e.g. matching data types between programming and domain language) we created a binding between Fern and PCRaster. The resulting PCRaster Python multicore module can be used to execute existing PCRaster models without having to make any changes to the model code. We show initial results on synthetic and geoscientific models indicating significant runtime improvements provided by parallel local and focal operations. We further outline challenges in improving remaining algorithms such as flow operations over digital elevation maps and further potential improvements like enhancing disk I/O.

  7. Dynamic motion planning of 3D human locomotion using gradient-based optimization.

    PubMed

    Kim, Hyung Joo; Wang, Qian; Rahmatalla, Salam; Swan, Colby C; Arora, Jasbir S; Abdel-Malek, Karim; Assouline, Jose G

    2008-06-01

    Since humans can walk with an infinite variety of postures and limb movements, there is no unique solution to the modeling problem to predict human gait motions. Accordingly, we test herein the hypothesis that the redundancy of human walking mechanisms makes solving for human joint profiles and force time histories an indeterminate problem best solved by inverse dynamics and optimization methods. A new optimization-based human-modeling framework is thus described for predicting three-dimensional human gait motions on level and inclined planes. The basic unknowns in the framework are the joint motion time histories of a 25-degree-of-freedom human model and its six global degrees of freedom. The joint motion histories are calculated by minimizing an objective function such as deviation of the trunk from upright posture that relates to the human model's performance. A variety of important constraints are imposed on the optimization problem, including (1) satisfaction of dynamic equilibrium equations by requiring the model's zero moment point (ZMP) to lie within the instantaneous geometrical base of support, (2) foot collision avoidance, (3) limits on ground-foot friction, and (4) vanishing yawing moment. Analytical forms of objective and constraint functions are presented and discussed for the proposed human-modeling framework in which the resulting optimization problems are solved using gradient-based mathematical programming techniques. When the framework is applied to the modeling of bipedal locomotion on level and inclined planes, acyclic human walking motions that are smooth and realistic as opposed to less natural robotic motions are obtained. The aspects of the modeling framework requiring further investigation and refinement, as well as potential applications of the framework in biomechanics, are discussed.

  8. Framework for non-coherent interface models at finite displacement jumps and finite strains

    NASA Astrophysics Data System (ADS)

    Ottosen, Niels Saabye; Ristinmaa, Matti; Mosler, Jörn

    2016-05-01

    This paper deals with a novel constitutive framework suitable for non-coherent interfaces, such as cracks, undergoing large deformations in a geometrically exact setting. For this type of interface, the displacement field shows a jump across the interface. Within the engineering community, so-called cohesive zone models are frequently applied in order to describe non-coherent interfaces. However, for existing models to comply with the restrictions imposed by (a) thermodynamical consistency (e.g., the second law of thermodynamics), (b) balance equations (in particular, balance of angular momentum) and (c) material frame indifference, these models are essentially fiber models, i.e. models where the traction vector is collinear with the displacement jump. This constraints the ability to model shear and, in addition, anisotropic effects are excluded. A novel, extended constitutive framework which is consistent with the above mentioned fundamental physical principles is elaborated in this paper. In addition to the classical tractions associated with a cohesive zone model, the main idea is to consider additional tractions related to membrane-like forces and out-of-plane shear forces acting within the interface. For zero displacement jump, i.e. coherent interfaces, this framework degenerates to existing formulations presented in the literature. For hyperelasticity, the Helmholtz energy of the proposed novel framework depends on the displacement jump as well as on the tangent vectors of the interface with respect to the current configuration - or equivalently - the Helmholtz energy depends on the displacement jump and the surface deformation gradient. It turns out that by defining the Helmholtz energy in terms of the invariants of these variables, all above-mentioned fundamental physical principles are automatically fulfilled. Extensions of the novel framework necessary for material degradation (damage) and plasticity are also covered.

  9. Combining Unsupervised and Supervised Classification to Build User Models for Exploratory Learning Environments

    ERIC Educational Resources Information Center

    Amershi, Saleema; Conati, Cristina

    2009-01-01

    In this paper, we present a data-based user modeling framework that uses both unsupervised and supervised classification to build student models for exploratory learning environments. We apply the framework to build student models for two different learning environments and using two different data sources (logged interface and eye-tracking data).…

  10. A Unified Framework for Complex Networks with Degree Trichotomy Based on Markov Chains.

    PubMed

    Hui, David Shui Wing; Chen, Yi-Chao; Zhang, Gong; Wu, Weijie; Chen, Guanrong; Lui, John C S; Li, Yingtao

    2017-06-16

    This paper establishes a Markov chain model as a unified framework for describing the evolution processes in complex networks. The unique feature of the proposed model is its capability in addressing the formation mechanism that can reflect the "trichotomy" observed in degree distributions, based on which closed-form solutions can be derived. Important special cases of the proposed unified framework are those classical models, including Poisson, Exponential, Power-law distributed networks. Both simulation and experimental results demonstrate a good match of the proposed model with real datasets, showing its superiority over the classical models. Implications of the model to various applications including citation analysis, online social networks, and vehicular networks design, are also discussed in the paper.

  11. Quantitative and qualitative assessment of the groundwater system behavior to support Brownfield regeneration of Hunedoara (Romania) former steel production site

    NASA Astrophysics Data System (ADS)

    Gogu, R.; Gaitanaru, D.; Ciugulea, O.; Boukhemacha, M. A.; Bica, I.

    2012-04-01

    Located in the Western part of Romania, the study area is the Hunedoara former steel industry site. The current contamination status of the subsurface shows a real threat due to the contribution of more than 100 years of steel production, ironworks operations, coke products generation, and recovery of recycling materials. Analyses performed in 2007 indicated high contaminations with heavy metals like copper, lead, cadmium, manganese, and chromium. As the contamination of the soil and groundwater severe, brownfield regeneration of this site is essential for a sustainable land management. Intelligent remediation techniques with regard to phytoremediation and soil washing with recycled solutions could be applied. However, these techniques could be correctly chosen and applied if a reliable image of the hydrological, geological, hydrogeological, pedological settings exits and after a deep understanding of the contamination mechanisms. As consequence the development of a groundwater flow and contaminant transport model for this area is compulsory. Hunedoara County has a complex geological structure, made by crystalline-Mesozoic units belonging to Southern Carpathians and by sedimentary-volcanic units of Western Carpathians. The site area is shaped by the presence of alluvial deposits from the Superior Holocene. From the lithologic point of view, covered by a thick layer of clay a sandy formation is located at depths bellow 10 m. The two strata are covering an extended carbonate media. The main aquifer is represented by a groundwater body located under the clay layer. The groundwater table of the superficial aquifer is located at about 10 m depth. The one layer groundwater flow model simulating aquifer behavior covers about 1,2 km2. Its conceptual model relies on a 3D geological model made by using 7 accurate geological cross-sections of the studied domain. Detailed geological data was provided by direct-push core sampling correlated with the penetration time and with electrical conductivity tests. One important role in the spatial distribution of the contaminants is played by the hydro-stratigraphical features of the site. In situ testing of hydraulic conductivity has been performed by injecting water under a specified pressure (4-5 bar) into the aquifer. The interpretation provides in a preliminary stage a relative profile of hydraulic conductivity. By means of several slug tests, the results are translated into absolute values of hydraulic conductivity. The calibrated flow model represents the first step for the quantitative assessment of the groundwater parameters. Correlating the surface and soil distribution of the pollutants, a multi-component transport model is currently set-up in order to quantify the spatial distribution of the contaminated area.

  12. Family Environment and Childhood Obesity: A New Framework with Structural Equation Modeling

    PubMed Central

    Huang, Hui; Wan Mohamed Radzi, Che Wan Jasimah bt; Salarzadeh Jenatabadi, Hashem

    2017-01-01

    The main purpose of the current article is to introduce a framework of the complexity of childhood obesity based on the family environment. A conceptual model that quantifies the relationships and interactions among parental socioeconomic status, family food security level, child’s food intake and certain aspects of parental feeding behaviour is presented using the structural equation modeling (SEM) concept. Structural models are analysed in terms of the direct and indirect connections among latent and measurement variables that lead to the child weight indicator. To illustrate the accuracy, fit, reliability and validity of the introduced framework, real data collected from 630 families from Urumqi (Xinjiang, China) were considered. The framework includes two categories of data comprising the normal body mass index (BMI) range and obesity data. The comparison analysis between two models provides some evidence that in obesity modeling, obesity data must be extracted from the dataset and analysis must be done separately from the normal BMI range. This study may be helpful for researchers interested in childhood obesity modeling based on family environment. PMID:28208833

  13. Family Environment and Childhood Obesity: A New Framework with Structural Equation Modeling.

    PubMed

    Huang, Hui; Wan Mohamed Radzi, Che Wan Jasimah Bt; Salarzadeh Jenatabadi, Hashem

    2017-02-13

    The main purpose of the current article is to introduce a framework of the complexity of childhood obesity based on the family environment. A conceptual model that quantifies the relationships and interactions among parental socioeconomic status, family food security level, child's food intake and certain aspects of parental feeding behaviour is presented using the structural equation modeling (SEM) concept. Structural models are analysed in terms of the direct and indirect connections among latent and measurement variables that lead to the child weight indicator. To illustrate the accuracy, fit, reliability and validity of the introduced framework, real data collected from 630 families from Urumqi (Xinjiang, China) were considered. The framework includes two categories of data comprising the normal body mass index (BMI) range and obesity data. The comparison analysis between two models provides some evidence that in obesity modeling, obesity data must be extracted from the dataset and analysis must be done separately from the normal BMI range. This study may be helpful for researchers interested in childhood obesity modeling based on family environment.

  14. Determination of sample size for higher volatile data using new framework of Box-Jenkins model with GARCH: A case study on gold price

    NASA Astrophysics Data System (ADS)

    Roslindar Yaziz, Siti; Zakaria, Roslinazairimah; Hura Ahmad, Maizah

    2017-09-01

    The model of Box-Jenkins - GARCH has been shown to be a promising tool for forecasting higher volatile time series. In this study, the framework of determining the optimal sample size using Box-Jenkins model with GARCH is proposed for practical application in analysing and forecasting higher volatile data. The proposed framework is employed to daily world gold price series from year 1971 to 2013. The data is divided into 12 different sample sizes (from 30 to 10200). Each sample is tested using different combination of the hybrid Box-Jenkins - GARCH model. Our study shows that the optimal sample size to forecast gold price using the framework of the hybrid model is 1250 data of 5-year sample. Hence, the empirical results of model selection criteria and 1-step-ahead forecasting evaluations suggest that the latest 12.25% (5-year data) of 10200 data is sufficient enough to be employed in the model of Box-Jenkins - GARCH with similar forecasting performance as by using 41-year data.

  15. A Personalized Predictive Framework for Multivariate Clinical Time Series via Adaptive Model Selection.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2017-11-01

    Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.

  16. Framework of distributed coupled atmosphere-ocean-wave modeling system

    NASA Astrophysics Data System (ADS)

    Wen, Yuanqiao; Huang, Liwen; Deng, Jian; Zhang, Jinfeng; Wang, Sisi; Wang, Lijun

    2006-05-01

    In order to research the interactions between the atmosphere and ocean as well as their important role in the intensive weather systems of coastal areas, and to improve the forecasting ability of the hazardous weather processes of coastal areas, a coupled atmosphere-ocean-wave modeling system has been developed. The agent-based environment framework for linking models allows flexible and dynamic information exchange between models. For the purpose of flexibility, portability and scalability, the framework of the whole system takes a multi-layer architecture that includes a user interface layer, computational layer and service-enabling layer. The numerical experiment presented in this paper demonstrates the performance of the distributed coupled modeling system.

  17. pyomo.dae: a modeling and automatic discretization framework for optimization with differential and algebraic equations

    DOE PAGES

    Nicholson, Bethany; Siirola, John D.; Watson, Jean-Paul; ...

    2017-12-20

    We describe pyomo.dae, an open source Python-based modeling framework that enables high-level abstract specification of optimization problems with differential and algebraic equations. The pyomo.dae framework is integrated with the Pyomo open source algebraic modeling language, and is available at http://www.pyomo.org. One key feature of pyomo.dae is that it does not restrict users to standard, predefined forms of differential equations, providing a high degree of modeling flexibility and the ability to express constraints that cannot be easily specified in other modeling frameworks. Other key features of pyomo.dae are the ability to specify optimization problems with high-order differential equations and partial differentialmore » equations, defined on restricted domain types, and the ability to automatically transform high-level abstract models into finite-dimensional algebraic problems that can be solved with off-the-shelf solvers. Moreover, pyomo.dae users can leverage existing capabilities of Pyomo to embed differential equation models within stochastic and integer programming models and mathematical programs with equilibrium constraint formulations. Collectively, these features enable the exploration of new modeling concepts, discretization schemes, and the benchmarking of state-of-the-art optimization solvers.« less

  18. pyomo.dae: a modeling and automatic discretization framework for optimization with differential and algebraic equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicholson, Bethany; Siirola, John D.; Watson, Jean-Paul

    We describe pyomo.dae, an open source Python-based modeling framework that enables high-level abstract specification of optimization problems with differential and algebraic equations. The pyomo.dae framework is integrated with the Pyomo open source algebraic modeling language, and is available at http://www.pyomo.org. One key feature of pyomo.dae is that it does not restrict users to standard, predefined forms of differential equations, providing a high degree of modeling flexibility and the ability to express constraints that cannot be easily specified in other modeling frameworks. Other key features of pyomo.dae are the ability to specify optimization problems with high-order differential equations and partial differentialmore » equations, defined on restricted domain types, and the ability to automatically transform high-level abstract models into finite-dimensional algebraic problems that can be solved with off-the-shelf solvers. Moreover, pyomo.dae users can leverage existing capabilities of Pyomo to embed differential equation models within stochastic and integer programming models and mathematical programs with equilibrium constraint formulations. Collectively, these features enable the exploration of new modeling concepts, discretization schemes, and the benchmarking of state-of-the-art optimization solvers.« less

  19. Modeling framework for representing long-term effectiveness of best management practices in addressing hydrology and water quality problems: Framework development and demonstration using a Bayesian method

    NASA Astrophysics Data System (ADS)

    Liu, Yaoze; Engel, Bernard A.; Flanagan, Dennis C.; Gitau, Margaret W.; McMillan, Sara K.; Chaubey, Indrajeet; Singh, Shweta

    2018-05-01

    Best management practices (BMPs) are popular approaches used to improve hydrology and water quality. Uncertainties in BMP effectiveness over time may result in overestimating long-term efficiency in watershed planning strategies. To represent varying long-term BMP effectiveness in hydrologic/water quality models, a high level and forward-looking modeling framework was developed. The components in the framework consist of establishment period efficiency, starting efficiency, efficiency for each storm event, efficiency between maintenance, and efficiency over the life cycle. Combined, they represent long-term efficiency for a specific type of practice and specific environmental concern (runoff/pollutant). An approach for possible implementation of the framework was discussed. The long-term impacts of grass buffer strips (agricultural BMP) and bioretention systems (urban BMP) in reducing total phosphorus were simulated to demonstrate the framework. Data gaps were captured in estimating the long-term performance of the BMPs. A Bayesian method was used to match the simulated distribution of long-term BMP efficiencies with the observed distribution with the assumption that the observed data represented long-term BMP efficiencies. The simulated distribution matched the observed distribution well with only small total predictive uncertainties. With additional data, the same method can be used to further improve the simulation results. The modeling framework and results of this study, which can be adopted in hydrologic/water quality models to better represent long-term BMP effectiveness, can help improve decision support systems for creating long-term stormwater management strategies for watershed management projects.

  20. The Development of a Conceptual Framework and Tools to Assess Undergraduates' Principled Use of Models in Cellular Biology

    PubMed Central

    Merritt, Brett; Urban-Lurain, Mark; Parker, Joyce

    2010-01-01

    Recent science education reform has been marked by a shift away from a focus on facts toward deep, rich, conceptual understanding. This requires assessment that also focuses on conceptual understanding rather than recall of facts. This study outlines our development of a new assessment framework and tool—a taxonomy— which, unlike existing frameworks and tools, is grounded firmly in a framework that considers the critical role that models play in science. It also provides instructors a resource for assessing students' ability to reason about models that are central to the organization of key scientific concepts. We describe preliminary data arising from the application of our tool to exam questions used by instructors of a large-enrollment cell and molecular biology course over a 5-yr period during which time our framework and the assessment tool were increasingly used. Students were increasingly able to describe and manipulate models of the processes and systems being studied in this course as measured by assessment items. However, their ability to apply these models in new contexts did not improve. Finally, we discuss the implications of our results and the future directions for our research. PMID:21123691

  1. A framework for predicting impacts on ecosystem services from (sub)organismal responses to chemicals.

    PubMed

    Forbes, Valery E; Salice, Chris J; Birnir, Bjorn; Bruins, Randy J F; Calow, Peter; Ducrot, Virginie; Galic, Nika; Garber, Kristina; Harvey, Bret C; Jager, Henriette; Kanarek, Andrew; Pastorok, Robert; Railsback, Steve F; Rebarber, Richard; Thorbek, Pernille

    2017-04-01

    Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. Environ Toxicol Chem 2017;36:845-859. © 2017 SETAC. © 2017 SETAC.

  2. Classification framework for partially observed dynamical systems

    NASA Astrophysics Data System (ADS)

    Shen, Yuan; Tino, Peter; Tsaneva-Atanasova, Krasimira

    2017-04-01

    We present a general framework for classifying partially observed dynamical systems based on the idea of learning in the model space. In contrast to the existing approaches using point estimates of model parameters to represent individual data items, we employ posterior distributions over model parameters, thus taking into account in a principled manner the uncertainty due to both the generative (observational and/or dynamic noise) and observation (sampling in time) processes. We evaluate the framework on two test beds: a biological pathway model and a stochastic double-well system. Crucially, we show that the classification performance is not impaired when the model structure used for inferring posterior distributions is much more simple than the observation-generating model structure, provided the reduced-complexity inferential model structure captures the essential characteristics needed for the given classification task.

  3. Improved Hypoxia Modeling for Nutrient Control Decisions in the Gulf of Mexico

    NASA Technical Reports Server (NTRS)

    Habib, Shahid; Pickering, Ken; Tzortziou, Maria; Maninio, Antonio; Policelli, Fritz; Stehr, Jeff

    2011-01-01

    The Gulf of Mexico Modeling Framework is a suite of coupled models linking the deposition and transport of sediment and nutrients to subsequent bio-geo chemical processes and the resulting effect on concentrations of dissolved oxygen in the coastal waters of Louisiana and Texas. Here, we examine the potential benefits of using multiple NASA remote sensing data products within this Modeling Framework for increasing the accuracy of the models and their utility for nutrient control decisions in the Gulf of Mexico. Our approach is divided into three components: evaluation and improvement of (a) the precipitation input data (b) atmospheric constituent concentrations in EPA's air quality/deposition model and (c) the calculation of algal biomass, organic carbon and suspended solids within the water quality/eutrophication models of the framework.

  4. A hybrid model of cell cycle in mammals.

    PubMed

    Behaegel, Jonathan; Comet, Jean-Paul; Bernot, Gilles; Cornillon, Emilien; Delaunay, Franck

    2016-02-01

    Time plays an essential role in many biological systems, especially in cell cycle. Many models of biological systems rely on differential equations, but parameter identification is an obstacle to use differential frameworks. In this paper, we present a new hybrid modeling framework that extends René Thomas' discrete modeling. The core idea is to associate with each qualitative state "celerities" allowing us to compute the time spent in each state. This hybrid framework is illustrated by building a 5-variable model of the mammalian cell cycle. Its parameters are determined by applying formal methods on the underlying discrete model and by constraining parameters using timing observations on the cell cycle. This first hybrid model presents the most important known behaviors of the cell cycle, including quiescent phase and endoreplication.

  5. Building health behavior models to guide the development of just-in-time adaptive interventions: A pragmatic framework

    PubMed Central

    Nahum-Shani, Inbal; Hekler, Eric B.; Spruijt-Metz, Donna

    2016-01-01

    Advances in wireless devices and mobile technology offer many opportunities for delivering just-in-time adaptive interventions (JITAIs)--suites of interventions that adapt over time to an individual’s changing status and circumstances with the goal to address the individual’s need for support, whenever this need arises. A major challenge confronting behavioral scientists aiming to develop a JITAI concerns the selection and integration of existing empirical, theoretical and practical evidence into a scientific model that can inform the construction of a JITAI and help identify scientific gaps. The purpose of this paper is to establish a pragmatic framework that can be used to organize existing evidence into a useful model for JITAI construction. This framework involves clarifying the conceptual purpose of a JITAI, namely the provision of just-in-time support via adaptation, as well as describing the components of a JITAI and articulating a list of concrete questions to guide the establishment of a useful model for JITAI construction. The proposed framework includes an organizing scheme for translating the relatively static scientific models underlying many health behavior interventions into a more dynamic model that better incorporates the element of time. This framework will help to guide the next generation of empirical work to support the creation of effective JITAIs. PMID:26651462

  6. Predictive Models and Computational Embryology

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  7. Multi-Scale Multi-Domain Model | Transportation Research | NREL

    Science.gov Websites

    framework for NREL's MSMD model. NREL's MSMD model quantifies the impacts of electrical/thermal pathway : NREL Macroscopic design factors and highly dynamic environmental conditions significantly influence the design of affordable, long-lasting, high-performing, and safe large battery systems. The MSMD framework

  8. Flow resistance interactions on hillslopes with heterogeneous attributes: Effects on runoff hydrograph characteristics

    USDA-ARS?s Scientific Manuscript database

    An improved modeling framework for capturing the effects of dynamic resistance to overland flow is developed for intensively managed landscapes. The framework builds on the WEPP model but it removes the limitations of the “equivalent” plane and static roughness assumption. The enhanced model therefo...

  9. An Exploration of the Factors Influencing the Adoption of an IS Governance Framework

    ERIC Educational Resources Information Center

    Parker, Sharon L.

    2013-01-01

    This research explored IT governance framework adoption, leveraging established IS theories. It applied both the technology acceptance model (TAM) and the technology, organization, environment (TOE) models. The study consisted of developing a model utilizing TOE and TAM, deriving relevant hypotheses. Interviews with a group of practitioners…

  10. Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karali, Nihan; Xu, Tengfang; Sathaye, Jayant

    2012-12-12

    The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.

  11. Order-Constrained Bayes Inference for Dichotomous Models of Unidimensional Nonparametric IRT

    ERIC Educational Resources Information Center

    Karabatsos, George; Sheu, Ching-Fan

    2004-01-01

    This study introduces an order-constrained Bayes inference framework useful for analyzing data containing dichotomous scored item responses, under the assumptions of either the monotone homogeneity model or the double monotonicity model of nonparametric item response theory (NIRT). The framework involves the implementation of Gibbs sampling to…

  12. A geostatistical extreme-value framework for fast simulation of natural hazard events

    PubMed Central

    Stephenson, David B.

    2016-01-01

    We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768

  13. Robust Decision-making Applied to Model Selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hemez, Francois M.

    2012-08-06

    The scientific and engineering communities are relying more and more on numerical models to simulate ever-increasingly complex phenomena. Selecting a model, from among a family of models that meets the simulation requirements, presents a challenge to modern-day analysts. To address this concern, a framework is adopted anchored in info-gap decision theory. The framework proposes to select models by examining the trade-offs between prediction accuracy and sensitivity to epistemic uncertainty. The framework is demonstrated on two structural engineering applications by asking the following question: Which model, of several numerical models, approximates the behavior of a structure when parameters that define eachmore » of those models are unknown? One observation is that models that are nominally more accurate are not necessarily more robust, and their accuracy can deteriorate greatly depending upon the assumptions made. It is posited that, as reliance on numerical models increases, establishing robustness will become as important as demonstrating accuracy.« less

  14. Linking service quality, customer satisfaction, and behavioral intention.

    PubMed

    Woodside, A G; Frey, L L; Daly, R T

    1989-12-01

    Based on the service quality and script theory literature, a framework of relationships among service quality, customer satisfaction, and behavioral intention for service purchases is proposed. Specific models are developed from the general framework and the models are applied and tested for the highly complex and divergent consumer service of overnight hospital care. Service quality, customer satisfaction, and behavioral intention data were collected from recent patients of two hospitals. The findings support the specific models and general framework. Implications for theory, service marketing, and future research are discussed.

  15. Decision support models for solid waste management: Review and game-theoretic approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karmperis, Athanasios C., E-mail: athkarmp@mail.ntua.gr; Army Corps of Engineers, Hellenic Army General Staff, Ministry of Defence; Aravossis, Konstantinos

    Highlights: ► The mainly used decision support frameworks for solid waste management are reviewed. ► The LCA, CBA and MCDM models are presented and their strengths, weaknesses, similarities and possible combinations are analyzed. ► The game-theoretic approach in a solid waste management context is presented. ► The waste management bargaining game is introduced as a specific decision support framework. ► Cooperative and non-cooperative game-theoretic approaches to decision support for solid waste management are discussed. - Abstract: This paper surveys decision support models that are commonly used in the solid waste management area. Most models are mainly developed within three decisionmore » support frameworks, which are the life-cycle assessment, the cost–benefit analysis and the multi-criteria decision-making. These frameworks are reviewed and their strengths and weaknesses as well as their critical issues are analyzed, while their possible combinations and extensions are also discussed. Furthermore, the paper presents how cooperative and non-cooperative game-theoretic approaches can be used for the purpose of modeling and analyzing decision-making in situations with multiple stakeholders. Specifically, since a waste management model is sustainable when considering not only environmental and economic but also social aspects, the waste management bargaining game is introduced as a specific decision support framework in which future models can be developed.« less

  16. Structured statistical models of inductive reasoning.

    PubMed

    Kemp, Charles; Tenenbaum, Joshua B

    2009-01-01

    Everyday inductive inferences are often guided by rich background knowledge. Formal models of induction should aim to incorporate this knowledge and should explain how different kinds of knowledge lead to the distinctive patterns of reasoning found in different inductive contexts. This article presents a Bayesian framework that attempts to meet both goals and describes [corrected] 4 applications of the framework: a taxonomic model, a spatial model, a threshold model, and a causal model. Each model makes probabilistic inferences about the extensions of novel properties, but the priors for the 4 models are defined over different kinds of structures that capture different relationships between the categories in a domain. The framework therefore shows how statistical inference can operate over structured background knowledge, and the authors argue that this interaction between structure and statistics is critical for explaining the power and flexibility of human reasoning.

  17. V&V framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.

    2015-09-01

    A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less

  18. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  19. Three-dimensional hydrogeologic framework model for use with a steady-state numerical ground-water flow model of the Death Valley regional flow system, Nevada and California

    USGS Publications Warehouse

    Belcher, Wayne R.; Faunt, Claudia C.; D'Agnese, Frank A.

    2002-01-01

    The U.S. Geological Survey, in cooperation with the Department of Energy and other Federal, State, and local agencies, is evaluating the hydrogeologic characteristics of the Death Valley regional ground-water flow system. The ground-water flow system covers an area of about 100,000 square kilometers from latitude 35? to 38?15' North to longitude 115? to 118? West, with the flow system proper comprising about 45,000 square kilometers. The Death Valley regional ground-water flow system is one of the larger flow systems within the Southwestern United States and includes in its boundaries the Nevada Test Site, Yucca Mountain, and much of Death Valley. Part of this study includes the construction of a three-dimensional hydrogeologic framework model to serve as the foundation for the development of a steady-state regional ground-water flow model. The digital framework model provides a computer-based description of the geometry and composition of the hydrogeologic units that control regional flow. The framework model of the region was constructed by merging two previous framework models constructed for the Yucca Mountain Project and the Environmental Restoration Program Underground Test Area studies at the Nevada Test Site. The hydrologic characteristics of the region result from a currently arid climate and complex geology. Interbasinal regional ground-water flow occurs through a thick carbonate-rock sequence of Paleozoic age, a locally thick volcanic-rock sequence of Tertiary age, and basin-fill alluvium of Tertiary and Quaternary age. Throughout the system, deep and shallow ground-water flow may be controlled by extensive and pervasive regional and local faults and fractures. The framework model was constructed using data from several sources to define the geometry of the regional hydrogeologic units. These data sources include (1) a 1:250,000-scale hydrogeologic-map compilation of the region; (2) regional-scale geologic cross sections; (3) borehole information, and (4) gridded surfaces from a previous three-dimensional geologic model. In addition, digital elevation model data were used in conjunction with these data to define ground-surface altitudes. These data, properly oriented in three dimensions by using geographic information systems, were combined and gridded to produce the upper surfaces of the hydrogeologic units used in the flow model. The final geometry of the framework model is constructed as a volumetric model by incorporating the intersections of these gridded surfaces and by applying fault truncation rules to structural features from the geologic map and cross sections. The cells defining the geometry of the hydrogeologic framework model can be assigned several attributes such as lithology, hydrogeologic unit, thickness, and top and bottom altitudes.

  20. Applying air pollution modelling within a multi-criteria decision analysis framework to evaluate UK air quality policies

    NASA Astrophysics Data System (ADS)

    Chalabi, Zaid; Milojevic, Ai; Doherty, Ruth M.; Stevenson, David S.; MacKenzie, Ian A.; Milner, James; Vieno, Massimo; Williams, Martin; Wilkinson, Paul

    2017-10-01

    A decision support system for evaluating UK air quality policies is presented. It combines the output from a chemistry transport model, a health impact model and other impact models within a multi-criteria decision analysis (MCDA) framework. As a proof-of-concept, the MCDA framework is used to evaluate and compare idealized emission reduction policies in four sectors (combustion in energy and transformation industries, non-industrial combustion plants, road transport and agriculture) and across six outcomes or criteria (mortality, health inequality, greenhouse gas emissions, biodiversity, crop yield and air quality legal compliance). To illustrate a realistic use of the MCDA framework, the relative importance of the criteria were elicited from a number of stakeholders acting as proxy policy makers. In the prototype decision problem, we show that reducing emissions from industrial combustion (followed very closely by road transport and agriculture) is more advantageous than equivalent reductions from the other sectors when all the criteria are taken into account. Extensions of the MCDA framework to support policy makers in practice are discussed.

  1. A Micro-Level Data-Calibrated Agent-Based Model: The Synergy between Microsimulation and Agent-Based Modeling.

    PubMed

    Singh, Karandeep; Ahn, Chang-Won; Paik, Euihyun; Bae, Jang Won; Lee, Chun-Hee

    2018-01-01

    Artificial life (ALife) examines systems related to natural life, its processes, and its evolution, using simulations with computer models, robotics, and biochemistry. In this article, we focus on the computer modeling, or "soft," aspects of ALife and prepare a framework for scientists and modelers to be able to support such experiments. The framework is designed and built to be a parallel as well as distributed agent-based modeling environment, and does not require end users to have expertise in parallel or distributed computing. Furthermore, we use this framework to implement a hybrid model using microsimulation and agent-based modeling techniques to generate an artificial society. We leverage this artificial society to simulate and analyze population dynamics using Korean population census data. The agents in this model derive their decisional behaviors from real data (microsimulation feature) and interact among themselves (agent-based modeling feature) to proceed in the simulation. The behaviors, interactions, and social scenarios of the agents are varied to perform an analysis of population dynamics. We also estimate the future cost of pension policies based on the future population structure of the artificial society. The proposed framework and model demonstrates how ALife techniques can be used by researchers in relation to social issues and policies.

  2. Development of an "Alert Framework" Based on the Practices in the Medical Front.

    PubMed

    Sakata, Takuya; Araki, Kenji; Yamazaki, Tomoyoshi; Kawano, Koichi; Maeda, Minoru; Kushima, Muneo; Araki, Sanae

    2018-05-09

    At the University of Miyazaki Hospital (UMH), we have accumulated and semantically structured a vast amount of medical information since the activation of the electronic health record system approximately 10 years ago. With this medical information, we have decided to develop an alert system for aiding in medical treatment. The purpose of this investigation is to not only to integrate an alert framework into the electronic heath record system, but also to formulate a modeling method of this knowledge. A trial alert framework was developed for the staff in various occupational categories at the UMH. Based on findings of subsequent interviews, a more detailed and upgraded alert framework was constructed, resulting in the final model. Based on our current findings, an alert framework was developed with four major items. Based on the analysis of the medical practices from the trial model, it has been concluded that there are four major risk patterns that trigger the alert. Furthermore, the current alert framework contains detailed definitions which are easily substituted into the database, leading to easy implementation of the electronic health records.

  3. Leveraging the Zachman framework implementation using action - research methodology - a case study: aligning the enterprise architecture and the business goals

    NASA Astrophysics Data System (ADS)

    Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo

    2013-02-01

    With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.

  4. Modular modelling with Physiome standards

    PubMed Central

    Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.

    2016-01-01

    Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set of principles to address this consideration. The principles are illustrated with examples that couple electrophysiology, signalling, metabolism, gene regulation and synthetic biology, together forming an architectural prototype for whole‐cell modelling (including human intervention) in CellML. Such models illustrate how testable units of quantitative biophysical simulation can be constructed. Finally, future relationships between modular models so constructed and Physiome frameworks and tools are discussed, with particular reference to how such frameworks and tools can in turn be extended to complement and gain more benefit from the results of applying the principles. PMID:27353233

  5. An Integrated Ensemble-Based Operational Framework to Predict Urban Flooding: A Case Study of Hurricane Sandy in the Passaic and Hackensack River Basins

    NASA Astrophysics Data System (ADS)

    Saleh, F.; Ramaswamy, V.; Georgas, N.; Blumberg, A. F.; Wang, Y.

    2016-12-01

    Advances in computational resources and modeling techniques are opening the path to effectively integrate existing complex models. In the context of flood prediction, recent extreme events have demonstrated the importance of integrating components of the hydrosystem to better represent the interactions amongst different physical processes and phenomena. As such, there is a pressing need to develop holistic and cross-disciplinary modeling frameworks that effectively integrate existing models and better represent the operative dynamics. This work presents a novel Hydrologic-Hydraulic-Hydrodynamic Ensemble (H3E) flood prediction framework that operationally integrates existing predictive models representing coastal (New York Harbor Observing and Prediction System, NYHOPS), hydrologic (US Army Corps of Engineers Hydrologic Modeling System, HEC-HMS) and hydraulic (2-dimensional River Analysis System, HEC-RAS) components. The state-of-the-art framework is forced with 125 ensemble meteorological inputs from numerical weather prediction models including the Global Ensemble Forecast System, the European Centre for Medium-Range Weather Forecasts (ECMWF), the Canadian Meteorological Centre (CMC), the Short Range Ensemble Forecast (SREF) and the North American Mesoscale Forecast System (NAM). The framework produces, within a 96-hour forecast horizon, on-the-fly Google Earth flood maps that provide critical information for decision makers and emergency preparedness managers. The utility of the framework was demonstrated by retrospectively forecasting an extreme flood event, hurricane Sandy in the Passaic and Hackensack watersheds (New Jersey, USA). Hurricane Sandy caused significant damage to a number of critical facilities in this area including the New Jersey Transit's main storage and maintenance facility. The results of this work demonstrate that ensemble based frameworks provide improved flood predictions and useful information about associated uncertainties, thus improving the assessment of risks as when compared to a deterministic forecast. The work offers perspectives for short-term flood forecasts, flood mitigation strategies and best management practices for climate change scenarios.

  6. A Framework for the Study of Emotions in Organizational Contexts.

    ERIC Educational Resources Information Center

    Fiebig, Greg V.; Kramer, Michael W.

    1998-01-01

    Approaches the study of emotions in organizations holistically, based on a proposed framework. Provides descriptive data that suggests the presence of the framework's major elements. States that future examination of emotions based on this framework should assist in understanding emotions, which are frequently ignored in a rational model. (PA)

  7. Testing a Conceptual Change Model Framework for Visual Data

    ERIC Educational Resources Information Center

    Finson, Kevin D.; Pedersen, Jon E.

    2015-01-01

    An emergent data analysis technique was employed to test the veracity of a conceptual framework constructed around visual data use and instruction in science classrooms. The framework incorporated all five key components Vosniadou (2007a, 2007b) described as existing in a learner's schema: framework theory, presuppositions, conceptual domains,…

  8. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges

    PubMed Central

    Amarasingham, Ruben; Audet, Anne-Marie J.; Bates, David W.; Glenn Cohen, I.; Entwistle, Martin; Escobar, G. J.; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M.; Shahi, Anand; Stewart, Walter F.; Steyerberg, Ewout W.; Xie, Bin

    2016-01-01

    Context: The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Objectives: Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. Methods: To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. Findings: The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and Ethics) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility.Ethics: Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models. PMID:27141516

  9. Consensus Statement on Electronic Health Predictive Analytics: A Guiding Framework to Address Challenges.

    PubMed

    Amarasingham, Ruben; Audet, Anne-Marie J; Bates, David W; Glenn Cohen, I; Entwistle, Martin; Escobar, G J; Liu, Vincent; Etheredge, Lynn; Lo, Bernard; Ohno-Machado, Lucila; Ram, Sudha; Saria, Suchi; Schilling, Lisa M; Shahi, Anand; Stewart, Walter F; Steyerberg, Ewout W; Xie, Bin

    2016-01-01

    The recent explosion in available electronic health record (EHR) data is motivating a rapid expansion of electronic health care predictive analytic (e-HPA) applications, defined as the use of electronic algorithms that forecast clinical events in real time with the intent to improve patient outcomes and reduce costs. There is an urgent need for a systematic framework to guide the development and application of e-HPA to ensure that the field develops in a scientifically sound, ethical, and efficient manner. Building upon earlier frameworks of model development and utilization, we identify the emerging opportunities and challenges of e-HPA, propose a framework that enables us to realize these opportunities, address these challenges, and motivate e-HPA stakeholders to both adopt and continuously refine the framework as the applications of e-HPA emerge. To achieve these objectives, 17 experts with diverse expertise including methodology, ethics, legal, regulation, and health care delivery systems were assembled to identify emerging opportunities and challenges of e-HPA and to propose a framework to guide the development and application of e-HPA. The framework proposed by the panel includes three key domains where e-HPA differs qualitatively from earlier generations of models and algorithms (Data Barriers, Transparency, and ETHICS) and areas where current frameworks are insufficient to address the emerging opportunities and challenges of e-HPA (Regulation and Certification; and Education and Training). The following list of recommendations summarizes the key points of the framework: Data Barriers: Establish mechanisms within the scientific community to support data sharing for predictive model development and testing.Transparency: Set standards around e-HPA validation based on principles of scientific transparency and reproducibility. Develop both individual-centered and society-centered risk-benefit approaches to evaluate e-HPA.Regulation and Certification: Construct a self-regulation and certification framework within e-HPA.Education and Training: Make significant changes to medical, nursing, and paraprofessional curricula by including training for understanding, evaluating, and utilizing predictive models.

  10. The Effect of Framework Design on Stress Distribution in Implant-Supported FPDs: A 3-D FEM Study

    PubMed Central

    Eraslan, Oguz; Inan, Ozgur; Secilmis, Asli

    2010-01-01

    Objectives: The biomechanical behavior of the superstructure plays an important role in the functional longevity of dental implants. However, information about the influence of framework design on stresses transmitted to the implants and supporting tissues is limited. The purpose of this study was to evaluate the effects of framework designs on stress distribution at the supporting bone and supporting implants. Methods: In this study, the three-dimensional (3D) finite element stress analysis method was used. Three types of 3D mathematical models simulating three different framework designs for implant-supported 3-unit posterior fixed partial dentures were prepared with supporting structures. Convex (1), concave (2), and conventional (3) pontic framework designs were simulated. A 300-N static vertical occlusal load was applied on the node at the center of occlusal surface of the pontic to calculate the stress distributions. As a second condition, frameworks were directly loaded to evaluate the effect of the framework design clearly. The Solidworks/Cosmosworks structural analysis programs were used for finite element modeling/analysis. Results: The analysis of the von Mises stress values revealed that maximum stress concentrations were located at the loading areas for all models. The pontic side marginal edges of restorations and the necks of implants were other stress concentration regions. There was no clear difference among models when the restorations were loaded at occlusal surfaces. When the veneering porcelain was removed, and load was applied directly to the framework, there was a clear increase in stress concentration with a concave design on supporting implants and bone structure. Conclusions: The present study showed that the use of a concave design in the pontic frameworks of fixed partial dentures increases the von Mises stress levels on implant abutments and supporting bone structure. However, the veneering porcelain element reduces the effect of the framework and compensates for design weaknesses. PMID:20922156

  11. CSDMS2.0: Computational Infrastructure for Community Surface Dynamics Modeling

    NASA Astrophysics Data System (ADS)

    Syvitski, J. P.; Hutton, E.; Peckham, S. D.; Overeem, I.; Kettner, A.

    2012-12-01

    The Community Surface Dynamic Modeling System (CSDMS) is an NSF-supported, international and community-driven program that seeks to transform the science and practice of earth-surface dynamics modeling. CSDMS integrates a diverse community of more than 850 geoscientists representing 360 international institutions (academic, government, industry) from 60 countries and is supported by a CSDMS Interagency Committee (22 Federal agencies), and a CSDMS Industrial Consortia (18 companies). CSDMS presently distributes more 200 Open Source models and modeling tools, access to high performance computing clusters in support of developing and running models, and a suite of products for education and knowledge transfer. CSDMS software architecture employs frameworks and services that convert stand-alone models into flexible "plug-and-play" components to be assembled into larger applications. CSDMS2.0 will support model applications within a web browser, on a wider variety of computational platforms, and on other high performance computing clusters to ensure robustness and sustainability of the framework. Conversion of stand-alone models into "plug-and-play" components will employ automated wrapping tools. Methods for quantifying model uncertainty are being adapted as part of the modeling framework. Benchmarking data is being incorporated into the CSDMS modeling framework to support model inter-comparison. Finally, a robust mechanism for ingesting and utilizing semantic mediation databases is being developed within the Modeling Framework. Six new community initiatives are being pursued: 1) an earth - ecosystem modeling initiative to capture ecosystem dynamics and ensuing interactions with landscapes, 2) a geodynamics initiative to investigate the interplay among climate, geomorphology, and tectonic processes, 3) an Anthropocene modeling initiative, to incorporate mechanistic models of human influences, 4) a coastal vulnerability modeling initiative, with emphasis on deltas and their multiple threats and stressors, 5) a continental margin modeling initiative, to capture extreme oceanic and atmospheric events generating turbidity currents in the Gulf of Mexico, and 6) a CZO Focus Research Group, to develop compatibility between CSDMS architecture and protocols and Critical Zone Observatory-developed models and data.

  12. Monitoring and modeling as a continuing learning process: the use of hydrological models in a general probabilistic framework.

    NASA Astrophysics Data System (ADS)

    Baroni, G.; Gräff, T.; Reinstorf, F.; Oswald, S. E.

    2012-04-01

    Nowadays uncertainty and sensitivity analysis are considered basic tools for the assessment of hydrological models and the evaluation of the most important sources of uncertainty. In this context, in the last decades several methods have been developed and applied in different hydrological conditions. However, in most of the cases, the studies have been done by investigating mainly the influence of the parameter uncertainty on the simulated outputs and few approaches tried to consider also other sources of uncertainty i.e. input and model structure. Moreover, several constrains arise when spatially distributed parameters are involved. To overcome these limitations a general probabilistic framework based on Monte Carlo simulations and the Sobol method has been proposed. In this study, the general probabilistic framework was applied at field scale using a 1D physical-based hydrological model (SWAP). Furthermore, the framework was extended at catchment scale in combination with a spatially distributed hydrological model (SHETRAN). The models are applied in two different experimental sites in Germany: a relatively flat cropped field close to Potsdam (Brandenburg) and a small mountainous catchment with agricultural land use (Schaefertal, Harz Mountains). For both cases, input and parameters are considered as major sources of uncertainty. Evaluation of the models was based on soil moisture detected at plot scale in different depths and, for the catchment site, also with daily discharge values. The study shows how the framework can take into account all the various sources of uncertainty i.e. input data, parameters (either in scalar or spatially distributed form) and model structures. The framework can be used in a loop in order to optimize further monitoring activities used to improve the performance of the model. In the particular applications, the results show how the sources of uncertainty are specific for each process considered. The influence of the input data as well as the presence of compensating errors become clear by the different processes simulated.

  13. A framework for global river flood risk assessment

    NASA Astrophysics Data System (ADS)

    Winsemius, H. C.; Van Beek, L. P. H.; Bouwman, A.; Ward, P. J.; Jongman, B.

    2012-04-01

    There is an increasing need for strategic global assessments of flood risks. Such assessments may be required by: (a) International Financing Institutes and Disaster Management Agencies to evaluate where, when, and which investments in flood risk mitigation are most required; (b) (re-)insurers, who need to determine their required coverage capital; and (c) large companies to account for risks of regional investments. In this contribution, we propose a framework for global river flood risk assessment. The framework combines coarse scale resolution hazard probability distributions, derived from global hydrological model runs (typical scale about 0.5 degree resolution) with high resolution estimates of exposure indicators. The high resolution is required because floods typically occur at a much smaller scale than the typical resolution of global hydrological models, and exposure indicators such as population, land use and economic value generally are strongly variable in space and time. The framework therefore estimates hazard at a high resolution ( 1 km2) by using a) global forcing data sets of the current (or in scenario mode, future) climate; b) a global hydrological model; c) a global flood routing model, and d) importantly, a flood spatial downscaling routine. This results in probability distributions of annual flood extremes as an indicator of flood hazard, at the appropriate resolution. A second component of the framework combines the hazard probability distribution with classical flood impact models (e.g. damage, affected GDP, affected population) to establish indicators for flood risk. The framework can be applied with a large number of datasets and models and sensitivities of such choices can be evaluated by the user. The framework is applied using the global hydrological model PCR-GLOBWB, combined with a global flood routing model. Downscaling of the hazard probability distributions to 1 km2 resolution is performed with a new downscaling algorithm, applied on a number of target regions. We demonstrate the use of impact models in these regions based on global GDP, population, and land use maps. In this application, we show sensitivities of the estimated risks with regard to the use of different climate input datasets, decisions made in the downscaling algorithm, and different approaches to establish distributed estimates of GDP and asset exposure to flooding.

  14. A Framework for Curriculum Research.

    ERIC Educational Resources Information Center

    Kimpston, Richard D.; Rogers, Karen B.

    1986-01-01

    A framework for generating curriculum research is proposed from a synthesis of Dunkin and Biddle's model of teaching variables with Beauchamp's "curriculum system" planning functions. The framework systematically defines variables that delineate curriculum planning processes. (CJH)

  15. The Predictive Relationship among the Community of Inquiry Framework, Perceived Learning and Online, and Graduate Students' Course Grades in Online Synchronous and Asynchronous Courses

    ERIC Educational Resources Information Center

    Rockinson-Szapkiw, Amanda J.; Wendt, Jillian; Wighting, Mervyn; Nisbet, Deanna

    2016-01-01

    The Community of Inquiry framework has been widely supported by research to provide a model of online learning that informs the design and implementation of distance learning courses. However, the relationship between elements of the CoI framework and perceived learning warrants further examination as a predictive model for online graduate student…

  16. A revised Self- and Family Management Framework.

    PubMed

    Grey, Margaret; Schulman-Green, Dena; Knafl, Kathleen; Reynolds, Nancy R

    2015-01-01

    Research on self- and family management of chronic conditions has advanced over the past 6 years, but the use of simple frameworks has hampered the understanding of the complexities involved. We sought to update our previously published model with new empirical, synthetic, and theoretical work. We used synthesis of previous studies to update the framework. We propose a revised framework that clarifies facilitators and barriers, processes, proximal outcomes, and distal outcomes of self- and family management and their relationships. We offer the revised framework as a model that can be used in studies aimed at advancing self- and family management science. The use of the framework to guide studies would allow for the design of studies that can address more clearly how self-management interventions work and under what conditions. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Implementing Value-Based Payment Reform: A Conceptual Framework and Case Examples.

    PubMed

    Conrad, Douglas A; Vaughn, Matthew; Grembowski, David; Marcus-Smith, Miriam

    2016-08-01

    This article develops a conceptual framework for implementation of value-based payment (VBP) reform and then draws on that framework to systematically examine six distinct multi-stakeholder coalition VBP initiatives in three different regions of the United States. The VBP initiatives deploy the following payment models: reference pricing, "shadow" primary care capitation, bundled payment, pay for performance, shared savings within accountable care organizations, and global payment. The conceptual framework synthesizes prior models of VBP implementation. It describes how context, project objectives, payment and care delivery strategies, and the barriers and facilitators to translating strategy into implementation affect VBP implementation and value for patients. We next apply the framework to six case examples of implementation, and conclude by discussing the implications of the case examples and the conceptual framework for future practice and research. © The Author(s) 2015.

  18. Prognostic residual mean flow in an ocean general circulation model and its relation to prognostic Eulerian mean flow

    DOE PAGES

    Saenz, Juan A.; Chen, Qingshan; Ringler, Todd

    2015-05-19

    Recent work has shown that taking the thickness-weighted average (TWA) of the Boussinesq equations in buoyancy coordinates results in exact equations governing the prognostic residual mean flow where eddy–mean flow interactions appear in the horizontal momentum equations as the divergence of the Eliassen–Palm flux tensor (EPFT). It has been proposed that, given the mathematical tractability of the TWA equations, the physical interpretation of the EPFT, and its relation to potential vorticity fluxes, the TWA is an appropriate framework for modeling ocean circulation with parameterized eddies. The authors test the feasibility of this proposition and investigate the connections between the TWAmore » framework and the conventional framework used in models, where Eulerian mean flow prognostic variables are solved for. Using the TWA framework as a starting point, this study explores the well-known connections between vertical transfer of horizontal momentum by eddy form drag and eddy overturning by the bolus velocity, used by Greatbatch and Lamb and Gent and McWilliams to parameterize eddies. After implementing the TWA framework in an ocean general circulation model, we verify our analysis by comparing the flows in an idealized Southern Ocean configuration simulated using the TWA and conventional frameworks with the same mesoscale eddy parameterization.« less

  19. A Framework to Implement IoT Network Performance Modelling Techniques for Network Solution Selection †

    PubMed Central

    Delaney, Declan T.; O’Hare, Gregory M. P.

    2016-01-01

    No single network solution for Internet of Things (IoT) networks can provide the required level of Quality of Service (QoS) for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks. PMID:27916929

  20. A Framework to Implement IoT Network Performance Modelling Techniques for Network Solution Selection.

    PubMed

    Delaney, Declan T; O'Hare, Gregory M P

    2016-12-01

    No single network solution for Internet of Things (IoT) networks can provide the required level of Quality of Service (QoS) for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks.

  1. Predictive Models and Computational Toxicology (II IBAMTOX)

    EPA Science Inventory

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  2. Multimodal Speaker Diarization.

    PubMed

    Noulas, A; Englebienne, G; Krose, B J A

    2012-01-01

    We present a novel probabilistic framework that fuses information coming from the audio and video modality to perform speaker diarization. The proposed framework is a Dynamic Bayesian Network (DBN) that is an extension of a factorial Hidden Markov Model (fHMM) and models the people appearing in an audiovisual recording as multimodal entities that generate observations in the audio stream, the video stream, and the joint audiovisual space. The framework is very robust to different contexts, makes no assumptions about the location of the recording equipment, and does not require labeled training data as it acquires the model parameters using the Expectation Maximization (EM) algorithm. We apply the proposed model to two meeting videos and a news broadcast video, all of which come from publicly available data sets. The results acquired in speaker diarization are in favor of the proposed multimodal framework, which outperforms the single modality analysis results and improves over the state-of-the-art audio-based speaker diarization.

  3. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regardingmore » their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.« less

  4. A theoretical framework for psychiatric nursing practice.

    PubMed

    Onega, L L

    1991-01-01

    Traditionally, specific theoretical frameworks which are congruent with psychiatric nursing practice have been poorly articulated. The purpose of this paper is to identify and discuss a philosophical base, a theoretical framework, application to psychiatric nursing, and issues related to psychiatric nursing knowledge development and practice. A philosophical framework that is likely to be congruent with psychiatric nursing, which is based on the nature of human beings, health, psychiatric nursing and reality, is identified. Aaron Antonovsky's Salutogenic Model is discussed and applied to psychiatric nursing. This model provides a helpful way for psychiatric nurses to organize their thinking processes and ultimately improve the health care services that they offer to their clients. Goal setting and nursing interventions using this model are discussed. Additionally, application of the use of Antonovsky's model is made to nursing research areas such as hardiness, uncertainty, suffering, empathy and literary works. Finally, specific issues related to psychiatric nursing are addressed.

  5. Mechanochemical models of processive molecular motors

    NASA Astrophysics Data System (ADS)

    Lan, Ganhui; Sun, Sean X.

    2012-05-01

    Motor proteins are the molecular engines powering the living cell. These nanometre-sized molecules convert chemical energy, both enthalpic and entropic, into useful mechanical work. High resolution single molecule experiments can now observe motor protein movement with increasing precision. The emerging data must be combined with structural and kinetic measurements to develop a quantitative mechanism. This article describes a modelling framework where quantitative understanding of motor behaviour can be developed based on the protein structure. The framework is applied to myosin motors, with emphasis on how synchrony between motor domains give rise to processive unidirectional movement. The modelling approach shows that the elasticity of protein domains are important in regulating motor function. Simple models of protein domain elasticity are presented. The framework can be generalized to other motor systems, or an ensemble of motors such as muscle contraction. Indeed, for hundreds of myosins, our framework can be reduced to the Huxely-Simmons description of muscle movement in the mean-field limit.

  6. Modeling Complex Biological Flows in Multi-Scale Systems using the APDEC Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trebotich, D

    We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA-laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscousmore » flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.« less

  7. Modeling complex biological flows in multi-scale systems using the APDEC framework

    NASA Astrophysics Data System (ADS)

    Trebotich, David

    2006-09-01

    We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscous flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.

  8. Short-Term Global Horizontal Irradiance Forecasting Based on Sky Imaging and Pattern Recognition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Brian S; Feng, Cong; Cui, Mingjian

    Accurate short-term forecasting is crucial for solar integration in the power grid. In this paper, a classification forecasting framework based on pattern recognition is developed for 1-hour-ahead global horizontal irradiance (GHI) forecasting. Three sets of models in the forecasting framework are trained by the data partitioned from the preprocessing analysis. The first two sets of models forecast GHI for the first four daylight hours of each day. Then the GHI values in the remaining hours are forecasted by an optimal machine learning model determined based on a weather pattern classification model in the third model set. The weather pattern ismore » determined by a support vector machine (SVM) classifier. The developed framework is validated by the GHI and sky imaging data from the National Renewable Energy Laboratory (NREL). Results show that the developed short-term forecasting framework outperforms the persistence benchmark by 16% in terms of the normalized mean absolute error and 25% in terms of the normalized root mean square error.« less

  9. Stochastic and Deterministic Models for the Metastatic Emission Process: Formalisms and Crosslinks.

    PubMed

    Gomez, Christophe; Hartung, Niklas

    2018-01-01

    Although the detection of metastases radically changes prognosis of and treatment decisions for a cancer patient, clinically undetectable micrometastases hamper a consistent classification into localized or metastatic disease. This chapter discusses mathematical modeling efforts that could help to estimate the metastatic risk in such a situation. We focus on two approaches: (1) a stochastic framework describing metastatic emission events at random times, formalized via Poisson processes, and (2) a deterministic framework describing the micrometastatic state through a size-structured density function in a partial differential equation model. Three aspects are addressed in this chapter. First, a motivation for the Poisson process framework is presented and modeling hypotheses and mechanisms are introduced. Second, we extend the Poisson model to account for secondary metastatic emission. Third, we highlight an inherent crosslink between the stochastic and deterministic frameworks and discuss its implications. For increased accessibility the chapter is split into an informal presentation of the results using a minimum of mathematical formalism and a rigorous mathematical treatment for more theoretically interested readers.

  10. Framework for the Parametric System Modeling of Space Exploration Architectures

    NASA Technical Reports Server (NTRS)

    Komar, David R.; Hoffman, Jim; Olds, Aaron D.; Seal, Mike D., II

    2008-01-01

    This paper presents a methodology for performing architecture definition and assessment prior to, or during, program formulation that utilizes a centralized, integrated architecture modeling framework operated by a small, core team of general space architects. This framework, known as the Exploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), enables: 1) a significantly larger fraction of an architecture trade space to be assessed in a given study timeframe; and 2) the complex element-to-element and element-to-system relationships to be quantitatively explored earlier in the design process. Discussion of the methodology advantages and disadvantages with respect to the distributed study team approach typically used within NASA to perform architecture studies is presented along with an overview of EXAMINE s functional components and tools. An example Mars transportation system architecture model is used to demonstrate EXAMINE s capabilities in this paper. However, the framework is generally applicable for exploration architecture modeling with destinations to any celestial body in the solar system.

  11. Nowcasting Ground Magnetic Perturbations with the Space Weather Modeling Framework

    NASA Astrophysics Data System (ADS)

    Welling, D. T.; Toth, G.; Singer, H. J.; Millward, G. H.; Gombosi, T. I.

    2015-12-01

    Predicting ground-based magnetic perturbations is a critical step towards specifying and predicting geomagnetically induced currents (GICs) in high voltage transmission lines. Currently, the Space Weather Modeling Framework (SWMF), a flexible modeling framework for simulating the multi-scale space environment, is being transitioned from research to operational use (R2O) by NOAA's Space Weather Prediction Center. Upon completion of this transition, the SWMF will provide localized B/t predictions using real-time solar wind observations from L1 and the F10.7 proxy for EUV as model input. This presentation describes the operational SWMF setup and summarizes the changes made to the code to enable R2O progress. The framework's algorithm for calculating ground-based magnetometer observations will be reviewed. Metrics from data-model comparisons will be reviewed to illustrate predictive capabilities. Early data products, such as regional-K index and grids of virtual magnetometer stations, will be presented. Finally, early successes will be shared, including the code's ability to reproduce the recent March 2015 St. Patrick's Day Storm.

  12. Health level 7 development framework for medication administration.

    PubMed

    Kim, Hwa Sun; Cho, Hune

    2009-01-01

    We propose the creation of a standard data model for medication administration activities through the development of a clinical document architecture using the Health Level 7 Development Framework process based on an object-oriented analysis and the development method of Health Level 7 Version 3. Medication administration is the most common activity performed by clinical professionals in healthcare settings. A standardized information model and structured hospital information system are necessary to achieve evidence-based clinical activities. A virtual scenario is used to demonstrate the proposed method of administering medication. We used the Health Level 7 Development Framework and other tools to create the clinical document architecture, which allowed us to illustrate each step of the Health Level 7 Development Framework in the administration of medication. We generated an information model of the medication administration process as one clinical activity. It should become a fundamental conceptual model for understanding international-standard methodology by healthcare professionals and nursing practitioners with the objective of modeling healthcare information systems.

  13. TLS and photogrammetry for the modeling of a historic wooden framework

    NASA Astrophysics Data System (ADS)

    Koehl, M.; Viale, M.

    2012-04-01

    The building which is the object of the study is located in the center of Andlau, France. This mansion that was built in 1582 was the residence of the Lords of Andlau from the XVIth century until the French Revolution. Its architecture represents the Renaissance style of the XVIth century in particular by its volutes and its spiral staircase inside the polygonal turret. In January 2005, the municipality of Andlau became the owner of this Seigneury which is intended to welcome the future Heritage Interpretation Center (HIC), a museum is also going to be created there. Three levels of attic of this building are going to be restored and isolated, the historic framework will that way be masked and the last three levels will not be accessible any more. In this context, our lab was asked to model the framework to allow to make diagnoses there, to learn to know and to consolidate the knowledge on this type of historic framework. Finally, next to a virtual visualization, we provided other applications in particular the creation of an accurate 3D model of the framework for animations, as well as for foundation of an historical information system and for supplying the future museum and HIC with digital data. The project contains different phases: the data acquisition, the model creation and data structuring, the creation of an interactive model and the integration in a historic information system. All levels of the attic were acquired: a 3D Trimble GX scanner and partially a Trimble CX scanner were used in particular for the acquisition of data in the highest part of the framework. The various scans were directly georeferenced in the field thanks to control points, then merged together in an unique point cloud covering the whole structure. Several panoramic photos were also realized to create a virtual tour of the framework and the surroundings of the Seigneury. The purpose of the project was to supply a 3D model allowing the creation of scenographies and interactive contents which will be integrated into an informative device. That way, the public can easily visualize the framework, manipulate the 3D model, discover the construction and the various parts of the historical wooden structure. The raw point cloud cannot be used for this kind of applications. It is thus necessary, from the data which it supplies, to create an exploitable model. Several parameters are to be taken into account: the level of detail of the 3D model, the necessary time to model all the beams, the weight of the final files and finally the type of applied texture. The idea was to implement a workflow to reconcile these various criteria, several methods were tested. This project allowed to create a range of solutions (3D models of the complete framework, virtual tour, interactive 3D models, video animations) to allow an uninitiated public to take advantage of 3D material and software often reserved for the professionals. The work was completed by the comparison between a theoretical model of the framework and a more detailed model of the current state, which allowed to make diagnoses and to study the movements of the structure in the time and to supply important data for rehabilitation and renovation operations.

  14. A Framework for Understanding Physics Students' Computational Modeling Practices

    NASA Astrophysics Data System (ADS)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by their existing physics content knowledge, particularly their knowledge of analytic procedures. While this existing knowledge was often applied in inappropriate circumstances, the students were still able to display a considerable amount of understanding of the physics content and of analytic solution procedures. These observations could not be adequately accommodated by the existing literature of programming comprehension. In extending the resource framework to the task of computational modeling, I model students' practices in terms of three important elements. First, a knowledge base includes re- sources for understanding physics, math, and programming structures. Second, a mechanism for monitoring and control describes students' expectations as being directed towards numerical, analytic, qualitative or rote solution approaches and which can be influenced by the problem representation. Third, a set of solution approaches---many of which were identified in this study---describe what aspects of the knowledge base students use and how they use that knowledge to enact their expectations. This framework allows us as researchers to track student discussions and pinpoint the source of difficulties. This work opens up many avenues of potential research. First, this framework gives researchers a vocabulary for extending Resource Theory to other domains of instruction, such as modeling how physics students use graphs. Second, this framework can be used as the basis for modeling expert physicists' programming practices. Important instructional implications also follow from this research. Namely, as we broaden the use of computational modeling in the physics classroom, our instructional practices should focus on helping students understand the step-by-step nature of programming in contrast to the already salient analytic procedures.

  15. Toward a unified approach to dose-response modeling in ecotoxicology.

    PubMed

    Ritz, Christian

    2010-01-01

    This study reviews dose-response models that are used in ecotoxicology. The focus lies on clarification of differences and similarities between models, and as a side effect, their different guises in ecotoxicology are unravelled. A look at frequently used dose-response models reveals major discrepancies, among other things in naming conventions. Therefore, there is a need for a unified view on dose-response modeling in order to improve the understanding of it and to facilitate communication and comparison of findings across studies, thus realizing its full potential. This study attempts to establish a general framework that encompasses most dose-response models that are of interest to ecotoxicologists in practice. The framework includes commonly used models such as the log-logistic and Weibull models, but also features entire suites of models as found in various guidance documents. An outline on how the proposed framework can be implemented in statistical software systems is also provided.

  16. The ASCA Model and a Multi-Tiered System of Supports: A Framework to Support Students of Color with Problem Behavior

    ERIC Educational Resources Information Center

    Belser, Christopher T.; Shillingford, M. Ann; Joe, J. Richelle

    2016-01-01

    The American School Counselor Association (ASCA) National Model and a multi-tiered system of supports (MTSS) both provide frameworks for systematically solving problems in schools, including student behavior concerns. The authors outline a model that integrates overlapping elements of the National Model and MTSS as a support for marginalized…

  17. A general framework for parametric survival analysis.

    PubMed

    Crowther, Michael J; Lambert, Paul C

    2014-12-30

    Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.

  18. Parameterization models for pesticide exposure via crop consumption.

    PubMed

    Fantke, Peter; Wieland, Peter; Juraske, Ronnie; Shaddick, Gavin; Itoiz, Eva Sevigné; Friedrich, Rainer; Jolliet, Olivier

    2012-12-04

    An approach for estimating human exposure to pesticides via consumption of six important food crops is presented that can be used to extend multimedia models applied in health risk and life cycle impact assessment. We first assessed the variation of model output (pesticide residues per kg applied) as a function of model input variables (substance, crop, and environmental properties) including their possible correlations using matrix algebra. We identified five key parameters responsible for between 80% and 93% of the variation in pesticide residues, namely time between substance application and crop harvest, degradation half-lives in crops and on crop surfaces, overall residence times in soil, and substance molecular weight. Partition coefficients also play an important role for fruit trees and tomato (Kow), potato (Koc), and lettuce (Kaw, Kow). Focusing on these parameters, we develop crop-specific models by parametrizing a complex fate and exposure assessment framework. The parametric models thereby reflect the framework's physical and chemical mechanisms and predict pesticide residues in harvest using linear combinations of crop, crop surface, and soil compartments. Parametric model results correspond well with results from the complex framework for 1540 substance-crop combinations with total deviations between a factor 4 (potato) and a factor 66 (lettuce). Predicted residues also correspond well with experimental data previously used to evaluate the complex framework. Pesticide mass in harvest can finally be combined with reduction factors accounting for food processing to estimate human exposure from crop consumption. All parametric models can be easily implemented into existing assessment frameworks.

  19. Groundwater recharge and agricultural contamination

    USGS Publications Warehouse

    Böhlke, J.K.

    2002-01-01

    Agriculture has had direct and indirect effects on the rates and compositions of groundwater recharge and aquifer biogeochemistry. Direct effects include dissolution and transport of excess quantities of fertilizers and associated materials and hydrologic alterations related to irrigation and drainage. Some indirect effects include changes in water–rock reactions in soils and aquifers caused by increased concentrations of dissolved oxidants, protons, and major ions. Agricultural activities have directly or indirectly affected the concentrations of a large number of inorganic chemicals in groundwater, for example NO3–, N2, Cl, SO42–, H+, P, C, K, Mg, Ca, Sr, Ba, Ra, and As, as well as a wide variety of pesticides and other organic compounds. For reactive contaminants like NO3–, a combination of chemical, isotopic, and environmental-tracer analytical approaches might be required to resolve changing inputs from subsequent alterations as causes of concentration gradients in groundwater. Groundwater records derived from multi-component hydrostratigraphic data can be used to quantify recharge rates and residence times of water and dissolved contaminants, document past variations in recharging contaminant loads, and identify natural contaminant-remediation processes. These data indicate that many of the world's surficial aquifers contain transient records of changing agricultural contamination from the last half of the 20th century. The transient agricultural groundwater signal has important implications for long-term trends and spatial heterogeneity in discharge.

  20. Palaeoclimate Records in Dryland Dunes: Progress and Remaining Challenges Utilizing the Unsaturated Zone for Palaeomoisture Reconstruction.

    NASA Astrophysics Data System (ADS)

    Stone, A.

    2016-12-01

    Reconstructions of past rainfall in dryland regions underpin our understanding the links between climatic forcing and palaeohydrological response. However, there are only few proxies in drylands that record palaeorainfall, or palaeomoisture, in a straightforward manner. The unsaturated zone (USZ) has very significant potential as a novel dryland palaeomoisture archive. The approach is simple, based on variations in the concentration of pore-moisture tracers with depth, representing a hydrostratigraphical record through time. The tracer input is meteoric, with the concentration of this tracer established in the near-surface zone as a function of the level of evapotranspiration before that pore-moisture is transmitted vertically down to the water table. This presentation will highlight key regions where hydrostratigraphies have been successfully applied in drylands. It will also set out challenges regarding the assumptions of the approach, with the intention to stimulate discussion regarding the future development of the unsaturated zone as a palaeoclimate archive over a range of timescales and resolutions. Depending on the rate of moisture flux and the depth of the unsaturated zone, dryland hydrostratigraphies may record (i) broad climatic shifts since the last interglacial at low temporal resolution or multi-millennial length palaeomoisture records with a decadal temporal resolution. USZ hydrostratigraphies may also contain a record of changes in the amount of infiltration (and groundwater recharge) caused by changes to land-use.

  1. Lithostratigraphic controls on bedding-plane fractures and the potential for discrete groundwater flow through a siliciclastic sandstone aquifer, southern Wisconsin

    NASA Astrophysics Data System (ADS)

    Swanson, Susan K.

    2007-04-01

    Outcrop-analog studies of the Upper Cambrian Tunnel City Group sandstones in southern Wisconsin show the utility of lithostratigraphic information in hydrostratigraphic studies of siliciclastic sandstone aquifers. Recent work supports the lateral continuity of discrete groundwater flow through these sandstones. Lithologic description of the Reno Member of the Lone Rock Formation (Tunnel City Group) in outcrop and core reveals repeating sequences of three dominant lithofacies, including flat-pebble intraclast conglomerate with a glauconite-rich matrix; glauconitic and feldspathic subquartzose sandstone with horizontal-planar, low-angle, and hummocky lamination; and feldspathic subquartzose sandstone with dolomite-filled burrows. The vertically stacked Reno Member sequences have been interpreted as having a storm-related origin, and they are laterally continuous on the scale of an outcrop. Horizontal fracture locations correlate with bedding planes at contacts between lithofacies. They are most commonly associated with the base of the flat-pebble intraclast conglomerate or with partings along laminae and erosional surfaces in the laminated subquartzose sandstone lithofacies. Sequences show upward increases in natural gamma radiation due to increasing potassium feldspar content. The incorporation of the detailed lithostratigraphic information allows a more accurate interpretation of borehole natural gamma logs where the rocks are buried and saturated and clarifies the role of sedimentary structures in the distribution of features that might promote discrete flow through these rocks.

  2. A new framework to increase the efficiency of large-scale solar power plants.

    NASA Astrophysics Data System (ADS)

    Alimohammadi, Shahrouz; Kleissl, Jan P.

    2015-11-01

    A new framework to estimate the spatio-temporal behavior of solar power is introduced, which predicts the statistical behavior of power output at utility scale Photo-Voltaic (PV) power plants. The framework is based on spatio-temporal Gaussian Processes Regression (Kriging) models, which incorporates satellite data with the UCSD version of the Weather and Research Forecasting model. This framework is designed to improve the efficiency of the large-scale solar power plants. The results are also validated from measurements of the local pyranometer sensors, and some improvements in different scenarios are observed. Solar energy.

  3. Implementation of a 3d numerical model of a folded multilayer carbonate aquifer

    NASA Astrophysics Data System (ADS)

    Di Salvo, Cristina; Guyennon, Nicolas; Romano, Emanuele; Bruna Petrangeli, Anna; Preziosi, Elisabetta

    2016-04-01

    The main objective of this research is to present a case study of the numerical model implementation of a complex carbonate, structurally folded aquifer, with a finite difference, porous equivalent model. The case study aquifer (which extends over 235 km2 in the Apennine chain, Central Italy) provides a long term average of 3.5 m3/s of good quality groundwater to the surface river network, sustaining the minimum vital flow, and it is planned to be exploited in the next years for public water supply. In the downstream part of the river in the study area, a "Site of Community Importance" include the Nera River for its valuable aquatic fauna. However, the possible negative effects of the foreseen exploitation on groundwater dependent ecosystems are a great concern and model grounded scenarios are needed. This multilayer aquifer was conceptualized as five hydrostratigraphic units: three main aquifers (the uppermost unconfined, the central and the deepest partly confined), are separated by two locally discontinuous aquitards. The Nera river cuts through the two upper aquifers and acts as the main natural sink for groundwater. An equivalent porous medium approach was chosen. The complex tectonic structure of the aquifer requires several steps in defining the conceptual model; the presence of strongly dipping layers with very heterogeneous hydraulic conductivity, results in different thicknesses of saturated portions. Aquifers can have both unconfined or confined zones; drying and rewetting must be allowed when considering recharge/discharge cycles. All these characteristics can be included in the conceptual and numerical model; however, being the number of flow and head target scarce, the over-parametrization of the model must be avoided. Following the principle of parsimony, three steady state numerical models were developed, starting from a simple model, and then adding complexity: 2D (single layer), QUASI -3D (with leackage term simulating flow through aquitards) and fully-3D (with aquitards simulated explicitly and transient flow represented by 3D governing equations). At first, steady state simulation were run under average seasonal recharge. To overcome dry-cell problems in the FULL-3D model, the Newton-Raphson formulation for MODFLOW-2005 was invoked. Steady state calibration was achieved mainly using annual average flow along four streambed's Nera River springs and average water level data available only in two observation wells. Results show that a FULL-3D zoned model was required to match the observed distribution of river base flow. The FULL-3D model was then run in transient conditions (1990-2013) by using monthly spatially distributed recharge estimated using the Thornthwaite-Mather method based on 60 years of climate data. The monitored flow of one spring, used for public water supply, was used as proxy data for reconstruct Nera River hydrogram; proxy-based hydrogram was used for calibration of storage coefficients and further model's parameters adjustment. Once calibrated, the model was run under different aquifer management scenario (i.e., pumping wells planned to be active for water supply); the related risk of depletion of spring discharge and groundwater-surface water interaction was evaluated.

  4. A framework for global river flood risk assessments

    NASA Astrophysics Data System (ADS)

    Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.

    2012-08-01

    There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate. The framework estimates hazard at high resolution (~1 km2) using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood routing model, and importantly, a flood extent downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case-study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard and damage estimates has been performed using the Dartmouth Flood Observatory database and damage estimates from the EM-DAT database and World Bank sources. We discuss and show sensitivities of the estimated risks with regard to the use of different climate input sets, decisions made in the downscaling algorithm, and different approaches to establish impact models.

  5. An examination of the spatial variability of the United States surface water balance using the Budyko relationship for current and projected climates

    NASA Astrophysics Data System (ADS)

    Ficklin, D. L.; Abatzoglou, J. T.

    2017-12-01

    The spatial variability in the balance between surface runoff (Q) and evapotranspiration (ET) is critical for understanding water availability. The Budyko framework suggests that this balance is solely a function of aridity. Observed deviations from this framework for individual watersheds, however, can vary significantly, resulting in uncertainty in using the Budyko framework in ungauged catchments and under future climate and land use scenarios. Here, we model the spatial variability in the partitioning of precipitation into Q and ET using a set of climatic, physiographic, and vegetation metrics for 211 near-natural watersheds across the contiguous United States (CONUS) within Budyko's framework through the free parameter ω. Using a generalized additive model, we found that precipitation seasonality, the ratio of soil water holding capacity to precipitation, topographic slope, and the fraction of precipitation falling as snow explained 81.2% of the variability in ω. This ω model applied to the Budyko framework explained 97% of the spatial variability in long-term Q for an independent set of near-natural watersheds. The developed ω model was also used to estimate the entire CONUS surface water balance for both contemporary and mid-21st century conditions. The contemporary CONUS surface water balance compared favorably to more sophisticated land-surface modeling efforts. For mid-21st century conditions, the model simulated an increase in the fraction of precipitation used by ET across the CONUS with declines in Q for much of the eastern CONUS and mountainous watersheds across the western US. The Budyko framework using the modeled ω lends itself to an alternative approach for assessing the potential response of catchment water balance to climate change to complement other approaches.

  6. Stochastic filtering for damage identification through nonlinear structural finite element model updating

    NASA Astrophysics Data System (ADS)

    Astroza, Rodrigo; Ebrahimian, Hamed; Conte, Joel P.

    2015-03-01

    This paper describes a novel framework that combines advanced mechanics-based nonlinear (hysteretic) finite element (FE) models and stochastic filtering techniques to estimate unknown time-invariant parameters of nonlinear inelastic material models used in the FE model. Using input-output data recorded during earthquake events, the proposed framework updates the nonlinear FE model of the structure. The updated FE model can be directly used for damage identification and further used for damage prognosis. To update the unknown time-invariant parameters of the FE model, two alternative stochastic filtering methods are used: the extended Kalman filter (EKF) and the unscented Kalman filter (UKF). A three-dimensional, 5-story, 2-by-1 bay reinforced concrete (RC) frame is used to verify the proposed framework. The RC frame is modeled using fiber-section displacement-based beam-column elements with distributed plasticity and is subjected to the ground motion recorded at the Sylmar station during the 1994 Northridge earthquake. The results indicate that the proposed framework accurately estimate the unknown material parameters of the nonlinear FE model. The UKF outperforms the EKF when the relative root-mean-square error of the recorded responses are compared. In addition, the results suggest that the convergence of the estimate of modeling parameters is smoother and faster when the UKF is utilized.

  7. Frameworks for change in healthcare organisations: a formative evaluation of the NHS Change Model.

    PubMed

    Martin, Graham P; Sutton, Elizabeth; Willars, Janet; Dixon-Woods, Mary

    2013-08-01

    Organisational change in complex healthcare systems is a multifaceted process. The English National Health Service recently introduced a 'Change Model' that seeks to offer an evidence-based framework for guiding change. We report findings from a formative evaluation of the NHS Change Model and make recommendations for those developing the Model and its users. The evaluation involved 28 interviews with managers and clinicians making use of the Change Model in relation to a variety of projects. Interviews were fully transcribed and were analysed using an approach based on the Framework method. Participants saw the Change Model as valuable and practically useful. Fidelity to core principles of the Model was variable: participants often altered the Model, especially when using it to orchestrate the work of others. In challenging organisational contexts, the Change Model was sometimes used to delegitimise opposition rather than identify shared purpose among different interest groups. Those guiding change may benefit from frameworks, guidance and toolkits to structure and inform their planning and activities. Participants' experiences suggested the Change Model has much potential. Further work on its design and on supporting materials may optimise the approach, but its utility rests in particular on organisational cultures that support faithful application. © The Author(s) 2013 Reprints and permissions:]br]sagepub.co.uk/journalsPermissions.nav.

  8. A new framework to enhance the interpretation of external validation studies of clinical prediction models.

    PubMed

    Debray, Thomas P A; Vergouwe, Yvonne; Koffijberg, Hendrik; Nieboer, Daan; Steyerberg, Ewout W; Moons, Karel G M

    2015-03-01

    It is widely acknowledged that the performance of diagnostic and prognostic prediction models should be assessed in external validation studies with independent data from "different but related" samples as compared with that of the development sample. We developed a framework of methodological steps and statistical methods for analyzing and enhancing the interpretation of results from external validation studies of prediction models. We propose to quantify the degree of relatedness between development and validation samples on a scale ranging from reproducibility to transportability by evaluating their corresponding case-mix differences. We subsequently assess the models' performance in the validation sample and interpret the performance in view of the case-mix differences. Finally, we may adjust the model to the validation setting. We illustrate this three-step framework with a prediction model for diagnosing deep venous thrombosis using three validation samples with varying case mix. While one external validation sample merely assessed the model's reproducibility, two other samples rather assessed model transportability. The performance in all validation samples was adequate, and the model did not require extensive updating to correct for miscalibration or poor fit to the validation settings. The proposed framework enhances the interpretation of findings at external validation of prediction models. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  9. A dynamic water-quality modeling framework for the Neuse River estuary, North Carolina

    USGS Publications Warehouse

    Bales, Jerad D.; Robbins, Jeanne C.

    1999-01-01

    As a result of fish kills in the Neuse River estuary in 1995, nutrient reduction strategies were developed for point and nonpoint sources in the basin. However, because of the interannual variability in the natural system and the resulting complex hydrologic-nutrient inter- actions, it is difficult to detect through a short-term observational program the effects of management activities on Neuse River estuary water quality and aquatic health. A properly constructed water-quality model can be used to evaluate some of the potential effects of manage- ment actions on estuarine water quality. Such a model can be used to predict estuarine response to present and proposed nutrient strategies under the same set of meteorological and hydrologic conditions, thus removing the vagaries of weather and streamflow from the analysis. A two-dimensional, laterally averaged hydrodynamic and water-quality modeling framework was developed for the Neuse River estuary by using previously collected data. Development of the modeling framework consisted of (1) computational grid development, (2) assembly of data for model boundary conditions and model testing, (3) selection of initial values of model parameters, and (4) limited model testing. The model domain extends from Streets Ferry to Oriental, N.C., includes seven lateral embayments that have continual exchange with the main- stem of the estuary, three point-source discharges, and three tributary streams. Thirty-five computational segments represent the mainstem of the estuary, and the entire framework contains a total of 60 computa- tional segments. Each computational cell is 0.5 meter thick; segment lengths range from 500 meters to 7,125 meters. Data that were used to develop the modeling framework were collected during March through October 1991 and represent the most comprehensive data set available prior to 1997. Most of the data were collected by the North Carolina Division of Water Quality, the University of North Carolina Institute of Marine Sciences, and the U.S. Geological Survey. Limitations in the modeling framework were clearly identified. These limitations formed the basis for a set of suggestions to refine the Neuse River estuary water-quality model.

  10. Argumentation in Science Education: A Model-Based Framework

    ERIC Educational Resources Information Center

    Bottcher, Florian; Meisert, Anke

    2011-01-01

    The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons…

  11. Technical Assistance Model for Long-Term Systems Change: Three State Examples

    ERIC Educational Resources Information Center

    Kasprzak, Christina; Hurth, Joicey; Lucas, Anne; Marshall, Jacqueline; Terrell, Adriane; Jones, Elizabeth

    2010-01-01

    The National Early Childhood Technical Assistance Center (NECTAC) Technical Assistance (TA) Model for Long-Term Systems Change (LTSC) is grounded in conceptual frameworks in the literature on systems change and systems thinking. The NECTAC conceptual framework uses a logic model approach to change developed specifically for states' infant and…

  12. Characteristics and Conceptual Framework of the Easy-Play Model

    ERIC Educational Resources Information Center

    Lu, Chunlei; Steele, Kyle

    2014-01-01

    The Easy-Play Model offers a defined framework to organize games that promote an inclusive and enjoyable sport experience. The model can be implemented by participants playing sports in educational, recreational or social contexts with the goal of achieving an active lifestyle in an inclusive, cooperative and enjoyable environment. The Easy-Play…

  13. HEAVY-DUTY DIESEL VEHICLE MODAL EMISSION MODEL (HDDV-MEM): VOLUME I: MODAL EMISSION MODELING FRAMEWORK; VOLUME II: MODAL COMPONENTS AND OUTPUTS

    EPA Science Inventory

    This research outlines a proposed Heavy-Duty Diesel Vehicle Modal Emission Modeling Framework (HDDV-MEMF) for heavy-duty diesel-powered trucks and buses. The heavy-duty vehicle modal modules being developed under this research effort, although different, should be compatible wi...

  14. Learning situation models in a smart home.

    PubMed

    Brdiczka, Oliver; Crowley, James L; Reignier, Patrick

    2009-02-01

    This paper addresses the problem of learning situation models for providing context-aware services. Context for modeling human behavior in a smart environment is represented by a situation model describing environment, users, and their activities. A framework for acquiring and evolving different layers of a situation model in a smart environment is proposed. Different learning methods are presented as part of this framework: role detection per entity, unsupervised extraction of situations from multimodal data, supervised learning of situation representations, and evolution of a predefined situation model with feedback. The situation model serves as frame and support for the different methods, permitting to stay in an intuitive declarative framework. The proposed methods have been integrated into a whole system for smart home environment. The implementation is detailed, and two evaluations are conducted in the smart home environment. The obtained results validate the proposed approach.

  15. Hierarchical Bayesian Modeling of Fluid-Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Broccardo, M.; Mignan, A.; Wiemer, S.; Stojadinovic, B.; Giardini, D.

    2017-11-01

    In this study, we present a Bayesian hierarchical framework to model fluid-induced seismicity. The framework is based on a nonhomogeneous Poisson process with a fluid-induced seismicity rate proportional to the rate of injected fluid. The fluid-induced seismicity rate model depends upon a set of physically meaningful parameters and has been validated for six fluid-induced case studies. In line with the vision of hierarchical Bayesian modeling, the rate parameters are considered as random variables. We develop both the Bayesian inference and updating rules, which are used to develop a probabilistic forecasting model. We tested the Basel 2006 fluid-induced seismic case study to prove that the hierarchical Bayesian model offers a suitable framework to coherently encode both epistemic uncertainty and aleatory variability. Moreover, it provides a robust and consistent short-term seismic forecasting model suitable for online risk quantification and mitigation.

  16. Introduction to the special section on mixture modeling in personality assessment.

    PubMed

    Wright, Aidan G C; Hallquist, Michael N

    2014-01-01

    Latent variable models offer a conceptual and statistical framework for evaluating the underlying structure of psychological constructs, including personality and psychopathology. Complex structures that combine or compare categorical and dimensional latent variables can be accommodated using mixture modeling approaches, which provide a powerful framework for testing nuanced theories about psychological structure. This special series includes introductory primers on cross-sectional and longitudinal mixture modeling, in addition to empirical examples applying these techniques to real-world data collected in clinical settings. This group of articles is designed to introduce personality assessment scientists and practitioners to a general latent variable framework that we hope will stimulate new research and application of mixture models to the assessment of personality and its pathology.

  17. A Framework to Manage Information Models

    NASA Astrophysics Data System (ADS)

    Hughes, J. S.; King, T.; Crichton, D.; Walker, R.; Roberts, A.; Thieman, J.

    2008-05-01

    The Information Model is the foundation on which an Information System is built. It defines the entities to be processed, their attributes, and the relationships that add meaning. The development and subsequent management of the Information Model is the single most significant factor for the development of a successful information system. A framework of tools has been developed that supports the management of an information model with the rigor typically afforded to software development. This framework provides for evolutionary and collaborative development independent of system implementation choices. Once captured, the modeling information can be exported to common languages for the generation of documentation, application databases, and software code that supports both traditional and semantic web applications. This framework is being successfully used for several science information modeling projects including those for the Planetary Data System (PDS), the International Planetary Data Alliance (IPDA), the National Cancer Institute's Early Detection Research Network (EDRN), and several Consultative Committee for Space Data Systems (CCSDS) projects. The objective of the Space Physics Archive Search and Exchange (SPASE) program is to promote collaboration and coordination of archiving activity for the Space Plasma Physics community and ensure the compatibility of the architectures used for a global distributed system and the individual data centers. Over the past several years, the SPASE data model working group has made great progress in developing the SPASE Data Model and supporting artifacts including a data dictionary, XML Schema, and two ontologies. The authors have captured the SPASE Information Model in this framework. This allows the generation of documentation that presents the SPASE Information Model in object-oriented notation including UML class diagrams and class hierarchies. The modeling information can also be exported to semantic web languages such as OWL and RDF and written to XML Metadata Interchange (XMI) files for import into UML tools.

  18. Metadata mapping and reuse in caBIG.

    PubMed

    Kunz, Isaac; Lin, Ming-Chin; Frey, Lewis

    2009-02-05

    This paper proposes that interoperability across biomedical databases can be improved by utilizing a repository of Common Data Elements (CDEs), UML model class-attributes and simple lexical algorithms to facilitate the building domain models. This is examined in the context of an existing system, the National Cancer Institute (NCI)'s cancer Biomedical Informatics Grid (caBIG). The goal is to demonstrate the deployment of open source tools that can be used to effectively map models and enable the reuse of existing information objects and CDEs in the development of new models for translational research applications. This effort is intended to help developers reuse appropriate CDEs to enable interoperability of their systems when developing within the caBIG framework or other frameworks that use metadata repositories. The Dice (di-grams) and Dynamic algorithms are compared and both algorithms have similar performance matching UML model class-attributes to CDE class object-property pairs. With algorithms used, the baselines for automatically finding the matches are reasonable for the data models examined. It suggests that automatic mapping of UML models and CDEs is feasible within the caBIG framework and potentially any framework that uses a metadata repository. This work opens up the possibility of using mapping algorithms to reduce cost and time required to map local data models to a reference data model such as those used within caBIG. This effort contributes to facilitating the development of interoperable systems within caBIG as well as other metadata frameworks. Such efforts are critical to address the need to develop systems to handle enormous amounts of diverse data that can be leveraged from new biomedical methodologies.

  19. Modelling biological behaviours with the unified modelling language: an immunological case study and critique.

    PubMed

    Read, Mark; Andrews, Paul S; Timmis, Jon; Kumar, Vipin

    2014-10-06

    We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology.

  20. An interdisciplinary framework for participatory modeling design and evaluation—What makes models effective participatory decision tools?

    NASA Astrophysics Data System (ADS)

    Falconi, Stefanie M.; Palmer, Richard N.

    2017-02-01

    Increased requirements for public involvement in water resources management (WRM) over the past century have stimulated the development of more collaborative decision-making methods. Participatory modeling (PM) uses computer models to inform and engage stakeholders in the planning process in order to influence collaborative decisions in WRM. Past evaluations of participatory models focused on process and final outcomes, yet, were hindered by diversity of purpose and inconsistent documentation. This paper presents a two-stage framework for evaluating PM based on mechanisms for improving model effectiveness as participatory tools. The five dimensions characterize the "who, when, how, and why" of each participatory effort (stage 1). Models are evaluated as "boundary objects," a concept used to describe tools that bridge understanding and translate different bodies of knowledge to improve credibility, salience, and legitimacy (stage 2). This evaluation framework is applied to five existing case studies from the literature. Though the goals of participation can be diverse, the novel contribution of the two-stage proposed framework is the flexibility it has to evaluate a wide range of cases that differ in scope, modeling approach, and participatory context. Also, the evaluation criteria provide a structured vocabulary based on clear mechanisms that extend beyond previous process-based and outcome-based evaluations. Effective models are those that take advantage of mechanisms that facilitate dialogue and resolution and improve the accessibility and applicability of technical knowledge. Furthermore, the framework can help build more complete records and systematic documentation of evidence to help standardize the field of PM.

  1. Modelling biological behaviours with the unified modelling language: an immunological case study and critique

    PubMed Central

    Read, Mark; Andrews, Paul S.; Timmis, Jon; Kumar, Vipin

    2014-01-01

    We present a framework to assist the diagrammatic modelling of complex biological systems using the unified modelling language (UML). The framework comprises three levels of modelling, ranging in scope from the dynamics of individual model entities to system-level emergent properties. By way of an immunological case study of the mouse disease experimental autoimmune encephalomyelitis, we show how the framework can be used to produce models that capture and communicate the biological system, detailing how biological entities, interactions and behaviours lead to higher-level emergent properties observed in the real world. We demonstrate how the UML can be successfully applied within our framework, and provide a critique of UML's ability to capture concepts fundamental to immunology and biology more generally. We show how specialized, well-explained diagrams with less formal semantics can be used where no suitable UML formalism exists. We highlight UML's lack of expressive ability concerning cyclic feedbacks in cellular networks, and the compounding concurrency arising from huge numbers of stochastic, interacting agents. To compensate for this, we propose several additional relationships for expressing these concepts in UML's activity diagram. We also demonstrate the ambiguous nature of class diagrams when applied to complex biology, and question their utility in modelling such dynamic systems. Models created through our framework are non-executable, and expressly free of simulation implementation concerns. They are a valuable complement and precursor to simulation specifications and implementations, focusing purely on thoroughly exploring the biology, recording hypotheses and assumptions, and serve as a communication medium detailing exactly how a simulation relates to the real biology. PMID:25142524

  2. A Formal Theory for Modular ERDF Ontologies

    NASA Astrophysics Data System (ADS)

    Analyti, Anastasia; Antoniou, Grigoris; Damásio, Carlos Viegas

    The success of the Semantic Web is impossible without any form of modularity, encapsulation, and access control. In an earlier paper, we extended RDF graphs with weak and strong negation, as well as derivation rules. The ERDF #n-stable model semantics of the extended RDF framework (ERDF) is defined, extending RDF(S) semantics. In this paper, we propose a framework for modular ERDF ontologies, called modular ERDF framework, which enables collaborative reasoning over a set of ERDF ontologies, while support for hidden knowledge is also provided. In particular, the modular ERDF stable model semantics of modular ERDF ontologies is defined, extending the ERDF #n-stable model semantics. Our proposed framework supports local semantics and different points of view, local closed-world and open-world assumptions, and scoped negation-as-failure. Several complexity results are provided.

  3. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  4. Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification

    DTIC Science & Technology

    2014-09-18

    and full/scale experimental verifications towards ground/ satellite quantum key distribution0 Oat Qhotonics 4235>9+7,=5;9!អ \\58^ Zin K. Dao Z. Miu T...Conceptual Modeling of a Quantum Key Distribution Simulation Framework Using the Discrete Event System Specification DISSERTATION Jeffrey D. Morris... QUANTUM KEY DISTRIBUTION SIMULATION FRAMEWORK USING THE DISCRETE EVENT SYSTEM SPECIFICATION DISSERTATION Presented to the Faculty Department of Systems

  5. Exploring Conceptual Frameworks of Models of Atomic Structures and Periodic Variations, Chemical Bonding, and Molecular Shape and Polarity: A Comparison of Undergraduate General Chemistry Students with High and Low Levels of Content Knowledge

    ERIC Educational Resources Information Center

    Wang, Chia-Yu; Barrow, Lloyd H.

    2013-01-01

    The purpose of the study was to explore students' conceptual frameworks of models of atomic structure and periodic variations, chemical bonding, and molecular shape and polarity, and how these conceptual frameworks influence their quality of explanations and ability to shift among chemical representations. This study employed a purposeful sampling…

  6. Using framework-based synthesis for conducting reviews of qualitative studies.

    PubMed

    Dixon-Woods, Mary

    2011-04-14

    Framework analysis is a technique used for data analysis in primary qualitative research. Recent years have seen its being adapted to conduct syntheses of qualitative studies. Framework-based synthesis shows considerable promise in addressing applied policy questions. An innovation in the approach, known as 'best fit' framework synthesis, has been published in BMC Medical Research Methodology this month. It involves reviewers in choosing a conceptual model likely to be suitable for the question of the review, and using it as the basis of their initial coding framework. This framework is then modified in response to the evidence reported in the studies in the reviews, so that the final product is a revised framework that may include both modified factors and new factors that were not anticipated in the original model. 'Best fit' framework-based synthesis may be especially suitable in addressing urgent policy questions where the need for a more fully developed synthesis is balanced by the need for a quick answer. Please see related article: http://www.biomedcentral.com/1471-2288/11/29.

  7. Creating an outcomes framework.

    PubMed

    Doerge, J B

    2000-01-01

    Four constructs used to build a framework for outcomes management for a large midwestern tertiary hospital are described in this article. A system framework outlining a model of clinical integration and population management based in Steven Shortell's work is discussed. This framework includes key definitions of high-risk patients, target groups, populations and community. Roles for each level of population management and how they were implemented in the health care system are described. A point of service framework centered on seven dimensions of care is the next construct applied on each nursing unit. The third construct outlines the framework for role development. Three roles for nursing were created to implement strategies for target groups that are strategic disease categories; two of those roles are described in depth. The philosophy of nursing practice is centered on caring and existential advocacy. The final construct is the modification of the Dartmouth model as a common framework for outcomes. System applications of the scorecard and lessons learned in the 2-year process of implementation are shared

  8. A system for environmental model coupling and code reuse: The Great Rivers Project

    NASA Astrophysics Data System (ADS)

    Eckman, B.; Rice, J.; Treinish, L.; Barford, C.

    2008-12-01

    As part of the Great Rivers Project, IBM is collaborating with The Nature Conservancy and the Center for Sustainability and the Global Environment (SAGE) at the University of Wisconsin, Madison to build a Modeling Framework and Decision Support System (DSS) designed to help policy makers and a variety of stakeholders (farmers, fish & wildlife managers, hydropower operators, et al.) to assess, come to consensus, and act on land use decisions representing effective compromises between human use and ecosystem preservation/restoration. Initially focused on Brazil's Paraguay-Parana, China's Yangtze, and the Mississippi Basin in the US, the DSS integrates data and models from a wide variety of environmental sectors, including water balance, water quality, carbon balance, crop production, hydropower, and biodiversity. In this presentation we focus on the modeling framework aspect of this project. In our approach to these and other environmental modeling projects, we see a flexible, extensible modeling framework infrastructure for defining and running multi-step analytic simulations as critical. In this framework, we divide monolithic models into atomic components with clearly defined semantics encoded via rich metadata representation. Once models and their semantics and composition rules have been registered with the system by their authors or other experts, non-expert users may construct simulations as workflows of these atomic model components. A model composition engine enforces rules/constraints for composing model components into simulations, to avoid the creation of Frankenmodels, models that execute but produce scientifically invalid results. A common software environment and common representations of data and models are required, as well as an adapter strategy for code written in e.g., Fortran or python, that still enables efficient simulation runs, including parallelization. Since each new simulation, as a new composition of model components, requires calibration of parameters (fudge factors) to produce scientifically valid results, we are also developing an autocalibration engine. Finally, visualization is a key element of this modeling framework strategy, both to convey complex scientific data effectively, and also to enable non-expert users to make full use of the relevant features of the framework. We are developing a visualization environment with a strong data model, to enable visualizations, model results, and data all to be handled similarly.

  9. SMILI?: A Framework for Interfaces to Learning Data in Open Learner Models, Learning Analytics and Related Fields

    ERIC Educational Resources Information Center

    Bull, Susan; Kay, Judy

    2016-01-01

    The SMILI? (Student Models that Invite the Learner In) Open Learner Model Framework was created to provide a coherent picture of the many and diverse forms of Open Learner Models (OLMs). The aim was for SMILI? to provide researchers with a systematic way to describe, compare and critique OLMs. We expected it to highlight those areas where there…

  10. Sequential optimization of a terrestrial biosphere model constrained by multiple satellite based products

    NASA Astrophysics Data System (ADS)

    Ichii, K.; Kondo, M.; Wang, W.; Hashimoto, H.; Nemani, R. R.

    2012-12-01

    Various satellite-based spatial products such as evapotranspiration (ET) and gross primary productivity (GPP) are now produced by integration of ground and satellite observations. Effective use of these multiple satellite-based products in terrestrial biosphere models is an important step toward better understanding of terrestrial carbon and water cycles. However, due to the complexity of terrestrial biosphere models with large number of model parameters, the application of these spatial data sets in terrestrial biosphere models is difficult. In this study, we established an effective but simple framework to refine a terrestrial biosphere model, Biome-BGC, using multiple satellite-based products as constraints. We tested the framework in the monsoon Asia region covered by AsiaFlux observations. The framework is based on the hierarchical analysis (Wang et al. 2009) with model parameter optimization constrained by satellite-based spatial data. The Biome-BGC model is separated into several tiers to minimize the freedom of model parameter selections and maximize the independency from the whole model. For example, the snow sub-model is first optimized using MODIS snow cover product, followed by soil water sub-model optimized by satellite-based ET (estimated by an empirical upscaling method; Support Vector Regression (SVR) method; Yang et al. 2007), photosynthesis model optimized by satellite-based GPP (based on SVR method), and respiration and residual carbon cycle models optimized by biomass data. As a result of initial assessment, we found that most of default sub-models (e.g. snow, water cycle and carbon cycle) showed large deviations from remote sensing observations. However, these biases were removed by applying the proposed framework. For example, gross primary productivities were initially underestimated in boreal and temperate forest and overestimated in tropical forests. However, the parameter optimization scheme successfully reduced these biases. Our analysis shows that terrestrial carbon and water cycle simulations in monsoon Asia were greatly improved, and the use of multiple satellite observations with this framework is an effective way for improving terrestrial biosphere models.

  11. Carbon dioxide capture using covalent organic frameworks (COFs) type material-a theoretical investigation.

    PubMed

    Dash, Bibek

    2018-04-26

    The present work deals with a density functional theory (DFT) study of porous organic framework materials containing - groups for CO 2 capture. In this study, first principle calculations were performed for CO 2 adsorption using N-containing covalent organic framework (COFs) models. Ab initio and DFT-based methods were used to characterize the N-containing porous model system based on their interaction energies upon complexing with CO 2 and nitrogen gas. Binding energies (BEs) of CO 2 and N 2 molecules with the polymer framework were calculated with DFT methods. Hybrid B3LYP and second order MP2 methods combined with of Pople 6-31G(d,p) and correlation consistent basis sets cc-pVDZ, cc-pVTZ and aug-ccVDZ were used to calculate BEs. The effect of linker groups in the designed covalent organic framework model system on the CO 2 and N 2 interactions was studied using quantum calculations.

  12. Environmental accounting for Arctic shipping - a framework building on ship tracking data from satellites.

    PubMed

    Mjelde, A; Martinsen, K; Eide, M; Endresen, Ø

    2014-10-15

    Arctic shipping is on the rise, leading to increased concern over the potential environmental impacts. To better understand the magnitude of influence to the Arctic environment, detailed modelling of emissions and environmental risks are essential. This paper describes a framework for environmental accounting. A cornerstone in the framework is the use of Automatic Identification System (AIS) ship tracking data from satellites. When merged with ship registers and other data sources, it enables unprecedented accuracy in modelling and geographical allocation of emissions and discharges. This paper presents results using two of the models in the framework; emissions of black carbon (BC) in the Arctic, which is of particular concern for climate change, and; bunker fuels and wet bulk carriage in the Arctic, of particular concern for oil spill to the environment. Using the framework, a detailed footprint from Arctic shipping with regards to operational emissions and potential discharges is established. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Building health behavior models to guide the development of just-in-time adaptive interventions: A pragmatic framework.

    PubMed

    Nahum-Shani, Inbal; Hekler, Eric B; Spruijt-Metz, Donna

    2015-12-01

    Advances in wireless devices and mobile technology offer many opportunities for delivering just-in-time adaptive interventions (JITAIs)-suites of interventions that adapt over time to an individual's changing status and circumstances with the goal to address the individual's need for support, whenever this need arises. A major challenge confronting behavioral scientists aiming to develop a JITAI concerns the selection and integration of existing empirical, theoretical and practical evidence into a scientific model that can inform the construction of a JITAI and help identify scientific gaps. The purpose of this paper is to establish a pragmatic framework that can be used to organize existing evidence into a useful model for JITAI construction. This framework involves clarifying the conceptual purpose of a JITAI, namely, the provision of just-in-time support via adaptation, as well as describing the components of a JITAI and articulating a list of concrete questions to guide the establishment of a useful model for JITAI construction. The proposed framework includes an organizing scheme for translating the relatively static scientific models underlying many health behavior interventions into a more dynamic model that better incorporates the element of time. This framework will help to guide the next generation of empirical work to support the creation of effective JITAIs. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  14. Robustness of movement models: can models bridge the gap between temporal scales of data sets and behavioural processes?

    PubMed

    Schlägel, Ulrike E; Lewis, Mark A

    2016-12-01

    Discrete-time random walks and their extensions are common tools for analyzing animal movement data. In these analyses, resolution of temporal discretization is a critical feature. Ideally, a model both mirrors the relevant temporal scale of the biological process of interest and matches the data sampling rate. Challenges arise when resolution of data is too coarse due to technological constraints, or when we wish to extrapolate results or compare results obtained from data with different resolutions. Drawing loosely on the concept of robustness in statistics, we propose a rigorous mathematical framework for studying movement models' robustness against changes in temporal resolution. In this framework, we define varying levels of robustness as formal model properties, focusing on random walk models with spatially-explicit component. With the new framework, we can investigate whether models can validly be applied to data across varying temporal resolutions and how we can account for these different resolutions in statistical inference results. We apply the new framework to movement-based resource selection models, demonstrating both analytical and numerical calculations, as well as a Monte Carlo simulation approach. While exact robustness is rare, the concept of approximate robustness provides a promising new direction for analyzing movement models.

  15. A Bayesian framework based on a Gaussian mixture model and radial-basis-function Fisher discriminant analysis (BayGmmKda V1.1) for spatial prediction of floods

    NASA Astrophysics Data System (ADS)

    Tien Bui, Dieu; Hoang, Nhat-Duc

    2017-09-01

    In this study, a probabilistic model, named as BayGmmKda, is proposed for flood susceptibility assessment in a study area in central Vietnam. The new model is a Bayesian framework constructed by a combination of a Gaussian mixture model (GMM), radial-basis-function Fisher discriminant analysis (RBFDA), and a geographic information system (GIS) database. In the Bayesian framework, GMM is used for modeling the data distribution of flood-influencing factors in the GIS database, whereas RBFDA is utilized to construct a latent variable that aims at enhancing the model performance. As a result, the posterior probabilistic output of the BayGmmKda model is used as flood susceptibility index. Experiment results showed that the proposed hybrid framework is superior to other benchmark models, including the adaptive neuro-fuzzy inference system and the support vector machine. To facilitate the model implementation, a software program of BayGmmKda has been developed in MATLAB. The BayGmmKda program can accurately establish a flood susceptibility map for the study region. Accordingly, local authorities can overlay this susceptibility map onto various land-use maps for the purpose of land-use planning or management.

  16. Framework Programmable Platform for the Advanced Software Development Workstation (FPP/ASDW). Demonstration framework document. Volume 2: Framework process description

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paula S.; Crump, John W.; Ackley, Keith A.

    1992-01-01

    In the second volume of the Demonstration Framework Document, the graphical representation of the demonstration framework is given. This second document was created to facilitate the reading and comprehension of the demonstration framework. It is designed to be viewed in parallel with Section 4.2 of the first volume to help give a picture of the relationships between the UOB's (Unit of Behavior) of the model. The model is quite large and the design team felt that this form of presentation would make it easier for the reader to get a feel for the processes described in this document. The IDEF3 (Process Description Capture Method) diagrams of the processes of an Information System Development are presented. Volume 1 describes the processes and the agents involved with each process, while this volume graphically shows the precedence relationships among the processes.

  17. A Liver-centric Multiscale Modeling Framework for Xenobiotics

    EPA Science Inventory

    We describe a multi-scale framework for modeling acetaminophen-induced liver toxicity. Acetaminophen is a widely used analgesic. Overdose of acetaminophen can result in liver injury via its biotransformation into toxic product, which further induce massive necrosis. Our study foc...

  18. Integrating human and natural systems in community psychology: an ecological model of stewardship behavior.

    PubMed

    Moskell, Christine; Allred, Shorna Broussard

    2013-03-01

    Community psychology (CP) research on the natural environment lacks a theoretical framework for analyzing the complex relationship between human systems and the natural world. We introduce other academic fields concerned with the interactions between humans and the natural environment, including environmental sociology and coupled human and natural systems. To demonstrate how the natural environment can be included within CP's ecological framework, we propose an ecological model of urban forest stewardship action. Although ecological models of behavior in CP have previously modeled health behaviors, we argue that these frameworks are also applicable to actions that positively influence the natural environment. We chose the environmental action of urban forest stewardship because cities across the United States are planting millions of trees and increased citizen participation in urban tree planting and stewardship will be needed to sustain the benefits provided by urban trees. We used the framework of an ecological model of behavior to illustrate multiple levels of factors that may promote or hinder involvement in urban forest stewardship actions. The implications of our model for the development of multi-level ecological interventions to foster stewardship actions are discussed, as well as directions for future research to further test and refine the model.

  19. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Mandelli, Diego; Rasmussen, Martin

    2016-06-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: •more » Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.« less

  20. Machine Learning-based discovery of closures for reduced models of dynamical systems

    NASA Astrophysics Data System (ADS)

    Pan, Shaowu; Duraisamy, Karthik

    2017-11-01

    Despite the successful application of machine learning (ML) in fields such as image processing and speech recognition, only a few attempts has been made toward employing ML to represent the dynamics of complex physical systems. Previous attempts mostly focus on parameter calibration or data-driven augmentation of existing models. In this work we present a ML framework to discover closure terms in reduced models of dynamical systems and provide insights into potential problems associated with data-driven modeling. Based on exact closure models for linear system, we propose a general linear closure framework from viewpoint of optimization. The framework is based on trapezoidal approximation of convolution term. Hyperparameters that need to be determined include temporal length of memory effect, number of sampling points, and dimensions of hidden states. To circumvent the explicit specification of memory effect, a general framework inspired from neural networks is also proposed. We conduct both a priori and posteriori evaluations of the resulting model on a number of non-linear dynamical systems. This work was supported in part by AFOSR under the project ``LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.

  1. A MULTISCALE FRAMEWORK FOR THE STOCHASTIC ASSIMILATION AND MODELING OF UNCERTAINTY ASSOCIATED NCF COMPOSITE MATERIALS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehrez, Loujaine; Ghanem, Roger; McAuliffe, Colin

    multiscale framework to construct stochastic macroscopic constitutive material models is proposed. A spectral projection approach, specifically polynomial chaos expansion, has been used to construct explicit functional relationships between the homogenized properties and input parameters from finer scales. A homogenization engine embedded in Multiscale Designer, software for composite materials, has been used for the upscaling process. The framework is demonstrated using non-crimp fabric composite materials by constructing probabilistic models of the homogenized properties of a non-crimp fabric laminate in terms of the input parameters together with the homogenized properties from finer scales.

  2. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  3. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  4. The Roy Adaptation Model: A Theoretical Framework for Nurses Providing Care to Individuals With Anorexia Nervosa.

    PubMed

    Jennings, Karen M

    Using a nursing theoretical framework to understand, elucidate, and propose nursing research is fundamental to knowledge development. This article presents the Roy Adaptation Model as a theoretical framework to better understand individuals with anorexia nervosa during acute treatment, and the role of nursing assessments and interventions in the promotion of weight restoration. Nursing assessments and interventions situated within the Roy Adaptation Model take into consideration how weight restoration does not occur in isolation but rather reflects an adaptive process within external and internal environments, and has the potential for more holistic care.

  5. Airborne electromagnetic mapping of the base of aquifer in areas of western Nebraska

    USGS Publications Warehouse

    Abraham, Jared D.; Cannia, James C.; Bedrosian, Paul A.; Johnson, Michaela R.; Ball, Lyndsay B.; Sibray, Steven S.

    2012-01-01

    Airborne geophysical surveys of selected areas of the North and South Platte River valleys of Nebraska, including Lodgepole Creek valley, collected data to map aquifers and bedrock topography and thus improve the understanding of groundwater - surface-water relationships to be used in water-management decisions. Frequency-domain helicopter electromagnetic surveys, using a unique survey flight-line design, collected resistivity data that can be related to lithologic information for refinement of groundwater model inputs. To make the geophysical data useful to multidimensional groundwater models, numerical inversion converted measured data into a depth-dependent subsurface resistivity model. The inverted resistivity model, along with sensitivity analyses and test-hole information, is used to identify hydrogeologic features such as bedrock highs and paleochannels, to improve estimates of groundwater storage. The two- and three-dimensional interpretations provide the groundwater modeler with a high-resolution hydrogeologic framework and a quantitative estimate of framework uncertainty. The new hydrogeologic frameworks improve understanding of the flow-path orientation by refining the location of paleochannels and associated base of aquifer highs. These interpretations provide resource managers high-resolution hydrogeologic frameworks and quantitative estimates of framework uncertainty. The improved base of aquifer configuration represents the hydrogeology at a level of detail not achievable with previously available data.

  6. Statistical label fusion with hierarchical performance models

    PubMed Central

    Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.

    2014-01-01

    Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809

  7. XAL Application Framework and Bricks GUI Builder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pelaia II, Tom

    2007-01-01

    The XAL [1] Application Framework is a framework for rapidly developing document based Java applications with a common look and feel along with many built-in user interface behaviors. The Bricks GUI builder consists of a modern application and framework for rapidly building user interfaces in support of true Model-View-Controller (MVC) compliant Java applications. Bricks and the XAL Application Framework allow developers to rapidly create quality applications.

  8. A comparison of fit of CNC-milled titanium and zirconia frameworks to implants.

    PubMed

    Abduo, Jaafar; Lyons, Karl; Waddell, Neil; Bennani, Vincent; Swain, Michael

    2012-05-01

    Computer numeric controlled (CNC) milling was proven to be predictable method to fabricate accurately fitting implant titanium frameworks. However, no data are available regarding the fit of CNC-milled implant zirconia frameworks. To compare the precision of fit of implant frameworks milled from titanium and zirconia and relate it to peri-implant strain development after framework fixation. A partially edentulous epoxy resin models received two Branemark implants in the areas of the lower left second premolar and second molar. From this model, 10 identical frameworks were fabricated by mean of CNC milling. Half of them were made from titanium and the other half from zirconia. Strain gauges were mounted close to the implants to qualitatively and quantitatively assess strain development as a result of framework fitting. In addition, the fit of the framework implant interface was measured using an optical microscope, when only one screw was tightened (passive fit) and when all screws were tightened (vertical fit). The data was statistically analyzed using the Mann-Whitney test. All frameworks produced measurable amounts of peri-implant strain. The zirconia frameworks produced significantly less strain than titanium. Combining the qualitative and quantitative information indicates that the implants were under vertical displacement rather than horizontal. The vertical fit was similar for zirconia (3.7 µm) and titanium (3.6 µm) frameworks; however, the zirconia frameworks exhibited a significantly finer passive fit (5.5 µm) than titanium frameworks (13.6 µm). CNC milling produced zirconia and titanium frameworks with high accuracy. The difference between the two materials in terms of fit is expected to be of minimal clinical significance. The strain developed around the implants was more related to the framework fit rather than framework material. © 2011 Wiley Periodicals, Inc.

  9. Sustained sensorimotor control as intermittent decisions about prediction errors: computational framework and application to ground vehicle steering.

    PubMed

    Markkula, Gustav; Boer, Erwin; Romano, Richard; Merat, Natasha

    2018-06-01

    A conceptual and computational framework is proposed for modelling of human sensorimotor control and is exemplified for the sensorimotor task of steering a car. The framework emphasises control intermittency and extends on existing models by suggesting that the nervous system implements intermittent control using a combination of (1) motor primitives, (2) prediction of sensory outcomes of motor actions, and (3) evidence accumulation of prediction errors. It is shown that approximate but useful sensory predictions in the intermittent control context can be constructed without detailed forward models, as a superposition of simple prediction primitives, resembling neurobiologically observed corollary discharges. The proposed mathematical framework allows straightforward extension to intermittent behaviour from existing one-dimensional continuous models in the linear control and ecological psychology traditions. Empirical data from a driving simulator are used in model-fitting analyses to test some of the framework's main theoretical predictions: it is shown that human steering control, in routine lane-keeping and in a demanding near-limit task, is better described as a sequence of discrete stepwise control adjustments, than as continuous control. Results on the possible roles of sensory prediction in control adjustment amplitudes, and of evidence accumulation mechanisms in control onset timing, show trends that match the theoretical predictions; these warrant further investigation. The results for the accumulation-based model align with other recent literature, in a possibly converging case against the type of threshold mechanisms that are often assumed in existing models of intermittent control.

  10. Architectural frameworks: defining the structures for implementing learning health systems.

    PubMed

    Lessard, Lysanne; Michalowski, Wojtek; Fung-Kee-Fung, Michael; Jones, Lori; Grudniewicz, Agnes

    2017-06-23

    The vision of transforming health systems into learning health systems (LHSs) that rapidly and continuously transform knowledge into improved health outcomes at lower cost is generating increased interest in government agencies, health organizations, and health research communities. While existing initiatives demonstrate that different approaches can succeed in making the LHS vision a reality, they are too varied in their goals, focus, and scale to be reproduced without undue effort. Indeed, the structures necessary to effectively design and implement LHSs on a larger scale are lacking. In this paper, we propose the use of architectural frameworks to develop LHSs that adhere to a recognized vision while being adapted to their specific organizational context. Architectural frameworks are high-level descriptions of an organization as a system; they capture the structure of its main components at varied levels, the interrelationships among these components, and the principles that guide their evolution. Because these frameworks support the analysis of LHSs and allow their outcomes to be simulated, they act as pre-implementation decision-support tools that identify potential barriers and enablers of system development. They thus increase the chances of successful LHS deployment. We present an architectural framework for LHSs that incorporates five dimensions-goals, scientific, social, technical, and ethical-commonly found in the LHS literature. The proposed architectural framework is comprised of six decision layers that model these dimensions. The performance layer models goals, the scientific layer models the scientific dimension, the organizational layer models the social dimension, the data layer and information technology layer model the technical dimension, and the ethics and security layer models the ethical dimension. We describe the types of decisions that must be made within each layer and identify methods to support decision-making. In this paper, we outline a high-level architectural framework grounded in conceptual and empirical LHS literature. Applying this architectural framework can guide the development and implementation of new LHSs and the evolution of existing ones, as it allows for clear and critical understanding of the types of decisions that underlie LHS operations. Further research is required to assess and refine its generalizability and methods.

  11. Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework

    PubMed Central

    Talluto, Matthew V.; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C. Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A.; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique

    2016-01-01

    Aim Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Location Eastern North America (as an example). Methods Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple (Acer saccharum), an abundant tree native to eastern North America. Results For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. Main conclusions We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making. PMID:27499698

  12. Cross-scale integration of knowledge for predicting species ranges: a metamodeling framework.

    PubMed

    Talluto, Matthew V; Boulangeat, Isabelle; Ameztegui, Aitor; Aubin, Isabelle; Berteaux, Dominique; Butler, Alyssa; Doyon, Frédérik; Drever, C Ronnie; Fortin, Marie-Josée; Franceschini, Tony; Liénard, Jean; McKenney, Dan; Solarik, Kevin A; Strigul, Nikolay; Thuiller, Wilfried; Gravel, Dominique

    2016-02-01

    Current interest in forecasting changes to species ranges have resulted in a multitude of approaches to species distribution models (SDMs). However, most approaches include only a small subset of the available information, and many ignore smaller-scale processes such as growth, fecundity, and dispersal. Furthermore, different approaches often produce divergent predictions with no simple method to reconcile them. Here, we present a flexible framework for integrating models at multiple scales using hierarchical Bayesian methods. Eastern North America (as an example). Our framework builds a metamodel that is constrained by the results of multiple sub-models and provides probabilistic estimates of species presence. We applied our approach to a simulated dataset to demonstrate the integration of a correlative SDM with a theoretical model. In a second example, we built an integrated model combining the results of a physiological model with presence-absence data for sugar maple ( Acer saccharum ), an abundant tree native to eastern North America. For both examples, the integrated models successfully included information from all data sources and substantially improved the characterization of uncertainty. For the second example, the integrated model outperformed the source models with respect to uncertainty when modelling the present range of the species. When projecting into the future, the model provided a consensus view of two models that differed substantially in their predictions. Uncertainty was reduced where the models agreed and was greater where they diverged, providing a more realistic view of the state of knowledge than either source model. We conclude by discussing the potential applications of our method and its accessibility to applied ecologists. In ideal cases, our framework can be easily implemented using off-the-shelf software. The framework has wide potential for use in species distribution modelling and can drive better integration of multi-source and multi-scale data into ecological decision-making.

  13. Structure, function, and behaviour of computational models in systems biology

    PubMed Central

    2013-01-01

    Background Systems Biology develops computational models in order to understand biological phenomena. The increasing number and complexity of such “bio-models” necessitate computer support for the overall modelling task. Computer-aided modelling has to be based on a formal semantic description of bio-models. But, even if computational bio-models themselves are represented precisely in terms of mathematical expressions their full meaning is not yet formally specified and only described in natural language. Results We present a conceptual framework – the meaning facets – which can be used to rigorously specify the semantics of bio-models. A bio-model has a dual interpretation: On the one hand it is a mathematical expression which can be used in computational simulations (intrinsic meaning). On the other hand the model is related to the biological reality (extrinsic meaning). We show that in both cases this interpretation should be performed from three perspectives: the meaning of the model’s components (structure), the meaning of the model’s intended use (function), and the meaning of the model’s dynamics (behaviour). In order to demonstrate the strengths of the meaning facets framework we apply it to two semantically related models of the cell cycle. Thereby, we make use of existing approaches for computer representation of bio-models as much as possible and sketch the missing pieces. Conclusions The meaning facets framework provides a systematic in-depth approach to the semantics of bio-models. It can serve two important purposes: First, it specifies and structures the information which biologists have to take into account if they build, use and exchange models. Secondly, because it can be formalised, the framework is a solid foundation for any sort of computer support in bio-modelling. The proposed conceptual framework establishes a new methodology for modelling in Systems Biology and constitutes a basis for computer-aided collaborative research. PMID:23721297

  14. A climate robust integrated modelling framework for regional impact assessment of climate change

    NASA Astrophysics Data System (ADS)

    Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet

    2013-04-01

    Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change scenarios developed by KNMI for precipitation and reference evapotranspiration according to Penman-Monteith. Special focus in the project was on the role of uncertainty. How valid is the information that is generated by this modelling framework? What are the most important uncertainties of the input data, how do they affect the results of the model chain and how can the uncertainties of the data, results, and model concepts be quantified and communicated? Besides these technical issues, an important part of the study was devoted to the perception of stakeholders. Stakeholder analysis and additional working sessions yielded insight into how the models, their results and the uncertainties are perceived, how the modelling framework and results connect to the stakeholders' information demands and what kind of additional information is needed for adequate support on decision making.

  15. A Conceptual Framework for SAHRA Integrated Multi-resolution Modeling in the Rio Grande Basin

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Gupta, H.; Springer, E.; Wagener, T.; Brookshire, D.; Duffy, C.

    2004-12-01

    The sustainable management of water resources in a river basin requires an integrated analysis of the social, economic, environmental and institutional dimensions of the problem. Numerical models are commonly used for integration of these dimensions and for communication of the analysis results to stakeholders and policy makers. The National Science Foundation Science and Technology Center for Sustainability of semi-Arid Hydrology and Riparian Areas (SAHRA) has been developing integrated multi-resolution models to assess impacts of climate variability and land use change on water resources in the Rio Grande Basin. These models not only couple natural systems such as surface and ground waters, but will also include engineering, economic and social components that may be involved in water resources decision-making processes. This presentation will describe the conceptual framework being developed by SAHRA to guide and focus the multiple modeling efforts and to assist the modeling team in planning, data collection and interpretation, communication, evaluation, etc. One of the major components of this conceptual framework is a Conceptual Site Model (CSM), which describes the basin and its environment based on existing knowledge and identifies what additional information must be collected to develop technically sound models at various resolutions. The initial CSM is based on analyses of basin profile information that has been collected, including a physical profile (e.g., topographic and vegetative features), a man-made facility profile (e.g., dams, diversions, and pumping stations), and a land use and ecological profile (e.g., demographics, natural habitats, and endangered species). Based on the initial CSM, a Conceptual Physical Model (CPM) is developed to guide and evaluate the selection of a model code (or numerical model) for each resolution to conduct simulations and predictions. A CPM identifies, conceptually, all the physical processes and engineering and socio-economic activities occurring (or to occur) in the real system that the corresponding numerical models are required to address, such as riparian evapotranspiration responses to vegetation change and groundwater pumping impacts on soil moisture contents. Simulation results from different resolution models and observations of the real system will then be compared to evaluate the consistency among the CSM, the CPMs, and the numerical models, and feedbacks will be used to update the models. In a broad sense, the evaluation of the models (conceptual or numerical), as well as the linkages between them, can be viewed as a part of the overall conceptual framework. As new data are generated and understanding improves, the models will evolve, and the overall conceptual framework is refined. The development of the conceptual framework becomes an on-going process. We will describe the current state of this framework and the open questions that have to be addressed in the future.

  16. Unified Program Design: Organizing Existing Programming Models, Delivery Options, and Curriculum

    ERIC Educational Resources Information Center

    Rubenstein, Lisa DaVia; Ridgley, Lisa M.

    2017-01-01

    A persistent problem in the field of gifted education has been the lack of categorization and delineation of gifted programming options. To address this issue, we propose Unified Program Design as a structural framework for gifted program models. This framework defines gifted programs as the combination of delivery methods and curriculum models.…

  17. An Odds Ratio Approach for Detecting DDF under the Nested Logit Modeling Framework

    ERIC Educational Resources Information Center

    Terzi, Ragip; Suh, Youngsuk

    2015-01-01

    An odds ratio approach (ORA) under the framework of a nested logit model was proposed for evaluating differential distractor functioning (DDF) in multiple-choice items and was compared with an existing ORA developed under the nominal response model. The performances of the two ORAs for detecting DDF were investigated through an extensive…

  18. A Model of Factors Determining Students' Ability to Interpret External Representations in Biochemistry

    ERIC Educational Resources Information Center

    Schonborn, Konrad J.; Anderson, Trevor R.

    2009-01-01

    The aim of this research was to develop a model of factors affecting students' ability to interpret external representations (ERs) in biochemistry. The study was qualitative in design and was guided by the modelling framework of Justi and Gilbert. Application of the process outlined by the framework, and consultation with relevant literature, led…

  19. Exploring Students' Visual Conception of Matter: Towards Developing a Teaching Framework Using Models

    ERIC Educational Resources Information Center

    Espinosa, Allen A.; Marasigan, Arlyne C.; Datukan, Janir T.

    2016-01-01

    This study explored how students visualise the states and classifications of matter with the use of scientific models. Misconceptions of students in using scientific models were also identified to formulate a teaching framework. To elicit data in the study, a Visual Conception Questionnaire was administered to thirty-four (34), firstyear, general…

  20. Integrating Social Activity Theory and Critical Discourse Analysis: A Multilayered Methodological Model for Examining Knowledge Mediation in Mentoring

    ERIC Educational Resources Information Center

    Becher, Ayelet; Orland-Barak, Lily

    2016-01-01

    This study suggests an integrative qualitative methodological framework for capturing complexity in mentoring activity. Specifically, the model examines how historical developments of a discipline direct mentors' mediation of professional knowledge through the language that they use. The model integrates social activity theory and a framework of…

Top