Sample records for site characterization methodology

  1. SEMINAR PUBLICATION: SITE CHARACTERIZATION FOR SUBSURFACE REMEDIATION

    EPA Science Inventory

    This seminar publication provides a comprehensive approach to site characterization for subsurface remediation. Chapter 1 describes a methodology for integrating site characterization with subsurface remediation. The rest of the handbook is divided into three parts. Part I covers...

  2. Remote sensing for site characterization

    USGS Publications Warehouse

    Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.; Kuehn, Friedrich; King, Trude V.; Hoerig, Bernhard; Peters, Douglas C.

    2000-01-01

    This volume, Remote Sensing for Site Characterization, describes the feasibility of aircraft- and satellite-based methods of revealing environmental-geological problems. A balanced ratio between explanations of the methodological/technical side and presentations of case studies is maintained. The comparison of case studies from North America and Germany show how the respective territorial conditions lead to distinct methodological approaches.

  3. FIELD EVALUATION OF IN-SITU BIODEGRADATION OF CHLORINATED ETHENES: PART I, METHODOLOGY AND FIELD SITE CHARACTERIZATION

    EPA Science Inventory

    Careful site characterization and implementation of quantitative monitoring methods are prerequisites for a convincing evaluation of enhanced biostimulation for aquifer restoration. his paper describes the characterization of a site at Moffett Naval Air Station, Mountain View, Ca...

  4. A FIELD EVALUATION OF IN-SITU BIODEGRADATION OF CHLORINATED ETHENES: PART I, METHODOLOGY AND FIELD SITE CHARACTERIZATION

    EPA Science Inventory

    Careful site characterization and implementation of quantitative monitoring methods are prerequisites for a convincing evaluation of enhanced biostimulation for aquifer restoration. This paper describes the characterization of a site at Moffett Naval Air Station, Mountain View, C...

  5. Highway runoff stormwater management potential (HRSMP) site characterization using NASA public domain imagery.

    DOT National Transportation Integrated Search

    2016-04-01

    The focus of this research project was the development of geospatial technology (GST) methodology to : characterize and evaluate highway runoff stormwater management potential (HRSMP) sites in order to : reduce their impact on properties, save lives ...

  6. Proceedings of the tenth annual DOE low-level waste management conference: Session 2: Site performance assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-12-01

    This document contains twelve papers on various aspects of low-level radioactive waste management. Topics of this volume include: performance assessment methodology; remedial action alternatives; site selection and site characterization procedures; intruder scenarios; sensitivity analysis procedures; mathematical models for mixed waste environmental transport; and risk assessment methodology. Individual papers were processed separately for the database. (TEM)

  7. Site characterization methodology for aquifers in support of bioreclamation activities. Volume 2: Borehole flowmeter technique, tracer tests, geostatistics and geology. Final report, August 1987-September 1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Young, S.C.

    1993-08-01

    This report discusses a field demonstration of a methodology for characterizing an aquifer's geohydrology in the detail required to design an optimum network of wells and/or infiltration galleries for bioreclamation systems. The project work was conducted on a 1-hectare test site at Columbus AFB, Mississippi. The technical report is divided into two volumes. Volume I describes the test site and the well network, the assumptions, and the application of equations that define groundwater flow to a well, the results of three large-scale aquifer tests, and the results of 160 single-pump tests. Volume II describes the bore hole flowmeter tests, themore » tracer tests, the geological investigations, the geostatistical analysis and the guidelines for using groundwater models to design bioreclamation systems. Site characterization, Hydraulic conductivity, Groundwater flow, Geostatistics, Geohydrology, Monitoring wells.« less

  8. Development of Hydrologic Characterization Methodology of Faults: Outline of the Project in Berkeley, California

    NASA Astrophysics Data System (ADS)

    Goto, J.; Miwa, T.; Tsuchi, H.; Karasaki, K.

    2009-12-01

    The Nuclear Waste Management Organization of Japan (NUMO), after volunteering municipalities arise, will start a three-staged program for selecting a HLW and TRU waste repository site. It is recognized from experiences from various site characterization programs in the world that the hydrologic property of faults is one of the most important parameters in the early stage of the program. It is expected that numerous faults of interest exist in an investigation area of several tens of square kilometers. It is, however, impossible to characterize all these faults in a limited time and budget. This raises problems in the repository designing and safety assessment that we may have to accept unrealistic or over conservative results by using a single model or parameters for all the faults in the area. We, therefore, seek to develop an efficient and practical methodology to characterize hydrologic property of faults. This project is a five year program started in 2007, and comprises the basic methodology development through literature study and its verification through field investigations. The literature study tries to classify faults by correlating their geological features with hydraulic property, to search for the most efficient technology for fault characterization, and to develop a work flow diagram. The field investigation starts from selection of a site and fault(s), followed by existing site data analyses, surface geophysics, geological mapping, trenching, water sampling, a series of borehole investigations and modeling/analyses. Based on the results of the field investigations, we plan to develop a systematic hydrologic characterization methodology of faults. A classification method that correlates combinations of geological features (rock type, fault displacement, fault type, position in a fault zone, fracture zone width, damage zone width) with widths of high permeability zones around a fault zone was proposed through a survey on available documents of the site characterization programs. The field investigation started in 2008, by selecting the Wildcat Fault that cut across the Laurence Berkeley National Laboratory (LBNL) site as the target. Analyses on site-specific data, surface geophysics, geological mapping and trenching have confirmed the approximate location and characteristics of the fault (see Session H48, Onishi, et al). The plan for the remaining years includes borehole investigations at LBNL, and another series of investigations in the northern part of the Wildcat Fault.

  9. A non-intrusive screening methodology for environmental hazard assessment at waste disposal sites for water resources protection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simons, B.A.; Woldt, W.E.; Jones, D.D.

    The environmental and health risks posed by unregulated waste disposal sites are potential concerns of Pacific Rim regions and island ares because of the need to protect aquifers and other valuable water resources. A non-intrusive screening methodology to determine site characteristics including possible soil and/or groundwater contamination, areal extent of waste, etc. is being developed and tested at waste disposal sites in Nebraska. This type of methodology would be beneficial to Pacific Rim regions in investigating and/or locating unknown or poorly documented contamination areas for hazard assessment and groundwater protection. Traditional assessment methods are generally expensive, time consuming, and potentiallymore » exacerbate the problem. Ideally, a quick and inexpensive assessment method to reliably characterize these sites is desired. Electromagnetic (EM) conductivity surveying and soil-vapor sampling techniques, combined with innovative three-dimensional geostatistical methods are used to map the data to develop a site characterization of the subsurface and to aid in tracking any contaminant plumes. The EM data is analyzed to determine/estimate the extent and volume of waste and/or leachate. Soil-vapor data are analyzed to estimate a site`s volatile organic compound (VOC) emission rate to the atmosphere. The combined information could then be incorporated as one part of an overall hazard assessment system.« less

  10. Incorporating linguistic, probabilistic, and possibilistic information in a risk-based approach for ranking contaminated sites.

    PubMed

    Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng

    2010-10-01

    Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.

  11. METHODOLOGY FOR SITING AMBIENT AIR MONITORS AT THE NEIGHBORHOOD SCALE

    EPA Science Inventory

    In siting a monitor to measure compliance with U.S. National Ambient Air Quality Standards for particulate matter (PM), there is a need to characterize variations in PM concentration within a neighborhood-scale region in order to achieve monitor siting objectives.

    We p...

  12. Highway runoff stormwater management potential (HRSMP) site characterization using NASA public domain imagery : research summary.

    DOT National Transportation Integrated Search

    2016-04-01

    The objectives of this research were to develop and utilize GST methodologies : including remote sensing, to characterize and determine the level of performance : of stormwater management (SWM) facilities (BMPs), resulting in the reduction of : highw...

  13. Wave Resource Characterization at US Wave Energy Converter (WEC) Test Sites

    NASA Astrophysics Data System (ADS)

    Dallman, A.; Neary, V. S.

    2016-02-01

    The US Department of Energy's (DOE) Marine and Hydrokinetic energy (MHK) Program is supporting a diverse research and development portfolio intended to accelerate commercialization of the marine renewable industry by improving technology performance, reducing market barriers, and lowering the cost of energy. Wave resource characterization at potential and existing wave energy converter (WEC) test sites and deployment locations contributes to this DOE goal by providing a catalogue of wave energy resource characteristics, met-ocean data, and site infrastructure information, developed utilizing a consistent methodology. The purpose of the catalogue is to enable the comparison of resource characteristics among sites to facilitate the selection of test sites that are most suitable for a developer's device and that best meet their testing needs and objectives. It also provides inputs for the design of WEC test devices and planning WEC tests, including the planning of deployment and operations and maintenance. The first edition included three sites: the Pacific Marine Energy Center (PMEC) North Energy Test Site (NETS) offshore of Newport, Oregon, the Kaneohe Bay Naval Wave Energy Test Site (WETS) offshore of Oahu, HI, and a potential site offshore of Humboldt Bay, CA (Eureka, CA). The second edition was recently finished, which includes five additional sites: the Jennette's Pier Wave Energy Converter Test Site in North Carolina, the US Army Corps of Engineers (USACE) Field Research Facility (FRF), the PMEC Lake Washington site, the proposed PMEC South Energy Test Site (SETS), and the proposed CalWave Central Coast WEC Test Site. The operational sea states are included according to the IEC Technical Specification on wave energy resource assessment and characterization, with additional information on extreme sea states, weather windows, and representative spectra. The methodology and a summary of results will be discussed.

  14. Seismic Hazard and Ground Motion Characterization at the Itoiz Dam (Northern Spain)

    NASA Astrophysics Data System (ADS)

    Rivas-Medina, A.; Santoyo, M. A.; Luzón, F.; Benito, B.; Gaspar-Escribano, J. M.; García-Jerez, A.

    2012-08-01

    This paper presents a new hazard-consistent ground motion characterization of the Itoiz dam site, located in Northern Spain. Firstly, we propose a methodology with different approximation levels to the expected ground motion at the dam site. Secondly, we apply this methodology taking into account the particular characteristics of the site and of the dam. Hazard calculations were performed following the Probabilistic Seismic Hazard Assessment method using a logic tree, which accounts for different seismic source zonings and different ground-motion attenuation relationships. The study was done in terms of peak ground acceleration and several spectral accelerations of periods coinciding with the fundamental vibration periods of the dam. In order to estimate these ground motions we consider two different dam conditions: when the dam is empty ( T = 0.1 s) and when it is filled with water to its maximum capacity ( T = 0.22 s). Additionally, seismic hazard analysis is done for two return periods: 975 years, related to the project earthquake, and 4,975 years, identified with an extreme event. Soil conditions were also taken into account at the site of the dam. Through the proposed methodology we deal with different forms of characterizing ground motion at the study site. In a first step, we obtain the uniform hazard response spectra for the two return periods. In a second step, a disaggregation analysis is done in order to obtain the controlling earthquakes that can affect the dam. Subsequently, we characterize the ground motion at the dam site in terms of specific response spectra for target motions defined by the expected values SA ( T) of T = 0.1 and 0.22 s for the return periods of 975 and 4,975 years, respectively. Finally, synthetic acceleration time histories for earthquake events matching the controlling parameters are generated using the discrete wave-number method and subsequently analyzed. Because of the short relative distances between the controlling earthquakes and the dam site we considered finite sources in these computations. We conclude that directivity effects should be taken into account as an important variable in this kind of studies for ground motion characteristics.

  15. Accounting for geophysical information in geostatistical characterization of unexploded ordnance (UXO) sites.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saito, Hirotaka; Goovaerts, Pierre; McKenna, Sean Andrew

    2003-06-01

    Efficient and reliable unexploded ordnance (UXO) site characterization is needed for decisions regarding future land use. There are several types of data available at UXO sites and geophysical signal maps are one of the most valuable sources of information. Incorporation of such information into site characterization requires a flexible and reliable methodology. Geostatistics allows one to account for exhaustive secondary information (i.e.,, known at every location within the field) in many different ways. Kriging and logistic regression were combined to map the probability of occurrence of at least one geophysical anomaly of interest, such as UXO, from a limited numbermore » of indicator data. Logistic regression is used to derive the trend from a geophysical signal map, and kriged residuals are added to the trend to estimate the probabilities of the presence of UXO at unsampled locations (simple kriging with varying local means or SKlm). Each location is identified for further remedial action if the estimated probability is greater than a given threshold. The technique is illustrated using a hypothetical UXO site generated by a UXO simulator, and a corresponding geophysical signal map. Indicator data are collected along two transects located within the site. Classification performances are then assessed by computing proportions of correct classification, false positive, false negative, and Kappa statistics. Two common approaches, one of which does not take any secondary information into account (ordinary indicator kriging) and a variant of common cokriging (collocated cokriging), were used for comparison purposes. Results indicate that accounting for exhaustive secondary information improves the overall characterization of UXO sites if an appropriate methodology, SKlm in this case, is used.« less

  16. Microseismic techniques for avoiding induced seismicity during fluid injection

    DOE PAGES

    Matzel, Eric; White, Joshua; Templeton, Dennise; ...

    2014-01-01

    The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterization phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.

  17. CHARACTERIZATION OF GROUNDWATER SAMPLES FROM SUPERFUND SITES BY AS CHROMATOGRAPHY/MASS SPECTROMETRY AND LIQUID CHROMATOGRAPHY/MASS SPECTROMETRY.

    EPA Science Inventory

    Groundwater at or near Superfund sites often contains much organic matter,as indicated by total organic carbon (TOC) measurements. Analyses by standard GC and GC/MS methodology often miss the more polar or nonvolatile of these organic compounds. The identification of the highly p...

  18. Technical bases and guidance for the use of composite soil sampling for demonstrating compliance with radiological release criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vitkus, Timothy J.

    2012-04-24

    This guidance provides information on methodologies and the technical bases that licensees should consider for incorporating composite sampling strategies into final status survey (FSS) plans. In addition, this guidance also includes appropriate uses of composite sampling for generating the data for other decommissioning site investigations such as characterization or other preliminary site investigations.

  19. Development of a Probabilistic Tornado Wind Hazard Model for the Continental United States Volume I: Main Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boissonnade, A; Hossain, Q; Kimball, J

    Since the mid-l980's, assessment of the wind and tornado risks at the Department of Energy (DOE) high and moderate hazard facilities has been based on the straight wind/tornado hazard curves given in UCRL-53526 (Coats, 1985). These curves were developed using a methodology that utilized a model, developed by McDonald, for severe winds at sub-tornado wind speeds and a separate model, developed by Fujita, for tornado wind speeds. For DOE sites not covered in UCRL-53526, wind and tornado hazard assessments are based on the criteria outlined in DOE-STD-1023-95 (DOE, 1996), utilizing the methodology in UCRL-53526; Subsequent to the publication of UCRL53526,more » in a study sponsored by the Nuclear Regulatory Commission (NRC), the Pacific Northwest Laboratory developed tornado wind hazard curves for the contiguous United States, NUREG/CR-4461 (Ramsdell, 1986). Because of the different modeling assumptions and underlying data used to develop the tornado wind information, the wind speeds at specified exceedance levels, at a given location, based on the methodology in UCRL-53526, are different than those based on the methodology in NUREG/CR-4461. In 1997, Lawrence Livermore National Laboratory (LLNL) was funded by the DOE to review the current methodologies for characterizing tornado wind hazards and to develop a state-of-the-art wind/tornado characterization methodology based on probabilistic hazard assessment techniques and current historical wind data. This report describes the process of developing the methodology and the database of relevant tornado information needed to implement the methodology. It also presents the tornado wind hazard curves obtained from the application of the method to DOE sites throughout the contiguous United States.« less

  20. Methodology of management of dredging operations II. Applications.

    PubMed

    Junqua, G; Abriak, N E; Gregoire, P; Dubois, V; Mac Farlane, F; Damidot, D

    2006-04-01

    This paper presents the new methodology of management of dredging operations. Derived partly from existing methodologies (OECD, PNUE, AIPCN), it aims to be more comprehensive, mixing the qualities and the complementarities of previous methodologies. The application of the methodology has been carried out on the site of the Port of Dunkirk (FRANCE). Thus, a characterization of the sediments of this site has allowed a zoning of the Port to be established in to zones of probable homogeneity of sediments. Moreover, sources of pollution have been identified, with an aim of prevention. Ways of waste improvement have also been developed, to answer regional needs, from a point of view of competitive and territorial intelligence. Their development has required a mutualisation of resources between professionals, research centres and local communities, according to principles of industrial ecology. Lastly, a tool of MultiCriteria Decision-Making Aid (M.C.D.M.A.) has been used to determine the most relevant scenario (or alternative, or action) for a dredging operation intended by the Port of Dunkirk. These applications have confirmed the relevance of this methodology for the management of dredging operations.

  1. Central Heating Plant site characterization report, Marine Corps Combat Development Command, Quantico, Virginia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1990-08-01

    This report presents the methodology and results of a characterization of the operation and maintenance (O M) environment at the US Marine Corps (USMC) Quantico, Virginia, Central Heating Plant (CHP). This characterization is part of a program intended to provide the O M staff with a computerized artificial intelligence (AI) decision support system that will assist the plant staff in more efficient operation of their plant. 3 refs., 12 figs.

  2. Characterization and management of electrical noise in the new Australian military HF communication network

    NASA Astrophysics Data System (ADS)

    Vyden, Bruce

    2000-03-01

    The Australian Defense Force's (ADF's) High Frequency (HF) communication network is soon to be replaced by a modernized system. Characterization of electrical noise at the receiver sites proposed for the new system is crucial to its performance. Consequently receiver site noise will be measured under the HF Modernization implementation contract that was awarded to Boeing Australia Ltd. Unfortunately the utility of the noise measurements is constrained by the uncertainties of both the ionosphere and atmosphere. This paper discusses some of the issues related to the methodology for measuring the noise and exposes some unresolved issues.

  3. Evaluation of Suitability of Selected Set of Department of Defense Military Bases and Department of Energy Facilities for Siting a Small Modular Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poore III, Willis P; Belles, Randy; Mays, Gary T

    This report summarizes the approach that ORNL developed for screening a sample set of US Department of Defense (DOD) military base sites and DOE sites for possible powering with an SMR; the methodology employed, including spatial modeling; and initial results for several sample sites. The objective in conducting this type of siting evaluation is demonstrate the capability to characterize specific DOD and DOE sites to identify any particular issues associated with powering the sites with an SMR using OR-SAGE; it is not intended to be a definitive assessment per se as to the absolute suitability of any particular site.

  4. A NEW APPROACH AND METHODOLOGIES FOR CHARACTERIZING THE HYDROGEOLOGIC PROPERTIES OF AQUIFERS

    EPA Science Inventory

    In the authors' opinion, the ability of hydrologists to perform field measurements of aquifer hydraulic properties must be enhanced if we are to improve significantly our capacity to solve ground water contamination problems at Superfund and other sites. Therefore, the primar...

  5. NEW APPROACH AND METHODOLOGIES FOR CHARACTERIZING THE HYDROGEOLOGIC PROPERTIES OF AQUIFERS

    EPA Science Inventory

    In the authors' opinion, the ability of hydrologists to perform field measurements of aquifer hydraulic properties must be enhanced if we are to improve significantly our capacity to solve ground water contamination problems at Superfund and other sites. herefore, the primary pur...

  6. A NEW APPROACH AND METHODOLOGIES FOR CHARACTERIZING THE HYDROGEOLOGIC PROPERTIES OF AQUIFERS

    EPA Science Inventory

    In the authors' opinion, the ability of hydrologists to perform field measurements of aquifer hydraulic properties must be enhanced if we are to improve significantly our capacity to solve ground water contamination problems at Superfund and other sites. Therefore, the primary pu...

  7. Precipitable water vapour content from ESR/SKYNET sun-sky radiometers: validation against GNSS/GPS and AERONET over three different sites in Europe

    NASA Astrophysics Data System (ADS)

    Campanelli, Monica; Mascitelli, Alessandra; Sanò, Paolo; Diémoz, Henri; Estellés, Victor; Federico, Stefano; Iannarelli, Anna Maria; Fratarcangeli, Francesca; Mazzoni, Augusto; Realini, Eugenio; Crespi, Mattia; Bock, Olivier; Martínez-Lozano, Jose A.; Dietrich, Stefano

    2018-01-01

    The estimation of the precipitable water vapour content (W) with high temporal and spatial resolution is of great interest to both meteorological and climatological studies. Several methodologies based on remote sensing techniques have been recently developed in order to obtain accurate and frequent measurements of this atmospheric parameter. Among them, the relative low cost and easy deployment of sun-sky radiometers, or sun photometers, operating in several international networks, allowed the development of automatic estimations of W from these instruments with high temporal resolution. However, the great problem of this methodology is the estimation of the sun-photometric calibration parameters. The objective of this paper is to validate a new methodology based on the hypothesis that the calibration parameters characterizing the atmospheric transmittance at 940 nm are dependent on vertical profiles of temperature, air pressure and moisture typical of each measurement site. To obtain the calibration parameters some simultaneously seasonal measurements of W, from independent sources, taken over a large range of solar zenith angle and covering a wide range of W, are needed. In this work yearly GNSS/GPS datasets were used for obtaining a table of photometric calibration constants and the methodology was applied and validated in three European ESR-SKYNET network sites, characterized by different atmospheric and climatic conditions: Rome, Valencia and Aosta. Results were validated against the GNSS/GPS and AErosol RObotic NETwork (AERONET) W estimations. In both the validations the agreement was very high, with a percentage RMSD of about 6, 13 and 8 % in the case of GPS intercomparison at Rome, Aosta and Valencia, respectively, and of 8 % in the case of AERONET comparison in Valencia. Analysing the results by W classes, the present methodology was found to clearly improve W estimation at low W content when compared against AERONET in terms of % bias, bringing the agreement with the GPS (considered the reference one) from a % bias of 5.76 to 0.52.

  8. U.S. Department of Energy's site screening, site selection, and initial characterization for storage of CO2 in deep geological formations

    USGS Publications Warehouse

    Rodosta, T.D.; Litynski, J.T.; Plasynski, S.I.; Hickman, S.; Frailey, S.; Myer, L.

    2011-01-01

    The U.S. Department of Energy (DOE) is the lead Federal agency for the development and deployment of carbon sequestration technologies. As part of its mission to facilitate technology transfer and develop guidelines from lessons learned, DOE is developing a series of best practice manuals (BPMs) for carbon capture and storage (CCS). The "Site Screening, Site Selection, and Initial Characterization for Storage of CO2 in Deep Geological Formations" BPM is a compilation of best practices and includes flowchart diagrams illustrating the general decision making process for Site Screening, Site Selection, and Initial Characterization. The BPM integrates the knowledge gained from various programmatic efforts, with particular emphasis on the Characterization Phase through pilot-scale CO2 injection testing of the Validation Phase of the Regional Carbon Sequestration Partnership (RCSP) Initiative. Key geologic and surface elements that suitable candidate storage sites should possess are identified, along with example Site Screening, Site Selection, and Initial Characterization protocols for large-scale geologic storage projects located across diverse geologic and regional settings. This manual has been written as a working document, establishing a framework and methodology for proper site selection for CO2 geologic storage. This will be useful for future CO2 emitters, transporters, and storage providers. It will also be of use in informing local, regional, state, and national governmental agencies of best practices in proper sequestration site selection. Furthermore, it will educate the inquisitive general public on options and processes for geologic CO2 storage. In addition to providing best practices, the manual presents a geologic storage resource and capacity classification system. The system provides a "standard" to communicate storage and capacity estimates, uncertainty and project development risk, data guidelines and analyses for adequate site characterization, and guidelines for reporting estimates within the classification based on each project's status. 

  9. Thermodynamic analysis of water molecules at the surface of proteins and applications to binding site prediction and characterization.

    PubMed

    Beuming, Thijs; Che, Ye; Abel, Robert; Kim, Byungchan; Shanmugasundaram, Veerabahu; Sherman, Woody

    2012-03-01

    Water plays an essential role in determining the structure and function of all biological systems. Recent methodological advances allow for an accurate and efficient estimation of the thermodynamic properties of water molecules at the surface of proteins. In this work, we characterize these thermodynamic properties and relate them to various structural and functional characteristics of the protein. We find that high-energy hydration sites often exist near protein motifs typically characterized as hydrophilic, such as backbone amide groups. We also find that waters around alpha helices and beta sheets tend to be less stable than waters around loops. Furthermore, we find no significant correlation between the hydration site-free energy and the solvent accessible surface area of the site. In addition, we find that the distribution of high-energy hydration sites on the protein surface can be used to identify the location of binding sites and that binding sites of druggable targets tend to have a greater density of thermodynamically unstable hydration sites. Using this information, we characterize the FKBP12 protein and show good agreement between fragment screening hit rates from NMR spectroscopy and hydration site energetics. Finally, we show that water molecules observed in crystal structures are less stable on average than bulk water as a consequence of the high degree of spatial localization, thereby resulting in a significant loss in entropy. These findings should help to better understand the characteristics of waters at the surface of proteins and are expected to lead to insights that can guide structure-based drug design efforts. Copyright © 2011 Wiley Periodicals, Inc.

  10. An Evaluation of Diet and Physical Activity Messaging in African American Churches

    ERIC Educational Resources Information Center

    Harmon, Brook E.; Blake, Christine E.; Thrasher, James F.; Hébert, James R.

    2014-01-01

    The use of faith-based organizations as sites to deliver diet and physical activity interventions is increasing. Methods to assess the messaging environment within churches are limited. Our research aimed to develop and test an objective assessment methodology to characterize health messages, particularly those related to diet and physical…

  11. Seismic Hazard Estimates Using Ill-defined Macroseismic Data at Site

    NASA Astrophysics Data System (ADS)

    Albarello, D.; Mucciarelli, M.

    - A new approach is proposed to the seismic hazard estimate based on documentary data concerning local history of seismic effects. The adopted methodology allows for the use of ``poor'' data, such as the macroseismic ones, within a formally coherent approach that permits overcoming a number of problems connected to the forcing of available information in the frame of ``standard'' methodologies calibrated on the use of instrumental data. The use of the proposed methodology allows full exploitation of all the available information (that for many towns in Italy covers several centuries) making possible a correct use of macroseismic data characterized by different levels of completeness and reliability. As an application of the proposed methodology, seismic hazard estimates are presented for two towns located in Northern Italy: Bologna and Carpi.

  12. Quantification and regionalization of groundwater recharge in South-Central Kansas: Integrating field characterization, statistical analysis, and GIS

    USGS Publications Warehouse

    Sophocleous, M.

    2000-01-01

    A practical methodology for recharge characterization was developed based on several years of field-oriented research at 10 sites in the Great Bend Prairie of south-central Kansas. This methodology combines the soil-water budget on a storm-by-storm year-round basis with the resulting watertable rises. The estimated 1985-1992 average annual recharge was less than 50mm/year with a range from 15 mm/year (during the 1998 drought) to 178 mm/year (during the 1993 flood year). Most of this recharge occurs during the spring months. To regionalize these site-specific estimates, an additional methodology based on multiple (forward) regression analysis combined with classification and GIS overlay analyses was developed and implemented. The multiple regression analysis showed that the most influential variables were, in order of decreasing importance, total annual precipitation, average maximum springtime soil-profile water storage, average shallowest springtime depth to watertable, and average springtime precipitation rate. Therefore, four GIS (ARC/INFO) data "layers" or coverages were constructed for the study region based on these four variables, and each such coverage was classified into the same number of data classes to avoid biasing the results. The normalized regression coefficients were employed to weigh the class rankings of each recharge-affecting variable. This approach resulted in recharge zonations that agreed well with the site recharge estimates. During the "Great Flood of 1993," when rainfall totals exceeded normal levels by -200% in the northern portion of the study region, the developed regionalization methodology was tested against such extreme conditions, and proved to be both practical, based on readily available or easily measurable data, and robust. It was concluded that the combination of multiple regression and GIS overlay analyses is a powerful and practical approach to regionalizing small samples of recharge estimates.

  13. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of anmore » uncertainty analysis framework.« less

  14. Insights from native mass spectrometry approaches for top- and middle- level characterization of site-specific antibody-drug conjugates

    PubMed Central

    Botzanowski, Thomas; Erb, Stéphane; Hernandez-Alba, Oscar; Ehkirch, Anthony; Colas, Olivier; Wagner-Rousset, Elsa; Rabuka, David; Beck, Alain; Drake, Penelope M.; Cianférani, Sarah

    2017-01-01

    ABSTRACT Antibody-drug conjugates (ADCs) have emerged as a family of compounds with promise as efficient immunotherapies. First-generation ADCs were generated mostly via reactions on either lysine side-chain amines or cysteine thiol groups after reduction of the interchain disulfide bonds, resulting in heterogeneous populations with a variable number of drug loads per antibody. To control the position and the number of drug loads, new conjugation strategies aiming at the generation of more homogeneous site-specific conjugates have been developed. We report here the first multi-level characterization of a site-specific ADC by state-of-the-art mass spectrometry (MS) methods, including native MS and its hyphenation to ion mobility (IM-MS). We demonstrate the versatility of native MS methodologies for site-specific ADC analysis, with the unique ability to provide several critical quality attributes within one single run, along with a direct snapshot of ADC homogeneity/heterogeneity without extensive data interpretation. The capabilities of native IM-MS to directly access site-specific ADC conformational information are also highlighted. Finally, the potential of these techniques for assessing an ADC's heterogeneity/homogeneity is illustrated by comparing the analytical characterization of a site-specific DAR4 ADC to that of first-generation ADCs. Altogether, our results highlight the compatibility, versatility, and benefits of native MS approaches for the analytical characterization of all types of ADCs, including site-specific conjugates. Thus, we envision integrating native MS and IM-MS approaches, even in their latest state-of-the-art forms, into workflows that benchmark bioconjugation strategies. PMID:28406343

  15. Assessing Freshwater Ecosystem Service Risk over Ecological, Socioeconomic, and Cultural Gradients: Problem Space Characterization and Methodology

    NASA Astrophysics Data System (ADS)

    Harmon, T. C.; Villamizar, S. R.; Conde, D.; Rusak, J.; Reid, B.; Astorga, A.; Perillo, G. M.; Piccolo, M. C.; Zilio, M.; London, S.; Velez, M.; Hoyos, N.; Escobar, J.

    2014-12-01

    Freshwater ecosystems and the services they provide are under increasing anthropogenic pressure at local (e.g., irrigation diversions, wastewater discharge) and global scales (e.g., climate change, global trading). The impact depends on an ecosystem's sensitivity, which is determined by its geophysical and ecological settings, and the population and activities in its surrounding watershed. Given the importance of ecosystem services, it is critical that we improve our ability to identify and understand changes in aquatic ecosystems, and translate them to risk of service loss. Furthermore, to inspire changes in human behavior, it is equally critical that we learn to communicate risk, and pose risk mitigation strategies, in a manner acceptable to a broad spectrum of stakeholders. Quantifying the nature and timing of the risk is difficult because (1) we often fail to understand the connection between anthropogenic pressures and the timing and extent of ecosystem changes; and (2) the concept of risk is inherently coupled to human perception, which generally differs with cultural and socio-economic conditions. In this study, we endeavor to assess aquatic ecosystem risks across an international array of six study sites. The challenge is to construct a methodology capable of capturing the marked biogeographical, socioeconomic, and cultural differences among the sites, which include: (1) Muskoka River watershed in humid continental Ontario, Canada; (2) Lower San Joaquin River, an impounded snow-fed river in semi-arid Central California; (3) Ciénaga Grande de Santa Marta, a tropical coastal lagoon in Colombia; (4) Senguer River basin in the semi-arid part of Argentina; (5) Laguna de Rocha watershed in humid subtropical Uruguay; and (6) Palomas Lake complex in oceanic Chilean Patagonia. Results will include a characterization of the experimental gradient over the six sites, an overview of the risk assessment methodology, and preliminary findings for several of the sites.

  16. Online catalog of world-wide test sites for the post-launch characterization and calibration of optical sensors

    USGS Publications Warehouse

    Chander, G.; Christopherson, J.B.; Stensaas, G.L.; Teillet, P.M.

    2007-01-01

    In an era when the number of Earth-observing satellites is rapidly growing and measurements from these sensors are used to answer increasingly urgent global issues, it is imperative that scientists and decision-makers can rely on the accuracy of Earth-observing data products. The characterization and calibration of these sensors are vital to achieve an integrated Global Earth Observation System of Systems (GEOSS) for coordinated and sustained observations of Earth. The U.S. Geological Survey (USGS), as a supporting member of the Committee on Earth Observation Satellites (CEOS) and GEOSS, is working with partners around the world to establish an online catalog of prime candidate test sites for the post-launch characterization and calibration of space-based optical imaging sensors. The online catalog provides easy public Web site access to this vital information for the global community. This paper describes the catalog, the test sites, and the methodologies to use the test sites. It also provides information regarding access to the online catalog and plans for further development of the catalog in cooperation with calibration specialists from agencies and organizations around the world. Through greater access to and understanding of these vital test sites and their use, the validity and utility of information gained from Earth remote sensing will continue to improve. Copyright IAF/IAA. All rights reserved.

  17. Diversity of Staphylococcus aureus strains colonizing various niches of the human body.

    PubMed

    Muenks, Carol E; Hogan, Patrick G; Wang, Jeffrey W; Eisenstein, Kimberly A; Burnham, Carey-Ann D; Fritz, Stephanie A

    2016-06-01

    As individuals may be colonized with multiple strains of Staphylococcus aureus at different body sites, the objectives of this study were to determine whether S. aureus polyclonal colonization exists within one body niche and the optimal sampling sites and culture methodology to capture the diversity of S. aureus strains in community-dwelling individuals. Swabs were collected from the nares, axillae, and inguinal folds of 3 children with community-associated S. aureus infections and 11 household contacts, all with known S. aureus colonization. S. aureus isolates were recovered from each body niche using 4 culture methods and evaluated for polyclonality using phenotypic and genotypic strain characterization methodologies. Within individuals, the mean (range) number of phenotypes and genotypes was 2.4 (1-4) and 3.1 (1-6), respectively. Six (43%) and 10 (71%) participants exhibited phenotypic and genotypic polyclonality within one body niche, respectively. Broth enrichment yielded the highest analytical sensitivity for S. aureus recovery, while direct plating to blood agar yielded the highest genotypic strain diversity. This study revealed S. aureus polyclonality within a single body niche. Culture methodology and sampling sites influenced the analytical sensitivity of S. aureus colonization detection and the robustness of phenotypic and genotypic strain recovery. Copyright © 2016 The British Infection Association. Published by Elsevier Ltd. All rights reserved.

  18. Ligand-Assisted Protein Structure (LAPS): An Experimental Paradigm for Characterizing Cannabinoid-Receptor Ligand-Binding Domains.

    PubMed

    Janero, David R; Korde, Anisha; Makriyannis, Alexandros

    2017-01-01

    Detailed characterization of the ligand-binding motifs and structure-function correlates of the principal GPCRs of the endocannabinoid-signaling system, the cannabinoid 1 (CB1R) and cannabinoid 2 (CB2R) receptors, is essential to inform the rational design of drugs that modulate CB1R- and CB2R-dependent biosignaling for therapeutic gain. We discuss herein an experimental paradigm termed "ligand-assisted protein structure" (LAPS) that affords a means of characterizing, at the amino acid level, CB1R and CB2R structural features key to ligand engagement and receptor-dependent information transmission. For this purpose, LAPS integrates three key disciplines and methodologies: (a) medicinal chemistry: design and synthesis of high-affinity, pharmacologically active probes as reporters capable of reacting irreversibly with particular amino acids at (or in the immediate vicinity of) the ligand-binding domain of the functionally active receptor; (b) molecular and cellular biology: introduction of discrete, conservative point mutations into the target GPCR and determination of their effect on probe binding and pharmacological activity; (c) analytical chemistry: identification of the site(s) of probe-GPCR interaction through focused, bottom-up, amino acid-level proteomic identification of the probe-receptor complex using liquid chromatography tandem mass spectrometry. Subsequent in silico methods including ligand docking and computational modeling provide supplementary data on the probe-receptor interaction as defined by LAPS. Examples of LAPS as applied to human CB2R orthosteric binding site characterization for a biarylpyrazole antagonist/inverse agonist and a classical cannabinoid agonist belonging to distinct chemical classes of cannabinergic compounds are given as paradigms for further application of this methodology to other therapeutic protein targets. LAPS is well positioned to complement other experimental and in silico methods in contemporary structural biology such as X-ray crystallography. © 2017 Elsevier Inc. All rights reserved.

  19. Advances in Structural Integrity Analysis Methods for Aging Metallic Airframe Structures with Local Damage

    NASA Technical Reports Server (NTRS)

    Starnes, James H., Jr.; Newman, James C., Jr.; Harris, Charles E.; Piascik, Robert S.; Young, Richard D.; Rose, Cheryl A.

    2003-01-01

    Analysis methodologies for predicting fatigue-crack growth from rivet holes in panels subjected to cyclic loads and for predicting the residual strength of aluminum fuselage structures with cracks and subjected to combined internal pressure and mechanical loads are described. The fatigue-crack growth analysis methodology is based on small-crack theory and a plasticity induced crack-closure model, and the effect of a corrosive environment on crack-growth rate is included. The residual strength analysis methodology is based on the critical crack-tip-opening-angle fracture criterion that characterizes the fracture behavior of a material of interest, and a geometric and material nonlinear finite element shell analysis code that performs the structural analysis of the fuselage structure of interest. The methodologies have been verified experimentally for structures ranging from laboratory coupons to full-scale structural components. Analytical and experimental results based on these methodologies are described and compared for laboratory coupons and flat panels, small-scale pressurized shells, and full-scale curved stiffened panels. The residual strength analysis methodology is sufficiently general to include the effects of multiple-site damage on structural behavior.

  20. Soil Characterization and Site Response of Marine and Continental Environments

    NASA Astrophysics Data System (ADS)

    Contreras-Porras, R. S.; Huerta-Lopez, C. I.; Martinez-Cruzado, J. A.; Gaherty, J. B.; Collins, J. A.

    2009-05-01

    An in situ soil properties study was conducted to characterize both site and shallow layer sediments under marine and continental environments. Data from the SCoOBA (Sea of Cortez Ocean Bottom Array) seismic experiment and in land ambient vibration measurements on the urban areas of Tijuana, B. C., and Ensenada, B. C., Mexico were used in the analysis. The goal of this investigation is to identify and to analyze the effect of the physical/geotechnical properties of the ground on the site response upon seismic excitations in both marine and continental environments. The time series were earthquakes and background noise recorded within interval of 10/2005 to 10/2006 in the Gulf of California (GoC) with very-broadband Ocean Bottom Seismographs (OBS), and ambient vibration measurements collected during different time periods on Tijuana and Ensenada urban areas. The data processing and analysis was conducted by means of the H/V Spectral Ratios (HVSPR) of multi component data, the Random Decrement Method (RDM), and Blind Deconvolution (BD). This study presents ongoing results of a long term project to characterize the local site response of soil layers upon dynamic excitations using digital signal processing algorithms on time series, as well as the comparison between the results these methodologies are providing.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P E; Harris, D; Myers, S

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less

  2. Rational analyses of information foraging on the web.

    PubMed

    Pirolli, Peter

    2005-05-06

    This article describes rational analyses and cognitive models of Web users developed within information foraging theory. This is done by following the rational analysis methodology of (a) characterizing the problems posed by the environment, (b) developing rational analyses of behavioral solutions to those problems, and (c) developing cognitive models that approach the realization of those solutions. Navigation choice is modeled as a random utility model that uses spreading activation mechanisms that link proximal cues (information scent) that occur in Web browsers to internal user goals. Web-site leaving is modeled as an ongoing assessment by the Web user of the expected benefits of continuing at a Web site as opposed to going elsewhere. These cost-benefit assessments are also based on spreading activation models of information scent. Evaluations include a computational model of Web user behavior called Scent-Based Navigation and Information Foraging in the ACT Architecture, and the Law of Surfing, which characterizes the empirical distribution of the length of paths of visitors at a Web site. 2005 Lawrence Erlbaum Associates, Inc.

  3. Novel methodology to characterize electromagnetic exposure of the brain

    NASA Astrophysics Data System (ADS)

    Crespo-Valero, Pedro; Christopoulou, Maria; Zefferer, Marcel; Christ, Andreas; Achermann, Peter; Nikita, Konstantina S.; Kuster, Niels

    2011-01-01

    Due to the greatly non-uniform field distribution induced in brain tissues by radio frequency electromagnetic sources, the exposure of anatomical and functional regions of the brain may be a key issue in interpreting laboratory findings and epidemiological studies concerning endpoints related to the central nervous system. This paper introduces the Talairach atlas in characterization of the electromagnetic exposure of the brain. A hierarchical labeling scheme is mapped onto high-resolution human models. This procedure is fully automatic and allows identification of over a thousand different sites all over the brain. The electromagnetic absorption can then be extracted and interpreted in every region or combination of regions in the brain, depending on the characterization goals. The application examples show how this methodology enhances the dosimetry assessment of the brain based on results obtained by either finite difference time domain simulations or measurements delivered by test compliance dosimetry systems. Applications include, among others, the detailed dosimetric analysis of the exposure of the brain during cell phone use, improved design of exposure setups for human studies or medical diagnostic and therapeutic devices using electromagnetic fields or ultrasound.

  4. Thermodynamics of camphor migration in cytochrome P450cam by atomistic simulations.

    PubMed

    Rydzewski, J; Nowak, W

    2017-08-10

    Understanding the mechanisms of ligand binding to enzymes is of paramount importance for the design of new drugs. Here, we report on the use of a novel biased molecular dynamics (MD) methodology to study the mechanism of camphor binding to cytochrome P450cam. Microsecond-long MD simulations allowed us to observe reaction coordinates characterizing ligand diffusion from the active site of cytochrome P450cam to solvent via three egress routes. These atomistic simulations were used to estimate thermodynamic quantities along the reaction coordinates and indicate diverse binding configurations. The results suggest that the diffusion of camphor along the pathway near the substrate recognition site (SRS) is thermodynamically preferred. In addition, we show that the diffusion near the SRS is triggered by a transition from a heterogeneous collection of closed ligand-bound conformers to the basin comprising the open conformations of cytochrome P450cam. The conformational change accompanying this switch is characterized by the retraction of the F and G helices and the disorder of the B' helix. These results are corroborated by experimental studies and provide detailed insight into ligand binding and conformational behavior of the cytochrome family. The presented methodology is general and can be applied to other ligand-protein systems.

  5. Incorporating advanced EMI technologies in operational munitions characterization surveys

    NASA Astrophysics Data System (ADS)

    Miller, Jonathan S.; Shubiditze, Fridon; Pasion, Leonard; Schultz, Gregory; Chung, Heesoo

    2011-06-01

    The presence of unexploded ordnance (UXO), discarded military munitions (DMM), and munitions constituents (MC) at both active and formerly used defense sites (FUDS) has created a necessity for production-level efforts to remove these munitions and explosives of concern (MEC). Ordnance and explosives (OE) and UXO removal operations typically employ electromagnetic induction (EMI) or magnetometer surveys to identify potential MEC hazards in previously determined areas of interest. A major cost factor in these operations is the significant allocation of resources for the excavation of harmless objects associated with fragmentation, scrap, or geological clutter. Recent advances in classification and discrimination methodologies, as well as the development of sensor technologies that fully exploit physics-based analysis, have demonstrated promise for significantly reducing the false alarm rate due to MEC related clutter. This paper identifies some of the considerations for and the challenges associated with implementing these discrimination methodologies and advanced sensor technologies in production-level surveys. Specifically, we evaluate the implications of deploying an advanced multi-axis EMI sensor at a variety of MEC sites, the discrimination methodologies that leverage the data produced by this sensor, and the potential for productivity increase that could be realized by incorporating this advanced technology as part of production protocol.

  6. Development of a Methodology for Hydrogeological Characterization of Faults: Progress of the Project in Berkeley, California

    NASA Astrophysics Data System (ADS)

    Goto, J.; Moriya, T.; Yoshimura, K.; Tsuchi, H.; Karasaki, K.; Onishi, T.; Ueta, K.; Tanaka, S.; Kiho, K.

    2010-12-01

    The Nuclear Waste Management Organization of Japan (NUMO), in collaboration with Lawrence Berkeley National Laboratory (LBNL), has carried out a project to develop an efficient and practical methodology to characterize hydrologic property of faults since 2007, exclusively for the early stage of siting a deep underground repository. A preliminary flowchart of the characterization program and a classification scheme of fault hydrology based on the geological feature have been proposed. These have been tested through the field characterization program on the Wildcat Fault in Berkeley, California. The Wildcat Fault is a relatively large non-active strike-slip fault which is believed to be a subsidiary of the active Hayward Fault. Our classification scheme assumes the contrasting hydrologic features between the linear northern part and the split/spread southern part of the Wildcat Fault. The field characterization program to date has been concentrated in and around the LBNL site on the southern part of the fault. Several lines of electrical and reflection seismic surveys, and subsequent trench investigations, have revealed the approximate distribution and near-surface features of the Wildcat Fault (see also Onishi, et al. and Ueta, et al.). Three 150m deep boreholes, WF-1 to WF-3, have been drilled on a line normal to the trace of the fault in the LBNL site. Two vertical holes were placed to characterize the undisturbed Miocene sedimentary formations at the eastern and western sides of the fault (WF-1 and WF-2 respectively). WF-2 on the western side intersected the rock formation, which was expected only in WF-1, and several of various intensities. Therefore, WF-3, originally planned as inclined to penetrate the fault, was replaced by the vertical hole further to the west. It again encountered unexpected rocks and faults. Preliminary results of in-situ hydraulic tests suggested that the transmissivity of WF-1 is ten to one hundred times higher than WF-2. The monitoring of hydraulic pressure displayed different head distribution patterns between WF-1 and WF-2 (see also Karasaki, et al.). Based on these results, three hypotheses on the distribution of the Wildcat Fault were proposed: (a) a vertical fault in between WF-1 and WF-2, (b) a more gently dipping fault intersected in WF-2 and WF-3, and (c) a wide zone of faults extending between WF-1 and WF-3. At present, WF-4, an inclined hole to penetrate the possible (eastern?) master fault, is ongoing to test these hypotheses. After the WF-4 investigation, hydrologic and geochemical analyses and modeling of the southern part of the fault will be carried out. A simpler field characterization program will also be carried out in the northern part of the fault. Finally, all the results will be synthesized to improve the comprehensive methodology.

  7. Incorporating ToxCast and Tox21 Datasets to Rank Biological Activity of Chemicals at Superfund Sites in North Carolina

    PubMed Central

    Tilley, Sloane K.; Reif, David M.; Fry, Rebecca C.

    2017-01-01

    Background The Superfund program of the Environmental Protection Agency (EPA) was established in 1980 to address public health concerns posed by toxic substances released into the environment in the United States. Forty-two of the 1328 hazardous waste sites that remain on the Superfund National Priority List are located in the state of North Carolina. Methods We set out to develop a database that contained information on both the prevalence and biological activity of chemicals present at Superfund sites in North Carolina. A chemical characterization tool, the Toxicological Priority Index (ToxPi), was used to rank the biological activity of these chemicals based on their predicted bioavailability, documented associations with biological pathways, and activity in in vitro assays of the ToxCast and Tox21 programs. Results The ten most prevalent chemicals found at North Carolina Superfund sites were chromium, trichloroethene, lead, tetrachloroethene, arsenic, benzene, manganese, 1,2-dichloroethane, nickel, and barium. For all chemicals found at North Carolina Superfund sites, ToxPi analysis was used to rank their biological activity. Through this data integration, residual pesticides and organic solvents were identified to be some of the most highly-ranking predicted bioactive chemicals. This study provides a novel methodology for creating state or regional databases of Superfund sites. Conclusions These data represent a novel integrated profile of the most prevalent chemicals at North Carolina Superfund sites. This information, and the associated methodology, is useful to toxicologists, risk assessors, and the communities living in close proximity to these sites. PMID:28153528

  8. Incorporating ToxCast and Tox21 datasets to rank biological activity of chemicals at Superfund sites in North Carolina.

    PubMed

    Tilley, Sloane K; Reif, David M; Fry, Rebecca C

    2017-04-01

    The Superfund program of the Environmental Protection Agency (EPA) was established in 1980 to address public health concerns posed by toxic substances released into the environment in the United States. Forty-two of the 1328 hazardous waste sites that remain on the Superfund National Priority List are located in the state of North Carolina. We set out to develop a database that contained information on both the prevalence and biological activity of chemicals present at Superfund sites in North Carolina. A chemical characterization tool, the Toxicological Priority Index (ToxPi), was used to rank the biological activity of these chemicals based on their predicted bioavailability, documented associations with biological pathways, and activity in in vitro assays of the ToxCast and Tox21 programs. The ten most prevalent chemicals found at North Carolina Superfund sites were chromium, trichloroethene, lead, tetrachloroethene, arsenic, benzene, manganese, 1,2-dichloroethane, nickel, and barium. For all chemicals found at North Carolina Superfund sites, ToxPi analysis was used to rank their biological activity. Through this data integration, residual pesticides and organic solvents were identified to be some of the most highly-ranking predicted bioactive chemicals. This study provides a novel methodology for creating state or regional databases of biological activity of contaminants at Superfund sites. These data represent a novel integrated profile of the most prevalent chemicals at North Carolina Superfund sites. This information, and the associated methodology, is useful to toxicologists, risk assessors, and the communities living in close proximity to these sites. Copyright © 2016. Published by Elsevier Ltd.

  9. Comparative study of large scale simulation of underground explosions inalluvium and in fractured granite using stochastic characterization

    NASA Astrophysics Data System (ADS)

    Vorobiev, O.; Ezzedine, S. M.; Antoun, T.; Glenn, L.

    2014-12-01

    This work describes a methodology used for large scale modeling of wave propagation fromunderground explosions conducted at the Nevada Test Site (NTS) in two different geological settings:fractured granitic rock mass and in alluvium deposition. We show that the discrete nature of rockmasses as well as the spatial variability of the fabric of alluvium is very important to understand groundmotions induced by underground explosions. In order to build a credible conceptual model of thesubsurface we integrated the geological, geomechanical and geophysical characterizations conductedduring recent test at the NTS as well as historical data from the characterization during the undergroundnuclear test conducted at the NTS. Because detailed site characterization is limited, expensive and, insome instances, impossible we have numerically investigated the effects of the characterization gaps onthe overall response of the system. We performed several computational studies to identify the keyimportant geologic features specific to fractured media mainly the joints; and those specific foralluvium porous media mainly the spatial variability of geological alluvium facies characterized bytheir variances and their integral scales. We have also explored common key features to both geologicalenvironments such as saturation and topography and assess which characteristics affect the most theground motion in the near-field and in the far-field. Stochastic representation of these features based onthe field characterizations have been implemented in Geodyn and GeodynL hydrocodes. Both codeswere used to guide site characterization efforts in order to provide the essential data to the modelingcommunity. We validate our computational results by comparing the measured and computed groundmotion at various ranges. This work performed under the auspices of the U.S. Department of Energy by Lawrence LivermoreNational Laboratory under Contract DE-AC52-07NA27344.

  10. Protein model discrimination using mutational sensitivity derived from deep sequencing.

    PubMed

    Adkar, Bharat V; Tripathi, Arti; Sahoo, Anusmita; Bajaj, Kanika; Goswami, Devrishi; Chakrabarti, Purbani; Swarnkar, Mohit K; Gokhale, Rajesh S; Varadarajan, Raghavan

    2012-02-08

    A major bottleneck in protein structure prediction is the selection of correct models from a pool of decoys. Relative activities of ∼1,200 individual single-site mutants in a saturation library of the bacterial toxin CcdB were estimated by determining their relative populations using deep sequencing. This phenotypic information was used to define an empirical score for each residue (RankScore), which correlated with the residue depth, and identify active-site residues. Using these correlations, ∼98% of correct models of CcdB (RMSD ≤ 4Å) were identified from a large set of decoys. The model-discrimination methodology was further validated on eleven different monomeric proteins using simulated RankScore values. The methodology is also a rapid, accurate way to obtain relative activities of each mutant in a large pool and derive sequence-structure-function relationships without protein isolation or characterization. It can be applied to any system in which mutational effects can be monitored by a phenotypic readout. Copyright © 2012 Elsevier Ltd. All rights reserved.

  11. A phased approach to induced seismicity risk management

    DOE PAGES

    White, Joshua A.; Foxall, William

    2014-01-01

    This work describes strategies for assessing and managing induced seismicity risk during each phase of a carbon storage project. We consider both nuisance and damage potential from induced earthquakes, as well as the indirect risk of enhancing fault leakage pathways. A phased approach to seismicity management is proposed, in which operations are continuously adapted based on available information and an on-going estimate of risk. At each project stage, specific recommendations are made for (a) monitoring and characterization, (b) modeling and analysis, and (c) site operations. The resulting methodology can help lower seismic risk while ensuring site operations remain practical andmore » cost-effective.« less

  12. Transient Inverse Calibration of Hanford Site-Wide Groundwater Model to Hanford Operational Impacts - 1943 to 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Charles R.; Bergeron, Marcel P.; Wurstner, Signe K.

    2001-05-31

    This report describes a new initiative to strengthen the technical defensibility of predictions made with the Hanford site-wide groundwater flow and transport model. The focus is on characterizing major uncertainties in the current model. PNNL will develop and implement a calibration approach and methodology that can be used to evaluate alternative conceptual models of the Hanford aquifer system. The calibration process will involve a three-dimensional transient inverse calibration of each numerical model to historical observations of hydraulic and water quality impacts to the unconfined aquifer system from Hanford operations since the mid-1940s.

  13. Multivalent peptidic linker enables identification of preferred sites of conjugation for a potent thialanstatin antibody drug conjugate.

    PubMed

    Puthenveetil, Sujiet; He, Haiyin; Loganzo, Frank; Musto, Sylvia; Teske, Jesse; Green, Michael; Tan, Xingzhi; Hosselet, Christine; Lucas, Judy; Tumey, L Nathan; Sapra, Puja; Subramanyam, Chakrapani; O'Donnell, Christopher J; Graziani, Edmund I

    2017-01-01

    Antibody drug conjugates (ADCs) are no longer an unknown entity in the field of cancer therapy with the success of marketed ADCs like ADCETRIS and KADCYLA and numerous others advancing through clinical trials. The pursuit of novel cytotoxic payloads beyond the mictotubule inhibitors and DNA damaging agents has led us to the recent discovery of an mRNA splicing inhibitor, thailanstatin, as a potent ADC payload. In our previous work, we observed that the potency of this payload was uniquely tied to the method of conjugation, with lysine conjugates showing much superior potency as compared to cysteine conjugates. However, the ADC field is rapidly shifting towards site-specific ADCs due to their advantages in manufacturability, characterization and safety. In this work we report the identification of a highly efficacious site-specific thailanstatin ADC. The site of conjugation played a critical role on both the in vitro and in vivo potency of these ADCs. During the course of this study, we developed a novel methodology of loading a single site with multiple payloads using an in situ generated multi-drug carrying peptidic linker that allowed us to rapidly screen for optimal conjugation sites. Using this methodology, we were able to identify a double-cysteine mutant ADC delivering four-loaded thailanstatin that was very efficacious in a gastric cancer xenograft model at 3mg/kg and was also shown to be efficacious against T-DM1 resistant and MDR1 overexpressing tumor cell lines.

  14. Using FEP's List and a PA Methodology for Evaluating Suitable Areas for the LLW Repository in Italy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Risoluti, P.; Ciabatti, P.; Mingrone, G.

    2002-02-26

    In Italy following a referendum held in 1987, nuclear energy has been phased out. Since 1998, a general site selection process covering the whole Italian territory has been under way. A GIS (Geographic Information System) methodology was implemented in three steps using the ESRI Arc/Info and Arc/View platforms. The screening identified approximately 0.8% of the Italian territory as suitable for locating the LLW Repository. 200 areas have been identified as suitable for the location of the LLW Repository, using a multiple exclusion criteria procedure (1:500,000), regional scale (1:100.000) and local scale (1:25,000-1:10,000). A methodology for evaluating these areas has beenmore » developed allowing, along with the evaluation of the long term efficiency of the engineered barrier system (EBS), the characterization of the selected areas in terms of physical and safety factors and planning factors. The first step was to identify, on a referenced FEPs list, a group of geomorphological, geological, hydrogeological, climatic and human behavior caused process and/or events, which were considered of importance for the site evaluation, taking into account the Italian situation. A site evaluation system was established ascribing weighted scores to each of these processes and events, which were identified as parameters of the new evaluation system. The score of each parameter is ranging from 1 (low suitability) to 3 (high suitability). The corresponding weight is calculated considering the effect of the parameter in terms of total dose to the critical group, using an upgraded AMBER model for PA calculation. At the end of the process an index obtained by a score weighted sum gives the degree of suitability of the selected areas for the LLW Repository location. The application of the methodology to two selected sites is given in the paper.« less

  15. Seismic Characterization of the Newberry and Cooper Basin EGS Sites

    NASA Astrophysics Data System (ADS)

    Templeton, D. C.; Wang, J.; Goebel, M.; Johannesson, G.; Myers, S. C.; Harris, D.; Cladouhos, T. T.

    2015-12-01

    To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance traditional microearthquake detection and location methodologies at two EGS systems: the Newberry EGS site and the Habanero EGS site in the Cooper Basin of South Australia. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP typically have smaller magnitudes or occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation is real, or simply within the anticipated error range. At the Newberry EGS site, 235 events were reported in the original catalog. MFP identified 164 additional events (an increase of over 70% more events). For the relocated events in the Newberry catalog, we can distinguish two distinct seismic swarms that fall outside of one another's 95% probability error ellipsoids.This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  16. Optimal health and disease management using spatial uncertainty: a geographic characterization of emergent artemisinin-resistant Plasmodium falciparum distributions in Southeast Asia.

    PubMed

    Grist, Eric P M; Flegg, Jennifer A; Humphreys, Georgina; Mas, Ignacio Suay; Anderson, Tim J C; Ashley, Elizabeth A; Day, Nicholas P J; Dhorda, Mehul; Dondorp, Arjen M; Faiz, M Abul; Gething, Peter W; Hien, Tran T; Hlaing, Tin M; Imwong, Mallika; Kindermans, Jean-Marie; Maude, Richard J; Mayxay, Mayfong; McDew-White, Marina; Menard, Didier; Nair, Shalini; Nosten, Francois; Newton, Paul N; Price, Ric N; Pukrittayakamee, Sasithon; Takala-Harrison, Shannon; Smithuis, Frank; Nguyen, Nhien T; Tun, Kyaw M; White, Nicholas J; Witkowski, Benoit; Woodrow, Charles J; Fairhurst, Rick M; Sibley, Carol Hopkins; Guerin, Philippe J

    2016-10-24

    Artemisinin-resistant Plasmodium falciparum malaria parasites are now present across much of mainland Southeast Asia, where ongoing surveys are measuring and mapping their spatial distribution. These efforts require substantial resources. Here we propose a generic 'smart surveillance' methodology to identify optimal candidate sites for future sampling and thus map the distribution of artemisinin resistance most efficiently. The approach uses the 'uncertainty' map generated iteratively by a geostatistical model to determine optimal locations for subsequent sampling. The methodology is illustrated using recent data on the prevalence of the K13-propeller polymorphism (a genetic marker of artemisinin resistance) in the Greater Mekong Subregion. This methodology, which has broader application to geostatistical mapping in general, could improve the quality and efficiency of drug resistance mapping and thereby guide practical operations to eliminate malaria in affected areas.

  17. USE OF THE AERIAL MEASUREMENT SYSTEM HELICOPTER EMERGENCY RESPONSE ACQUISITION SYSTEMS WITH GEOGRAPHIC INFORMATION SYSTEM FOR RADIOACTIVE SOIL REMEDIATION - [11504

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BROCK CT

    2011-02-15

    The Aerial Measurement System (AMS) Helicopter Emergency Response Acquisition System provides a thorough and economical means to identify and characterize the contaminants for large area radiological surveys. The helicopter system can provide a 100-percent survey of an area that qualifies as a scoping survey under the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM) methodology. If the sensitivity is adequate when compared to the clean up values, it may also be used for the characterization survey. The data from the helicopter survey can be displayed and manipulated to provide invaluable data during remediation activities.

  18. Alternative Radiological Characterization of Sealed Source TRU Waste for WIPP Disposal (LAUR-05-8776)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whitworth, J.; Pearson, M.; Feldman, A.

    2006-07-01

    The Offsite Source Recovery (OSR) Project at Los Alamos National Laboratory is now shipping transuranic (TRU) waste containers to the Waste Isolation Pilot Plant (WIPP) in New Mexico for disposal. Sealed source waste disposal has become possible in part because OSR personnel were able to obtain Environmental Protection Agency (EPA) and DOE-CBFO approval for an alternative radiological characterization procedure relying on acceptable knowledge (AK) and modeling, rather than on non-destructive assay (NDA) of each container. This is the first successful qualification of an 'alternate methodology' under the radiological characterization requirements of the WIPP Waste Acceptance Criteria (WAC) by any TRUmore » waste generator site. This paper describes the approach OSR uses to radiologically characterize its sealed source waste and the process by which it obtained certification of this approach. (authors)« less

  19. Region-to-area screening methodology for the Crystalline Repository Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1985-04-01

    The purpose of this document is to describe the Crystalline Repository Project's (CRP) process for region-to-area screening of exposed and near-surface crystalline rock bodies in the three regions of the conterminous United States where crystalline rock is being evaluated as a potential host for the second nuclear waste repository (i.e., in the North Central, Northeastern, and Southeastern Regions). This document indicates how the US Department of Energy's (DOE) General Guidelines for the Recommendation of Sites for Nuclear Waste Repositories (10 CFR 960) were used to select and apply factors and variables for the region-to-area screening, explains how these factors andmore » variable are to be applied in the region-to-area screening, and indicates how this methodology relates to the decision process leading to the selection of candidate areas. A brief general discussion of the screening process from the national survey through area screening and site recommendation is presented. This discussion sets the scene for detailed discussions which follow concerning the region-to-area screening process, the guidance provided by the DOE Siting Guidelines for establishing disqualifying factors and variables for screening, and application of the disqualifying factors and variables in the screening process. This document is complementary to the regional geologic and environmental characterization reports to be issued in the summer of 1985 as final documents. These reports will contain the geologic and environmental data base that will be used in conjunction with the methodology to conduct region-to-area screening.« less

  20. 3D geophysical imaging for site-specific characterization plan of an old landfill.

    PubMed

    Di Maio, R; Fais, S; Ligas, P; Piegari, E; Raga, R; Cossu, R

    2018-06-01

    As it is well-known, the characterization plan of an old landfill site is the first stage of the project for the treatment and reclamation of contaminated lands. It is a preliminary in-situ study, with collection of data related to pollution phenomena, and is aimed at defining the physical properties and the geometry of fill materials as well as the possible migration paths of pollutants to the surrounding environmental targets (subsoil and groundwater). To properly evaluate the extent and potential for subsoil contamination, waste volume and possible leachate emissions from the landfill have to be assessed. In such perspective, the integrated use of geophysical methods is an important tool as it allows a detailed 3D representation of the whole system, i.e. waste body and hosting environment (surrounding rocks). This paper presents a very accurate physical and structural characterization of an old landfill and encasing rocks obtained by an integrated analysis of data coming from a multi-methodological geophysical exploration. Moreover, drillings were carried out for waste sampling and characterization of the landfill body, as well as for calibration of the geophysical modeling. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Comprehensive profiling of retroviral integration sites using target enrichment methods from historical koala samples without an assembled reference genome

    PubMed Central

    Alquezar-Planas, David E.; Ishida, Yasuko; Courtiol, Alexandre; Timms, Peter; Johnson, Rebecca N.; Lenz, Dorina; Helgen, Kristofer M.; Roca, Alfred L.; Hartman, Stefanie

    2016-01-01

    Background. Retroviral integration into the host germline results in permanent viral colonization of vertebrate genomes. The koala retrovirus (KoRV) is currently invading the germline of the koala (Phascolarctos cinereus) and provides a unique opportunity for studying retroviral endogenization. Previous analysis of KoRV integration patterns in modern koalas demonstrate that they share integration sites primarily if they are related, indicating that the process is currently driven by vertical transmission rather than infection. However, due to methodological challenges, KoRV integrations have not been comprehensively characterized. Results. To overcome these challenges, we applied and compared three target enrichment techniques coupled with next generation sequencing (NGS) and a newly customized sequence-clustering based computational pipeline to determine the integration sites for 10 museum Queensland and New South Wales (NSW) koala samples collected between the 1870s and late 1980s. A secondary aim of this study sought to identify common integration sites across modern and historical specimens by comparing our dataset to previously published studies. Several million sequences were processed, and the KoRV integration sites in each koala were characterized. Conclusions. Although the three enrichment methods each exhibited bias in integration site retrieval, a combination of two methods, Primer Extension Capture and hybridization capture is recommended for future studies on historical samples. Moreover, identification of integration sites shows that the proportion of integration sites shared between any two koalas is quite small. PMID:27069793

  2. Analysis of the Source Physics Experiment SPE4 Prime Using State-Of Parallel Numerical Tools.

    NASA Astrophysics Data System (ADS)

    Vorobiev, O.; Ezzedine, S. M.; Antoun, T.; Glenn, L.

    2015-12-01

    This work describes a methodology used for large scale modeling of wave propagation from underground chemical explosions conducted at the Nevada National Security Site (NNSS) fractured granitic rock. We show that the discrete natures of rock masses as well as the spatial variability of the fabric of rock properties are very important to understand ground motions induced by underground explosions. In order to build a credible conceptual model of the subsurface we integrated the geological, geomechanical and geophysical characterizations conducted during recent test at the NNSS as well as historical data from the characterization during the underground nuclear test conducted at the NNSS. Because detailed site characterization is limited, expensive and, in some instances, impossible we have numerically investigated the effects of the characterization gaps on the overall response of the system. We performed several computational studies to identify the key important geologic features specific to fractured media mainly the joints characterized at the NNSS. We have also explored common key features to both geological environments such as saturation and topography and assess which characteristics affect the most the ground motion in the near-field and in the far-field. Stochastic representation of these features based on the field characterizations has been implemented into LLNL's Geodyn-L hydrocode. Simulations were used to guide site characterization efforts in order to provide the essential data to the modeling community. We validate our computational results by comparing the measured and computed ground motion at various ranges for the recently executed SPE4 prime experiment. We have also conducted a comparative study between SPE4 prime and previous experiments SPE1 and SPE3 to assess similarities and differences and draw conclusions on designing SPE5.

  3. Evaluation of stormwater harvesting sites using multi criteria decision methodology

    NASA Astrophysics Data System (ADS)

    Inamdar, P. M.; Sharma, A. K.; Cook, Stephen; Perera, B. J. C.

    2018-07-01

    Selection of suitable urban stormwater harvesting sites and associated project planning are often complex due to spatial, temporal, economic, environmental and social factors, and related various other variables. This paper is aimed at developing a comprehensive methodology framework for evaluating of stormwater harvesting sites in urban areas using Multi Criteria Decision Analysis (MCDA). At the first phase, framework selects potential stormwater harvesting (SWH) sites using spatial characteristics in a GIS environment. In second phase, MCDA methodology is used for evaluating and ranking of SWH sites in multi-objective and multi-stakeholder environment. The paper briefly describes first phase of framework and focuses chiefly on the second phase of framework. The application of the methodology is also demonstrated over a case study comprising of the local government area, City of Melbourne (CoM), Australia for the benefit of wider water professionals engaged in this area. Nine performance measures (PMs) were identified to characterise the objectives and system performance related to the eight alternative SWH sites for the demonstration of the application of developed methodology. To reflect the stakeholder interests in the current study, four stakeholder participant groups were identified, namely, water authorities (WA), academics (AC), consultants (CS), and councils (CL). The decision analysis methodology broadly consisted of deriving PROMETHEE II rankings of eight alternative SWH sites in the CoM case study, under two distinct group decision making scenarios. The major innovation of this work is the development and application of comprehensive methodology framework that assists in the selection of potential sites for SWH, and facilitates the ranking in multi-objective and multi-stakeholder environment. It is expected that the proposed methodology will assist the water professionals and managers with better knowledge that will reduce the subjectivity in the selection and evaluation of SWH sites.

  4. Myth and Reality in Hydrogeological Site Characterization at DD and R Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubin, Yoram

    2008-01-15

    The science of hydrogeological site characterization has made significant progress over the last twenty years. Progress has been made in modeling of flow and transport in the heterogeneous subsurface, in understanding of the complex patterns of geological heterogeneity and in measurement technologies. Modeling of uncertainty has also advanced significantly, in recognition of the inherent limitations of subsurface characterization. Much less progress has been made in transforming this progress into practice, where characterization is determined to a large extent by regulations. Environmental regulations have not progressed as much as the science, for example, in recognizing uncertainty. As such, practitioners are lessmore » inclined to adopt advanced, science-based solutions, this opening the door for myths and conflicts. Myths develop where the science base is perceived to be weak, whereas conflicts arise in the face of a disconnect between the science and the regulations. Myths translate to ad-hoc solutions and misplaced empiricism, as well as to unjustified reliance on field experience, to the detriment of D and DR. This paper explores the roots for this situation and identifies ideas that may help in bridging the gap between research and applications. A rational approach for DD and R is needed that will encourage innovation in site characterization, reduce costs and accelerate completion. Such an approach needs to include several elements. DD and R regulations need to recognize the various aspects of uncertainty inherent to site characterization, and as such, should be formulated using probabilistic concepts. One of the immediate benefits will be in allowing a gradual approach for data acquisition in DD and R sites: decisions can be made even under the most severe data limitations, and can be modified as additional data become available. The definition of risk is another major element. There is no universal definition of risk or of a methodology to define risk. Different sites justify different definitions, depending on many environmental, economical and social factors. Despite the lack of consensus, it seems that a good place to start is in fact to recognize that there is a room for all these factors, and a need to balance between them. As experience is gained, through research and discussions among DD and R stakeholders, this may become less of a challenge. Regulations need to recognize the possibility of developing alternative, site-specific characterization strategies based on the various length and time scales that define specific environmental problems, including length scales of heterogeneity, source dimensions and distance to environmental targets. For example, point and distributed sources justify different characterization strategies. Development of problem- or site-specific strategies will create the context for defining innovative efficient DD and R strategies. Innovation in characterization can will also follow from recognizing the specific physiological aspects of the toxins and the related uncertainty. This will open the door for improving risk characterization not only from the hydrologic perspective, but also form the physiologic one.« less

  5. Methodology Investigation Characterization of Test Environment.

    DTIC Science & Technology

    1979-08-01

    canopy trees may be briefly deciduous, especially when flowering . Number of tree species is very large. Canopy: Trees 145 to 180 feet (45 to 55 m) tall...rooted palms are abundant. Shrub layer: Dwarf palms 5 to 8 feet (1.5 to 2.5 m) tall with undi- vided leaves usually abundant. Giant herbs with banana ...forest cover for agricultural purposes, corn and banana culture. These sites are now either abandoned or poorly maintained; in either case, tree

  6. Methodological challenges in conducting a multi-site randomized clinical trial of massage therapy in hospice.

    PubMed

    Kutner, Jean; Smith, Marlaine; Mellis, Karen; Felton, Sue; Yamashita, Traci; Corbin, Lisa

    2010-06-01

    Researchers conducting multi-site studies of interventions for end-of-life symptom management face significant challenges with respect to obtaining an adequate sample and training and retaining on-site study teams. The purpose of this paper is to describe the strategies and responses to these challenges in a multi-site randomized clinical trial (RCT) of the efficacy of massage therapy for decreasing pain among patients with advanced cancer in palliative care/hospice settings. Over a period of 36 months, we enrolled 380 participants across 15 sites; 27% of whom withdrew prior to study completion (less than the anticipated 30% rate). We saw an average of 68% turnover amongst study staff. Three key qualities characterized successful on-site study teams: (1) organizational commitment; (2) strong leadership from on-site study coordinators; and (3) effective lines of communication between the on-site study coordinators and both their teams and the university-based research team. Issues of recruitment, retention and training should be accounted for in hospice-based research study design and budgeting.

  7. A framework to spatially cluster air pollution monitoring sites in US based on the PM2.5 composition

    PubMed Central

    Austin, Elena; Coull, Brent A.; Zanobetti, Antonella; Koutrakis, Petros

    2013-01-01

    Background Heterogeneity in the response to PM2.5 is hypothesized to be related to differences in particle composition across monitoring sites which reflect differences in source types as well as climatic and topographic conditions impacting different geographic locations. Identifying spatial patterns in particle composition is a multivariate problem that requires novel methodologies. Objectives Use cluster analysis methods to identify spatial patterns in PM2.5 composition. Verify that the resulting clusters are distinct and informative. Methods 109 monitoring sites with 75% reported speciation data during the period 2003–2008 were selected. These sites were categorized based on their average PM2.5 composition over the study period using k-means cluster analysis. The obtained clusters were validated and characterized based on their physico-chemical characteristics, geographic locations, emissions profiles, population density and proximity to major emission sources. Results Overall 31 clusters were identified. These include 21 clusters with 2 or more sites which were further grouped into 4 main types using hierarchical clustering. The resulting groupings are chemically meaningful and represent broad differences in emissions. The remaining clusters, encompassing single sites, were characterized based on their particle composition and geographic location. Conclusions The framework presented here provides a novel tool which can be used to identify and further classify sites based on their PM2.5 composition. The solution presented is fairly robust and yielded groupings that were meaningful in the context of air-pollution research. PMID:23850585

  8. Remote-sensing application for facilitating land resource assessment and monitoring for utility-scale solar energy development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamada, Yuki; Grippo, Mark A.

    2015-01-01

    A monitoring plan that incorporates regional datasets and integrates cost-effective data collection methods is necessary to sustain the long-term environmental monitoring of utility-scale solar energy development in expansive, environmentally sensitive desert environments. Using very high spatial resolution (VHSR; 15 cm) multispectral imagery collected in November 2012 and January 2014, an image processing routine was developed to characterize ephemeral streams, vegetation, and land surface in the southwestern United States where increased utility-scale solar development is anticipated. In addition to knowledge about desert landscapes, the methodology integrates existing spectral indices and transformation (e.g., visible atmospherically resistant index and principal components); a newlymore » developed index, erosion resistance index (ERI); and digital terrain and surface models, all of which were derived from a common VHSR image. The methodology identified fine-scale ephemeral streams with greater detail than the National Hydrography Dataset and accurately estimated vegetation distribution and fractional cover of various surface types. The ERI classified surface types that have a range of erosive potentials. The remote-sensing methodology could ultimately reduce uncertainty and monitoring costs for all stakeholders by providing a cost-effective monitoring approach that accurately characterizes the land resources at potential development sites.« less

  9. Site-conditions map for Portugal based on VS measurements: methodology and final model

    NASA Astrophysics Data System (ADS)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and subsequently for some geographical regions.

  10. On multi-site damage identification using single-site training data

    NASA Astrophysics Data System (ADS)

    Barthorpe, R. J.; Manson, G.; Worden, K.

    2017-11-01

    This paper proposes a methodology for developing multi-site damage location systems for engineering structures that can be trained using single-site damaged state data only. The methodology involves training a sequence of binary classifiers based upon single-site damage data and combining the developed classifiers into a robust multi-class damage locator. In this way, the multi-site damage identification problem may be decomposed into a sequence of binary decisions. In this paper Support Vector Classifiers are adopted as the means of making these binary decisions. The proposed methodology represents an advancement on the state of the art in the field of multi-site damage identification which require either: (1) full damaged state data from single- and multi-site damage cases or (2) the development of a physics-based model to make multi-site model predictions. The potential benefit of the proposed methodology is that a significantly reduced number of recorded damage states may be required in order to train a multi-site damage locator without recourse to physics-based model predictions. In this paper it is first demonstrated that Support Vector Classification represents an appropriate approach to the multi-site damage location problem, with methods for combining binary classifiers discussed. Next, the proposed methodology is demonstrated and evaluated through application to a real engineering structure - a Piper Tomahawk trainer aircraft wing - with its performance compared to classifiers trained using the full damaged-state dataset.

  11. Development of a laboratory niche Web site.

    PubMed

    Dimenstein, Izak B; Dimenstein, Simon I

    2013-10-01

    This technical note presents the development of a methodological laboratory niche Web site. The "Grossing Technology in Surgical Pathology" (www.grossing-technology.com) Web site is used as an example. Although common steps in creation of most Web sites are followed, there are particular requirements for structuring the template's menu on methodological laboratory Web sites. The "nested doll principle," in which one object is placed inside another, most adequately describes the methodological approach to laboratory Web site design. Fragmentation in presenting the Web site's material highlights the discrete parts of the laboratory procedure. An optimally minimal triad of components can be recommended for the creation of a laboratory niche Web site: a main set of media, a blog, and an ancillary component (host, contact, and links). The inclusion of a blog makes the Web site a dynamic forum for professional communication. By forming links and portals, cloud computing opens opportunities for connecting a niche Web site with other Web sites and professional organizations. As an additional source of information exchange, methodological laboratory niche Web sites are destined to parallel both traditional and new forms, such as books, journals, seminars, webinars, and internal educational materials. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Pilot-Scale Demonstration of In-Situ Chemical Oxidation ...

    EPA Pesticide Factsheets

    A pilot-scale in situ chemical oxidation (ISCO) demonstration, involving subsurface injections of sodium permanganate (NaMnO4), was performed at the US Marine Corp Recruit Depot (MCRD), site 45 (Parris Island (PI), SC). The ground water was originally contaminated with perchloroethylene (PCE) (also known as tetrachloroethylene), a chlorinated solvent used in dry cleaner operations. High resolution site characterization involved multiple iterations of soil core sampling and analysis. Nested micro-wells and conventional wells were also used to sample and analyze ground water for PCE and decomposition products (i.e., trichloroethyelene (TCE), dichloroethylene (c-DCE, t-DCE), and vinyl chloride (VC)), collectively referred to as chlorinated volatile organic compounds (CVOC). This characterization methodology was used to develop and refine the conceptual site model and the ISCO design, not only by identifying CVOC contamination but also by eliminating uncontaminated portions of the aquifer from further ISCO consideration. Direct-push injection was selected as the main method of NaMnO4 delivery due to its flexibility and low initial capital cost. Site impediments to ISCO activities in the source area involved subsurface utilities, including a high pressure water main, a high voltage power line, a communication line, and sanitary and stormwater sewer lines. Utility markings were used in conjunction with careful planning and judicious selection of injection locations. A

  13. Maillard Proteomics: Opening New Pages

    PubMed Central

    Soboleva, Alena; Schmidt, Rico; Vikhnina, Maria; Grishina, Tatiana; Frolov, Andrej

    2017-01-01

    Protein glycation is a ubiquitous non-enzymatic post-translational modification, formed by reaction of protein amino and guanidino groups with carbonyl compounds, presumably reducing sugars and α-dicarbonyls. Resulting advanced glycation end products (AGEs) represent a highly heterogeneous group of compounds, deleterious in mammals due to their pro-inflammatory effect, and impact in pathogenesis of diabetes mellitus, Alzheimer’s disease and ageing. The body of information on the mechanisms and pathways of AGE formation, acquired during the last decades, clearly indicates a certain site-specificity of glycation. It makes characterization of individual glycation sites a critical pre-requisite for understanding in vivo mechanisms of AGE formation and developing adequate nutritional and therapeutic approaches to reduce it in humans. In this context, proteomics is the methodology of choice to address site-specific molecular changes related to protein glycation. Therefore, here we summarize the methods of Maillard proteomics, specifically focusing on the techniques providing comprehensive structural and quantitative characterization of glycated proteome. Further, we address the novel break-through areas, recently established in the field of Maillard research, i.e., in vitro models based on synthetic peptides, site-based diagnostics of metabolism-related diseases (e.g., diabetes mellitus), proteomics of anti-glycative defense, and dynamics of plant glycated proteome during ageing and response to environmental stress. PMID:29231845

  14. Analysis of water flow paths: methodology and example calculations for a potential geological repository in Sweden.

    PubMed

    Werner, Kent; Bosson, Emma; Berglund, Sten

    2006-12-01

    Safety assessment related to the siting of a geological repository for spent nuclear fuel deep in the bedrock requires identification of potential flow paths and the associated travel times for radionuclides originating at repository depth. Using the Laxemar candidate site in Sweden as a case study, this paper describes modeling methodology, data integration, and the resulting water flow models, focusing on the Quaternary deposits and the upper 150 m of the bedrock. Example simulations identify flow paths to groundwater discharge areas and flow paths in the surface system. The majority of the simulated groundwater flow paths end up in the main surface waters and along the coastline, even though the particles used to trace the flow paths are introduced with a uniform spatial distribution at a relatively shallow depth. The calculated groundwater travel time, determining the time available for decay and retention of radionuclides, is on average longer to the coastal bays than to other biosphere objects at the site. Further, it is demonstrated how GIS-based modeling can be used to limit the number of surface flow paths that need to be characterized for safety assessment. Based on the results, the paper discusses an approach for coupling the present models to a model for groundwater flow in the deep bedrock.

  15. Analysis of Solar Census Remote Solar Access Value Calculation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nangle, J.; Dean, J.; Van Geet, O.

    2015-03-01

    The costs of photovoltaic (PV) system hardware (PV panels, inverters, racking, etc.) have fallen dramatically over the past few years. Nonhardware (soft) costs, however, have failed to keep pace with the decrease in hardware costs, and soft costs have become a major driver of U.S. PV system prices. Upfront or 'sunken' customer acquisition costs make up a portion of an installation's soft costs and can be addressed through software solutions that aim to streamline sales and system design aspects of customer acquisition. One of the key soft costs associated with sales and system design is collecting information on solar accessmore » for a particular site. Solar access, reported in solar access values (SAVs), is a measurement of the available clear sky over a site and is used to characterize the impacts of local shading objects. Historically, onsite shading studies have been required to characterize the SAV of the proposed array and determine the potential energy production of a photovoltaic system.« less

  16. Quantitative interpretation of magnetic properties as a way to characterize biogeophysical signatures of biodegraded contaminated sites

    NASA Astrophysics Data System (ADS)

    Ustra, A.; Kessouri, P.; Leite, A.; Mendonça, C. A.; Bandeira, N.

    2017-12-01

    Magnetic minerals in soils and rocks are one way to study biogechemical and paleoenvironmental processes. The ultrafine fraction of these minerals (superparmagnetic (SP) and stable single domain (SSD)) are usually investigated in environmental magnetism studies, since changes in mineralogy, concentration, size and morphology of the magnetic grains can be related to biogeochemical processes. In this study, we use low-field frequency dependent susceptibility (FDS) and isothermal remanent magnetization (IRM) to characterize the magnetic properties of materials in environmental magnetism. Magnetic susceptibility (MS) measurements are frequently used as a proxy of magnetic minerals present in soils and rocks. MS is a complex function of magnetic mineralogy and grain size, as well as magnitude and frequency of the applied field. This work presents a method for inverting low-field FDS data. The inverted parameters can be interpreted in terms of grain size variations of magnetic particles on the SP-SSD transition. This work also presents a method for inverting IRM demagnetization curves, to obtain the saturation magnetization, the individual magnetic moment for an assemblage of ultrafine SP minerals and estimate the concentration of magnetic carriers. IRM magnetization curves can be interpreted as resulting from distinct contributions of different mineral phases, which can be described by Cummulative Log-Gaussian (CLG) distributions. Each acquisition curve provides fundamental parameters that are characteristic of the respective mineral phase. The CLG decomposition is widely used in an interpretation procedure named mineral unmixing. In this work we present an inversion method for mineral unmixing, implementing the genetic algorithm to find the parameters of distinct components. These methodologies have been tested by synthetic models and applied to data from environmental magnetism studies. In this work we apply the proposed methodologies to characterize the magnetic properties of samples from the former Brandywine MD Defense Reutilization and Marketing Office (DRMO). The results from the magnetic properties characterization will provide additional information that may assist the interpretation of the biogeophysical signatures observed at the site.

  17. Mortar radiocarbon dating: preliminary accuracy evaluation of a novel methodology.

    PubMed

    Marzaioli, Fabio; Lubritto, Carmine; Nonni, Sara; Passariello, Isabella; Capano, Manuela; Terrasi, Filippo

    2011-03-15

    Mortars represent a class of building and art materials that are widespread at archeological sites from the Neolithic period on. After about 50 years of experimentation, the possibility to evaluate their absolute chronology by means of radiocarbon ((14)C) remains still uncertain. With the use of a simplified mortar production process in the laboratory environment, this study shows the overall feasibility of a novel physical pretreatment for the isolation of the atmospheric (14)CO(2) (i.e., binder) signal absorbed by the mortars during their setting. This methodology is based on the assumption that an ultrasonic attack in liquid phase isolates a suspension of binder carbonates from bulk mortars. Isotopic ((13)C and (14)C), % C, X-ray diffractometry (XRD), and scanning electron microscopy (SEM) analyses were performed to characterize the proposed methodology. The applied protocol allows suppression of the fossil carbon (C) contamination originating from the incomplete burning of the limestone during the quick lime production, providing unbiased dating for "laboratory" mortars produced operating at historically adopted burning temperatures.

  18. Characterization of the interactions of PARP-1 with UV-damaged DNA in vivo and in vitro

    PubMed Central

    Purohit, Nupur K.; Robu, Mihaela; Shah, Rashmi G.; Geacintov, Nicholas E.; Shah, Girish M.

    2016-01-01

    The existing methodologies for studying robust responses of poly (ADP-ribose) polymerase-1 (PARP-1) to DNA damage with strand breaks are often not suitable for examining its subtle responses to altered DNA without strand breaks, such as UV-damaged DNA. Here we describe two novel assays with which we characterized the interaction of PARP-1 with UV-damaged DNA in vivo and in vitro. Using an in situ fractionation technique to selectively remove free PARP-1 while retaining the DNA-bound PARP-1, we demonstrate a direct recruitment of the endogenous or exogenous PARP-1 to the UV-lesion site in vivo after local irradiation. In addition, using the model oligonucleotides with single UV lesion surrounded by multiple restriction enzyme sites, we demonstrate in vitro that DDB2 and PARP-1 can simultaneously bind to UV-damaged DNA and that PARP-1 casts a bilateral asymmetric footprint from −12 to +9 nucleotides on either side of the UV-lesion. These techniques will permit characterization of different roles of PARP-1 in the repair of UV-damaged DNA and also allow the study of normal housekeeping roles of PARP-1 with undamaged DNA. PMID:26753915

  19. Clearing Unexploded Ordnance: Bayesian Methodology for Assessing Success

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, K K.

    2005-10-30

    The Department of Defense has many Formerly Used Defense Sites (FUDS) that are slated for transfer for public use. Some sites have unexploded ordnance (UXO) that must be cleared prior to any land transfers. Sites are characterized using geophysical sensing devices and locations are identified where possible UXO may be located. In practice, based on the analysis of the geophysical surveys, a dig list of N suspect locations is created for a site that is possibly contaminated with UXO. The suspect locations on the dig list are often assigned into K bins ranging from ``most likely to contain UXO" tomore » ``least likely to be UXO" based on signal discrimination techniques and expert judgment. Usually all dig list locations are sampled to determine if UXO is present before the site is determined to be free of UXO. While this method is 100% certain to insure no UXO remains in the locations identified by the signal discrimination and expert judgment, it is very costly. This paper proposes a statistical Bayesian methodology that may result in digging less than 100% of the suspect locations to reach a pre-defined tolerable risk, where risk is defined in terms of a low probability that any UXO remains in the unsampled dig list locations. Two important features of a Bayesian approach are that it can account for uncertainties in model parameters and that it can handle data that becomes available in stages. The results from each stage of data can be used to direct the subsequent digs.« less

  20. Renewable Energy Assessment Methodology for Japanese OCONUS Army Installations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solana, Amy E.; Horner, Jacob A.; Russo, Bryan J.

    2010-08-30

    Since 2005, Pacific Northwest National Laboratory (PNNL) has been asked by Installation Management Command (IMCOM) to conduct strategic assessments at selected US Army installations of the potential use of renewable energy resources, including solar, wind, geothermal, biomass, waste, and ground source heat pumps (GSHPs). IMCOM has the same economic, security, and legal drivers to develop alternative, renewable energy resources overseas as it has for installations located in the US. The approach for continental US (CONUS) studies has been to use known, US-based renewable resource characterizations and information sources coupled with local, site-specific sources and interviews. However, the extent to whichmore » this sort of data might be available for outside the continental US (OCONUS) sites was unknown. An assessment at Camp Zama, Japan was completed as a trial to test the applicability of the CONUS methodology at OCONUS installations. It was found that, with some help from Camp Zama personnel in translating and locating a few Japanese sources, there was relatively little difficulty in finding sources that should provide a solid basis for conducting an assessment of comparable depth to those conducted for US installations. Project implementation will likely be more of a challenge, but the feasibility analysis will be able to use the same basic steps, with some adjusted inputs, as PNNL’s established renewable resource assessment methodology.« less

  1. A Measure of the Broad Substrate Specificity of Enzymes Based on ‘Duplicate’ Catalytic Residues

    PubMed Central

    Chakraborty, Sandeep; Ásgeirsson, Bjarni; Rao, Basuthkar J.

    2012-01-01

    The ability of an enzyme to select and act upon a specific class of compounds with unerring precision and efficiency is an essential feature of life. Simultaneously, these enzymes often catalyze the reaction of a range of similar substrates of the same class, and also have promiscuous activities on unrelated substrates. Previously, we have established a methodology to quantify promiscuous activities in a wide range of proteins. In the current work, we quantitatively characterize the active site for the ability to catalyze distinct, yet related, substrates (BRASS). A protein with known structure and active site residues provides the framework for computing ‘duplicate’ residues, each of which results in slightly modified replicas of the active site scaffold. Such spatial congruence is supplemented by Finite difference Poisson Boltzmann analysis which filters out electrostatically unfavorable configurations. The congruent configurations are used to compute an index (BrassIndex), which reflects the broad substrate profile of the active site. We identify an acetylhydrolase and a methyltransferase as having the lowest and highest BrassIndex, respectively, from a set of non-homologous proteins extracted from the Catalytic Site Atlas. The acetylhydrolase, a regulatory enzyme, is known to be highly specific for platelet-activating factor. In the methyltransferase (PDB: 1QAM), various combinations of glycine (Gly38/40/42), asparagine (Asn101/11) and glutamic acid (Glu59/36) residues having similar spatial and electrostatic profiles with the specified scaffold (Gly38, Asn101 and Glu59) exemplifies the broad substrate profile such an active site may provide. ‘Duplicate’ residues identified by relaxing the spatial and/or electrostatic constraints can be the target of directed evolution methodologies, like saturation mutagenesis, for modulating the substrate specificity of proteins. PMID:23166637

  2. Human health exposure assessment for Rocky Mountain Arsenal study area evaluations. Volume 6-E. Central study area exposure assessment. Version 4. 1(volume 6-E). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objectives of the Human Health Exposure Assessment include: (1) estimate the type and magnitude of exposures to contaminants; (2) Identify contaminants of concern; (3) Identify sites for remedial action; (4) Recommend sites for the no action remedial alternative; and (5) Provide a basis for detailed characterization of the risk associated with all sites. This document consists of the following: An executive summary. Vol I - Land use and exposed population evaluations. Vol. II III - Toxicity assessment (includes army and shell toxicity profiles). Vol. IV - PPLV Methodology. Vol. V - PPLV Calculations. Vol. VI - Study area exposuremore » analysis A introduction, B Western study ares, C Southern study area, D northern Central study area, E Central study area, F Eastern study area, G South plants study area, and H North plants study area. Vol. VII - Summary exposure assessment.« less

  3. Study area evaluations. Volume 6-H. North plants study area exposure assessment version 4. 1. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objectives of the Human Health Exposure Assessment include: (1) estimate the type and magnitude of exposures to contaminants; (2) Identify contaminants of concern; (3) Identify sites for remedial action; (4) Recommend sites for the no action remedial alternative; and (5) Provide a basis for detailed characterization of the risk associated with all sites. This document consists of the following: An executive summary. Vol I - Land use and exposed population evaluations. Vol. II III - Toxicity assessment (includes army and shell toxicity profiles). Vol. IV - PPLV Methodology. Vol. V - PPLV Calculations. Vol. VI - Study area exposuremore » analysis A introduction, B Western study ares, C Southern study area, D northern Central study area, E Central study area, F Eastern study area, G South plants study area, and H North plants study area. Vol. VII - Summary exposure assessment.« less

  4. Human health exposure assessment for Rocky Mountain Arsenal. Volume 7. Summary exposure assessment version 4. 1. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objectives of the Human Health Exposure Assessment include: (1) estimate the type and magnitude of exposures to contaminants; (2) Identify contaminants of concern; (3) Identify sites for remedial action; (4) Recommend sites for the no action remedial alternative; and (5) Provide a basis for detailed characterization of the risk associated with all sites. This document consists of the following: An executive summary. Vol I - Land use and exposed population evaluations. Vol. II III - Toxicity assessment (includes army and shell toxicity profiles). Vol. IV - PPLV Methodology. Vol. V - PPLV Calculations. Vol. VI - Study area exposuremore » analysis A introduction, B Western study ares, C Southern study area, D northern Central study area, E Central study area, F Eastern study area, G South plants study area, and H North plants study area. Vol. VII - Summary exposure assessment.« less

  5. Human health exposure assessment for Rocky Mountain Arsenal study area evaluations. Volume 6-F. Eastern study area exposure assessment. Version 4. 1(volume 6-F). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objectives of the Human Health Exposure Assessment include: (1) estimate the type and magnitude of exposures to contaminants; (2) Identify contaminants of concern; (3) Identify sites for remedial action; (4) Recommend sites for the no action remedial alternative; and (5) Provide a basis for detailed characterization of the risk associated with all sites. This document consists of the following: An executive summary. Vol I - Land use and exposed population evaluations. Vol. II III - Toxicity assessment (includes army and shell toxicity profiles). Vol. IV - PPLV Methodology. Vol. V - PPLV Calculations. Vol. VI - Study area exposuremore » analysis A introduction, B Western study ares, C Southern study area, D northern Central study area, E Central study area, F Eastern study area, G South plants study area, and H North plants study area. Vol. VII - Summary exposure assessment.« less

  6. Human health exposure assessment for Rocky Mountain Arsenal. Volume 8. Response to comments on the draft exposure assessment version 4. 1. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objectives of the Human Health Exposure Assessment include: (1) estimate the type and magnitude of exposures to contaminants; (2) Identify contaminants of concern; (3) Identify sites for remedial action; (4) Recommend sites for the no action remedial alternative; and (5) Provide a basis for detailed characterization of the risk associated with all sites. This document consists of the following: An executive summary. Vol I - Land use and exposed population evaluations. Vol. II III - Toxicity assessment (includes army and shell toxicity profiles). Vol. IV - PPLV Methodology. Vol. V - PPLV Calculations. Vol. VI - Study area exposuremore » analysis A introduction, B Western study ares, C Southern study area, D northern Central study area, E Central study area, F Eastern study area, G South plants study area, and H North plants study area. Vol. VII - Summary exposure assessment.« less

  7. Human health exposure assessment for Rocky Mountain Arsenal. Volume 2-A. Toxicity assessment. Version 4. 1. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objectives of the Human Health Exposure Assessment include: (1) estimate the type and magnitude of exposures to contaminants; (2) Identify contaminants of concern; (3) Identify sites for remedial action; (4) Recommend sites for the no action remedial alternative; and (5) Provide a basis for detailed characterization of the risk associated with all sites. This document consists of the following: An executive summary. Vol I - Land use and exposed population evaluations. Vol. II III - Toxicity assessment (includes army and shell toxicity profiles). Vol. IV - PPLV Methodology. Vol. V - PPLV Calculations. Vol. VI - Study area exposuremore » analysis a introduction, B Western study ares, C Southern study area, D northern Central study area, E Central study area, F Eastern study area, G South plants study area, and H North plants study area. Vol. VII - Summary exposure assessment.« less

  8. Human health exposure assessment for Rocky Mountain Arsenal. Volume 3. Toxicity assessment version 4. 1. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objectives of the Human Health Exposure Assessment include: (1) estimate the type and magnitude of exposures to contaminants; (2) Identify contaminants of concern; (3) Identify sites for remedial action; (4) Recommend sites for the no action remedial alternative; and (5) Provide a basis for detailed characterization of the risk associated with all sites. This document consists of the following: An executive summary. Vol I - Land use and exposed population evaluations. Vol. II III - Toxicity assessment (includes army and shell toxicity profiles). Vol. IV - PPLV Methodology. Vol. V - PPLV Calculations. Vol. VI - Study area exposuremore » analysis A introduction, B Western study ares, C Southern study area, D northern Central study area, E Central study area, F Eastern study area, G South plants study area, and H North plants study area. Vol. VII - Summary exposure assessment.« less

  9. Human health exposure assessment for Rocky Mountain Arsenal. Volume 1. Land use and exposed population evaluations version 4. 1. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objectives of the Human Health Exposure Assessment include: (1) estimate the type and magnitude of exposures to contaminants; (2) Identify contaminants of concern; (3) Identify sites for remedial action; (4) Recommend sites for the no action remedial alternative; and (5) Provide a basis for detailed characterization of the risk associated with all sites. This document consists of the following: An executive summary. Vol I - Land use and exposed population evaluations. Vol. II III - Toxicity assessment (includes army and shell toxicity profiles). Vol. IV - PPLV Methodology. Vol. V - PPLV Calculations. Vol. VI - Study area exposuremore » analysis A introduction, B Western study ares, C Southern study area, D northern Central study area, E Central study area, F Eastern study area, G South plants study area, and H North plants study area. Vol. VII - Summary exposure assessment.« less

  10. Human health exposure assessment for Rocky Mountain Arsenal study area evaluations. Volume 6-C. Southern study area exposure assessment version 4. 1. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The objectives of the Human Health Exposure Assessment include: (1) estimate the type and magnitude of exposures to contaminants; (2) Identify contaminants of concern; (3) Identify sites for remedial action; (4) Recommend sites for the no action remedial alternative; and (5) Provide a basis for detailed characterization of the risk associated with all sites. This document consists of the following: An executive summary. Vol I - Land use and exposed population evaluations. Vol. II III - Toxicity assessment (includes army and shell toxicity profiles). Vol. IV - PPLV Methodology. Vol. V - PPLV Calculations. Vol. VI - Study area exposuremore » analysis A introduction, B Western study ares, C Southern study area, D northern Central study area, E Central study area, F Eastern study area, G South plants study area, and H North plants study area. Vol. VII - Summary exposure assessment.« less

  11. The National Visitor Use Monitoring methodology and final results for round 1

    Treesearch

    S.J. Zarnoch; E.M. White; D.B.K. English; Susan M. Kocis; Ross Arnold

    2011-01-01

    A nationwide, systematic monitoring process has been developed to provide improved estimates of recreation visitation on National Forest System lands. Methodology is presented to provide estimates of site visits and national forest visits based on an onsite sampling design of site-days and last-exiting recreationists. Stratification of the site days, based on site type...

  12. Selection and Characterization of Landing Sites for Chandrayaan-2 Lander

    NASA Astrophysics Data System (ADS)

    Gopala Krishna, Barla; Amitabh, Amitabh; Srinivasan, T. P.; Karidhal, Ritu; Nagesh, G.; Manjusha, N.

    2016-07-01

    Indian Space Research Organisation has planned the second mission to moon known as Chandrayaan-2, which consists of an Orbiter, a Lander and a Rover. This will be the first soft landing mission of India on lunar surface. The Orbiter, Lander and Rover individually will carry scientific payloads that enhance the scientific objectives of Chandrayaan-2. The Lander soft lands on the lunar surface and subsequently Lander & Rover will carry on with the payload activities on the moon surface. Landing Site identification based on the scientific and engineering constrains of lander plays an important role in success of a mission. The Lander poses some constraints because of its engineering design for the selection of the landing site and on the other hand the landing site / region imparts some constrain on the Lander. The various constraints that have to be considered for the study of the landing site are Local slope, Sun illumination during mission life, Radio communication with the Earth, Global slope towards equator, Boulders size, Crater density and boulder distribution. This paper describes the characterization activities of the different landing locations which have been studied for Chandrayaan-2 Lander. The sites have been studied both in the South Polar and North Polar regions of the moon on the near side. The Engineering Constraints at the sites due to the Lander, Factors that affect mission life (i.e. illumination at the location), Factors influencing communication to earth (i.e. RF visibility) & Shadow movements have been studied at these locations and zones that are favourable for landing have been short listed. This paper gives methodology of these studies along with the results of the characteristics of all the sites and the recommendations for further action in finalizing the landing area.

  13. Method development and strategy for the characterization of complexly faulted and fractured rhyolitic tuffs, Yucca Mountain, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karasaki, K.; Galloway, D.

    1991-06-01

    The planned high-level nuclear waste repository at Yucca Mountain, Nevada, would exist in unsaturated, fractured welded tuff. One possible contaminant pathway to the accessible environment is transport by groundwater infiltrating to the water table and flowing through the saturated zone. Therefore, an effort to characterize the hydrology of the saturated zone is being undertaken in parallel with that of the unsaturated zone. As a part of the saturated zone investigation, there wells-UE-25c{number_sign}1, UE-25c{number_sign}2, and UE-25c{number_sign}3 (hereafter called the c-holes)-were drilled to study hydraulic and transport properties of rock formations underlying the planned waste repository. The location of the c-holes ismore » such that the formations penetrated in the unsaturated zone occur at similar depths and with similar thicknesses as at the planned repository site. In characterizing a highly heterogeneous flow system, several issues emerge. (1) The characterization strategy should allow for the virtual impossibility to enumerate and characterize all heterogeneities. (2) The methodology to characterize the heterogeneous flow system at the scale of the well tests needs to be established. (3) Tools need to be developed for scaling up the information obtained at the well-test scale to the larger scale of the site. In the present paper, the characterization strategy and the methods under development are discussed with the focus on the design and analysis of the field experiments at the c-holes.« less

  14. Evaluation of Statistical Methodologies Used in U. S. Army Ordnance and Explosive Work

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrouchov, G

    2000-02-14

    Oak Ridge National Laboratory was tasked by the U.S. Army Engineering and Support Center (Huntsville, AL) to evaluate the mathematical basis of existing software tools used to assist the Army with the characterization of sites potentially contaminated with unexploded ordnance (UXO). These software tools are collectively known as SiteStats/GridStats. The first purpose of the software is to guide sampling of underground anomalies to estimate a site's UXO density. The second purpose is to delineate areas of homogeneous UXO density that can be used in the formulation of response actions. It was found that SiteStats/GridStats does adequately guide the sampling somore » that the UXO density estimator for a sector is unbiased. However, the software's techniques for delineation of homogeneous areas perform less well than visual inspection, which is frequently used to override the software in the overall sectorization methodology. The main problems with the software lie in the criteria used to detect nonhomogeneity and those used to recommend the number of homogeneous subareas. SiteStats/GridStats is not a decision-making tool in the classical sense. Although it does provide information to decision makers, it does not require a decision based on that information. SiteStats/GridStats provides information that is supplemented by visual inspections, land-use plans, and risk estimates prior to making any decisions. Although the sector UXO density estimator is unbiased regardless of UXO density variation within a sector, its variability increases with increased sector density variation. For this reason, the current practice of visual inspection of individual sampled grid densities (as provided by Site-Stats/GridStats) is necessary to ensure approximate homogeneity, particularly at sites with medium to high UXO density. Together with Site-Stats/GridStats override capabilities, this provides a sufficient mechanism for homogeneous sectorization and thus yields representative UXO density estimates. Objections raised by various parties to the use of a numerical ''discriminator'' in SiteStats/GridStats were likely because of the fact that the concerned statistical technique is customarily applied for a different purpose and because of poor documentation. The ''discriminator'', in Site-Stats/GridStats is a ''tuning parameter'' for the sampling process, and it affects the precision of the grid density estimates through changes in required sample size. It is recommended that sector characterization in terms of a map showing contour lines of constant UXO density with an expressed uncertainty or confidence level is a better basis for remediation decisions than a sector UXO density point estimate. A number of spatial density estimation techniques could be adapted to the UXO density estimation problem.« less

  15. Design of Highly Selective Platinum Nanoparticle Catalysts for the Aerobic Oxidation of KA-Oil using Continuous-Flow Chemistry.

    PubMed

    Gill, Arran M; Hinde, Christopher S; Leary, Rowan K; Potter, Matthew E; Jouve, Andrea; Wells, Peter P; Midgley, Paul A; Thomas, John M; Raja, Robert

    2016-03-08

    Highly active and selective aerobic oxidation of KA-oil to cyclohexanone (precursor for adipic acid and ɛ-caprolactam) has been achieved in high yields using continuous-flow chemistry by utilizing uncapped noble-metal (Au, Pt & Pd) nanoparticle catalysts. These are prepared using a one-step in situ methodology, within three-dimensional porous molecular architectures, to afford robust heterogeneous catalysts. Detailed spectroscopic characterization of the nature of the active sites at the molecular level, coupled with aberration-corrected scanning transmission electron microscopy, reveals that the synthetic methodology and associated activation procedures play a vital role in regulating the morphology, shape and size of the metal nanoparticles. These active centers have a profound influence on the activation of molecular oxygen for selective catalytic oxidations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Residual Strength Analysis Methodology: Laboratory Coupons to Structural Components

    NASA Technical Reports Server (NTRS)

    Dawicke, D. S.; Newman, J. C., Jr.; Starnes, J. H., Jr.; Rose, C. A.; Young, R. D.; Seshadri, B. R.

    2000-01-01

    The NASA Aircraft Structural Integrity (NASIP) and Airframe Airworthiness Assurance/Aging Aircraft (AAA/AA) Programs have developed a residual strength prediction methodology for aircraft fuselage structures. This methodology has been experimentally verified for structures ranging from laboratory coupons up to full-scale structural components. The methodology uses the critical crack tip opening angle (CTOA) fracture criterion to characterize the fracture behavior and a material and a geometric nonlinear finite element shell analysis code to perform the structural analyses. The present paper presents the results of a study to evaluate the fracture behavior of 2024-T3 aluminum alloys with thickness of 0.04 inches to 0.09 inches. The critical CTOA and the corresponding plane strain core height necessary to simulate through-the-thickness effects at the crack tip in an otherwise plane stress analysis, were determined from small laboratory specimens. Using these parameters, the CTOA fracture criterion was used to predict the behavior of middle crack tension specimens that were up to 40 inches wide, flat panels with riveted stiffeners and multiple-site damage cracks, 18-inch diameter pressurized cylinders, and full scale curved stiffened panels subjected to internal pressure and mechanical loads.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, S; Larsen, S; Wagoner, J

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization of full three-dimensional (3D)more » finite difference modeling, as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project, in support of LLNL's national-security mission, benefits the U.S. military and intelligence community. Fiscal year (FY) 2003 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A three-seismic-array vehicle tracking testbed was installed on site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications. In FY03 specifically, a large and complex simulation experiment was conducted that tested the full modeling-based approach to geological characterization using E2D, the K-L statistical methodology, and matched field processing applied to tunnel detection with surface seismic sensors. The simulation validated the full methodology and the need for geological heterogeneity to be accounted for in the overall approach. The Lake Lynn site area was geologically modeled using the code Earthvision to produce a 32 million node 3D model grid for E3D. Model linking issues were resolved and a number of full 3D model runs were accomplished using shot locations that matched the data. E3D-generated wavefield movies showed the reflection signal would be too small to be observed in the data due to trapped and attenuated energy in the weathered layer. An analysis of the few sensors coupled to bedrock did not improve the reflection signal strength sufficiently because the shots, though buried, were within the surface layer and hence attenuated. Ability to model a complex 3D geological structure and calculate synthetic seismograms that are in good agreement with actual data (especially for surface waves and below the complex weathered layer) was demonstrated. We conclude that E3D is a powerful tool for assessing the conditions under which a tunnel could be detected in a specific geological setting. Finally, the Lake Lynn tunnel explosion data were analyzed using standard array processing techniques. The results showed that single detonations could be detected and located but simultaneous detonations would require a strategic placement of arrays.« less

  18. Use of USLE/GIS methodology for predicting soil loss in a semiarid agricultural watershed.

    PubMed

    Erdogan, Emrah H; Erpul, Günay; Bayramin, Ilhami

    2007-08-01

    The Universal Soil Loss Equation (USLE) is an erosion model to estimate average soil loss that would generally result from splash, sheet, and rill erosion from agricultural plots. Recently, use of USLE has been extended as a useful tool predicting soil losses and planning control practices in agricultural watersheds by the effective integration of the GIS-based procedures to estimate the factor values in a grid cell basis. This study was performed in the Kazan Watershed located in the central Anatolia, Turkey, to predict soil erosion risk by the USLE/GIS methodology for planning conservation measures in the site. Rain erosivity (R), soil erodibility (K), and cover management factor (C) values of the model were calculated from erosivity map, soil map, and land use map of Turkey, respectively. R values were site-specifically corrected using DEM and climatic data. The topographical and hydrological effects on the soil loss were characterized by LS factor evaluated by the flow accumulation tool using DEM and watershed delineation techniques. From resulting soil loss map of the watershed, the magnitude of the soil erosion was estimated in terms of the different soil units and land uses and the most erosion-prone areas where irreversible soil losses occurred were reasonably located in the Kazan watershed. This could be very useful for deciding restoration practices to control the soil erosion of the sites to be severely influenced.

  19. A graph-based approach to detect spatiotemporal dynamics in satellite image time series

    NASA Astrophysics Data System (ADS)

    Guttler, Fabio; Ienco, Dino; Nin, Jordi; Teisseire, Maguelonne; Poncelet, Pascal

    2017-08-01

    Enhancing the frequency of satellite acquisitions represents a key issue for Earth Observation community nowadays. Repeated observations are crucial for monitoring purposes, particularly when intra-annual process should be taken into account. Time series of images constitute a valuable source of information in these cases. The goal of this paper is to propose a new methodological framework to automatically detect and extract spatiotemporal information from satellite image time series (SITS). Existing methods dealing with such kind of data are usually classification-oriented and cannot provide information about evolutions and temporal behaviors. In this paper we propose a graph-based strategy that combines object-based image analysis (OBIA) with data mining techniques. Image objects computed at each individual timestamp are connected across the time series and generates a set of evolution graphs. Each evolution graph is associated to a particular area within the study site and stores information about its temporal evolution. Such information can be deeply explored at the evolution graph scale or used to compare the graphs and supply a general picture at the study site scale. We validated our framework on two study sites located in the South of France and involving different types of natural, semi-natural and agricultural areas. The results obtained from a Landsat SITS support the quality of the methodological approach and illustrate how the framework can be employed to extract and characterize spatiotemporal dynamics.

  20. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was sUGcessful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  1. Assessment of Effectiveness of Geologic Isolation Systems: REFERENCE SITE INITIAL ASSESSMENT FOR A SALT DOME REPOSITORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harwell, M. A.; Brandstetter, A.; Benson, G. L.

    1982-06-01

    As a methodology demonstration for the Office of Nuclear Waste Isolation (ONWI), the Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) Program conducted an initial reference site analysis of the long-term effectiveness of a salt dome repository. The Hainesville Salt Dome in Texas was chosen to be representative of the Gulf Coast interior salt domes; however, the Hainesville Site has been eliminated as a possible nuclear waste repository site. The data used for this exercise are not adequate for an actual assessment, nor have all the parametric analyses been made that would adequately characterize the response of the geosystem surroundingmore » the repository. Additionally, because this was the first exercise of the complete AEGIS and WASTE Rock Interaction Technology (WRIT) methodology, this report provides the initial opportunity for the methodology, specifically applied to a site, to be reviewed by the community outside the AEGIS. The scenario evaluation, as a part of the methodology demonstration, involved consideration of a large variety of potentially disruptive phenomena, which alone or in concert could lead to a breach in a salt dome repository and to a subsequent transport of the radionuclides to the environment. Without waste- and repository-induced effects, no plausible natural geologic events or processes which would compromise the repository integrity could be envisioned over the one-million-year time frame after closure. Near-field (waste- and repository-induced) effects were excluded from consideration in this analysis, but they can be added in future analyses when that methodology development is more complete. The potential for consequential human intrusion into salt domes within a million-year time frame led to the consideration of a solution mining intrusion scenario. The AEGIS staff developed a specific human intrusion scenario at 100 years and 1000 years post-closure, which is one of a whole suite of possible scenarios. This scenario resulted in the delivery of radionuclidecontaminated brine to the surface, where a portion was diverted to culinary salt for direct ingestion by the existing population. Consequence analyses indicated calculated human doses that would be highly deleterious. Additional analyses indicated that doses well above background would occur from such a scenario t even if it occurred a million years into the future. The way to preclude such an intrusion is for continued control over the repository sitet either through direct institutional control or through the effective passive transfer of information. A secondary aspect of the specific human intrusion scenario involved a breach through the side of the salt dome t through which radionuclides migrated via the ground-water system to the accessible environment. This provided a demonstration of the geotransport methodology that AEGIS can use in actual site evaluations, as well as the WRIT program's capabilities with respect to defining the source term and retardation rates of the radionuclides in the repository. This reference site analysis was initially published as a Working Document in December 1979. That version was distributed for a formal peer review by individuals and organizations not involved in its development. The present report represents a revisiont based in part on the responses received from the external reviewers. Summaries of the comments from the reviewers and responses to these comments by the AEGIS staff are presented. The exercise of the AEGIS methodology was successful in demonstrating the methodologyt and thus t in providing a basis for substantive peer review, in terms of further development of the AEGIS site-applications capability and in terms of providing insight into the potential for consequential human intrusion into a salt dome repository.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burton, J. C.; Environmental Research

    The Commodity Credit Corporation (CCC) of the U.S. Department of Agriculture (USDA) has entered into an interagency agreement with the U.S. Department of Energy (DOE) under which Argonne National Laboratory provides technical assistance for hazardous waste site characterization and remediation for the CCC/USDA. Carbon tetrachloride is the contaminant of primary concern at sites in Kansas where former CCC/USDA grain storage facilities were located. Argonne applies its QuickSite(reg sign) Expedited Site Characterization (ESC) approach to these former facilities. The QuickSite environmental site characterization methodology is Argonne's proprietary implementation of the ESC process (ASTM 1998). Argonne has used this approach at severalmore » former CCC/USDA facilities in Kansas, including Agenda, Agra, Everest, and Frankfort. The Argonne ESC approach revolves around a multidisciplinary, team-oriented approach to problem solving. The basic features and steps of the QuickSite methodology are as follows: (1) A team of scientists with diverse expertise and strong field experience is required to make the process work. The Argonne team is composed of geologists, geochemists, geophysicists, hydrogeologists, chemists, biologists, engineers, computer scientists, health and safety personnel, and regulatory staff, as well as technical support staff. Most of the staff scientists are at the Ph.D. level; each has on average, more than 15 years of experience. The technical team works together throughout the process. In other words, the team that plans the program also implements the program in the field and writes the reports. More experienced scientists do not remain in the office while individuals with lesser degrees or experience carry out the field work. (2) The technical team reviews, evaluates, and interprets existing data for the site and the contaminants there to determine which data sets are technically valid and can be used in initially designing the field program. A basic mistake sometimes made in the site characterization process is failure to use technically sound available data to form working hypotheses on hydrogeology, contaminant distribution, etc. for initial testing. (3) After assembling and interpreting existing data for the site, the entire technical team visits the site to identify as a group the site characteristics that might prohibit or enhance any particular technological approach. Logistic and community constraints are also identified at this point. (4) After the field visit, the team selects a suite of technologies appropriate to the problem and completes the design of the field program. No one technique works well at all sites, and a suite of techniques is necessary to delineate site features fully. In addition, multiple technologies are employed to increase confidence in conclusions about site features. Noninvasive and minimally invasive technologies are emphasized to minimize risk to the environment, the community, and the staff. In no case is the traditional approach of installing a massive number of monitoring wells followed. A dynamic work plan that outlines the program is produced for the sponsoring and regulatory agencies. The word ''dynamic'' is emphasized because the work plan is viewed as a guide, subject to modification, for the site characterization activity, rather than a document that is absolute and unchangeable. Therefore, the health and safety plan and the quality assurance/quality control plan must be broad and encompass all possible alterations to the plan. The cooperation of the regulating agency is essential in successful implementation of this process. The sponsoring and regulatory agencies are notified if significant changes to the site-specific work plan are necessary. (5) The entire team participates in the technical field program. Several technical activities are undertaken simultaneously. These may range from different surface geophysics investigations to vegetation sampling. Data from the various activities are reduced and interpreted each day by the technical staff. Various computer programs are used to visualize and integrate the data. However, people do the data interpretation and integration, not the computers, which are just one more tool at the site. At the end of the day, the staff members meet, review results, and modify the next day's program as necessary to optimize activities that are generating overlapping or confirming site details. Data are not arbitrarily discarded -- each finding must be explained and understood. Anomalous readings may be due to equipment malfunctions, laboratory error, or the inability of a technique to work in a given setting. The suite of selected technologies is adjusted in the field if necessary. (6) The end result of this process is the optimization of the field activity to produce a high-quality technical product that is cost and time effective.« less

  3. [Use of a methodology to assess a risk at the stage of substantiating the choice of a ground area for placing industrial enterprises].

    PubMed

    Perminova, L A; Karpenko, I L; Barkhatova, L A; Chekryzhov, I V

    2009-01-01

    Human health risk was screened at the stage of supporting the choice of a building site for a petroleum refinery. Based on the calculated exposure levels, the risk to the Buguruslan population was characterized under different conditions: without consideration for the impact of the projected enterprise; with consideration for the pollutants contained in the emissions of the projected enterprise; at the border of an apartment block and at that of an estimated control area in five receptor points. The assessment of a human risk has revealed that the industrial enterprise cannot be sited on this territory and the estimated control area is inadequate due to the high carcinogenic and toxic risk from the formed emission of the projected enterprise.

  4. Real-time Microseismic Processing for Induced Seismicity Hazard Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzel, Eric M.

    Induced seismicity is inherently associated with underground fluid injections. If fluids are injected in proximity to a pre-existing fault or fracture system, the resulting elevated pressures can trigger dynamic earthquake slip, which could both damage surface structures and create new migration pathways. The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterizationmore » phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.« less

  5. Radiometric ratio characterization for low-to-mid CPV modules operating in variable irradiance conditions

    NASA Astrophysics Data System (ADS)

    Vorndran, Shelby; Russo, Juan; Zhang, Deming; Gordon, Michael; Kostuk, Raymond

    2012-10-01

    In this work, a concentrating photovoltaic (CPV) design methodology is proposed which aims to maximize system efficiency for a given irradiance condition. In this technique, the acceptance angle of the system is radiometrically matched to the angular spread of the site's average irradiance conditions using a simple geometric ratio. The optical efficiency of CPV systems from flat-plate to high-concentration is plotted at all irradiance conditions. Concentrator systems are measured outdoors in various irradiance conditions to test the methodology. This modeling technique is valuable at the design stage to determine the ideal level of concentration for a CPV module. It requires only two inputs: the acceptance angle profile of the system and the site's average direct and diffuse irradiance fractions. Acceptance angle can be determined by raytracing or testing a fabricated prototype in the lab with a solar simulator. The average irradiance conditions can be found in the Typical Metrological Year (TMY3) database. Additionally, the information gained from this technique can be used to determine tracking tolerance, quantify power loss during an isolated weather event, and do more sophisticated analysis such as I-V curve simulation.

  6. Quantitative phosphoproteomic analysis of neuronal intermediate filament proteins (NF-M/H) in Alzheimer's disease by iTRAQ.

    PubMed

    Rudrabhatla, Parvathi; Grant, Philip; Jaffe, Howard; Strong, Michael J; Pant, Harish C

    2010-11-01

    Aberrant hyperphosphorylation of neuronal cytoskeletal proteins is one of the major pathological hallmarks of neurodegenerative disorders such as Alzheimer disease (AD), amyotrophic lateral sclerosis (ALS), and Parkinson's disease (PD). Human NF-M/H display a large number of multiple KSP repeats in the carboxy-terminal tail domain, which are phosphorylation sites of proline-directed serine/threonine (pSer/Thr-Pro, KS/T-P) kinases. The phosphorylation sites of NF-M/H have not been characterized in AD brain. Here, we use quantitative phosphoproteomic methodology, isobaric tag for relative and absolute quantitation (iTRAQ), for the characterization of NF-M/H phosphorylation sites in AD brain. We identified 13 hyperphosphorylated sites of NF-M; 9 Lys-Ser-Pro (KSP) sites; 2 variant motifs, Glu-Ser-Pro (ESP) Ser-736 and Leu-Ser-Pro (LSP) Ser-837; and 2 non-S/T-P motifs, Ser-783 and Ser-788. All the Ser/Thr residues are phosphorylated at significantly greater abundance in AD brain compared with control brain. Ten hyperphosphorylated KSP sites have been identified on the C-terminal tail domain of NF-H, with greater abundance of phosphorylation in AD brain compared with control brain. Our data provide the direct evidence that NF-M/H are hyperphosphorylated in AD compared with control brain and suggest the role of both proline-directed and non-proline-directed protein kinases in AD. This study represents the first comprehensive iTRAQ analyses and quantification of phosphorylation sites of human NF-M and NF-H from AD brain and suggests that aberrant hyperphosphorylation of neuronal intermediate filament proteins is involved in AD.

  7. A neural network based methodology to predict site-specific spectral acceleration values

    NASA Astrophysics Data System (ADS)

    Kamatchi, P.; Rajasankar, J.; Ramana, G. V.; Nagpal, A. K.

    2010-12-01

    A general neural network based methodology that has the potential to replace the computationally-intensive site-specific seismic analysis of structures is proposed in this paper. The basic framework of the methodology consists of a feed forward back propagation neural network algorithm with one hidden layer to represent the seismic potential of a region and soil amplification effects. The methodology is implemented and verified with parameters corresponding to Delhi city in India. For this purpose, strong ground motions are generated at bedrock level for a chosen site in Delhi due to earthquakes considered to originate from the central seismic gap of the Himalayan belt using necessary geological as well as geotechnical data. Surface level ground motions and corresponding site-specific response spectra are obtained by using a one-dimensional equivalent linear wave propagation model. Spectral acceleration values are considered as a target parameter to verify the performance of the methodology. Numerical studies carried out to validate the proposed methodology show that the errors in predicted spectral acceleration values are within acceptable limits for design purposes. The methodology is general in the sense that it can be applied to other seismically vulnerable regions and also can be updated by including more parameters depending on the state-of-the-art in the subject.

  8. Freeze-quench (57)Fe-Mössbauer spectroscopy: trapping reactive intermediates.

    PubMed

    Krebs, Carsten; Bollinger, J Martin

    2009-01-01

    (57)Fe-Mössbauer spectroscopy is a method that probes transitions between the nuclear ground state (I=1/2) and the first nuclear excited state (I=3/2). This technique provides detailed information about the chemical environment and electronic structure of iron. Therefore, it has played an important role in studies of the numerous iron-containing proteins and enzymes. In conjunction with the freeze-quench method, (57)Fe-Mössbauer spectroscopy allows for monitoring changes of the iron site(s) during a biochemical reaction. This approach is particularly powerful for detection and characterization of reactive intermediates. Comparison of experimentally determined Mössbauer parameters to those predicted by density functional theory for hypothetical model structures can then provide detailed insight into the structures of reactive intermediates. We have recently used this methodology to study the reactions of various mononuclear non-heme-iron enzymes by trapping and characterizing several Fe(IV)-oxo reaction intermediates. In this article, we summarize these findings and demonstrate the potential of the method. © Springer Science+Business Media B.V. 2009

  9. Argonne National Laboratory Expedited Site Characterization: First International Symposium on Integrated Technical Approaches to Site Characterization - Proceedings Volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-06-08

    Laboratory applications for the analysis of PCBS (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBS. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the newmore » sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval.« less

  10. Natural fracture systems studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lorenz, J.C.; Warpinski, N.R.

    The objectives of this program are (1) to develop a basinal-analysis methodology for natural fracture exploration and exploitation, and (2) to determine the important characteristics of natural fracture systems for use in completion, stimulation, and production operations. Natural-fracture basinal analysis begins with studies of fractures in outcrop, core and logs in order to determine the type of fracturing and the relationship of the fractures to the lithologic environment. Of particular interest are the regional fracture systems that are pervasive in western US tight sand basins. A Methodology for applying this analysis is being developed, with the goal of providing amore » structure for rationally characterizing natural fracture systems basin-wide. Such basin-wide characterizations can then be expanded and supplemented locally, at sites where production may be favorable. Initial application of this analysis is to the Piceance basin where there is a wealth of data from the Multiwell Experiment (MWX), DOE cooperative wells, and other basin studies conducted by Sandia, CER Corporation, and the USGS (Lorenz and Finley, 1989, Lorenz et aI., 1989, and Spencer and Keighin, 1984). Such a basinal approach has been capable of explaining the fracture characteristics found throughout the southern part of the Piceance basin and along the Grand Hogback.« less

  11. High concentrations of heavy metals in PM from ceramic factories of Southern Spain

    NASA Astrophysics Data System (ADS)

    Sánchez de la Campa, Ana M.; de la Rosa, Jesús D.; González-Castanedo, Yolanda; Fernández-Camacho, Rocío; Alastuey, Andrés; Querol, Xavier; Pio, Casimiro

    2010-06-01

    In this study, physicochemical characterization of Atmospheric Particulate Matter (PM) was performed in an urban-industrial site background (Bailén, Southern Spain), highly influenced by the impact of emission plumes from ceramic factories. This area is considered one of the towns with the highest PM 10 levels and average SO 2 concentration in Spain. A three stages methodology was used: 1) real-time measurements of levels of PM 10 and gaseous pollutants, and sampling of PM; 2) chemical characterization using ICP-MS, ICP-OES, CI and TOT, and source apportionment analysis (receptor modelling) of PM; and 3) chemical characterization of emission plumes derived from representative factories. High ambient air concentrations were found for most major components and trace elements compared with other industrialized towns in Spain. V and Ni are considered fingerprints of PM derived from the emissions of brick factories in this area, and were shown to be of particular interest. This highlights the high V and Ni concentrations in PM 10 (122 ngV/m 3 and 23.4 ngNi/m 3), with Ni exceeding the 2013 annual target value for the European Directive 2004/107/EC (20 ng/m 3). The methodology of this work can be used by Government departments responsible for Environment and Epidemiology in planning control strategies for improving air quality.

  12. Objective sampling design in a highly heterogeneous landscape - characterizing environmental determinants of malaria vector distribution in French Guiana, in the Amazonian region.

    PubMed

    Roux, Emmanuel; Gaborit, Pascal; Romaña, Christine A; Girod, Romain; Dessay, Nadine; Dusfour, Isabelle

    2013-12-01

    Sampling design is a key issue when establishing species inventories and characterizing habitats within highly heterogeneous landscapes. Sampling efforts in such environments may be constrained and many field studies only rely on subjective and/or qualitative approaches to design collection strategy. The region of Cacao, in French Guiana, provides an excellent study site to understand the presence and abundance of Anopheles mosquitoes, their species dynamics and the transmission risk of malaria across various environments. We propose an objective methodology to define a stratified sampling design. Following thorough environmental characterization, a factorial analysis of mixed groups allows the data to be reduced and non-collinear principal components to be identified while balancing the influences of the different environmental factors. Such components defined new variables which could then be used in a robust k-means clustering procedure. Then, we identified five clusters that corresponded to our sampling strata and selected sampling sites in each stratum. We validated our method by comparing the species overlap of entomological collections from selected sites and the environmental similarities of the same sites. The Morisita index was significantly correlated (Pearson linear correlation) with environmental similarity based on i) the balanced environmental variable groups considered jointly (p = 0.001) and ii) land cover/use (p-value < 0.001). The Jaccard index was significantly correlated with land cover/use-based environmental similarity (p-value = 0.001). The results validate our sampling approach. Land cover/use maps (based on high spatial resolution satellite images) were shown to be particularly useful when studying the presence, density and diversity of Anopheles mosquitoes at local scales and in very heterogeneous landscapes.

  13. Uncertainty quantification and experimental design based on unsupervised machine learning identification of contaminant sources and groundwater types using hydrogeochemical data

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.

    2017-12-01

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical species. Numerous geochemical constituents and processes may need to be simulated in these models which further complicates the analyses. As a result, these types of model analyses are typically extremely challenging. Here, we demonstrate a new contaminant source identification approach that performs decomposition of the observation mixtures based on Nonnegative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. We also demonstrate how NMFk can be extended to perform uncertainty quantification and experimental design related to real-world site characterization. The NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios). The NMFk algorithm has been extensively tested on synthetic datasets; NMFk analyses have been actively performed on real-world data collected at the Los Alamos National Laboratory (LANL) groundwater sites related to Chromium and RDX contamination.

  14. Objective sampling design in a highly heterogeneous landscape - characterizing environmental determinants of malaria vector distribution in French Guiana, in the Amazonian region

    PubMed Central

    2013-01-01

    Background Sampling design is a key issue when establishing species inventories and characterizing habitats within highly heterogeneous landscapes. Sampling efforts in such environments may be constrained and many field studies only rely on subjective and/or qualitative approaches to design collection strategy. The region of Cacao, in French Guiana, provides an excellent study site to understand the presence and abundance of Anopheles mosquitoes, their species dynamics and the transmission risk of malaria across various environments. We propose an objective methodology to define a stratified sampling design. Following thorough environmental characterization, a factorial analysis of mixed groups allows the data to be reduced and non-collinear principal components to be identified while balancing the influences of the different environmental factors. Such components defined new variables which could then be used in a robust k-means clustering procedure. Then, we identified five clusters that corresponded to our sampling strata and selected sampling sites in each stratum. Results We validated our method by comparing the species overlap of entomological collections from selected sites and the environmental similarities of the same sites. The Morisita index was significantly correlated (Pearson linear correlation) with environmental similarity based on i) the balanced environmental variable groups considered jointly (p = 0.001) and ii) land cover/use (p-value << 0.001). The Jaccard index was significantly correlated with land cover/use-based environmental similarity (p-value = 0.001). Conclusions The results validate our sampling approach. Land cover/use maps (based on high spatial resolution satellite images) were shown to be particularly useful when studying the presence, density and diversity of Anopheles mosquitoes at local scales and in very heterogeneous landscapes. PMID:24289184

  15. Spatio-Temporal Process Variability in Watershed Scale Wetland Restoration Planning

    NASA Astrophysics Data System (ADS)

    Evenson, G. R.

    2012-12-01

    Watershed scale restoration decision making processes are increasingly informed by quantitative methodologies providing site-specific restoration recommendations - sometimes referred to as "systematic planning." The more advanced of these methodologies are characterized by a coupling of search algorithms and ecological models to discover restoration plans that optimize environmental outcomes. Yet while these methods have exhibited clear utility as decision support toolsets, they may be critiqued for flawed evaluations of spatio-temporally variable processes fundamental to watershed scale restoration. Hydrologic and non-hydrologic mediated process connectivity along with post-restoration habitat dynamics, for example, are commonly ignored yet known to appreciably affect restoration outcomes. This talk will present a methodology to evaluate such spatio-temporally complex processes in the production of watershed scale wetland restoration plans. Using the Tuscarawas Watershed in Eastern Ohio as a case study, a genetic algorithm will be coupled with the Soil and Water Assessment Tool (SWAT) to reveal optimal wetland restoration plans as measured by their capacity to maximize nutrient reductions. Then, a so-called "graphical" representation of the optimization problem will be implemented in-parallel to promote hydrologic and non-hydrologic mediated connectivity amongst existing wetlands and sites selected for restoration. Further, various search algorithm mechanisms will be discussed as a means of accounting for temporal complexities such as post-restoration habitat dynamics. Finally, generalized patterns of restoration plan optimality will be discussed as an alternative and possibly superior decision support toolset given the complexity and stochastic nature of spatio-temporal process variability.

  16. Methodology for the nuclear design validation of an Alternate Emergency Management Centre (CAGE)

    NASA Astrophysics Data System (ADS)

    Hueso, César; Fabbri, Marco; de la Fuente, Cristina; Janés, Albert; Massuet, Joan; Zamora, Imanol; Gasca, Cristina; Hernández, Héctor; Vega, J. Ángel

    2017-09-01

    The methodology is devised by coupling different codes. The study of weather conditions as part of the data of the site will determine the relative concentrations of radionuclides in the air using ARCON96. The activity in the air is characterized depending on the source and release sequence specified in NUREG-1465 by RADTRAD code, which provides results of the inner cloud source term contribution. Known activities, energy spectra are inferred using ORIGEN-S, which are used as input for the models of the outer cloud, filters and containment generated with MCNP5. The sum of the different contributions must meet the conditions of habitability specified by the CSN (Spanish Nuclear Regulatory Body) (TEDE <50 mSv and equivalent dose to the thyroid <500 mSv within 30 days following the accident doses) so that the dose is optimized by varying parameters such as CAGE location, flow filtering need for recirculation, thicknesses and compositions of the walls, etc. The results for the most penalizing area meet the established criteria, and therefore the CAGE building design based on the methodology presented is radiologically validated.

  17. A methodology based on the "anterior chamber of rabbit eyes" model for noninvasively determining the biocompatibility of biomaterials in an immune privileged site.

    PubMed

    Lu, Pei-Lin; Lai, Jui-Yang; Tabata, Yasuhiko; Hsiue, Ging-Ho

    2008-07-01

    In this study, a novel methodology based on the anterior chamber of rabbit eyes model was developed to evaluate the in vivo biocompatibility of biomaterials in an immune privileged site. The 7-mm-diameter membrane implants made from either a biological tissue material (amniotic membrane, AM group) or a biomedical polymeric material (gelatin, GM group) were inserted in rabbit anterior chamber for 36 months and characterized by biomicroscopic examinations, intraocular pressure measurements, and corneal thickness measurements. The noninvasive ophthalmic parameters were scored to provide a quantitative grading system. In this animal model, both AM and GM implants were visible in an ocular immune privileged site during clinical observations. The implants of the AM group appeared as soft tissue patches and have undergone a slow dissolution process resulting in a partial reduction of their size. Additionally, the AM implants did not induce any foreign body reaction or change in ocular tissue response for the studied period. By contrast, in the GM groups, significant corneal edema, elevated intraocular pressure, and increased corneal thickness were noted in the early postoperative phase (within 3 days), but resolved rapidly with in vivo dissolution of the gelatin. The results from the ocular grading system showed that both implants had good long-term biocompatibility in an ocular immune privileged site for up to 3 years. It is concluded that the anterior chamber of rabbit eyes model is an efficient method for noninvasively determining the immune privileged tissue/biomaterial interactions. (c) 2007 Wiley Periodicals, Inc.

  18. Projected Standard on neutron skyshine. [Skyshine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westfall, R.M.; Williams, D.S.

    1987-07-01

    Current interest in neutron skyshine arises from the application of dry fuel handling and storage techniques at reactor sites, at the proposed monitored retrievable storage facility and at other facilities being considered as part of the civilian radioactive waste management programs. The chairman of Standards Subcommittee ANS-6, Radiation Protection and Shielding, has requested that a work group be formed to characterize the neutron skyshine problem and, if necessary, prepare a draft Standard. The work group is comprised of representatives of storage cask vendors, architect engineering firms, nuclear utilities, the academic community and staff members of national laboratories and government agencies.more » The purpose of this presentation summary is to describe the activities of the work group and the scope and contents of the projected Standard, ANS-6.6.2, ''Calculation and Measurement of Direct and Scattered Neutron Radiation from Nuclear Power Operations.'' The specific source under consideration by the work group is an array of dry fuel casks located at a reactor site. However, it is recognized that the scope of the standard should be broad enough to encompass other neutron sources. The Standard will define appropriate methodology for properly characterizing the neutron dose due to skyshine. This dose characterization is necessary, for example, in demonstrating compliance with pertinent regulatory criteria.« less

  19. Radiological Characterization Methodology of INEEL Stored RH-TRU Waste from ANL-E

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rajiv N. Bhatt

    2003-02-01

    An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-basedmore » characterization methodology offers a superior alternative to traditional waste assay techniques. Using this methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits.« less

  20. Radiological Characterization Methodology for INEEL-Stored Remote-Handled Transuranic (RH TRU) Waste from Argonne National Laboratory-East

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuan, P.; Bhatt, R.N.

    2003-01-14

    An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-basedmore » characterization methodology offers a superior alternative to traditional waste assay techniques. Using the methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits.« less

  1. Proposal and application of a regional methodology of comparative risk assessment for potentially contaminated sites.

    PubMed

    Marzocchini, Manrico; Tatàno, Fabio; Moretti, Michela Simona; Antinori, Caterina; Orilisi, Stefano

    2018-06-05

    A possible approach for determining soil and groundwater quality criteria for contaminated sites is the comparative risk assessment. Originating from but not limited to Italian interest in a decentralised (regional) implementation of comparative risk assessment, this paper first addresses the proposal of an original methodology called CORIAN REG-M , which was created with initial attention to the context of potentially contaminated sites in the Marche Region (Central Italy). To deepen the technical-scientific knowledge and applicability of the comparative risk assessment, the following characteristics of the CORIAN REG-M methodology appear to be relevant: the simplified but logical assumption of three categories of factors (source and transfer/transport of potential contamination, and impacted receptors) within each exposure pathway; the adaptation to quality and quantity of data that are available or derivable at the given scale of concern; the attention to a reliable but unsophisticated modelling; the achievement of a conceptual linkage to the absolute risk assessment approach; and the potential for easy updating and/or refining of the methodology. Further, the application of the CORIAN REG-M methodology to some case-study sites located in the Marche Region indicated the following: a positive correlation can be expected between air and direct contact pathway scores, as well as between individual pathway scores and the overall site scores based on a root-mean-square algorithm; the exposure pathway, which presents the highest variability of scores, tends to be dominant at sites with the highest computed overall site scores; and the adoption of a root-mean-square algorithm can be expected to emphasise the overall site scoring.

  2. Methodological proceedings to evaluate the physical accessibility in urban historic sites.

    PubMed

    Ribeiro, Gabriela Sousa; Martins, Laura Bezerra; Monteiro, Circe Maria Gama

    2012-01-01

    Historic urban sites are set by cultural and social diversities, generating multiple activities and use and access to these sites should be available to all people including those with disabilities. Taking into consideration that using the same methodology that was used in different historic sites researches with positive results facilitates replication, we aimed to develop methodological procedures that identify conditions of physical accessibility in open public spaces and access to public buildings in historic urban sites to support proposals about design requirements for improvements to the problems diagnosed and control inadequacies of the physical environment. The study methods and techniques from different areas of knowledge culminated in a proposal built with an emphasis on user participation that could be applied with low cost and in relatively short period of time.

  3. Stress estimation from borehole scans for prediction of excavation overbreak in brittle rock

    NASA Astrophysics Data System (ADS)

    LeRiche, Andrew Campbell

    In the field of geomechanics, one of the most important considerations during design is the state of stress that exists at the location of a project. Despite this, stress is often the most poorly understood site characteristic, given the current challenges in accurately measuring it. This stems from the fact that stress can't be directly measured, but must be inferred by disturbing the rockmass and recording its response. Although some methods do exist for the prediction of in situ stress, this only provides a point estimate and is often plagued with uncertain results and practical limitations in the field. This research proposes a methodology of continuously predicting stress along a borehole through the back analysis of borehole breakout and how this same approach could be employed to predict excavation overbreak. KGHM's Victoria Project in Sudbury, Canada, was the location of data collection, which firstly involved site characterization through common geotechnical core logging procedures and laboratory scale intact core testing. Testing comprised Brazilian tensile strength and unconfined compressive strength testing, which involved the characterization of crack accumulation in both cases. From two pilot holes, acoustic televiewer surveys were completed to characterize the occurrence and geometry of breakout. This was done to predict the orientation of major principal stresses in the horizontal axis, with the results being further validated by the geometry of stress-induced core disking. From the lab material properties and breakout geometries, a continuum based, back analysis of breakout was done through the creation of a generic database of stress dependent numerical models. When compared with the in situ breakout profiles, this created an estimate of stress as a function of depth along each hole. The consideration of the presence of borehole fluid on the estimate of stress was also made. This provided the upper-bound estimate of stress from this methodology. Given the generic nature of the numerical models, potential shaft overbreak was also assessed using this technique and from the previously described estimate of stress.

  4. The Hollin Hill Landslide Observatory - a decade of geophysical characterization and monitoring

    NASA Astrophysics Data System (ADS)

    Uhlemann, S.; Wilkinson, P. B.; Meldrum, P.; Smith, A.; Dixon, N.; Merritt, A.; Swift, R. T.; Whiteley, J.; Gunn, D.; Chambers, J. E.

    2017-12-01

    Landslides are major and frequent natural hazards. They shape the Earth's surface, and endanger communities and infrastructure worldwide. Within the last decade, landslides caused more than 28,000 fatalities and direct damage exceeding $1.8 billion. Climate change, causing more frequent weather extremes, is likely to increase occurrences of shallow slope failures worldwide. Thus, there is a need to improve our understanding of these shallow, rainfall-induced landslides. In this context, integrated geophysical characterization and monitoring can play a crucial role by providing volumetric data that can be linked to the hydrological and geotechnical conditions of a slope. This enables understanding of the complex hydrological processes most-often being associated with landslides. Here we present a review of a decade of characterizing and monitoring a complex, inland, clayey landslide - forming the "Hollin Hill Landslide Observatory". Within the last decade, this landslide has experienced different activity characteristics, including creep, flow, and rotational failures - thereby providing an excellent testbed for the development of geophysical and geotechnical monitoring instrumentation and methodologies. These include developments of 4D geoelectrical monitoring techniques to estimate electrode positions from the resistivity data, incorporating these into a time-lapse inversion, and imaging moisture dynamics that control the landslide behaviour. Other developments include acoustic emission monitoring, and active and passive seismic monitoring. This work is underpinned by detailed characterization of the landslide, using geomorphological and geological mapping, geotechnical investigations, and a thorough geoelectrical and seismic characterization of the landslide mass. Hence, the data gained from the Hollin Hill landslide observatory has improved our understanding of the shallow landslide dynamics in response to climate change, their mechanics and evolution. The methodological and technical developments achieved at this site are suitable and applicable for implementation on other landslides worldwide.

  5. Initial source and site characterization studies for the U.C. Santa Barbara campus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archuleta, R.; Nicholson, C.; Steidl, J.

    1997-12-01

    The University of California Campus-Laboratory Collaboration (CLC) project is an integrated 3 year effort involving Lawrence Livermore National Laboratory (LLNL) and four UC campuses - Los Angeles (UCLA), Riverside (UCR), Santa Barbara (UCSB), and San Diego (UCSD) - plus additional collaborators at San Diego State University (SDSU), at Los Alamos National Laboratory and in industry. The primary purpose of the project is to estimate potential ground motions from large earthquakes and to predict site-specific ground motions for one critical structure on each campus. This project thus combines the disciplines of geology, seismology, geodesy, soil dynamics, and earthquake engineering into amore » fully integrated approach. Once completed, the CLC project will provide a template to evaluate other buildings at each of the four UC campuses, as well as provide a methodology for evaluating seismic hazards at other critical sites in California, including other UC locations at risk from large earthquakes. Another important objective of the CLC project is the education of students and other professional in the application of this integrated, multidisciplinary, state-of-the-art approach to the assessment of earthquake hazard. For each campus targeted by the CLC project, the seismic hazard study will consist of four phases: Phase I - Initial source and site characterization, Phase II - Drilling, logging, seismic monitoring, and laboratory dynamic soil testing, Phase III - Modeling of predicted site-specific earthquake ground motions, and Phase IV - Calculations of 3D building response. This report cover Phase I for the UCSB campus and incudes results up through March 1997.« less

  6. Statistical EMC: A new dimension electromagnetic compatibility of digital electronic systems

    NASA Astrophysics Data System (ADS)

    Tsaliovich, Anatoly

    Electromagnetic compatibility compliance test results are used as a database for addressing three classes of electromagnetic-compatibility (EMC) related problems: statistical EMC profiles of digital electronic systems, the effect of equipment-under-test (EUT) parameters on the electromagnetic emission characteristics, and EMC measurement specifics. Open area test site (OATS) and absorber line shielded room (AR) results are compared for equipment-under-test highest radiated emissions. The suggested statistical evaluation methodology can be utilized to correlate the results of different EMC test techniques, characterize the EMC performance of electronic systems and components, and develop recommendations for electronic product optimal EMC design.

  7. Humans and ecosystems over the coming millennia: overview of a biosphere assessment of radioactive waste disposal in Sweden.

    PubMed

    Kautsky, Ulrik; Lindborg, Tobias; Valentin, Jack

    2013-05-01

    This is an overview of the strategy used to describe the effects of a potential release from a radioactive waste repository on human exposure and future environments. It introduces a special issue of AMBIO, in which 13 articles show ways of understanding and characterizing the future. The study relies mainly on research performed in the context of a recent safety report concerning a repository for spent nuclear fuel in Sweden (the so-called SR-Site project). The development of a good understanding of on-site processes and acquisition of site-specific data facilitated the development of new approaches for assessment of surface ecosystems. A systematic and scientifically coherent methodology utilizes the understanding of the current spatial and temporal dynamics as an analog for future conditions. We conclude that future ecosystem can be inferred from a few variables and that this multidisciplinary approach is relevant in a much wider context than radioactive waste.

  8. BBD Optimization of K-ZnO Catalyst Modification Process for Heterogeneous Transesterification of Rice Bran Oil to Biodiesel

    NASA Astrophysics Data System (ADS)

    Kabo, K. S.; Yacob, A. R.; Bakar, W. A. W. A.; Buang, N. A.; Bello, A. M.; Ruskam, A.

    2016-07-01

    Environmentally benign zinc oxide (ZnO) was modified with 0-15% (wt.) potassium through wet impregnation and used in transesterification of rice bran oil (RBO) to form biodiesel. The catalyst was characterized by X-Ray powder Diffraction (XRD), its basic sites determined by back titration and Response Surface Methodology (RSM) Box-Behnken Design (BBD) was used to optimize the modification process variables on the basic sites of the catalyst. The transesterification product, biodiesel was analyzed by Nuclear Magnetic Resonance (NMR) spectroscopy. The result reveals K-modified ZnO with highly increased basic sites. Quadratic model with high regression R2 = 0.9995 was obtained from the ANOVA of modification process, optimization at maximum basic sites criterion gave optimum modification conditions of K-loading = 8.5% (wt.), calcination temperature = 480 oC and time = 4 hours with response and basic sites = 8.14 mmol/g which is in close agreement with the experimental value of 7.64 mmol/g. The catalyst was used and a value of 95.53% biodiesel conversion was obtained and effect of potassium leaching was not significant in the process

  9. Sintering-resistant Single-Site Nickel Catalyst Supported by Metal-Organic Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhanyong; Schweitzer, Neil; League, Aaron

    2016-02-17

    Developing supported single-site catalysts is an important goal in heterogeneous catalysis, since the well-defined active sites afford opportunities for detailed mechanistic studies, thereby facilitating the design of improved catalysts. We present herein a method for installing Ni ions uniformly and precisely on the node of a Zr-based MOF, NU-1000, in high density and large quantity (denoted as Ni-AIM) using atomic layer deposition (ALD) in a metal–organic framework (MOF) (AIM). Ni-AIM is demonstrated to be an efficient gas-phase hydrogenation catalyst upon activation. The structure of the active sites in Ni-AIM is proposed, revealing its single-site nature. More importantly, due to themore » organic linker used to construct the MOF support, the Ni ions stay isolated throughout the hydrogenation catalysis, in accord with its long-term stability. A quantum chemical characterization of the catalyst and the catalytic process complements the experimental results. With validation of computational modeling protocols, we further targeted ethylene oligomerization catalysis by Ni-AIM guided by theoretical prediction. Given the generality of the AIM methodology, this emerging class of materials should prove ripe for the discovery of new catalysts for the transformation of volatile substrates.« less

  10. CMOS Active Pixel Sensor Technology and Reliability Characterization Methodology

    NASA Technical Reports Server (NTRS)

    Chen, Yuan; Guertin, Steven M.; Pain, Bedabrata; Kayaii, Sammy

    2006-01-01

    This paper describes the technology, design features and reliability characterization methodology of a CMOS Active Pixel Sensor. Both overall chip reliability and pixel reliability are projected for the imagers.

  11. Characterizing wood-plastic composites via data-driven methodologies

    Treesearch

    John G. Michopoulos; John C. Hermanson; Robert Badaliance

    2007-01-01

    The recent increase of wood-plastic composite materials in various application areas has underlined the need for an efficient and robust methodology to characterize their nonlinear anisotropic constitutive behavior. In addition, the multiplicity of various loading conditions in structures utilizing these materials further increases the need for a characterization...

  12. WIPP waste characterization program sampling and analysis guidance manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    The Waste Isolation Pilot Plant (WIPP) Waste Characterization Program Sampling and Analysis Guidance Manual (Guidance Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Quality Assurance Program Plan (QAPP) for the WIPP Experimental-Waste Characterization Program (the Program). This Guidance Manual includes all of the sampling and testing methodologies accepted by the WIPP Project Office (DOE/WPO) for use in implementing the Program requirements specified in the QAPP. This includes methods for characterizing representative samples of transuranic (TRU) wastesmore » at DOE generator sites with respect to the gas generation controlling variables defined in the WIPP bin-scale and alcove test plans, as well as waste container headspace gas sampling and analytical procedures to support waste characterization requirements under the WIPP test program and the Resource Conservation and Recovery Act (RCRA). The procedures in this Guidance Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site specific procedures. The use of these procedures is intended to provide the necessary sensitivity, specificity, precision, and comparability of analyses and test results. The solutions to achieving specific program objectives will depend upon facility constraints, compliance with DOE Orders and DOE facilities' operating contractor requirements, and the knowledge and experience of the TRU waste handlers and analysts. With some analytical methods, such as gas chromatography/mass spectrometry, the Guidance Manual procedures may be used directly. With other methods, such as nondestructive/destructive characterization, the Guidance Manual provides guidance rather than a step-by-step procedure.« less

  13. A methodology for the characterization and diagnosis of cognitive impairments-Application to specific language impairment.

    PubMed

    Oliva, Jesús; Serrano, J Ignacio; del Castillo, M Dolores; Iglesias, Angel

    2014-06-01

    The diagnosis of mental disorders is in most cases very difficult because of the high heterogeneity and overlap between associated cognitive impairments. Furthermore, early and individualized diagnosis is crucial. In this paper, we propose a methodology to support the individualized characterization and diagnosis of cognitive impairments. The methodology can also be used as a test platform for existing theories on the causes of the impairments. We use computational cognitive modeling to gather information on the cognitive mechanisms underlying normal and impaired behavior. We then use this information to feed machine-learning algorithms to individually characterize the impairment and to differentiate between normal and impaired behavior. We apply the methodology to the particular case of specific language impairment (SLI) in Spanish-speaking children. The proposed methodology begins by defining a task in which normal and individuals with impairment present behavioral differences. Next we build a computational cognitive model of that task and individualize it: we build a cognitive model for each participant and optimize its parameter values to fit the behavior of each participant. Finally, we use the optimized parameter values to feed different machine learning algorithms. The methodology was applied to an existing database of 48 Spanish-speaking children (24 normal and 24 SLI children) using clustering techniques for the characterization, and different classifier techniques for the diagnosis. The characterization results show three well-differentiated groups that can be associated with the three main theories on SLI. Using a leave-one-subject-out testing methodology, all the classifiers except the DT produced sensitivity, specificity and area under curve values above 90%, reaching 100% in some cases. The results show that our methodology is able to find relevant information on the underlying cognitive mechanisms and to use it appropriately to provide better diagnosis than existing techniques. It is also worth noting that the individualized characterization obtained using our methodology could be extremely helpful in designing individualized therapies. Moreover, the proposed methodology could be easily extended to other languages and even to other cognitive impairments not necessarily related to language. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. U.S. Department of Energy worker health risk evaluation methodology for assessing risks associated with environmental restoration and waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blaylock, B.P.; Legg, J.; Travis, C.C.

    1995-06-01

    This document describes a worker health risk evaluation methodology for assessing risks associated with Environmental Restoration (ER) and Waste Management (WM). The methodology is appropriate for estimating worker risks across the Department of Energy (DOE) Complex at both programmatic and site-specific levels. This document supports the worker health risk methodology used to perform the human health risk assessment portion of the DOE Programmatic Environmental Impact Statement (PEIS) although it has applications beyond the PEIS, such as installation-wide worker risk assessments, screening-level assessments, and site-specific assessments.

  15. Contaminant source identification using semi-supervised machine learning

    NASA Astrophysics Data System (ADS)

    Vesselinov, Velimir V.; Alexandrov, Boian S.; O'Malley, Daniel

    2018-05-01

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may need to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. The NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).

  16. Contaminant source identification using semi-supervised machine learning

    DOE PAGES

    Vesselinov, Velimir Valentinov; Alexandrov, Boian S.; O’Malley, Dan

    2017-11-08

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may needmore » to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. Finally, the NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).« less

  17. Contaminant source identification using semi-supervised machine learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinov, Velimir Valentinov; Alexandrov, Boian S.; O’Malley, Dan

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical types. Numerous different geochemical constituents and processes may needmore » to be simulated in these models which further complicates the analyses. In this paper, we propose a new contaminant source identification approach that performs decomposition of the observation mixtures based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the unknown number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. NMFk is tested on synthetic and real-world site data. Finally, the NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios).« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaFreniere, L.

    The Commodity Credit Corporation (CCC), an agency of the U.S. Department of Agriculture (USDA), formerly operated a grain storage facility approximately 1,100 ft north of Centralia from 1949 until 1971. Subsequently, a concrete mixing plant operated on the site (FSA 1997). None of the CCC/USDA structures remain, though belowgrade foundations related to structures associated with the concrete mixing operations are evident. Two additional grain storage facilities currently exist in and near Centralia: the Nemaha County Co-op, approximately 4,000 ft south of the former CCC/USDA facility, and a private grain storage facility near the Don Morris residence, 3,500 ft north ofmore » the former CCC/USDA facility (Figure 1.1). The property on which the former facility was located is currently owned by Jeanne Burdett Lacky of Seneca, Kansas. In August-September 1998 the Kansas Department of Health and Environment (KDHE) conducted preliminary investigations at the former CCC/USDA facility, on the basis of the detection of carbon tetrachloride in the domestic well at the Don Morris residence (north of the former CCC/USDA facility). Prior to 1986, commercial grain fumigants containing carbon tetrachloride were commonly used by the CCC/USDA and the grain storage industry to preserve grain. The details of previous investigations in the area and a summary of the findings were reported previously (Argonne 2002a). Because the KDHE detected carbon tetrachloride in groundwater and soil at the former CCC/USDA facility at Centralia that might be related to historical use of carbon tetrachloride-based grain fumigants at the facility, the CCC/USDA is conducting an environmental site investigation to determine the source(s) and extent of the carbon tetrachloride contamination at the former facility near Centralia and to assess whether the contamination requires remedial action. The town of Centralia and all residents near the former CCC/USDA facility currently obtain their water from Rural Water District No.3. Therefore, local residents are not drinking or using the contaminated groundwater detected at the former facility. The Environmental Research Division of Argonne National Laboratory is performing the investigation at Centralia on behalf of the CCC/USDA. Argonne is a nonprofit, multidisciplinary research center operated by the University of Chicago for the U.S. Department of Energy (DOE). The CCC/USDA has entered into an interagency agreement with DOE, under which Argonne provides technical assistance to the CCC/USDA with environmental site characterization and remediation at its former grain storage facilities. At these former facilities, Argonne is applying its QuickSite{reg_sign} environmental site characterization methodology. QuickSite is Argonne's proprietary implementation system for the expedited site characterization process. This methodology has been applied successfully at a number of former CCC/USDA facilities in Nebraska and Kansas and has been adopted by the American Society for Testing and Materials (ASTM 1998) as standard practice for environmental site characterization. Argonne's investigations are conducted with a phased approach. Phase I focuses primarily on the investigation and evaluation of geology, hydrogeology, and hydrogeochemistry to identify potential contaminant pathways at a site. Phase II focuses on delineating the contamination present in both soil and aquifers along the potential migration pathways. Phase I of Argonne's investigation was conducted in March-April 2002. The results and findings of the Phase I investigation at Centralia were reported previously (Argonne 2003). This report documents the findings of the Phase II activities at Centralia. Section 1 provides a brief history of the area, a review of the Phase I results and conclusions, technical objectives for the Phase II investigation, and a brief description of the sections contained in this report. Section 2 describes the investigative methods used during the Phase II investigation. Section 3 presents all of the data obtained during the investigation. Section 4 describes the interpretation of the pertinent data used to meet the technical objectives of the investigation. Section 5 presents the conclusions of the investigation relative to the technical objectives and outlines further recommendations.« less

  19. The combined use of heat-pulse flowmeter logging and packer testing for transmissive fracture recognition

    NASA Astrophysics Data System (ADS)

    Lo, Hung-Chieh; Chen, Po-Jui; Chou, Po-Yi; Hsu, Shih-Meng

    2014-06-01

    This paper presents an improved borehole prospecting methodology based on a combination of techniques in the hydrogeological characterization of fractured rock aquifers. The approach is demonstrated by on-site tests carried out in the Hoshe Experimental Forest site and the Tailuge National Park, Taiwan. Borehole televiewer logs are used to obtain fracture location and distribution along boreholes. The heat-pulse flow meter log is used to measure vertical velocity flow profiles which can be analyzed to estimate fracture transmissivity and to indicate hydraulic connectivity between fractures. Double-packer hydraulic tests are performed to determine the rock mass transmissivity. The computer program FLASH is used to analyze the data from the flowmeter logs. The FLASH program is confirmed as a useful tool which quantitatively predicts the fracture transmissivity in comparison to the hydraulic properties obtained from packer tests. The location of conductive fractures and their transmissivity is identified, after which the preferential flow paths through the fracture network are precisely delineated from a cross-borehole test. The results provide robust confirmation of the use of combined flowmeter and packer methods in the characterization of fractured-rock aquifers, particularly in reference to the investigation of groundwater resource and contaminant transport dynamics.

  20. Use of Data Libraries for IAEA Nuclear Security Assessment Methodologies (NUSAM) [section 5.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shull, D.; Lane, M.

    2015-06-23

    Data libraries are essential for the characterization of the facility and provide the documented input which enables the facility assessment results and subsequent conclusions. Data Libraries are historical, verifiable, quantified, and applicable collections of testing data on different types of barriers, sensors, cameras, procedures, and/or personnel. Data libraries are developed and maintained as part of any assessment program or process. Data is collected during the initial stages of facility characterization to aid in the model and/or simulation development process. Data library values may also be developed through the use of state testing centers and/or site resources by testing different typesmore » of barriers, sensors, cameras, procedures, and/or personnel. If no data exists, subject matter expert opinion and manufacturer's specifications/ testing values can be the basis for initially assigning values, but are generally less reliable and lack appropriate confidence measures. The use of existing data libraries that have been developed by a state testing organization reduces the assessment costs by establishing standard delay, detection and assessment values for use by multiple sites or facilities where common barriers and alarms systems exist.« less

  1. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer

    PubMed Central

    Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain

    2017-01-01

    Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results. PMID:28467468

  2. A novel integrated framework and improved methodology of computer-aided drug design.

    PubMed

    Chen, Calvin Yu-Chian

    2013-01-01

    Computer-aided drug design (CADD) is a critical initiating step of drug development, but a single model capable of covering all designing aspects remains to be elucidated. Hence, we developed a drug design modeling framework that integrates multiple approaches, including machine learning based quantitative structure-activity relationship (QSAR) analysis, 3D-QSAR, Bayesian network, pharmacophore modeling, and structure-based docking algorithm. Restrictions for each model were defined for improved individual and overall accuracy. An integration method was applied to join the results from each model to minimize bias and errors. In addition, the integrated model adopts both static and dynamic analysis to validate the intermolecular stabilities of the receptor-ligand conformation. The proposed protocol was applied to identifying HER2 inhibitors from traditional Chinese medicine (TCM) as an example for validating our new protocol. Eight potent leads were identified from six TCM sources. A joint validation system comprised of comparative molecular field analysis, comparative molecular similarity indices analysis, and molecular dynamics simulation further characterized the candidates into three potential binding conformations and validated the binding stability of each protein-ligand complex. The ligand pathway was also performed to predict the ligand "in" and "exit" from the binding site. In summary, we propose a novel systematic CADD methodology for the identification, analysis, and characterization of drug-like candidates.

  3. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer.

    PubMed

    Rogiers, Bart; Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain

    2017-01-01

    Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results.

  4. Combining Field Monitoring with Remote Sensing to Reconstruct Historical Hydroperiod: a Case Study in a Degrading Tropical Wetland

    NASA Astrophysics Data System (ADS)

    Alonso, A.; Munoz-Carpena, R.; Kaplan, D. A.

    2017-12-01

    Wetland ecosystem structure and function are primarily governed by water regime. Characterizing past and current wetland hydrology is thus crucial for identifying the drivers of long-term wetland degradation. Critically, a lack of spatially distributed and long-term data has impeded such characterization in most wetland systems across the world. The publically accessible Moderate Resolution Imaging Spectroradiometer (MODIS) satellite products encode spatial and temporal data for landscape monitoring, but it was unclear whether it could be used to reliably predict the hydric status of wetland due to the mixture of spectral signatures existing within and between such systems. We proposed and tested a methodological framework for the identification of site-specific wetness status spectral identification rule (WSSIR) using two recent technical innovations: affordable, easily deployable field water level sensors to train the WSSIR with supervised learning, and the powerful cloud-based Google Earth Engine (GEE) platform to rapidly access and process the MODIS imagery. This methodological framework was used in a study case of the globally important Palo Verde National Park tropical wetland in Costa Rica. Results showed that a site-specific WISSR could reliably detect wetland wet or dry status (hydroperiod) and capture the temporal variability of the wetness status. We applied it on the 500 m 2000-2016 MODIS Land Surface Reflectance daily product to reconstruct hydroperiod history, hence reaching a temporal resolution rarely matched in remote sensing for environmental studies. The analysis of the resulting long-term, spatially distributed MODIS-derived data, coupled with shorter-term, 15-minute resolution field water level time-series provided new insights into the drivers controlling the spatiotemporal dynamics of hydrology within Palo Verde National Park's degrading wetlands. This new knowledge is critical to make informed restoration and management decisions. Specifically, we identified a significant influence of the surrounding rivers and irrigation district, which emphasised the importance of considering the wetland in the watershed context when elaborating management strategies. The methodological framework can be applied to any other mid- to large-scale wetland systems worldwide.

  5. A TRAJECTORY-CLUSTERING CORRELATION METHODOLOGY FOR EXAMINING THE LONG-RANGE TRANSPORT OF AIR POLLUTANTS. (R825260)

    EPA Science Inventory

    We present a robust methodology for examining the relationship between synoptic-scale atmospheric transport patterns and pollutant concentration levels observed at a site. Our approach entails calculating a large number of back-trajectories from the observational site over a long...

  6. Twitter-Based Detection of Illegal Online Sale of Prescription Opioid.

    PubMed

    Mackey, Tim K; Kalyanam, Janani; Katsuki, Takeo; Lanckriet, Gert

    2017-12-01

    To deploy a methodology accurately identifying tweets marketing the illegal online sale of controlled substances. We first collected tweets from the Twitter public application program interface stream filtered for prescription opioid keywords. We then used unsupervised machine learning (specifically, topic modeling) to identify topics associated with illegal online marketing and sales. Finally, we conducted Web forensic analyses to characterize different types of online vendors. We analyzed 619 937 tweets containing the keywords codeine, Percocet, fentanyl, Vicodin, Oxycontin, oxycodone, and hydrocodone over a 5-month period from June to November 2015. A total of 1778 tweets (< 1%) were identified as marketing the sale of controlled substances online; 90% had imbedded hyperlinks, but only 46 were "live" at the time of the evaluation. Seven distinct URLs linked to Web sites marketing or illegally selling controlled substances online. Our methodology can identify illegal online sale of prescription opioids from large volumes of tweets. Our results indicate that controlled substances are trafficked online via different strategies and vendors. Public Health Implications. Our methodology can be used to identify illegal online sellers in criminal violation of the Ryan Haight Online Pharmacy Consumer Protection Act.

  7. Lunar Orbit Insertion Targeting and Associated Outbound Mission Design for Lunar Sortie Missions

    NASA Technical Reports Server (NTRS)

    Condon, Gerald L.

    2007-01-01

    This report details the Lunar Orbit Insertion (LOI) arrival targeting and associated mission design philosophy for Lunar sortie missions with up to a 7-day surface stay and with global Lunar landing site access. It also documents the assumptions, methodology, and requirements validated by TDS-04-013, Integrated Transit Nominal and Abort Characterization and Sensitivity Study. This report examines the generation of the Lunar arrival parking orbit inclination and Longitude of the Ascending Node (LAN) targets supporting surface missions with global Lunar landing site access. These targets support the Constellation Program requirement for anytime abort (early return) by providing for a minimized worst-case wedge angle [and an associated minimum plane change delta-velocity (V) cost] between the Crew Exploration Vehicle (CEV) and the Lunar Surface Access Module (LSAM) for an LSAM launch anytime during the Lunar surface stay.

  8. ATP3 Unified Field Study Data

    DOE Data Explorer

    Wolfrum, Ed (ORCID:0000000273618931); Knoshug, Eric (ORCID:000000025709914X); Laurens, Lieve (ORCID:0000000349303267); Harmon, Valerie; Dempster, Thomas (ORCID:000000029550488X); McGowan, John (ORCID:0000000266920518); Rosov, Theresa; Cardello, David; Arrowsmith, Sarah; Kempkes, Sarah; Bautista, Maria; Lundquist, Tryg; Crowe, Brandon; Murawsky, Garrett; Nicolai, Eric; Rowe, Egan; Knurek, Emily; Javar, Reyna; Saracco Alvarez, Marcela; Schlosser, Steve; Riddle, Mary; Withstandley, Chris; Chen, Yongsheng; Van Ginkel, Steven; Igou, Thomas; Xu, Chunyan; Hu, Zixuan

    2017-10-20

    ATP3 Unified Field Study Data The Algae Testbed Public-Private Partnership (ATP3) was established with the goal of investigating open pond algae cultivation across different geographic, climatic, seasonal, and operational conditions while setting the benchmark for quality data collection, analysis, and dissemination. Identical algae cultivation systems and data analysis methodologies were established at testbed sites across the continental United States and Hawaii. Within this framework, the Unified Field Studies (UFS) were designed to characterize the cultivation of different algal strains during all 4 seasons across this testbed network. The dataset presented here is the complete, curated, climatic, cultivation, harvest, and biomass composition data for each season at each site. These data enable others to do in-depth cultivation, harvest, techno-economic, life cycle, resource, and predictive growth modeling analysis, as well as develop crop protection strategies for the nascent algae industry. NREL Sub award Number: DE-AC36-08-GO28308

  9. Permanent Neonatal Diabetes Caused by Creation of an Ectopic Splice Site within the INS Gene

    PubMed Central

    Gastaldo, Elena; Harries, Lorna W.; Rubio-Cabezas, Oscar; Castaño, Luis

    2012-01-01

    Background The aim of this study was to characterize the genetic etiology in a patient who presented with permanent neonatal diabetes at 2 months of age. Methodology/Principal Findings Regulatory elements and coding exons 2 and 3 of the INS gene were amplified and sequenced from genomic and complementary DNA samples. A novel heterozygous INS mutation within the terminal intron of the gene was identified in the proband and her affected father. This mutation introduces an ectopic splice site leading to the insertion of 29 nucleotides from the intronic sequence into the mature mRNA, which results in a longer and abnormal transcript. Conclusions/Significance This study highlights the importance of routinely sequencing the exon-intron boundaries and the need to carry out additional studies to confirm the pathogenicity of any identified intronic genetic variants. PMID:22235272

  10. Prediction of protein-peptide interactions: application of the XPairIt API to anthrax lethal factor and substrates

    NASA Astrophysics Data System (ADS)

    Hurley, Margaret M.; Sellers, Michael S.

    2013-05-01

    As software and methodology develop, key aspects of molecular interactions such as detailed energetics and flexibility are continuously better represented in docking simulations. In the latest iteration of the XPairIt API and Docking Protocol, we perform a blind dock of a peptide into the cleavage site of the Anthrax lethal factor (LF) metalloprotein. Molecular structures are prepared from RCSB:1JKY and we demonstrate a reasonably accurate docked peptide through analysis of protein motion and, using NCI Plot, visualize and characterize the forces leading to binding. We compare our docked structure to the 1JKY crystal structure and the more recent 1PWV structure, and discuss both captured and overlooked interactions. Our results offer a more detailed look at secondary contact and show that both van der Waals and electrostatic interactions from peptide residues further from the enzyme's catalytic site are significant.

  11. Characterizing structures on borehole images and logging data of the Nankai trough accretionary prism: new insights

    NASA Astrophysics Data System (ADS)

    Jurado, Maria Jose

    2016-04-01

    IODP has extensively used the D/V Chikyu to drill the Kumano portion of the Nankai Trough, including two well sites within the Kumano Basin. IODP Expeditions 338 and 348 drilled deep into the inner accretionary prism south of the Kii Peninsula collecting a suite of LWD data, including natural gamma ray, electrical resistivity logs and borehole images, suitable to characterize structures (fractures and faults) inside the accretionary prism. Structural interpretation and analysis of logging-while-drilling data in the deep inner prism revealed intense deformation of a generally homogenous lithology characterized by bedding that dips steeply (60-90°) to the NW, intersected by faults and fractures. Multiple phases of deformation are characterized. IODP Expedition borehole images and LWD data acquired in the last decade in previous and results of NantroSEIZE IODP Expeditions (314, 319) were also analyzed to investigate the internal geometries and structures of the Nankai Trough accretionary prism. This study focused mainly on the characterization of the different types of structures and their specific position within the accretionary prism structures. New structural constraints and methodologies as well as a new approach to the characterization of study of active structures inside the prism will be presented.

  12. Field Characterization of the Mineralogy and Organic Chemistry of Carbonates from the 2010 Arctic Mars Analog Svalbard Expedition by Evolved Gas Analysis

    NASA Technical Reports Server (NTRS)

    McAdam, A. C.; Ten Kate, I. L.; Stern, J. C.; Mahaffy, P. R.; Blake, D. F.; Morris, R. V.; Steele, A.; Amundson, H. E. F.

    2011-01-01

    The 2010 Arctic Mars Analog Svalbard Expedition (AMASE) investigated two geologic settings using methodologies and techniques being developed or considered for future Mars missions, such as the Mars Science Laboratory (MSL), ExoMars, and Mars Sample Return. The Sample Analysis at Mars (SAM) [1] instrument suite, which will be on MSL, consists of a quadrupole mass spectrometer (QMS), a gas chromatograph (GC), and a tunable laser mass spectrometer (TLS); all will be applied to analyze gases created by pyrolysis of samples. During AMASE, a Hiden Evolved Gas Analysis-Mass Spectrometer (EGA-MS) system represented the EGA-MS capability of SAM. Another MSL instrument, CheMin, will use x-ray diffraction (XRD) and x-ray fluorescence (XRF) to perform quantitative mineralogical characterization of samples [e.g., 2]. Field-portable versions of CheMin were used during AMASE. AMASE 2010 focused on two sites that represented biotic and abiotic analogs. The abiotic site was the basaltic Sigurdfjell vent complex, which contains Mars-analog carbonate cements including carbonate globules which are excellent analogs for the globules in the ALH84001 martian meteorite [e.g., 3, 4]. The biotic site was the Knorringfjell fossil methane seep, which featured carbonates precipitated in a methane-supported chemosynthetic community [5]. This contribution focuses on EGA-MS analyses of samples from each site, with mineralogy comparisons to CheMin team results. The results give insight into organic content and organic-mineral associations, as well as some constraints on the minerals present.

  13. The Hawaii Undersea Military Munitions Assessment

    NASA Astrophysics Data System (ADS)

    Edwards, Margo H.; Shjegstad, Sonia M.; Wilkens, Roy; King, James C.; Carton, Geoff; Bala, Deserie; Bingham, Brian; Bissonnette, Martine C.; Briggs, Christian; Bruso, Natalie S.; Camilli, Rich; Cremer, Max; Davis, Roger B.; DeCarlo, Eric H.; DuVal, Carter; Fornari, Daniel J.; Kaneakua-Pia, Iolana; Kelley, Christopher D.; Koide, Shelby; Mah, Christopher L.; Kerby, Terry; Kurras, Gregory J.; Rognstad, Mark R.; Sheild, Lukas; Silva, Jeff; Wellington, Basil; Woerkom, Michael Van

    2016-06-01

    The Hawaii Undersea Military Munitions Assessment (HUMMA) is the most comprehensive deep-water investigation undertaken by the United States to look at sea-disposed chemical and conventional munitions. HUMMA's primary scientific objective is to bound, characterize and assess a historic deep-water munitions sea-disposal site to determine the potential impact of the ocean environment on sea-disposed munitions and of sea-disposed munitions on the ocean environment and those that use it. Between 2007 and 2012 the HUMMA team conducted four field programs, collecting hundreds of square kilometers of acoustic data for high-resolution seafloor maps, tens of thousands of digital images, hundreds of hours of video of individual munitions, hundreds of physical samples acquired within two meters of munitions casings, and a suite of environmental data to characterize the ocean surrounding munitions in the study area. Using these data we examined six factors in the study area: (1) the spatial extent and distribution of munitions; (2) the integrity of munitions casings; (3) whether munitions constituents could be detected in sediment, seawater or animals near munitions; (4) whether constituent levels at munitions sites differed significantly from levels at reference control sites; (5) whether statistically significant differences in ecological population metrics could be detected between the two types of sites; and (6) whether munitions constituents or their derivatives potentially pose an unacceptable risk to human health. Herein we provide a general overview of HUMMA including overarching goals, methodologies, physical characteristics of the study area, data collected and general results. Detailed results, conclusions and recommendations for future research are discussed in the accompanying papers included in this volume.

  14. Uncertainty factors in screening ecological risk assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duke, L.D.; Taggart, M.

    2000-06-01

    The hazard quotient (HQ) method is commonly used in screening ecological risk assessments (ERAs) to estimate risk to wildlife at contaminated sites. Many ERAs use uncertainty factors (UFs) in the HQ calculation to incorporate uncertainty associated with predicting wildlife responses to contaminant exposure using laboratory toxicity data. The overall objective was to evaluate the current UF methodology as applied to screening ERAs in California, USA. Specific objectives included characterizing current UF methodology, evaluating the degree of conservatism in UFs as applied, and identifying limitations to the current approach. Twenty-four of 29 evaluated ERAs used the HQ approach: 23 of thesemore » used UFs in the HQ calculation. All 24 made interspecies extrapolations, and 21 compensated for its uncertainty, most using allometric adjustments and some using RFs. Most also incorporated uncertainty for same-species extrapolations. Twenty-one ERAs used UFs extrapolating from lowest observed adverse effect level (LOAEL) to no observed adverse effect level (NOAEL), and 18 used UFs extrapolating from subchronic to chronic exposure. Values and application of all UF types were inconsistent. Maximum cumulative UFs ranged from 10 to 3,000. Results suggest UF methodology is widely used but inconsistently applied and is not uniformly conservative relative to UFs recommended in regulatory guidelines and academic literature. The method is limited by lack of consensus among scientists, regulators, and practitioners about magnitudes, types, and conceptual underpinnings of the UF methodology.« less

  15. Regional risk assessment for contaminated sites part 2: ranking of potentially contaminated sites.

    PubMed

    Pizzol, Lisa; Critto, Andrea; Agostini, Paola; Marcomini, Antonio

    2011-11-01

    Environmental risks are traditionally assessed and presented in non spatial ways although the heterogeneity of the contaminants spatial distributions, the spatial positions and relations between receptors and stressors, as well as the spatial distribution of the variables involved in the risk assessment, strongly influence exposure estimations and hence risks. Taking into account spatial variability is increasingly being recognized as a further and essential step in sound exposure and risk assessment. To address this issue an innovative methodology which integrates spatial analysis and a relative risk approach was developed. The purpose of this methodology is to prioritize sites at regional scale where a preliminary site investigation may be required. The methodology aimed at supporting the inventory of contaminated sites was implemented within the spatial decision support sYstem for Regional rIsk Assessment of DEgraded land, SYRIADE, and was applied to the case-study of the Upper Silesia region (Poland). The developed methodology and tool are both flexible and easy to adapt to different regional contexts, allowing the user to introduce the regional relevant parameters identified on the basis of user expertise and regional data availability. Moreover, the used GIS functionalities, integrated with mathematical approaches, allow to take into consideration, all at once, the multiplicity of sources and impacted receptors within the region of concern, to assess the risks posed by all contaminated sites in the region and, finally, to provide a risk-based ranking of the potentially contaminated sites. Copyright © 2011. Published by Elsevier Ltd.

  16. Applicability of a bioelectronic cardiac monitoring system for the detection of biological effects of pollution in bioindicator species in the Gulf of Finland

    NASA Astrophysics Data System (ADS)

    Kholodkevich, Sergey V.; Kuznetsova, Tatiana V.; Sharov, Andrey N.; Kurakin, Anton S.; Lips, Urmas; Kolesova, Natalia; Lehtonen, Kari K.

    2017-07-01

    Field testing of an innovative technology based on a bioelectronic cardiac monitoring system was carried out in the Gulf of Finland (Baltic Sea). The study shows that the bioelectronic system is suitable for the selected bivalve mollusks Mytilus trossulus, Macoma balthica and Anodonta anatina. Specimens taken from reference sites demonstrated a heart rate recovery time of < 60 min after testing with changed salinity load, while those collected from sites characterized by high anthropogenic pressure demonstrated a prolonged recovery time of up to 110-360 min. These results make possible a discrimination of the study sites based on the assessment of physiological adaptive capacities of inhabiting species. In addition, the approach of measuring heart rate characteristics in M. balthica transplanted in cages to specific target areas was successfully used to evaluate the decline in the adaptive potential of mollusks exposed at polluted sites. Application of the novel system is a useful tool for the biomonitoring of freshwater and brackish water areas. Development of methodological basis for the testing of adaptive capacities (health) of key aquatic organisms provides new knowledge of biological effects of anthropogenic chemical stress in aquatic organisms.

  17. Emissions from cold heavy oil production with sands (CHOPS) facilities in Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Roscioli, J. R.; Herndon, S. C.; Yacovitch, T. I.; Knighton, W. B.; Zavala-Araiza, D.; Johnson, M. R.; Tyner, D. R.

    2017-12-01

    Cold heavy oil production with sands (CHOPS) is generally characterized as a pump driven oil extraction method producing a mixture of sand, water, and heavy oil to heated liquid storage tanks. In addition to fluids, CHOPS sites also produce solution gas, primarily composed of methane, through the well annulus. Depending on formation and well production characteristics, large volumes of this solution gas are frequently vented to the atmosphere without flaring or conservation. To better understand these emission we present measurements of methane, ethane, propane and aromatic emission rates from CHOPS sites using dual tracer flux ratio methodology. The use of two tracers allowed on-site emission sources to be accurately identified and in one instance indicated that the annular vent was responsible for >75% of emissions at the facility. Overall, a measurement survey of five CHOPS sites finds that the methane emissions are in general significantly under-reported by operators. This under-reporting may arise from uncertainties associated with measured gas-to-oil ratios upon which the reported vent volume is based. Finally, measurements of ethane, propane and aromatics from these facilities indicates surprisingly low non-methane hydrocarbon content.

  18. Characterization of the 3-D fracture setting of an unstable rock mass: From surface and seismic investigations to numerical modeling

    NASA Astrophysics Data System (ADS)

    Colombero, C.; Baillet, L.; Comina, C.; Jongmans, D.; Vinciguerra, S.

    2017-08-01

    The characterization of the fracturing state of a potentially unstable rock cliff is a crucial requirement for stability assessments and mitigation purposes. Classical measurements of fracture location and orientation can however be limited by inaccessible rock exposures. The steep topography and high-rise morphology of these cliffs, together with the widespread presence of fractures, can additionally condition the success of geophysical prospecting on these sites. In order to mitigate these limitations, an innovative approach combining noncontact geomechanical measurements, active and passive seismic surveys, and 3-D numerical modeling is proposed in this work to characterize the 3-D fracture setting of an unstable rock mass, located in NW Italian Alps (Madonna del Sasso, VB). The 3-D fracture geometry was achieved through a combination of field observations and noncontact geomechanical measurements on oriented pictures of the cliff, resulting from a previous laser-scanning and photogrammetric survey. The estimation of fracture persistence within the rock mass was obtained from surface active seismic surveys. Ambient seismic noise and earthquakes recordings were used to assess the fracture control on the site response. Processing of both data sets highlighted the resonance properties of the unstable rock volume decoupling from the stable massif. A finite element 3-D model of the site, including all the retrieved fracture information, enabled both validation and interpretation of the field measurements. The integration of these different methodologies, applied for the first time to a complex 3-D prone-to-fall mass, provided consistent information on the internal fracturing conditions, supplying key parameters for future monitoring purposes and mitigation strategies.

  19. Methane emissions from the global oil and gas supply chain: recent advances and next steps

    NASA Astrophysics Data System (ADS)

    Zavala Araiza, D.; Herndon, S. C.; Roscioli, J. R.; Yacovitch, T. I.; Knighton, W. B.; Johnson, M.; Tyner, D. R.; Hamburg, S.

    2017-12-01

    A wide body of research has characterized methane emissions from the oil and gas system in the US. In contrast, empirical data is limited for other significant oil and gas producing regions across the world. As a consequence, measuring and characterizing methane emissions across global oil and gas operations will be crucial to the design of effective mitigation strategies. Several countries have announced pledges to reduce methane emissions from this system (e.g., North America, Climate and Clean Air Coalition [CCAC] ministers). In the case of Canada, the federal government recently announced regulations supporting a 40-45% reduction of methane emissions from the oil and gas production systems. For these regulations to be effective, it is critical to understand the current methane emission patterns. We present results from a coordinated multiscale (i.e., airborne-based, ground-based) measurement campaign in Alberta, Canada. We use empirically derived emission estimates to characterize site-level emissions and derive an emissions distribution. Our work shows that many major sources of emissions are unmeasured or underreported. Consistent with previous studies in the US, a small fraction of sites disproportionately account for the majority of emissions: roughly 20% of sites accounted for 75% of emissions. An independent airborne-based regional estimate was 40% lower than the ground-based regional estimate, but not statistically different. Finally, we summarize next steps as part of the CCAC Oil and Gas Methane Study: ongoing work that is targeting oil and gas sectors/production regions with limited empirical data on methane emissions. This work builds on the approach deployed in quantifying methane emissions from the oil and gas supply chain in the US, underscoring the commitment to transparency of the collected data, external review, deployment of multiple methodologies, and publication of results in peer-reviewed journals.

  20. NANOSTRUCTURED METAL OXIDE CATALYSTS VIA BUILDING BLOCK SYNTHESES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig E. Barnes

    2013-03-05

    A broadly applicable methodology has been developed to prepare new single site catalysts on silica supports. This methodology requires of three critical components: a rigid building block that will be the main structural and compositional component of the support matrix; a family of linking reagents that will be used to insert active metals into the matrix as well as cross link building blocks into a three dimensional matrix; and a clean coupling reaction that will connect building blocks and linking agents together in a controlled fashion. The final piece of conceptual strategy at the center of this methodology involves dosingmore » the building block with known amounts of linking agents so that the targeted connectivity of a linking center to surrounding building blocks is obtained. Achieving targeted connectivities around catalytically active metals in these building block matrices is a critical element of the strategy by which single site catalysts are obtained. This methodology has been demonstrated with a model system involving only silicon and then with two metal-containing systems (titanium and vanadium). The effect that connectivity has on the reactivity of atomically dispersed titanium sites in silica building block matrices has been investigated in the selective oxidation of phenols to benezoquinones. 2-connected titanium sites are found to be five times as active (i.e. initial turnover frequencies) than 4-connected titanium sites (i.e. framework titanium sites).« less

  1. Identifying Aquifer Heterogeneities using the Level Set Method

    NASA Astrophysics Data System (ADS)

    Lu, Z.; Vesselinov, V. V.; Lei, H.

    2016-12-01

    Material interfaces between hydrostatigraphic units (HSU) with contrasting aquifer parameters (e.g., strata and facies with different hydraulic conductivity) have a great impact on flow and contaminant transport in subsurface. However, the identification of HSU shape in the subsurface is challenging and typically relies on tomographic approaches where a series of steady-state/transient head measurements at spatially distributed observation locations are analyzed using inverse models. In this study, we developed a mathematically rigorous approach for identifying material interfaces among any arbitrary number of HSUs using the level set method. The approach has been tested first with several synthetic cases, where the true spatial distribution of HSUs was assumed to be known and the head measurements were taken from the flow simulation with the true parameter fields. These synthetic inversion examples demonstrate that the level set method is capable of characterizing the spatial distribution of the heterogeneous. We then applied the methodology to a large-scale problem in which the spatial distribution of pumping wells and observation well screens is consistent with the actual aquifer contamination (chromium) site at the Los Alamos National Laboratory (LANL). In this way, we test the applicability of the methodology at an actual site. We also present preliminary results using the actual LANL site data. We also investigated the impact of the number of pumping/observation wells and the drawdown observation frequencies/intervals on the quality of the inversion results. We also examined the uncertainties associated with the estimated HSU shapes, and the accuracy of the results under different hydraulic-conductivity contrasts between the HSU's.

  2. Assessment of the ecotoxicity of urban estuarine sediment using benthic and pelagic copepod bioassays.

    PubMed

    Charry, Maria P; Keesing, Vaughan; Costello, Mark; Tremblay, Louis A

    2018-01-01

    Urban estuarine sediments are sinks to a range of contaminants of anthropogenic origin, and a key challenge is to characterize the risk of these compounds to receiving environments. In this study, the toxicity of urban estuarine sediments was tested using acute and chronic bioassays in the benthic harpacticoid Quinquelaophonte sp., and in the planktonic calanoid Gladioferens pectinatus , two New Zealand copepod species. The sediment samples from the estuary tributary sites significantly impacted reproduction in Quinquelaophonte sp. However, results from one of the estuary sites were not significantly different to those from the tributaries sites, suggesting that chemicals other than trace metals, polycyclic aromatic hydrocarbons and ammonia may be the causative stressors. Sediment elutriate samples had significant effects on reproductive endpoints in G. pectinatus , and on the induction of DNA damage in cells, as shown by the comet assay. The results indicate that sediment contamination at the Ahuriri Estuary has the potential to impact biological processes of benthic and pelagic organisms. The approach used provides a standardized methodology to assess the toxicity of estuarine sediments.

  3. Experimental and simulated ultrasonic characterization of complex damage in fused silica.

    PubMed

    Martin, L Peter; Chambers, David H; Thomas, Graham H

    2002-02-01

    The growth of a laser-induced, surface damage site in a fused silica window was monitored by the ultrasonic pulse-echo technique. The laser damage was grown using 12-ns pulses of 1.053-microm wavelength light at a fluence of approximately 27 J/cm2. The ultrasonic data were acquired after each pulse of the laser beam for 19 pulses. In addition, optical images of the surface and subsurface damage shape were recorded after each pulse of the laser. The ultrasonic signal amplitude exhibited variations with the damage size, which were attributed to the subsurface morphology of the damage site. A mechanism for the observed ultrasonic data based on the interaction of the ultrasound with cracks radiating from the damage site was tested using two-dimensional numerical simulations. The simulated results exhibit qualitatively similar characteristics to the experimental data and demonstrate the usefulness of numerical simulation as an aid for ultrasonic signal interpretation. The observed sensitivity to subsurface morphology makes the ultrasonic methodology a promising tool for monitoring laser damage in large aperture laser optics used in fusion energy research.

  4. Fuzzy logic controllers for electrotechnical devices - On-site tuning approach

    NASA Astrophysics Data System (ADS)

    Hissel, D.; Maussion, P.; Faucher, J.

    2001-12-01

    Fuzzy logic offers nowadays an interesting alternative to the designers of non linear control laws for electrical or electromechanical systems. However, due to the huge number of tuning parameters, this kind of control is only used in a few industrial applications. This paper proposes a new, very simple, on-site tuning strategy for a PID-like fuzzy logic controller. Thanks to the experimental designs methodology, we will propose sets of optimized pre-established settings for this kind of fuzzy controllers. The proposed parameters are only depending on one on-site open-loop identification test. In this way, this on-site tuning methodology has to be compared to the Ziegler-Nichols one's for conventional controllers. Experimental results (on a permanent magnets synchronous motor and on a DC/DC converter) will underline all the efficiency of this tuning methodology. Finally, the field of validity of the proposed pre-established settings will be given.

  5. Refinement of a methodology for siting maintenance area headquarters.

    DOT National Transportation Integrated Search

    1986-01-01

    Prior to this study, a methodology that generates travel time, or isochronal, contours around area headquarters or the housing bases of maintenance crews was developed. The methodology was then pilot tested for the Charlottesville Residency, and was ...

  6. Sequential analysis of hydrochemical data for watershed characterization.

    PubMed

    Thyne, Geoffrey; Güler, Cüneyt; Poeter, Eileen

    2004-01-01

    A methodology for characterizing the hydrogeology of watersheds using hydrochemical data that combine statistical, geochemical, and spatial techniques is presented. Surface water and ground water base flow and spring runoff samples (180 total) from a single watershed are first classified using hierarchical cluster analysis. The statistical clusters are analyzed for spatial coherence confirming that the clusters have a geological basis corresponding to topographic flowpaths and showing that the fractured rock aquifer behaves as an equivalent porous medium on the watershed scale. Then principal component analysis (PCA) is used to determine the sources of variation between parameters. PCA analysis shows that the variations within the dataset are related to variations in calcium, magnesium, SO4, and HCO3, which are derived from natural weathering reactions, and pH, NO3, and chlorine, which indicate anthropogenic impact. PHREEQC modeling is used to quantitatively describe the natural hydrochemical evolution for the watershed and aid in discrimination of samples that have an anthropogenic component. Finally, the seasonal changes in the water chemistry of individual sites were analyzed to better characterize the spatial variability of vertical hydraulic conductivity. The integrated result provides a method to characterize the hydrogeology of the watershed that fully utilizes traditional data.

  7. Mercury in Nelson's Sparrow Subspecies at Breeding Sites

    PubMed Central

    Winder, Virginia L.; Emslie, Steven D.

    2012-01-01

    Background Mercury is a persistent, biomagnifying contaminant that can cause negative effects on ecosystems. Marshes are often areas of relatively high mercury methylation and bioaccumulation. Nelson's Sparrows (Ammodramus nelsoni) use marsh habitats year-round and have been documented to exhibit tissue mercury concentrations that exceed negative effects thresholds. We sought to further characterize the potential risk of Nelson's Sparrows to mercury exposure by sampling individuals from sites within the range of each of its subspecies. Methodology/Principal Findings From 2009 to 2011, we captured adult Nelson's Sparrows at sites within the breeding range of each subspecies (A. n. nelsoni: Grand Forks and Upham, North Dakota; A. n. alterus: Moosonee, Ontario; and A. n. subvirgatus: Grand Manan Island, New Brunswick) and sampled breast feathers, the first primary feather (P1), and blood for total mercury analysis. Mean blood mercury in nelsoni individuals captured near Grand Forks ranged from 0.84±0.37 to 1.65±1.02 SD ppm among years, between 2.0 and 4.9 times as high as concentrations at the other sites (P<0.01). Breast feather mercury did not vary among sites within a given sampling year (site means ranged from 0.98±0.69 to 2.71±2.93 ppm). Mean P1 mercury in alterus (2.96±1.84 ppm fw) was significantly lower than in any other sampled population (5.25±2.24–6.77±3.51 ppm; P≤0.03). Conclusions/Significance Our study further characterized mercury in Nelson's Sparrows near Grand Forks; we documented localized and potentially harmful mercury concentrations, indicating that this area may represent a biological mercury hotspot. This finding warrants further research to determine if wildlife populations of conservation or recreational interest in this area may be experiencing negative effects due to mercury exposure. We present preliminary conclusions about the risk of each sampled population to mercury exposure. PMID:22384194

  8. Development and application of a methodology for a clean development mechanism to avoid methane emissions in closed landfills.

    PubMed

    Janke, Leandro; Lima, André O S; Millet, Maurice; Radetski, Claudemir M

    2013-01-01

    In Brazil, Solid Waste Disposal Sites have operated without consideration of environmental criteria, these areas being characterized by methane (CH4) emissions during the anaerobic degradation of organic matter. The United Nations organization has made efforts to control this situation, through the United Nations Framework Convention on Climate Change (UNFCCC) and the Kyoto Protocol, where projects that seek to reduce the emissions of greenhouse gases (GHG) can be financially rewarded through Certified Emission Reductions (CERs) if they respect the requirements established by the Clean Development Mechanism (CDM), such as the use of methodologies approved by the CDM Executive Board (CDM-EB). Thus, a methodology was developed according to the CDM standards related to the aeration, excavation and composting of closed Municipal Solid Waste (MSW) landfills, which was submitted to CDM-EB for assessment and, after its approval, applied to a real case study in Maringá City (Brazil) with a view to avoiding negative environmental impacts due the production of methane and leachates even after its closure. This paper describes the establishment of this CDM-EB-approved methodology to determine baseline emissions, project emissions and the resultant emission reductions with the application of appropriate aeration, excavation and composting practices at closed MSW landfills. A further result obtained through the application of the methodology in the landfill case study was that it would be possible to achieve an ex-ante emission reduction of 74,013 tCO2 equivalent if the proposed CDM project activity were implemented.

  9. Bidirectional Retroviral Integration Site PCR Methodology and Quantitative Data Analysis Workflow.

    PubMed

    Suryawanshi, Gajendra W; Xu, Song; Xie, Yiming; Chou, Tom; Kim, Namshin; Chen, Irvin S Y; Kim, Sanggu

    2017-06-14

    Integration Site (IS) assays are a critical component of the study of retroviral integration sites and their biological significance. In recent retroviral gene therapy studies, IS assays, in combination with next-generation sequencing, have been used as a cell-tracking tool to characterize clonal stem cell populations sharing the same IS. For the accurate comparison of repopulating stem cell clones within and across different samples, the detection sensitivity, data reproducibility, and high-throughput capacity of the assay are among the most important assay qualities. This work provides a detailed protocol and data analysis workflow for bidirectional IS analysis. The bidirectional assay can simultaneously sequence both upstream and downstream vector-host junctions. Compared to conventional unidirectional IS sequencing approaches, the bidirectional approach significantly improves IS detection rates and the characterization of integration events at both ends of the target DNA. The data analysis pipeline described here accurately identifies and enumerates identical IS sequences through multiple steps of comparison that map IS sequences onto the reference genome and determine sequencing errors. Using an optimized assay procedure, we have recently published the detailed repopulation patterns of thousands of Hematopoietic Stem Cell (HSC) clones following transplant in rhesus macaques, demonstrating for the first time the precise time point of HSC repopulation and the functional heterogeneity of HSCs in the primate system. The following protocol describes the step-by-step experimental procedure and data analysis workflow that accurately identifies and quantifies identical IS sequences.

  10. Application of data fusion modeling (DFM) to site characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, D.W.; Gibbs, B.P.; Jones, W.F.

    1996-01-01

    Subsurface characterization is faced with substantial uncertainties because the earth is very heterogeneous, and typical data sets are fragmented and disparate. DFM removes many of the data limitations of current methods to quantify and reduce uncertainty for a variety of data types and models. DFM is a methodology to compute hydrogeological state estimates and their uncertainties from three sources of information: measured data, physical laws, and statistical models for spatial heterogeneities. The benefits of DFM are savings in time and cost through the following: the ability to update models in real time to help guide site assessment, improved quantification ofmore » uncertainty for risk assessment, and improved remedial design by quantifying the uncertainty in safety margins. A Bayesian inverse modeling approach is implemented with a Gauss Newton method where spatial heterogeneities are viewed as Markov random fields. Information from data, physical laws, and Markov models is combined in a Square Root Information Smoother (SRIS). Estimates and uncertainties can be computed for heterogeneous hydraulic conductivity fields in multiple geological layers from the usually sparse hydraulic conductivity data and the often more plentiful head data. An application of DFM to the Old Burial Ground at the DOE Savannah River Site will be presented. DFM estimates and quantifies uncertainty in hydrogeological parameters using variably saturated flow numerical modeling to constrain the estimation. Then uncertainties are propagated through the transport modeling to quantify the uncertainty in tritium breakthrough curves at compliance points.« less

  11. Application of data fusion modeling (DFM) to site characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, D.W.; Gibbs, B.P.; Jones, W.F.

    1996-12-31

    Subsurface characterization is faced with substantial uncertainties because the earth is very heterogeneous, and typical data sets are fragmented and disparate. DFM removes many of the data limitations of current methods to quantify and reduce uncertainty for a variety of data types and models. DFM is a methodology to compute hydrogeological state estimates and their uncertainties from three sources of information: measured data, physical laws, and statistical models for spatial heterogeneities. The benefits of DFM are savings in time and cost through the following: the ability to update models in real time to help guide site assessment, improved quantification ofmore » uncertainty for risk assessment, and improved remedial design by quantifying the uncertainty in safety margins. A Bayesian inverse modeling approach is implemented with a Gauss Newton method where spatial heterogeneities are viewed as Markov random fields. Information from data, physical laws, and Markov models is combined in a Square Root Information Smoother (SRIS). Estimates and uncertainties can be computed for heterogeneous hydraulic conductivity fields in multiple geological layers from the usually sparse hydraulic conductivity data and the often more plentiful head data. An application of DFM to the Old Burial Ground at the DOE Savannah River Site will be presented. DFM estimates and quantifies uncertainty in hydrogeological parameters using variably saturated flow numerical modeling to constrain the estimation. Then uncertainties are propagated through the transport modeling to quantify the uncertainty in tritium breakthrough curves at compliance points.« less

  12. A quality evaluation methodology of health web-pages for non-professionals.

    PubMed

    Currò, Vincenzo; Buonuomo, Paola Sabrina; Onesimo, Roberta; de Rose, Paola; Vituzzi, Andrea; di Tanna, Gian Luca; D'Atri, Alessandro

    2004-06-01

    The proposal of an evaluation methodology for determining the quality of healthcare web sites for the dissemination of medical information to non-professionals. Three (macro) factors are considered for the quality evaluation: medical contents, accountability of the authors, and usability of the web site. Starting from two results in the literature the problem of whether or not to introduce a weighting function has been investigated. This methodology has been validated on a specialized information content, i.e., sore throats, due to the large interest such a topic enjoys with target users. The World Wide Web was accessed using a meta-search system merging several search engines. A statistical analysis was made to compare the proposed methodology with the obtained ranks of the sample web pages. The statistical analysis confirms that the variables examined (per item and sub factor) show substantially similar ranks and are capable of contributing to the evaluation of the main quality macro factors. A comparison between the aggregation functions in the proposed methodology (non-weighted averages) and the weighting functions, derived from the literature, allowed us to verify the suitability of the method. The proposed methodology suggests a simple approach which can quickly award an overall quality score for medical web sites oriented to non-professionals.

  13. SITE CHARACTERIZATION LIBRARY VERSION 3.0

    EPA Science Inventory

    The Site Characterization Library is a CD that provides a centralized, field-portable source for site characterization information. Version 3 of the Site Characterization Library contains additional (from earlier versions) electronic documents and computer programs related to th...

  14. UAS-Borne Photogrammetry for Surface Topographic Characterization: A Ground-Truth Baseline for Future Change Detection and Refinement of Scaled Remotely-Sensed Datasets

    NASA Astrophysics Data System (ADS)

    Coppersmith, R.; Schultz-Fellenz, E. S.; Sussman, A. J.; Vigil, S.; Dzur, R.; Norskog, K.; Kelley, R.; Miller, L.

    2015-12-01

    While long-term objectives of monitoring and verification regimes include remote characterization and discrimination of surficial geologic and topographic features at sites of interest, ground truth data is required to advance development of remote sensing techniques. Increasingly, it is desirable for these ground-based or ground-proximal characterization methodologies to be as nimble, efficient, non-invasive, and non-destructive as their higher-altitude airborne counterparts while ideally providing superior resolution. For this study, the area of interest is an alluvial site at the Nevada National Security Site intended for use in the Source Physics Experiment's (Snelson et al., 2013) second phase. Ground-truth surface topographic characterization was performed using a DJI Inspire 1 unmanned aerial system (UAS), at very low altitude (< 5-30m AGL). 2D photographs captured by the standard UAS camera payload were imported into Agisoft Photoscan to create three-dimensional point clouds. Within the area of interest, careful installation of surveyed ground control fiducial markers supplied necessary targets for field collection, and information for model georectification. The resulting model includes a Digital Elevation Model derived from 2D imagery. It is anticipated that this flexible and versatile characterization process will provide point cloud data resolution equivalent to a purely ground-based LiDAR scanning deployment (e.g., 1-2cm horizontal and vertical resolution; e.g., Sussman et al., 2012; Schultz-Fellenz et al., 2013). In addition to drastically increasing time efficiency in the field, the UAS method also allows for more complete coverage of the study area when compared to ground-based LiDAR. Comparison and integration of these data with conventionally-acquired airborne LiDAR data from a higher-altitude (~ 450m) platform will aid significantly in the refinement of technologies and detection capabilities of remote optical systems to identify and detect surface geologic and topographic signatures of interest. This work includes a preliminary comparison of surface signatures detected from varying standoff distances to assess current sensor performance and benefits.

  15. Large Eddy Simulation and Reynolds-Averaged Navier-Stokes modeling of flow in a realistic pharyngeal airway model: an investigation of obstructive sleep apnea.

    PubMed

    Mihaescu, Mihai; Murugappan, Shanmugam; Kalra, Maninder; Khosla, Sid; Gutmark, Ephraim

    2008-07-19

    Computational fluid dynamics techniques employing primarily steady Reynolds-Averaged Navier-Stokes (RANS) methodology have been recently used to characterize the transitional/turbulent flow field in human airways. The use of RANS implies that flow phenomena are averaged over time, the flow dynamics not being captured. Further, RANS uses two-equation turbulence models that are not adequate for predicting anisotropic flows, flows with high streamline curvature, or flows where separation occurs. A more accurate approach for such flow situations that occur in the human airway is Large Eddy Simulation (LES). The paper considers flow modeling in a pharyngeal airway model reconstructed from cross-sectional magnetic resonance scans of a patient with obstructive sleep apnea. The airway model is characterized by a maximum narrowing at the site of retropalatal pharynx. Two flow-modeling strategies are employed: steady RANS and the LES approach. In the RANS modeling framework both k-epsilon and k-omega turbulence models are used. The paper discusses the differences between the airflow characteristics obtained from the RANS and LES calculations. The largest discrepancies were found in the axial velocity distributions downstream of the minimum cross-sectional area. This region is characterized by flow separation and large radial velocity gradients across the developed shear layers. The largest difference in static pressure distributions on the airway walls was found between the LES and the k-epsilon data at the site of maximum narrowing in the retropalatal pharynx.

  16. Enhanced characterization of singly protonated phosphopeptide ions by femtosecond laser-induced ionization/dissociation tandem mass spectrometry (fs-LID-MS/MS).

    PubMed

    Smith, Scott A; Kalcic, Christine L; Safran, Kyle A; Stemmer, Paul M; Dantus, Marcos; Reid, Gavin E

    2010-12-01

    To develop an improved understanding of the regulatory role that post-translational modifications (PTMs) involving phosphorylation play in the maintenance of normal cellular function, tandem mass spectrometry (MS/MS) strategies coupled with ion activation techniques such as collision-induced dissociation (CID) and electron-transfer dissociation (ETD) are typically employed to identify the presence and site-specific locations of the phosphate moieties within a given phosphoprotein of interest. However, the ability of these techniques to obtain sufficient structural information for unambiguous phosphopeptide identification and characterization is highly dependent on the ion activation method employed and the properties of the precursor ion that is subjected to dissociation. Herein, we describe the application of a recently developed alternative ion activation technique for phosphopeptide analysis, termed femtosecond laser-induced ionization/dissociation (fs-LID). In contrast to CID and ETD, fs-LID is shown to be particularly suited to the analysis of singly protonated phosphopeptide ions, yielding a wide range of product ions including a, b, c, x, y, and z sequence ions, as well as ions that are potentially diagnostic of the positions of phosphorylation (e.g., 'a(n)+1-98'). Importantly, the lack of phosphate moiety losses or phosphate group 'scrambling' provides unambiguous information for sequence identification and phosphorylation site characterization. Therefore, fs-LID-MS/MS can serve as a complementary technique to established methodologies for phosphoproteomic analysis. Copyright © 2010. Published by Elsevier Inc.

  17. Retrieval of Aerosol Parameters from Continuous H24 Lidar-Ceilometer Measurements

    NASA Astrophysics Data System (ADS)

    Dionisi, D.; Barnaba, F.; Costabile, F.; Di Liberto, L.; Gobbi, G. P.; Wille, H.

    2016-06-01

    Ceilometer technology is increasingly applied to the monitoring and the characterization of tropospheric aerosols. In this work, a method to estimate some key aerosol parameters (extinction coefficient, surface area concentration and volume concentration) from ceilometer measurements is presented. A numerical model has been set up to derive a mean functional relationships between backscatter and the above mentioned parameters based on a large set of simulated aerosol optical properties. A good agreement was found between the modeled backscatter and extinction coefficients and the ones measured by the EARLINET Raman lidars. The developed methodology has then been applied to the measurements acquired by a prototype Polarization Lidar-Ceilometer (PLC). This PLC instrument was developed within the EC- LIFE+ project "DIAPASON" as an upgrade of the commercial, single-channel Jenoptik CHM15k system. The PLC run continuously (h24) close to Rome (Italy) for a whole year (2013-2014). Retrievals of the aerosol backscatter coefficient at 1064 nm and of the relevant aerosol properties were performed using the proposed methodology. This information, coupled to some key aerosol type identification made possible by the depolarization channel, allowed a year-round characterization of the aerosol field at this site. Examples are given to show how this technology coupled to appropriate data inversion methods is potentially useful in the operational monitoring of parameters of air quality and meteorological interest.

  18. Hierarchical SAPO‐34 Architectures with Tailored Acid Sites using Sustainable Sugar Templates

    PubMed Central

    Miletto, Ivana; Ivaldi, Chiara; Paul, Geo; Chapman, Stephanie; Marchese, Leonardo; Raja, Robert

    2018-01-01

    Abstract In a distinct, bottom‐up synthetic methodology, monosaccharides (fructose and glucose) and disaccharides (sucrose) have been used as mesoporogens to template hierarchical SAPO‐34 catalysts. Detailed materials characterization, which includes solid‐state magic angle spinning NMR and probe‐based FTIR, reveals that, although the mesopore dimensions are modified by the identity of the sugar template, the desirable acid characteristics of the microporous framework are retained. When the activity of the hierarchical SAPO‐34 catalysts was evaluated in the industrially relevant Beckmann rearrangement, under liquid‐phase conditions, the enhanced mass‐transport properties of sucrose‐templated hierarchical SAPO‐34 were found to deliver a superior yield of ϵ‐caprolactam. PMID:29686961

  19. Bacterial Production of Site Specific 13C Labeled Phenylalanine and Methodology for High Level Incorporation into Bacterially Expressed Recombinant Proteins

    PubMed Central

    Ramaraju, Bhargavi; McFeeters, Hana; Vogler, Bernhard; McFeeters, Robert L.

    2016-01-01

    Nuclear magnetic resonance spectroscopy studies of ever larger systems have benefited from many different forms of isotope labeling, in particular, site specific isotopic labeling. Site specific 13C labeling of methyl groups has become an established means of probing systems not amenable to traditional methodology. However useful, methyl reporter sites can be limited in number and/or location. Therefore, new complementary site specific isotope labeling strategies are valuable. Aromatic amino acids make excellent probes since they are often found at important interaction interfaces and play significant structural roles. Aromatic side chains have many of the same advantages as methyl containing amino acids including distinct 13C chemical shifts and multiple magnetically equivalent 1H positions. Herein we report economical bacterial production and one-step purification of phenylalanine with 13C incorporation at the Cα, Cγ and Cε positions, resulting in two isolated 1H-13C spin systems. We also present methodology to maximize incorporation of phenylalanine into recombinantly overexpressed proteins in bacteria and demonstrate compatibility with ILV-methyl labeling. Inexpensive, site specific isotope labeled phenylalanine adds another dimension to biomolecular NMR, opening new avenues of study. PMID:28028744

  20. Fish community-based measures of estuarine ecological quality and pressure-impact relationships

    NASA Astrophysics Data System (ADS)

    Fonseca, Vanessa F.; Vasconcelos, Rita P.; Gamito, Rita; Pasquaud, Stéphanie; Gonçalves, Catarina I.; Costa, José L.; Costa, Maria J.; Cabral, Henrique N.

    2013-12-01

    Community-based responses of fish fauna to anthropogenic pressures have been extensively used to assess the ecological quality of estuarine ecosystems. Several methodologies have been developed recently combining metrics reflecting community structure and function. A fish community facing significant environmental disturbances will be characterized by a simplified structure, with lower diversity and complexity. However, estuaries are naturally dynamic ecosystems exposed to numerous human pressures, making it difficult to distinguish between natural and anthropogenic-induced changes to the biological community. In the present work, the variability of several fish metrics was assessed in relation to different pressures in estuarine sites. The response of a multimetric index (Estuarine Fish Assessment Index) was also analysed. Overall, fish metrics and the multimetric index signalled anthropogenic stress, particularly environmental chemical pollution. The fish assemblage associated with this type of pressure was characterized by lower species diversity, lower number of functional guilds, lower abundance of marine migrants and of piscivorous individuals, and higher abundance of estuarine resident species. A decreased ecological quality status, based on the EFAI, was also determined for sites associated with this pressure group. Ultimately, the definition of each pressure groups favoured a stressor-specific analysis, evidencing pressure patterns and accounting for multiple factors in a highly dynamic environment.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significantmore » funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.« less

  2. Iterative performance assessments as a regulatory tool for evaluating repository safety: How experiences from SKI Project-90 were used in formulating the new performance assessment project SITE-94

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersson, J.

    1993-12-31

    The Swedish Nuclear Power Inspectorate, SKI, regulatory research program has to prepare for the process of licensing a repository for spent nuclear fuel, by building up the necessary knowledge and review capacity. SKIs main strategy for meeting this demand is to develop an independent performance assessment capability. SKIs first own performance assessment project, Project-90, was completed in 1991 and is now followed by a new project, SITE-94. SITE-94 is based on conclusions reached within Project-90. An independent review of Project-90, carried out by a NEA team of experts, has also contributed to the formation of the project. Another important reasonmore » for the project is that the implementing organization in Sweden, SKB, has proposed to submit an application to start detailed investigation of a repository candidate site around 1997. SITE-94 is a performance assessment of a hypothetical repository at a real site. The main objective of the project is to determine how site specific data should be assimilated into the performance assessment process, and to evaluate how uncertainties inherent in site characterization will influence performance assessment results. This will be addressed by exploring multiple interpretations, conceptual models, and parameters consistent with the site data. The site evaluation will strive for consistency between geological, hydrological, rock mechanical, and geochemical descriptions. Other important elements of SITE-94 are the development of a practical and defensible methodology for defining, constructing and analyzing scenarios, the development of approaches for treatment of uncertainties, evaluation of canister integrity, and the development and application of an appropriate quality assurance plan for performance assessments.« less

  3. Investigations in site response from ground motion observations in vertical arrays

    NASA Astrophysics Data System (ADS)

    Baise, Laurie Gaskins

    The aim of the research is to improve the understanding of earthquake site response and to improve the techniques available to investigate issues in this field. Vertical array ground motion data paired with the empirical transfer function (ETF) methodology is shown to accurately characterize site response. This manuscript draws on methods developed in the field of signal processing and statistical time series analysis to parameterize the ETF as an autoregressive moving-average (ARMA) system which is justified theoretically, historically, and by example. Site response is evaluated at six sites in California, Japan, and Taiwan using ETF estimates, correlation analysis, and full waveform modeling. Correlation analysis is proposed as a required data quality evaluation imperative to any subsequent site response analysis. ETF estimates and waveform modeling are used to decipher the site response at sites with simple and complex geologic structure, which provide simple time-invariant and time-variant methods for evaluating both linear site transfer functions and nonlinear site response for sites experiencing liquefaction of the soils. The Treasure and Yerba Buena Island sites, however, require 2-D waveform modeling to accurately evaluate the effects of the shallow sedimentary basin. ETFs are used to characterize the Port Island site and corresponding shake table tests before, during, and after liquefaction. ETFs derived from the shake table tests were demonstrated to consistently predict the linear field ground response below 16 m depth and the liquefied behavior above 15 m depth. The liquefied interval response was demonstrated to gradually return to pre-liquefied conditions within several weeks of the 1995 Hyogo-ken Nanbu earthquake. Both the site's and the shake table test's response were shown to be effectively linear up to 0.5 g in the native materials below 16 m depth. The effective linearity of the site response at GVDA, Chiba, and Lotting up to 0.1 g, 0.33 g, and 0.49 g, respectively, further confirms that site response in the field may be more linear than expected from laboratory tests. Strong motions were predicted at these sites with normalized mean square error less than 0.10 using ETFs generated from weak motions. The Treasure Island site response was shown to be dominated by surface waves propagating in the shallow sediments of the San Francisco Bay. Low correlation of the ground motions recorded on rock at Yerba Buena Island and in rock beneath the Treasure Island site intimates that the Yerba Buena site is an inappropriate reference site for Treasure Island site response studies. Accurate simulation of the Treasure Island site response was achieved using a 2-D velocity structure comprised of a 100 m uniform soil basin (Vs = 400 m/s) over a weathered rock veneer (Vs = 1.5 km/s) to 200 m depth.

  4. Controlling allosteric networks in proteins

    NASA Astrophysics Data System (ADS)

    Dokholyan, Nikolay

    2013-03-01

    We present a novel methodology based on graph theory and discrete molecular dynamics simulations for delineating allosteric pathways in proteins. We use this methodology to uncover the structural mechanisms responsible for coupling of distal sites on proteins and utilize it for allosteric modulation of proteins. We will present examples where inference of allosteric networks and its rewiring allows us to ``rescue'' cystic fibrosis transmembrane conductance regulator (CFTR), a protein associated with fatal genetic disease cystic fibrosis. We also use our methodology to control protein function allosterically. We design a novel protein domain that can be inserted into identified allosteric site of target protein. Using a drug that binds to our domain, we alter the function of the target protein. We successfully tested this methodology in vitro, in living cells and in zebrafish. We further demonstrate transferability of our allosteric modulation methodology to other systems and extend it to become ligh-activatable.

  5. Towards a Methodology for the Characterization of Teachers' Didactic-Mathematical Knowledge

    ERIC Educational Resources Information Center

    Pino-Fan, Luis R.; Assis, Adriana; Castro, Walter F.

    2015-01-01

    This research study aims at exploring the use of some dimensions and theoretical-methodological tools suggested by the model of Didactic-Mathematical Knowledge (DMK) for the analysis, characterization and development of knowledge that teachers should have in order to efficiently develop within their practice. For this purpose, we analyzed the…

  6. Preliminary Validation of Composite Material Constitutive Characterization

    Treesearch

    John G. Michopoulos; Athanasios lliopoulos; John C. Hermanson; Adrian C. Orifici; Rodney S. Thomson

    2012-01-01

    This paper is describing the preliminary results of an effort to validate a methodology developed for composite material constitutive characterization. This methodology involves using massive amounts of data produced from multiaxially tested coupons via a 6-DoF robotic system called NRL66.3 developed at the Naval Research Laboratory. The testing is followed by...

  7. Uncertainty and variability in laboratory derived sorption parameters of sediments from a uranium in situ recovery site

    NASA Astrophysics Data System (ADS)

    Dangelmayr, Martin A.; Reimus, Paul W.; Johnson, Raymond H.; Clay, James T.; Stone, James J.

    2018-06-01

    This research assesses the ability of a GC SCM to simulate uranium transport under variable geochemical conditions typically encountered at uranium in-situ recovery (ISR) sites. Sediment was taken from a monitoring well at the SRH site at depths 192 and 193 m below ground and characterized by XRD, XRF, TOC, and BET. Duplicate column studies on the different sediment depths, were flushed with synthesized restoration waters at two different alkalinities (160 mg/l CaCO3 and 360 mg/l CaCO3) to study the effect of alkalinity on uranium mobility. Uranium breakthrough occurred 25% - 30% earlier in columns with 360 mg/l CaCO3 over columns fed with 160 mg/l CaCO3 influent water. A parameter estimation program (PEST) was coupled to PHREEQC to derive site densities from experimental data. Significant parameter fittings were produced for all models, demonstrating that the GC SCM approach can model the impact of carbonate on uranium in flow systems. Derived site densities for the two sediment depths were between 141 and 178 μmol-sites/kg-soil, demonstrating similar sorption capacities despite heterogeneity in sediment mineralogy. Model sensitivity to alkalinity and pH was shown to be moderate compared to fitted site densities, when calcite saturation was allowed to equilibrate. Calcite kinetics emerged as a potential source of error when fitting parameters in flow conditions. Fitted results were compared to data from previous batch and column studies completed on sediments from the Smith-Ranch Highland (SRH) site, to assess variability in derived parameters. Parameters from batch experiments were lower by a factor of 1.1 to 3.4 compared to column studies completed on the same sediments. The difference was attributed to errors in solid-solution ratios and the impact of calcite dissolution in batch experiments. Column studies conducted at two different laboratories showed almost an order of magnitude difference in fitted site densities suggesting that experimental methodology may play a bigger role in column sorption behavior than actual sediment heterogeneity. Our results demonstrate the necessity for ISR sites to remove residual pCO2 and equilibrate restoration water with background geochemistry to reduce uranium mobility. In addition, the observed variability between fitted parameters on the same sediments highlights the need to provide standardized guidelines and methodology for regulators and industry when the GC SCM approach is used for ISR risk assessments.

  8. Uncertainty and variability in laboratory derived sorption parameters of sediments from a uranium in situ recovery site.

    PubMed

    Dangelmayr, Martin A; Reimus, Paul W; Johnson, Raymond H; Clay, James T; Stone, James J

    2018-06-01

    This research assesses the ability of a GC SCM to simulate uranium transport under variable geochemical conditions typically encountered at uranium in-situ recovery (ISR) sites. Sediment was taken from a monitoring well at the SRH site at depths 192 and 193 m below ground and characterized by XRD, XRF, TOC, and BET. Duplicate column studies on the different sediment depths, were flushed with synthesized restoration waters at two different alkalinities (160 mg/l CaCO 3 and 360 mg/l CaCO 3 ) to study the effect of alkalinity on uranium mobility. Uranium breakthrough occurred 25% - 30% earlier in columns with 360 mg/l CaCO 3 over columns fed with 160 mg/l CaCO 3 influent water. A parameter estimation program (PEST) was coupled to PHREEQC to derive site densities from experimental data. Significant parameter fittings were produced for all models, demonstrating that the GC SCM approach can model the impact of carbonate on uranium in flow systems. Derived site densities for the two sediment depths were between 141 and 178 μmol-sites/kg-soil, demonstrating similar sorption capacities despite heterogeneity in sediment mineralogy. Model sensitivity to alkalinity and pH was shown to be moderate compared to fitted site densities, when calcite saturation was allowed to equilibrate. Calcite kinetics emerged as a potential source of error when fitting parameters in flow conditions. Fitted results were compared to data from previous batch and column studies completed on sediments from the Smith-Ranch Highland (SRH) site, to assess variability in derived parameters. Parameters from batch experiments were lower by a factor of 1.1 to 3.4 compared to column studies completed on the same sediments. The difference was attributed to errors in solid-solution ratios and the impact of calcite dissolution in batch experiments. Column studies conducted at two different laboratories showed almost an order of magnitude difference in fitted site densities suggesting that experimental methodology may play a bigger role in column sorption behavior than actual sediment heterogeneity. Our results demonstrate the necessity for ISR sites to remove residual pCO2 and equilibrate restoration water with background geochemistry to reduce uranium mobility. In addition, the observed variability between fitted parameters on the same sediments highlights the need to provide standardized guidelines and methodology for regulators and industry when the GC SCM approach is used for ISR risk assessments. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Comparison of numerical weather prediction based deterministic and probabilistic wind resource assessment methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Draxl, Caroline; Hopson, Thomas

    Numerical weather prediction (NWP) models have been widely used for wind resource assessment. Model runs with higher spatial resolution are generally more accurate, yet extremely computational expensive. An alternative approach is to use data generated by a low resolution NWP model, in conjunction with statistical methods. In order to analyze the accuracy and computational efficiency of different types of NWP-based wind resource assessment methods, this paper performs a comparison of three deterministic and probabilistic NWP-based wind resource assessment methodologies: (i) a coarse resolution (0.5 degrees x 0.67 degrees) global reanalysis data set, the Modern-Era Retrospective Analysis for Research and Applicationsmore » (MERRA); (ii) an analog ensemble methodology based on the MERRA, which provides both deterministic and probabilistic predictions; and (iii) a fine resolution (2-km) NWP data set, the Wind Integration National Dataset (WIND) Toolkit, based on the Weather Research and Forecasting model. Results show that: (i) as expected, the analog ensemble and WIND Toolkit perform significantly better than MERRA confirming their ability to downscale coarse estimates; (ii) the analog ensemble provides the best estimate of the multi-year wind distribution at seven of the nine sites, while the WIND Toolkit is the best at one site; (iii) the WIND Toolkit is more accurate in estimating the distribution of hourly wind speed differences, which characterizes the wind variability, at five of the available sites, with the analog ensemble being best at the remaining four locations; and (iv) the analog ensemble computational cost is negligible, whereas the WIND Toolkit requires large computational resources. Future efforts could focus on the combination of the analog ensemble with intermediate resolution (e.g., 10-15 km) NWP estimates, to considerably reduce the computational burden, while providing accurate deterministic estimates and reliable probabilistic assessments.« less

  10. A high-throughput virus-induced gene silencing protocol identifies genes involved in multi-stress tolerance

    PubMed Central

    2013-01-01

    Background Understanding the function of a particular gene under various stresses is important for engineering plants for broad-spectrum stress tolerance. Although virus-induced gene silencing (VIGS) has been used to characterize genes involved in abiotic stress tolerance, currently available gene silencing and stress imposition methodology at the whole plant level is not suitable for high-throughput functional analyses of genes. This demands a robust and reliable methodology for characterizing genes involved in abiotic and multi-stress tolerance. Results Our methodology employs VIGS-based gene silencing in leaf disks combined with simple stress imposition and effect quantification methodologies for easy and faster characterization of genes involved in abiotic and multi-stress tolerance. By subjecting leaf disks from gene-silenced plants to various abiotic stresses and inoculating silenced plants with various pathogens, we show the involvement of several genes for multi-stress tolerance. In addition, we demonstrate that VIGS can be used to characterize genes involved in thermotolerance. Our results also showed the functional relevance of NtEDS1 in abiotic stress, NbRBX1 and NbCTR1 in oxidative stress; NtRAR1 and NtNPR1 in salinity stress; NbSOS1 and NbHSP101 in biotic stress; and NtEDS1, NbETR1, NbWRKY2 and NbMYC2 in thermotolerance. Conclusions In addition to widening the application of VIGS, we developed a robust, easy and high-throughput methodology for functional characterization of genes involved in multi-stress tolerance. PMID:24289810

  11. A Comparative Approach for Ranking Contaminated Sites Based on the Risk Assessment Paradigm Using Fuzzy PROMETHEE

    NASA Astrophysics Data System (ADS)

    Zhang, Kejiang; Kluck, Cheryl; Achari, Gopal

    2009-11-01

    A ranking system for contaminated sites based on comparative risk methodology using fuzzy Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE) was developed in this article. It combines the concepts of fuzzy sets to represent uncertain site information with the PROMETHEE, a subgroup of Multi-Criteria Decision Making (MCDM) methods. Criteria are identified based on a combination of the attributes (toxicity, exposure, and receptors) associated with the potential human health and ecological risks posed by contaminated sites, chemical properties, site geology and hydrogeology and contaminant transport phenomena. Original site data are directly used avoiding the subjective assignment of scores to site attributes. When the input data are numeric and crisp the PROMETHEE method can be used. The Fuzzy PROMETHEE method is preferred when substantial uncertainties and subjectivities exist in site information. The PROMETHEE and fuzzy PROMETHEE methods are both used in this research to compare the sites. The case study shows that this methodology provides reasonable results.

  12. A comparative approach for ranking contaminated sites based on the risk assessment paradigm using fuzzy PROMETHEE.

    PubMed

    Zhang, Kejiang; Kluck, Cheryl; Achari, Gopal

    2009-11-01

    A ranking system for contaminated sites based on comparative risk methodology using fuzzy Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE) was developed in this article. It combines the concepts of fuzzy sets to represent uncertain site information with the PROMETHEE, a subgroup of Multi-Criteria Decision Making (MCDM) methods. Criteria are identified based on a combination of the attributes (toxicity, exposure, and receptors) associated with the potential human health and ecological risks posed by contaminated sites, chemical properties, site geology and hydrogeology and contaminant transport phenomena. Original site data are directly used avoiding the subjective assignment of scores to site attributes. When the input data are numeric and crisp the PROMETHEE method can be used. The Fuzzy PROMETHEE method is preferred when substantial uncertainties and subjectivities exist in site information. The PROMETHEE and fuzzy PROMETHEE methods are both used in this research to compare the sites. The case study shows that this methodology provides reasonable results.

  13. 1-Propanol probing methodology: two-dimensional characterization of the effect of solute on H2O.

    PubMed

    Koga, Yoshikata

    2013-09-21

    The wording "hydrophobicity/hydrophilicity" has been used in a loose manner based on human experiences. We have devised a more quantitative way to redefine "hydrophobes" and "hydrophiles" in terms of the mole fraction dependence pattern of one of the third derivative quantities, the enthalpic interaction between solute molecules. We then devised a thermodynamic methodology to characterize the effect of a solute on H2O in terms of its hydrophobicity and/or hydrophilicity. We use a thermodynamic signature, the enthalpic interaction of 1-propanol, H, to monitor how the test solute modifies H2O. By this method, characterization is facilitated by two indices; one pertaining to its hydrophobicity and the other its hydrophilicity. Hence differences among amphiphiles are quantified in a two-dimensional manner. Furthermore, an individual ion can be characterized independent of a counter ion. By using this methodology, we have studied the effects on H2O of a number of solutes, and gained some important new insights. For example, such commonly used examples of hydrophobes in the literature as tetramethyl urea, trimethylamine-N-oxide, and tetramethylammonium salts are in fact surprisingly hydrophilic. Hence the conclusions about "hydrophobes" using these samples ought to be interpreted with caution. The effects of anions on H2O found by this methodology are in the same sequence of the Hofmeister ranking, which will no doubt aid a further investigation into this enigma in biochemistry. Thus, it is likely that this methodology could play an important role in the characterization of the effects of solutes in H2O, and a perspective view may be useful. Here, we describe the basis on which the methodology is developed and the methodology itself in m.ore detail than given in individual papers. We then summarize the results in two dimensional hydrophobicity/hydrophilicity maps.

  14. Development of Methodology for Programming Autonomous Agents

    NASA Technical Reports Server (NTRS)

    Erol, Kutluhan; Levy, Renato; Lang, Lun

    2004-01-01

    A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently

  15. Satellite characterization of four interesting sites for astronomical instrumentation

    NASA Astrophysics Data System (ADS)

    Cavazzani, S.; Zitelli, V.

    2013-03-01

    In this paper we have evaluated the amount of available telescope time at four interesting sites for astronomical instrumentation. We use the GOES 12 data for years 2008 and 2009. We use a homogeneous methodology presented in several previous papers to classify the nights as clear (completely cloud-free), mixed (partially cloud-covered) or covered. Additionally, for the clear nights we have evaluated the number of satellite-stable nights, corresponding to the number of ground-based photometric nights, and the clear nights, corresponding to the spectroscopic nights. We have applied this model to two sites in the Northern Hemisphere (San Pedro Martir (SPM), Mexico and Izaña, Canary Islands) and to two sites in the Southern Hemisphere (El Leoncito, Argentina and San Antonio de Los Cobres (SAC), Argentina). We have obtained, for the two years considered, mean percentages of cloud-free nights of 68.6 per cent at Izaña, 76.0 per cent at SPM, 70.6 per cent at Leoncito and 70.0 per cent at SAC. We have evaluated, amongst the cloud-free nights, a proportion of stable nights of 62.6 per cent at Izaña, 69.6 per cent at SPM, 64.9 per cent at Leoncito and 59.7 per cent at SAC.

  16. ALS Multicenter Cohort Study of Oxidative Stress (ALS COSMOS): study methodology, recruitment, and baseline demographic and disease characteristics.

    PubMed

    Mitsumoto, Hiroshi; Factor-Litvak, Pam; Andrews, Howard; Goetz, Raymond R; Andrews, Leslie; Rabkin, Judith G; McElhiney, Martin; Nieves, Jeri; Santella, Regina M; Murphy, Jennifer; Hupf, Jonathan; Singleton, Jess; Merle, David; Kilty, Mary; Heitzman, Daragh; Bedlack, Richard S; Miller, Robert G; Katz, Jonathan S; Forshew, Dallas; Barohn, Richard J; Sorenson, Eric J; Oskarsson, Bjorn; Fernandes Filho, J Americo M; Kasarskis, Edward J; Lomen-Hoerth, Catherine; Mozaffar, Tahseen; Rollins, Yvonne D; Nations, Sharon P; Swenson, Andrea J; Shefner, Jeremy M; Andrews, Jinsy A; Koczon-Jaremko, Boguslawa A

    2014-06-01

    Abstract In a multicenter study of newly diagnosed ALS patients without a reported family history of ALS, we are prospectively investigating whether markers of oxidative stress (OS) are associated with disease progression. Methods utilize an extensive structured telephone interview ascertaining environmental, lifestyle, dietary and psychological risk factors associated with OS. Detailed assessments were performed at baseline and at 3-6 month intervals during the ensuing 30 months. Our biorepository includes DNA, plasma, urine, and skin. Three hundred and fifty-five patients were recruited. Subjects were enrolled over a 36-month period at 16 sites. To meet the target number of subjects, the recruitment period was prolonged and additional sites were included. Results showed that demographic and disease characteristics were similar between 477 eligible/non-enrolled and enrolled patients, the only difference being type of health insurance among enrolled patients. Sites were divided into three groups by the number of enrolled subjects. Comparing these three groups, the Columbia site had fewer 'definite ALS' diagnoses. This is the first prospective, interdisciplinary, in-depth, multicenter epidemiological investigation of OS related to ALS progression and has been accomplished by an aggressive recruitment process. The baseline demographic and disease features of the study sample are now fully characterized.

  17. Defect Structure of Beta NiAl Using the BFS Method for Alloys

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo; Amador, Carlos; Ferrante, John; Noebe, Ronald D.

    1996-01-01

    The semiempirical BFS method for alloys is generalized by replacing experimental input with first-principles results thus allowing for the study of complex systems. In order to examine trends and behavior of a system in the vicinity of a given point of the phase diagram a search procedure based on a sampling of selected configurations is employed. This new approach is applied to the study of the beta phase of the Ni-Al system, which exists over a range of composition from 45-60 at.% Ni. This methodology results in a straightforward and economical way of reproducing and understanding the basic features of this system. At the stoichiometric composition, NiAl should exist in a perfectly ordered B2 structure. Ni-rich alloys are characterized by antisite point defects (with Ni atoms in the Al sites) with a decrease in lattice parameters. On the Al-rich side of stoichiometry there is a steep decrease in lattice parameter and density with increasing Al content. The presence of vacancies in Ni sites would explain such behavior. Recent X-ray diffraction experiments suggest a richer structure: the evidence, while strongly favoring the presence of vacancies in Ni sites, also suggests the possibility of some vacancies in Al sites in a 3:1 ratio. Moreover, local ordering of vacant sites may be preferred over a random distribution of individual point defects.

  18. Limestone percussion tools from the late Early Pleistocene sites of Barranco León and Fuente Nueva 3 (Orce, Spain).

    PubMed

    Barsky, Deborah; Vergès, Josep-María; Sala, Robert; Menéndez, Leticia; Toro-Moyano, Isidro

    2015-11-19

    In recent years, there is growing interest in the study of percussion scars and breakage patterns on hammerstones, cores and tools from Oldowan African and Eurasian lithic assemblages. Oldowan stone toolkits generally contain abundant small-sized flakes and their corresponding cores, and are characterized by their structural dichotomy of heavy- and light-duty tools. This paper explores the significance of the lesser known heavy-duty tool component, providing data from the late Lower Pleistocene sites of Barranco León and Fuente Nueva 3 (Orce, Spain), dated 1.4-1.2 Myr. Using quantitative and qualitative data from the large-sized limestone industries from these two major sites, we present a new methodology highlighting their morpho-technological features. In the light of the results, we discuss the shortfalls of extant classificatory methods for interpreting the role of percussive technology in early toolkits. This work is rooted in an experimental program designed to reproduce the wide range of percussion marks observed on the limestone artefacts from these two sites. A visual and descriptive reference is provided as an interpretative aid for future comparative research. Further experiments using a variety of materials and gestures are still needed before the elusive traces yield the secrets of the kinds of percussive activities carried out by hominins at these, and other, Oldowan sites. © 2015 The Author(s).

  19. Limestone percussion tools from the late Early Pleistocene sites of Barranco León and Fuente Nueva 3 (Orce, Spain)

    PubMed Central

    Barsky, Deborah; Vergès, Josep-María; Sala, Robert; Menéndez, Leticia; Toro-Moyano, Isidro

    2015-01-01

    In recent years, there is growing interest in the study of percussion scars and breakage patterns on hammerstones, cores and tools from Oldowan African and Eurasian lithic assemblages. Oldowan stone toolkits generally contain abundant small-sized flakes and their corresponding cores, and are characterized by their structural dichotomy of heavy- and light-duty tools. This paper explores the significance of the lesser known heavy-duty tool component, providing data from the late Lower Pleistocene sites of Barranco León and Fuente Nueva 3 (Orce, Spain), dated 1.4–1.2 Myr. Using quantitative and qualitative data from the large-sized limestone industries from these two major sites, we present a new methodology highlighting their morpho-technological features. In the light of the results, we discuss the shortfalls of extant classificatory methods for interpreting the role of percussive technology in early toolkits. This work is rooted in an experimental program designed to reproduce the wide range of percussion marks observed on the limestone artefacts from these two sites. A visual and descriptive reference is provided as an interpretative aid for future comparative research. Further experiments using a variety of materials and gestures are still needed before the elusive traces yield the secrets of the kinds of percussive activities carried out by hominins at these, and other, Oldowan sites. PMID:26483530

  20. SMARTe Site Characterization Tool. In: SMARTe20ll, EPA/600/C-10/007

    EPA Science Inventory

    The purpose of the Site Characterization Tool is to: (1) develop a sample design for collecting site characterization data and (2) perform data analysis on uploaded data. The sample design part helps to determine how many samples should be collected to characterize a site with ...

  1. Assessing the Effects of Land-use Change on Plant Traits, Communities and Ecosystem Functioning in Grasslands: A Standardized Methodology and Lessons from an Application to 11 European Sites

    PubMed Central

    Garnier, Eric; Lavorel, Sandra; Ansquer, Pauline; Castro, Helena; Cruz, Pablo; Dolezal, Jiri; Eriksson, Ove; Fortunel, Claire; Freitas, Helena; Golodets, Carly; Grigulis, Karl; Jouany, Claire; Kazakou, Elena; Kigel, Jaime; Kleyer, Michael; Lehsten, Veiko; Lepš, Jan; Meier, Tonia; Pakeman, Robin; Papadimitriou, Maria; Papanastasis, Vasilios P.; Quested, Helen; Quétier, Fabien; Robson, Matt; Roumet, Catherine; Rusch, Graciela; Skarpe, Christina; Sternberg, Marcelo; Theau, Jean-Pierre; Thébault, Aurélie; Vile, Denis; Zarovali, Maria P.

    2007-01-01

    Background and Aims A standardized methodology to assess the impacts of land-use changes on vegetation and ecosystem functioning is presented. It assumes that species traits are central to these impacts, and is designed to be applicable in different historical, climatic contexts and local settings. Preliminary results are presented to show its applicability. Methods Eleven sites, representative of various types of land-use changes occurring in marginal agro-ecosystems across Europe and Israel, were selected. Climatic data were obtained at the site level; soil data, disturbance and nutrition indices were described at the plot level within sites. Sixteen traits describing plant stature, leaf characteristics and reproductive phase were recorded on the most abundant species of each treatment. These data were combined with species abundance to calculate trait values weighed by the abundance of species in the communities. The ecosystem properties selected were components of above-ground net primary productivity and decomposition of litter. Key Results The wide variety of land-use systems that characterize marginal landscapes across Europe was reflected by the different disturbance indices, and were also reflected in soil and/or nutrient availability gradients. The trait toolkit allowed us to describe adequately the functional response of vegetation to land-use changes, but we suggest that some traits (vegetative plant height, stem dry matter content) should be omitted in studies involving mainly herbaceous species. Using the example of the relationship between leaf dry matter content and above-ground dead material, we demonstrate how the data collected may be used to analyse direct effects of climate and land use on ecosystem properties vs. indirect effects via changes in plant traits. Conclusions This work shows the applicability of a set of protocols that can be widely applied to assess the impacts of global change drivers on species, communities and ecosystems. PMID:17085470

  2. Systemic characterization and evaluation of particle packings as initial sets for discrete element simulations

    NASA Astrophysics Data System (ADS)

    Morfa, Carlos Recarey; Cortés, Lucía Argüelles; Farias, Márcio Muniz de; Morales, Irvin Pablo Pérez; Valera, Roberto Roselló; Oñate, Eugenio

    2018-07-01

    A methodology that comprises several characterization properties for particle packings is proposed in this paper. The methodology takes into account factors such as dimension and shape of particles, space occupation, homogeneity, connectivity and isotropy, among others. This classification and integration of several properties allows to carry out a characterization process to systemically evaluate the particle packings in order to guarantee the quality of the initial meshes in discrete element simulations, in both the micro- and the macroscales. Several new properties were created, and improvements in existing ones are presented. Properties from other disciplines were adapted to be used in the evaluation of particle systems. The methodology allows to easily characterize media at the level of the microscale (continuous geometries—steels, rocks microstructures, etc., and discrete geometries) and the macroscale. A global, systemic and integral system for characterizing and evaluating particle sets, based on fuzzy logic, is presented. Such system allows researchers to have a unique evaluation criterion based on the aim of their research. Examples of applications are shown.

  3. Systemic characterization and evaluation of particle packings as initial sets for discrete element simulations

    NASA Astrophysics Data System (ADS)

    Morfa, Carlos Recarey; Cortés, Lucía Argüelles; Farias, Márcio Muniz de; Morales, Irvin Pablo Pérez; Valera, Roberto Roselló; Oñate, Eugenio

    2017-10-01

    A methodology that comprises several characterization properties for particle packings is proposed in this paper. The methodology takes into account factors such as dimension and shape of particles, space occupation, homogeneity, connectivity and isotropy, among others. This classification and integration of several properties allows to carry out a characterization process to systemically evaluate the particle packings in order to guarantee the quality of the initial meshes in discrete element simulations, in both the micro- and the macroscales. Several new properties were created, and improvements in existing ones are presented. Properties from other disciplines were adapted to be used in the evaluation of particle systems. The methodology allows to easily characterize media at the level of the microscale (continuous geometries—steels, rocks microstructures, etc., and discrete geometries) and the macroscale. A global, systemic and integral system for characterizing and evaluating particle sets, based on fuzzy logic, is presented. Such system allows researchers to have a unique evaluation criterion based on the aim of their research. Examples of applications are shown.

  4. Installation Restoration Program Records Search for Kingsley Field, Oregon.

    DTIC Science & Technology

    1982-06-01

    Hazardous Assesment Rating Methodology (HARM), is now used for all Air Force IRP studies. To maintain consistency, AFESC had their on-call contractors review...Installation History D. Industrial Facilities E. POL Storage Tanks F. Abandoned Tanks G. Oil/Water Separators :" H. Site Hazard Rating Methodology I. Site...and implementing regulations. The pur- pose of DOD policy is to control the migration of hazardous material contaminants from DOD installations. 3

  5. Methodologies for Crawler Based Web Surveys.

    ERIC Educational Resources Information Center

    Thelwall, Mike

    2002-01-01

    Describes Web survey methodologies used to study the content of the Web, and discusses search engines and the concept of crawling the Web. Highlights include Web page selection methodologies; obstacles to reliable automatic indexing of Web sites; publicly indexable pages; crawling parameters; and tests for file duplication. (Contains 62…

  6. Specific surface to evaluate the efficiencies of milling and pretreatment of wood for enzymatic saccharification

    Treesearch

    Junyong Zhu; G.S. Wang; X.J. Pan; Roland Gleisner

    2009-01-01

    Sieving methods have been almost exclusively used for feedstock size-reduction characterization in the biomass refining literature. This study demonstrates a methodology to properly characterize specific surface of biomass substrates through two dimensional measurement of each fiber of the substrate using a wet imaging technique. The methodology provides more...

  7. Study site characterization. Chapter 2

    Treesearch

    Chris Potter; Richard Birdsey

    2008-01-01

    This chapter is an overview of the main site characterization requirements at landscape-scale sampling locations. The overview is organized according to multiple "Site Attribute" headings that require descriptions throughout a given study site area, leading ultimately to a sufficient overall site characterization. Guidance is provided to describe the major...

  8. Development and Validation of a Collocated Exposure Monitoring Methodology using Portable Air Monitors

    NASA Astrophysics Data System (ADS)

    Li, Z.; Che, W.; Frey, H. C.; Lau, A. K. H.

    2016-12-01

    Portable air monitors are currently being developed and used to enable a move towards exposure monitoring as opposed to fixed site monitoring. Reliable methods are needed regarding capturing spatial and temporal variability in exposure concentration to obtain credible data from which to develop efficient exposure mitigation measures. However, there are few studies that quantify the validity and repeatability of the collected data. The objective of this study is to present and evaluate a collocated exposure monitoring (CEM) methodology including the calibration of portable air monitors against stationary reference equipment, side-by-side comparison of portable air monitors, personal or microenvironmental exposure monitoring and the processing and interpretation of the collected data. The CEM methodology was evaluated based on application to portable monitors TSI DustTrak II Aerosol Monitor 8530 for fine particulate matter (PM2.5) and TSI Q-Trak model 7575 with probe model 982 for CO, CO2, temperature and relative humidity. Taking a school sampling campaign in Hong Kong in January and June, 2015 as an example, the calibrated side-by-side measured 1 Hz PM2.5 concentrations showed good consistency between two sets of portable air monitors. Confidence in side-by-side comparison, PM2.5 concentrations of which most of the time were within 2 percent, enabled robust inference regarding differences when the monitors measured in classroom and pedestrian during school hour. The proposed CEM methodology can be widely applied in sampling campaigns with the objective of simultaneously characterizing pollutant concentrations in two or more locations or microenvironments. The further application of the CEM methodology to transportation exposure will be presented and discussed.

  9. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  10. Harmonized clinical trial methodologies for localized cutaneous leishmaniasis and potential for extensive network with capacities for clinical evaluation

    PubMed Central

    Grogl, Max; Boni, Marina; Carvalho, Edgar M.; Chebli, Houda; Cisse, Mamoudou; Diro, Ermias; Fernandes Cota, Gláucia; Erber, Astrid C.; Gadisa, Endalamaw; Handjani, Farhad; Khamesipour, Ali; Llanos-Cuentas, Alejandro; López Carvajal, Liliana; Grout, Lise; Lmimouni, Badre Eddine; Mokni, Mourad; Nahzat, Mohammad Sami; Ben Salah, Afif; Ozbel, Yusuf; Pascale, Juan Miguel; Rizzo Molina, Nidia; Rode, Joelle; Romero, Gustavo; Ruiz-Postigo, José Antonio; Gore Saravia, Nancy; Soto, Jaime; Uzun, Soner; Mashayekhi, Vahid; Vélez, Ivan Dario; Vogt, Florian; Zerpa, Olga; Arana, Byron

    2018-01-01

    Introduction Progress with the treatment of cutaneous leishmaniasis (CL) has been hampered by inconsistent methodologies used to assess treatment effects. A sizable number of trials conducted over the years has generated only weak evidence backing current treatment recommendations, as shown by systematic reviews on old-world and new-world CL (OWCL and NWCL). Materials and methods Using a previously published guidance paper on CL treatment trial methodology as the reference, consensus was sought on key parameters including core eligibility and outcome measures, among OWCL (7 countries, 10 trial sites) and NWCL (7 countries, 11 trial sites) during two separate meetings. Results Findings and level of consensus within and between OWCL and NWCL sites are presented and discussed. In addition, CL trial site characteristics and capacities are summarized. Conclusions The consensus reached allows standardization of future clinical research across OWCL and NWCL sites. We encourage CL researchers to adopt and adapt as required the proposed parameters and outcomes in their future trials and provide feedback on their experience. The expertise afforded between the two sets of clinical sites provides the basis for a powerful consortium with potential for extensive, standardized assessment of interventions for CL and faster approval of candidate treatments. PMID:29329311

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hazen, Terry

    The US Department of Energy and the Institute for Ecology of Industrial Areas (IETU), Katowice, Poland have been cooperating in the development and implementation of innovative environmental remediation technologies since 1995. A major focus of this program has been the demonstration of bioremediation techniques to cleanup the soil and sediment associated with a waste lagoon at the Czechowice Oil Refinery (CZOR) in southern Poland. After an expedited site characterization (ESC), treatability study, and risk assessment study, a remediation system was designed that took advantage of local materials to minimize cost and maximize treatment efficiency. U.S. experts worked in tandem withmore » counterparts from the IETU and CZOR throughout this project to characterize, assess and subsequently, design, implement and monitor a bioremediation system. The CZOR, our industrial partner for this project, was chosen because of their foresight and commitment to the use of new approaches for environmental restoration. This program sets a precedent for Poland in which a portion of the funds necessary to complete the project were provided by the company responsible for the problem. The CZOR was named by PIOS (State Environmental Protection Inspectorate of Poland) as one of the top 80 biggest polluters in Poland. The history of the CZOR dates back more than 100 years to its establishment by the Vacuum Oil Company (a U.S. company and forerunner of Standard Oil). More than a century of continuous use of a sulfuric acid-based oil refining method by the CZOR has produced an estimated 120,000 tons of acidic, highly weathered, petroleum sludge. This waste has been deposited into three open, unlined process waste lagoons, 3 meters deep, now covering 3.8 hectares. Initial analysis indicated that the sludge was composed mainly of high molecular weight paraffinic and polynuclear aromatic hydrocarbons (PAHs). The overall objective of this full-scale demonstration project was to characterize, assess and remediate one of these lagoons. The remediation tested and evaluated a combination of U.S. and Polish-developed biological remediation technologies. Specifically, the goal of the demonstration was to reduce the environmental risk from PAH compounds in soil and to provide a green zone (grassy area) adjacent to the site boundary. The site was characterized using the DOE-developed Expedited Site Characterization (ESC) methodology. Based on the results of the ESC, a risk assessment was conducted using established U.S. procedures. Based on the results of the ESC and risk assessment, a 0.3-hectare site, the smallest of the waste lagoons, was selected for a modified aerobic biopile demonstration. This Executive Summary and the supporting report and appendices document the activities and results of this cooperative venture.« less

  12. Probing the reactivity of nucleophile residues in human 2,3-diphosphoglycerate/deoxy-hemoglobin complex by aspecific chemical modifications.

    PubMed

    Scaloni, A; Ferranti, P; De Simone, G; Mamone, G; Sannolo, N; Malorni, A

    1999-06-11

    The use of aspecific methylation reaction in combination with MS procedures has been employed for the characterization of the nucleophilic residues present on the molecular surface of the human 2,3-diphosphoglycerate/deoxy-hemoglobin complex. In particular, direct molecular weight determinations by ESMS allowed to control the reaction conditions, limiting the number of methyl groups introduced in the modified globin chains. A combined LCESMS-Edman degradation approach for the analysis of the tryptic peptide mixtures yielded to the exact identification of methylation sites together with the quantitative estimation of their degree of modification. The reactivities observed were directly correlated with the pKa and the relative surface accessibility of the nucleophilic residues, calculated from the X-ray crystallographic structure of the protein. The results here described indicate that this methodology can be efficiently used in aspecific modification experiments directed to the molecular characterization of the surface topology in proteins and protein complexes.

  13. Food-service establishment wastewater characterization.

    PubMed

    Lesikar, B J; Garza, O A; Persyn, R A; Kenimer, A L; Anderson, M T

    2006-08-01

    Food-service establishments that use on-site wastewater treatment systems are experiencing pretreatment system and/or drain field hydraulic and/or organic overloading. This study included characterization of four wastewater parameters (five-day biochemical oxygen demand [BOD5]; total suspended solids [TSS]; food, oil, and grease [FOG]; and flow) from 28 restaurants located in Texas during June, July, and August 2002. The field sampling methodology included taking a grab sample from each restaurant for 6 consecutive days at approximately the same time each day, followed by a 2-week break, and then sampling again for another 6 consecutive days, for a total of 12 samples per restaurant and 336 total observations. The analysis indicates higher organic (BOD5) and hydraulic values for restaurants than those typically found in the literature. The design values for this study for BOD5, TSS, FOG, and flow were 1523, 664, and 197 mg/L, and 96 L/day-seat respectively, which captured over 80% of the data collected.

  14. Screening of Marine Actinomycetes from Segara Anakan for Natural Pigment and Hydrolytic Activities

    NASA Astrophysics Data System (ADS)

    Asnani, A.; Ryandini, D.; Suwandri

    2016-02-01

    Marine actinomycetes have become sources of great interest to natural product chemistry due to their new chemical entities and bioactive metabolites. Since April 2010, we have screened actinobacteria from five sites that represent different ecosystems of Segara Anakan lagoon. In this present study we focus on specific isolates, K-2C which covers 1) actinomycetes identification based on morphology observation and 16S rRNA gene; 2) fermentation and isolation of pigment; 3) structure determination of pigment; and 4) hydrolytic enzymes characterization; Methodologies relevant to the studies were implemented accordingly. The results indicated that K-2C was likely Streptomyces fradiae strain RSU15, and the best fermentation medium should contain starch and casein with 21 days of incubation. The isolate has extracellular as well as intracellular pigments. Isolated pigments gave purple color with λmax of 529.00 nm. The pigment was structurally characterized. Interestingly, Streptomyces K-2C was able to produce potential hydrolytic enzymes such as amylase, cellulase, protease, lipase, urease, and nitrate reductase.

  15. Good use of fruit wastes: eco-friendly synthesis of silver nanoparticles, characterization, BSA protein binding studies.

    PubMed

    Sreekanth, T V M; Ravikumar, Sambandam; Lee, Yong Rok

    2016-06-01

    A simple and eco-friendly methodology for the green synthesis of silver nanoparticles (AgNPs) using a mango seed extract was evaluated. The AgNPs were characterized by ultraviolet-visible spectrophotometry, Fourier transform infrared spectroscopy, transmission electron microscopy, energy dispersive X-ray spectroscopy, and X-ray diffraction. The interaction between the green synthesized AgNPs and bovine serum albumin (BSA) in an aqueous solution at physiological pH was examined by fluorescence spectroscopy. The results confirmed that the AgNPs quenched the fluorophore of BSA by forming a ground state complex in aqueous solution. This fluorescence quenching data were also used to determine the binding sites and binding constants at different temperatures. The calculated thermodynamic parameters (ΔG°, ΔH° and ΔS°) suggest that the binding process occurs spontaneously through the involvement of electrostatic interactions. The synchronous fluorescence spectra showed a blue shift, indicating increasing hydrophobicity. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  16. A methodology for ecosystem-scale modeling of selenium

    USGS Publications Warehouse

    Presser, T.S.; Luoma, S.N.

    2010-01-01

    The main route of exposure for selenium (Se) is dietary, yet regulations lack biologically based protocols for evaluations of risk. We propose here an ecosystem-scale model that conceptualizes and quantifies the variables that determinehow Se is processed from water through diet to predators. This approach uses biogeochemical and physiological factors from laboratory and field studies and considers loading, speciation, transformation to particulate material, bioavailability, bioaccumulation in invertebrates, and trophic transfer to predators. Validation of the model is through data sets from 29 historic and recent field case studies of Se-exposed sites. The model links Se concentrations across media (water, particulate, tissue of different food web species). It can be used to forecast toxicity under different management or regulatory proposals or as a methodology for translating a fish-tissue (or other predator tissue) Se concentration guideline to a dissolved Se concentration. The model illustrates some critical aspects of implementing a tissue criterion: 1) the choice of fish species determines the food web through which Se should be modeled, 2) the choice of food web is critical because the particulate material to prey kinetics of bioaccumulation differs widely among invertebrates, 3) the characterization of the type and phase of particulate material is important to quantifying Se exposure to prey through the base of the food web, and 4) the metric describing partitioning between particulate material and dissolved Se concentrations allows determination of a site-specific dissolved Se concentration that would be responsible for that fish body burden in the specific environment. The linked approach illustrates that environmentally safe dissolved Se concentrations will differ among ecosystems depending on the ecological pathways and biogeochemical conditions in that system. Uncertainties and model sensitivities can be directly illustrated by varying exposure scenarios based on site-specific knowledge. The model can also be used to facilitate site-specific regulation and to present generic comparisons to illustrate limitations imposed by ecosystem setting and inhabitants. Used optimally, the model provides a tool for framing a site-specific ecological problem or occurrence of Se exposure, quantify exposure within that ecosystem, and narrow uncertainties abouthowto protect it by understanding the specifics of the underlying system ecology, biogeochemistry, and hydrology.?? 2010 SETAC.

  17. A methodology for ecosystem-scale modeling of selenium.

    PubMed

    Presser, Theresa S; Luoma, Samuel N

    2010-10-01

    The main route of exposure for selenium (Se) is dietary, yet regulations lack biologically based protocols for evaluations of risk. We propose here an ecosystem-scale model that conceptualizes and quantifies the variables that determine how Se is processed from water through diet to predators. This approach uses biogeochemical and physiological factors from laboratory and field studies and considers loading, speciation, transformation to particulate material, bioavailability, bioaccumulation in invertebrates, and trophic transfer to predators. Validation of the model is through data sets from 29 historic and recent field case studies of Se-exposed sites. The model links Se concentrations across media (water, particulate, tissue of different food web species). It can be used to forecast toxicity under different management or regulatory proposals or as a methodology for translating a fish-tissue (or other predator tissue) Se concentration guideline to a dissolved Se concentration. The model illustrates some critical aspects of implementing a tissue criterion: 1) the choice of fish species determines the food web through which Se should be modeled, 2) the choice of food web is critical because the particulate material to prey kinetics of bioaccumulation differs widely among invertebrates, 3) the characterization of the type and phase of particulate material is important to quantifying Se exposure to prey through the base of the food web, and 4) the metric describing partitioning between particulate material and dissolved Se concentrations allows determination of a site-specific dissolved Se concentration that would be responsible for that fish body burden in the specific environment. The linked approach illustrates that environmentally safe dissolved Se concentrations will differ among ecosystems depending on the ecological pathways and biogeochemical conditions in that system. Uncertainties and model sensitivities can be directly illustrated by varying exposure scenarios based on site-specific knowledge. The model can also be used to facilitate site-specific regulation and to present generic comparisons to illustrate limitations imposed by ecosystem setting and inhabitants. Used optimally, the model provides a tool for framing a site-specific ecological problem or occurrence of Se exposure, quantify exposure within that ecosystem, and narrow uncertainties about how to protect it by understanding the specifics of the underlying system ecology, biogeochemistry, and hydrology. © 2010 SETAC.

  18. Transportation technology and methodology reports

    DOT National Transportation Integrated Search

    1999-12-22

    This Internet site sponsored by the Office of Highway Policy Information provides links to a compilation of PDF reports on transportation technology and methodology. Reports include "FHWA Statistical Programs;" "Nonresponse in Household Travel Survey...

  19. Skyscape Archaeology: an emerging interdiscipline for archaeoastronomers and archaeologists

    NASA Astrophysics Data System (ADS)

    Henty, Liz

    2016-02-01

    For historical reasons archaeoastronomy and archaeology differ in their approach to prehistoric monuments and this has created a divide between the disciplines which adopt seemingly incompatible methodologies. The reasons behind the impasse will be explored to show how these different approaches gave rise to their respective methods. Archaeology investigations tend to concentrate on single site analysis whereas archaeoastronomical surveys tend to be data driven from the examination of a large number of similar sets. A comparison will be made between traditional archaeoastronomical data gathering and an emerging methodology which looks at sites on a small scale and combines archaeology and astronomy. Silva's recent research in Portugal and this author's survey in Scotland have explored this methodology and termed it skyscape archaeology. This paper argues that this type of phenomenological skyscape archaeology offers an alternative to large scale statistical studies which analyse astronomical data obtained from a large number of superficially similar archaeological sites.

  20. Data compilation, synthesis, and calculations used for organic-carbon storage and inventory estimates for mineral soils of the Mississippi River Basin

    USGS Publications Warehouse

    Buell, Gary R.; Markewich, Helaine W.

    2004-01-01

    U.S. Geological Survey investigations of environmental controls on carbon cycling in soils and sediments of the Mississippi River Basin (MRB), an area of 3.3 x 106 square kilometers (km2), have produced an assessment tool for estimating the storage and inventory of soil organic carbon (SOC) by using soil-characterization data from Federal, State, academic, and literature sources. The methodology is based on the linkage of site-specific SOC data (pedon data) to the soil-association map units of the U.S. Department of Agriculture State Soil Geographic (STATSGO) and Soil Survey Geographic (SSURGO) digital soil databases in a geographic information system. The collective pedon database assembled from individual sources presently contains 7,321 pedon records representing 2,581 soil series. SOC storage, in kilograms per square meter (kg/m2), is calculated for each pedon at standard depth intervals from 0 to 10, 10 to 20, 20 to 50, and 50 to 100 centimeters. The site-specific storage estimates are then regionalized to produce national-scale (STATSGO) and county-scale (SSURGO) maps of SOC to a specified depth. Based on this methodology, the mean SOC storage for the top meter of mineral soil in the MRB is approximately 10 kg/m2, and the total inventory is approximately 32.3 Pg (1 petagram = 109 metric tons). This inventory is from 2.5 to 3 percent of the estimated global mineral SOC pool.

  1. 10 CFR 960.3-2-3 - Recommendation of sites for characterization.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Recommendation of sites for characterization. 960.3-2-3... POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-2-3 Recommendation of sites... President not less than three candidate sites for such characterization. The recommendation decision shall...

  2. 10 CFR 960.3-1-4-3 - Site recommendation for characterization.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Site recommendation for characterization. 960.3-1-4-3... POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-1-4-3 Site recommendation for characterization. The evidence required to support the recommendation of a site as a candidate...

  3. Top-down approach from satellite to terrestrial rover application for environmental monitoring of landfills.

    PubMed

    Manzo, C; Mei, A; Zampetti, E; Bassani, C; Paciucci, L; Manetti, P

    2017-04-15

    This paper describes a methodology to perform chemical analyses in landfill areas by integrating multisource geomatic data. We used a top-down approach to identify Environmental Point of Interest (EPI) based on very high-resolution satellite data (Pleiades and WorldView 2) and on in situ thermal and photogrammetric surveys. Change detection techniques and geostatistical analysis supported the chemical survey, undertaken using an accumulation chamber and an RIIA, an unmanned ground vehicle developed by CNR IIA, equipped with a multiparameter sensor platform for environmental monitoring. Such an approach improves site characterization, identifying the key environmental points of interest where it is necessary to perform detailed chemical analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Development of policies for Natura 2000 sites: a multi-criteria approach to support decision makers.

    PubMed

    Cortina, Carla; Boggia, Antonio

    2014-08-01

    The aim of this study is to present a methodology to support decision makers in the choice of Natura 2000 sites needing an appropriate management plan to ensure a sustainable socio-economic development. In order to promote sustainable development in the Natura 2000 sites compatible with nature preservation, conservation measures or management plans are necessary. The main issue is to decide when only conservation measures can be applied and when the sites need an appropriate management plan. We present a case study for the Italian Region of Umbria. The methodology is based on a multi-criteria approach to identify the biodiversity index (BI), and on the development of a human activities index (HAI). By crossing the two indexes for each site on a Cartesian plane, four groups of sites were identified. Each group corresponds to a specific need for an appropriate management plan. Sites in the first group with a high level both of biodiversity and human activities have the most urgent need of an appropriate management plan to ensure sustainable development. The proposed methodology and analysis is replicable in other regions or countries by using the data available for each site in the Natura 2000 standard data form. A multi-criteria analysis is especially suitable for supporting decision makers when they deal with a multidimensional decision process. We found the multi-criteria approach particularly sound in this case, due to the concept of biodiversity itself, which is complex and multidimensional, and to the high number of alternatives (Natura 2000 sites) to be assessed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Turbidity monitoring equipment and methodology evaluation at MDOT construction sites.

    DOT National Transportation Integrated Search

    2014-12-01

    State Study 261 is a continuation of State study 225, "Turbidity Monitoring at Select : MDOT Construction Sites", which was successful in establishing baseline stream data : at several active construction sites. State Study 261 focused on the equipme...

  6. Site selection for MSFC operational tests of solar heating and cooling systems

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The criteria, methodology, and sequence aspects of the site selection process are presented. This report organized the logical thought process that should be applied to the site selection process, but final decisions are highly selective.

  7. Exploring Educational and Cultural Adaptation through Social Networking Sites

    ERIC Educational Resources Information Center

    Ryan, Sherry D.; Magro, Michael J.; Sharp, Jason H.

    2011-01-01

    Social networking sites have seen tremendous growth and are widely used around the world. Nevertheless, the use of social networking sites in educational contexts is an under explored area. This paper uses a qualitative methodology, autoethnography, to investigate how social networking sites, specifically Facebook[TM], can help first semester…

  8. A Patient Focused Solution for Enrolling Clinical Trials in Rare and Selective Cancer Indications: A Landscape of Haystacks and Needles

    PubMed Central

    Lynam, Eric B.; Leaw, Jiin; Wiener, Matthew B.

    2013-01-01

    Participation of adult cancer patients in US based clinical trials has remained near 3% for decades. Traditional research methodology reaches a small fraction of the target population with a fixed number of predetermined sites. Solutions are needed to ethically increase patient participation and accelerate cancer trial completion. We compared enrollment outcomes of traditional and patient focused research methodologies. A patient prioritized method (Just-In-Time, JIT) was implemented in parallel with traditionally managed sites in three cancer trials. JIT research sites were initiated after candidate patients presented, while traditional sites were initiated in advance. JIT sites enrolled with mean rates no less than, and up to 2.75 fold greater than, traditional sites. Mean patients enrolled per site was comparable (JIT-1.82, traditional-1.78). There were fewer non-enrolling JIT sites (2/28, 7%) compared to traditional sites 19/52, 37%). This retrospective analysis supports JIT as a prospective solution to increase cancer clinical trial enrollment and the efficiency of clinical trial administrative activities. PMID:23990689

  9. Effects of Following National Committee for Clinical Laboratory Standards and Deutsche Industrie Norm-Medizinische Mikrobiologie Guidelines, Country of Isolate Origin, and Site of Infection on Susceptibility of Escherichia coli to Amoxicillin-Clavulanate (Augmentin)

    PubMed Central

    Simpson, I.; Durodie, J.; Knott, S.; Shea, B.; Wilson, J.; Machka, K.

    1998-01-01

    Amoxicillin-clavulanate (Augmentin), as a combination of two active agents, poses extra challenges over single agents in establishing clinically relevant breakpoints for in vitro susceptibility tests. Hence, reported differences in amoxicillin-clavulanate percent susceptibilities among Escherichia coli isolates may reflect localized resistance problems and/or methodological differences in susceptibility testing and breakpoint criteria. The objectives of the present study were to determine the effects of (i) methodology, e.g., those of the National Committee for Clinical Laboratory Standards (NCCLS) and the Deutsche Industrie Norm-Medizinische Mikrobiologie (DIN), (ii) country of origin (Spain, France, and Germany), and (iii) site of infection (urinary tract, intra-abdominal sepsis, or other site[s]) upon the incidence of susceptibility to amoxicillin-clavulanate in 185 clinical isolates of E. coli. Cefuroxime and cefotaxime were included for comparison. The use of NCCLS methodology resulted in different distribution of amoxicillin-clavulanate MICs than that obtained with the DIN methodology, a difference highlighted by the 10% more strains found to be within the 8- to 32-μg/ml MIC range. This difference reflects the differing amounts of clavulanic acid present. NCCLS and DIN methodologies also produce different MIC distributions for cefotaxime but not for cefuroxime. Implementation of NCCLS and DIN breakpoints produced markedly different incidences of strains that were found to be susceptible, intermediate or resistant to amoxicillin-clavulanate. A total of 86.5% strains were found to be susceptible to amoxicillin-clavulanate by the NCCLS methodology, whereas only 43.8% were found to be susceptible by the DIN methodology. Similarly, 4.3% of the strains were found to be resistant by NCCLS guidelines compared to 21.1% by the DIN guidelines. The use of DIN breakpoints resulted in a fivefold-higher incidence of strains categorized as resistant to cefuroxime. There were no marked differences due to country of origin upon the MIC distributions for amoxicillin-clavulanate, cefuroxime, or cefotaxime, as determined with the NCCLS guidelines. Isolates from urinary tract and intra-abdominal infections were generally more resistant to amoxicillin-clavulanate than were isolates from other sites of infection. PMID:9574706

  10. The KULTURisk Regional Risk Assessment methodology for water-related natural hazards - Part 2: Application to the Zurich case study

    NASA Astrophysics Data System (ADS)

    Ronco, P.; Bullo, M.; Torresan, S.; Critto, A.; Olschewski, R.; Zappa, M.; Marcomini, A.

    2014-07-01

    The main objective of the paper is the application of the KULTURisk Regional Risk Assessment (KR-RRA) methodology, presented in the companion paper (Part 1, Ronco et al., 2014), to the Sihl River valley, in Switzerland. Through a tuning process of the methodology to the site-specific context and features, flood related risks have been assessed for different receptors lying on the Sihl River valley including the city of Zurich, which represents a typical case of river flooding in urban area. After characterizing the peculiarities of the specific case study, risk maps have been developed under a 300 years return period scenario (selected as baseline) for six identified relevant targets, exposed to flood risk in the Sihl valley, namely: people, economic activities (including buildings, infrastructures and agriculture), natural and semi-natural systems and cultural heritage. Finally, the total risk index map, which allows to identify and rank areas and hotspots at risk by means of Multi Criteria Decision Analysis tools, has been produced to visualize the spatial pattern of flood risk within the area of study. By means of a tailored participative approach, the total risk maps supplement the consideration of technical experts with the (essential) point of view of the relevant stakeholders for the appraisal of the specific scores and weights related to the receptor-relative risks. The total risk maps obtained for the Sihl River case study are associated with the lower classes of risk. In general, higher relative risks are concentrated in the deeply urbanized area within and around the Zurich city centre and areas that rely just behind to the Sihl River course. Here, forecasted injuries and potential fatalities are mainly due to high population density and high presence of old (vulnerable) people; inundated buildings are mainly classified as continuous and discontinuous urban fabric; flooded roads, pathways and railways, the majority of them referring to the Zurich main train station (Hauptbahnhof), are at high risk of inundation, causing huge indirect damages. The analysis of flood risk to agriculture, natural and semi-natural systems and cultural heritage have pointed out that these receptors could be relatively less impacted by the selected flood scenario mainly because their scattered presence. Finally, the application of the KR-RRA methodology to the Sihl River case study as well as to several other sites across Europe (not presented here), has demonstrated its flexibility and possible adaptation to different geographical and socio-economic contexts, depending on data availability and peculiarities of the sites, as well as for other hazard scenarios.

  11. The story of protein arginine methylation: characterization, regulation, and function.

    PubMed

    Peng, Chao; Wong, Catherine Cl

    2017-02-01

    Arginine methylation is an important post-translational modification (PTM) in cells, which is catalyzed by a group of protein arginine methyltransferases (PRMTs). It plays significant roles in diverse cellular processes and various diseases. Misregulation and aberrant expression of PRMTs can provide potential biomarkers and therapeutic targets for drug discovery. Areas covered: Herein, we review the arginine methylation literature and summarize the methodologies for the characterization of this modification, as well as describe the recent insights into arginine methyltransferases and their biological functions in diseases. Expert commentary: Benefits from the enzyme-based large-scale screening approach, the novel affinity enrichment strategies, arginine methylated protein family is the focus of attention. Although a number of arginine methyltransferases and related substrates are identified, the catalytic mechanism of different types of PRMTs remains unclear and few related demethylases are characterized. Novel functional studies continuously reveal the importance of this modification in the cell cycle and diseases. A deeper understanding of arginine methylated proteins, modification sites, and their mechanisms of regulation is needed to explore their role in life processes, especially their relationship with diseases, thus accelerating the generation of potent, selective, cell-penetrant drug candidates.

  12. Geological Investigation Program for the Site of a New Nuclear Power Plant in Hungary

    NASA Astrophysics Data System (ADS)

    Gerstenkorn, András; Trosits, Dalma; Chikán, Géza; János Katona, Tamás

    2015-04-01

    Comprehensive site evalaution program is implemented for the new Nuclear Power Plant to be constructed at Paks site in Hungary with the aim of confirmation of acceptability of the site and definition of site-related design basis data. Most extensive part of this program is to investigate geological-tectonical features of the site with particular aim on the assessment of the capability of faults at and around the site, characterization of site seismic hazard, and definition of the design basis earthquake. A brief description of the scope and methodology of the geological, seismological, geophysical, geotechnical and hydrogeological investigations will be given on the poster. Main focus of the presentation is to show the graded structure and extent of the geological investigations that follow the needs and scale of the geological modeling, starting with the site and its vicinity, as well as on the near regional and the regional scale. Geological inverstigations includes several boreholes up-to the base-rock, plenty of boreholes discovering the Pannonian and large number of shallow boreholes for investigation of more recent development. The planning of the geological investigations is based on the 3D seismic survey performed around the site, that is complemented by shallow-seimic survey at and in the vicinity of the site. The 3D geophysical imaging provides essential geodynamic information to assess the capability of near site faults and for the seismic hazard analysis, as well as for the hydrogeological modeling. The planned seismic survey gives a unique dataset for understanding the spatial relationship between individual fault segments. Planning of the research (trenching, etc.) for paleoseismic manifestations is also based on the 3D seismic survey. The seismic survey and other geophysical data (including data of space geodesy) allow the amendment of the understanding and the model of the tectonic evolution of the area and geological events. As it is known from earlier studies, seismic sources in the near regional area are the dominating contributors to the site seimic hazard. Therefore a 3D geological model will be developed for the 50 km region around the site in order to consider different geological scenarios. Site-scale investigations are aimed on the characterization of local geotechnical and hydrogeological conditions. The geotechnical investigations provide data for the evaluation of site response, i.e. the free-field ground motion response spectra, assessment of the liquefaction hazard and foundation design. Important element of the hydrogeological survey is numerical groundwater modeling. The aim of hydrogeological modeling is the summary of hydrogeological data in a numeric system, the description, simulation of underground water flow and transport conditions.

  13. Fast and Efficient Drosophila melanogaster Gene Knock-Ins Using MiMIC Transposons

    PubMed Central

    Vilain, Sven; Vanhauwaert, Roeland; Maes, Ine; Schoovaerts, Nils; Zhou, Lujia; Soukup, Sandra; da Cunha, Raquel; Lauwers, Elsa; Fiers, Mark; Verstreken, Patrik

    2014-01-01

    Modern molecular genetics studies necessitate the manipulation of genes in their endogenous locus, but most of the current methodologies require an inefficient donor-dependent homologous recombination step to locally modify the genome. Here we describe a methodology to efficiently generate Drosophila knock-in alleles by capitalizing on the availability of numerous genomic MiMIC transposon insertions carrying recombinogenic attP sites. Our methodology entails the efficient PhiC31-mediated integration of a recombination cassette flanked by unique I-SceI and/or I-CreI restriction enzyme sites into an attP-site. These restriction enzyme sites allow for double-strand break−mediated removal of unwanted flanking transposon sequences, while leaving the desired genomic modifications or recombination cassettes. As a proof-of-principle, we mutated LRRK, tau, and sky by using different MiMIC elements. We replaced 6 kb of genomic DNA encompassing the tau locus and 35 kb encompassing the sky locus with a recombination cassette that permits easy integration of DNA at these loci and we also generated a functional LRRKHA knock in allele. Given that ~92% of the Drosophila genes are located within the vicinity (<35 kb) of a MiMIC element, our methodology enables the efficient manipulation of nearly every locus in the fruit fly genome without the need for inefficient donor-dependent homologous recombination events. PMID:25298537

  14. Fast and efficient Drosophila melanogaster gene knock-ins using MiMIC transposons.

    PubMed

    Vilain, Sven; Vanhauwaert, Roeland; Maes, Ine; Schoovaerts, Nils; Zhou, Lujia; Soukup, Sandra; da Cunha, Raquel; Lauwers, Elsa; Fiers, Mark; Verstreken, Patrik

    2014-10-08

    Modern molecular genetics studies necessitate the manipulation of genes in their endogenous locus, but most of the current methodologies require an inefficient donor-dependent homologous recombination step to locally modify the genome. Here we describe a methodology to efficiently generate Drosophila knock-in alleles by capitalizing on the availability of numerous genomic MiMIC transposon insertions carrying recombinogenic attP sites. Our methodology entails the efficient PhiC31-mediated integration of a recombination cassette flanked by unique I-SceI and/or I-CreI restriction enzyme sites into an attP-site. These restriction enzyme sites allow for double-strand break-mediated removal of unwanted flanking transposon sequences, while leaving the desired genomic modifications or recombination cassettes. As a proof-of-principle, we mutated LRRK, tau, and sky by using different MiMIC elements. We replaced 6 kb of genomic DNA encompassing the tau locus and 35 kb encompassing the sky locus with a recombination cassette that permits easy integration of DNA at these loci and we also generated a functional LRRK(HA) knock in allele. Given that ~92% of the Drosophila genes are located within the vicinity (<35 kb) of a MiMIC element, our methodology enables the efficient manipulation of nearly every locus in the fruit fly genome without the need for inefficient donor-dependent homologous recombination events. Copyright © 2014 Vilain et al.

  15. Regional risk assessment approaches to land planning for industrial polluted areas in China: the Hulunbeier region case study.

    PubMed

    Li, Daiqing; Zhang, Chen; Pizzol, Lisa; Critto, Andrea; Zhang, Haibo; Lv, Shihai; Marcomini, Antonio

    2014-04-01

    The rapid industrial development and urbanization processes that occurred in China over the past 30years has increased dramatically the consumption of natural resources and raw materials, thus exacerbating the human pressure on environmental ecosystems. In result, large scale environmental pollution of soil, natural waters and urban air were recorded. The development of effective industrial planning to support regional sustainable economy development has become an issue of serious concern for local authorities which need to select safe sites for new industrial settlements (i.e. industrial plants) according to assessment approaches considering cumulative impacts, synergistic pollution effects and risks of accidental releases. In order to support decision makers in the development of efficient and effective regional land-use plans encompassing the identification of suitable areas for new industrial settlements and areas in need of intervention measures, this study provides a spatial regional risk assessment methodology which integrates relative risk assessment (RRA) and socio-economic assessment (SEA) and makes use of spatial analysis (GIS) methodologies and multicriteria decision analysis (MCDA) techniques. The proposed methodology was applied to the Chinese region of Hulunbeier which is located in eastern Inner Mongolia Autonomous Region, adjacent to the Republic of Mongolia. The application results demonstrated the effectiveness of the proposed methodology in the identification of the most hazardous and risky industrial settlements, the most vulnerable regional receptors and the regional districts which resulted to be the most relevant for intervention measures since they are characterized by high regional risk and excellent socio-economic development conditions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Magnetic Multi-Scale Mapping to Characterize Anthropogenic Targets

    NASA Astrophysics Data System (ADS)

    Le Maire, P.; Munschy, M.

    2017-12-01

    The discovery of buried anthropic objects on construction sites can cause delays and/or dangers for workers and for the public. Indeed, every year 500 tons of Unexploded-ordnance are discovered in France. Magnetic measurements are useful to localize magnetized objects. Moreover, it is the cheapest geophysical method which does not impact environment and which is relatively fast to perform. Fluxgate magnetometers (three components) are used to measure magnetic properties bellow the ground. These magnetic sensors are not absolute, so they need to be calibrated before the onset of the measurements. The advantage is that they allow magnetic compensation of the equipment attached to the sensor. So the choice of this kind sensor gives the opportunity to install the equipment aboard different magnetized supports: boat, quad bike, unmanned aerial vehicle, aircraft,... Indeed, this methodology permits to perform magnetic mapping with different scale and different elevation above ground level. An old French aerial military plant was chosen to perform this multi-scale approach. The advantage of the site is that it contains a lot of different targets with variable sizes and depth, e.g. buildings, unexploded-ordnances of the two world wars, trenches, pipes,… By comparison between the different magnetic anomaly maps at different elevations some of the geometric parameters of the magnetic sources can be characterized. The comparison between measured maps at different elevations and the prolonged map highlights the maximum distance for the target's detection (figure).

  17. 10 CFR 60.18 - Review of site characterization activities. 2

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... developed, and on the progress of waste form and waste package research and development. The semiannual... of site characterization will be established. Other topics related to site characterization shall...

  18. 10 CFR 60.18 - Review of site characterization activities. 2

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... developed, and on the progress of waste form and waste package research and development. The semiannual... of site characterization will be established. Other topics related to site characterization shall...

  19. 10 CFR 60.18 - Review of site characterization activities. 2

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... developed, and on the progress of waste form and waste package research and development. The semiannual... of site characterization will be established. Other topics related to site characterization shall...

  20. 10 CFR 60.18 - Review of site characterization activities. 2

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... developed, and on the progress of waste form and waste package research and development. The semiannual... of site characterization will be established. Other topics related to site characterization shall...

  1. Exposure to fall hazards and safety climate in the aircraft maintenance industry.

    PubMed

    Neitzel, Richard L; Seixas, Noah S; Harris, Michael J; Camp, Janice

    2008-01-01

    Falls represent a significant occupational hazard, particularly in industries with dynamic work environments. This paper describes rates of noncompliance with fall hazard prevention requirements, perceived safety climate and worker knowledge and beliefs, and the association between fall exposure and safety climate measures in commercial aircraft maintenance activities. Walkthrough observations were conducted on aircraft mechanics at two participating facilities (Sites A and B) to ascertain the degree of noncompliance. Mechanics at each site completed questionnaires concerning fall hazard knowledge, personal safety beliefs, and safety climate. Questionnaire results were summarized into safety climate and belief scores by workgroup and site. Noncompliance rates observed during walkthroughs were compared to the climate-belief scores, and were expected to be inversely associated. Important differences were seen in fall safety performance between the sites. The study provided a characterization of aircraft maintenance fall hazards, and also demonstrated the effectiveness of an objective hazard assessment methodology. Noncompliance varied by height, equipment used, location of work on the aircraft, shift, and by safety system. Although the expected relationship between safety climate and noncompliance was seen for site-average climate scores, workgroups with higher safety climate scores had greater observed noncompliance within Site A. Overall, use of engineered safety systems had a significant impact on working safely, while safety beliefs and climate also contributed, though inconsistently. The results of this study indicate that safety systems are very important in reducing noncompliance with fall protection requirements in aircraft maintenance facilities. Site-level fall safety compliance was found to be related to safety climate, although an unexpected relationship between compliance and safety climate was seen at the workgroup level within site. Finally, observed fall safety compliance was found to differ from self-reported compliance.

  2. SMARTE'S SITE CHARACTERIZATION TOOL

    EPA Science Inventory

    Site Characterization involves collecting environmental data to evaluate the nature and extent of contamination. Environmental data could consist of chemical analyses of soil, sediment, water or air samples. Typically site characterization data are statistically evaluated for thr...

  3. Vadose zone studies at an industrial contaminated site: the vadose zone monitoring system and cross-hole geophysics

    NASA Astrophysics Data System (ADS)

    Fernandez de Vera, Natalia; Beaujean, Jean; Jamin, Pierre; Nguyen, Frédéric; Dahan, Ofer; Vanclooster, Marnik; Brouyère, Serge

    2014-05-01

    In order to improve risk characterization and remediation measures for soil and groundwater contamination, there is a need to improve in situ vadose zone characterization. However, most available technologies have been developed in the context of agricultural soils. Such methodologies are not applicable at industrial sites, where soils and contamination differ in origin and composition. In addition, most technologies are applicable only in the first meters of soils, leaving deeper vadose zones with lack of information, in particular on field scale heterogeneity. In order to overcome such difficulties, a vadose zone experiment has been setup at a former industrial site in Belgium. Industrial activities carried out on site left a legacy of soil and groundwater contamination in BTEX, PAH, cyanide and heavy metals. The experiment comprises the combination of two techniques: the Vadose Zone Monitoring System (VMS) and cross-hole geophysics. The VMS allows continuous measurements of water content and temperature at different depths of the vadose zone. In addition, it provides the possibility of pore water sampling at different depths. The system is formed by a flexible sleeve containing monitoring units along its depth which is installed in a slanted borehole. The flexible sleeve contains three types of monitoring units in the vadose zone: Time Domain Transmissometry (TDT), which allows water content measurements; Vadose Sampling Ports (VSP), used for collecting water samples coming from the matrix; the Fracture Samplers (FS), which are used for retrieving water samples from the fractures. Cross-hole geophysics techniques consist in the injection of an electrical current using electrodes installed in vertical boreholes. From measured potential differences, detailed spatial patterns about electrical properties of the subsurface can be inferred. Such spatial patterns are related with subsurface heterogeneities, water content and solute concentrations. Two VMS were installed in two slanted boreholes on site, together with four vertical boreholes containing electrodes for geophysical measurements. Currently the site is being monitored under natural recharge conditions. Initial results show the reaction of the vadose zone to rainfall events, as well as chemical evolution of soil water with depth.

  4. Site characterization report for the basalt waste isolation project. Volume II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1982-11-01

    The reference location for a repository in basalt for the terminal storage of nuclear wastes on the Hanford Site and the candidate horizons within this reference repository location have been identified and the preliminary characterization work in support of the site screening process has been completed. Fifteen technical questions regarding the qualification of the site were identified to be addressed during the detailed site characterization phase of the US Department of Energy-National Waste Terminal Storage Program site selection process. Resolution of these questions will be provided in the final site characterization progress report, currently planned to be issued in 1987,more » and in the safety analysis report to be submitted with the License Application. The additional information needed to resolve these questions and the plans for obtaining the information have been identified. This Site Characterization Report documents the results of the site screening process, the preliminary site characterization data, the technical issues that need to be addressed, and the plans for resolving these issues. Volume 2 contains chapters 6 through 12: geochemistry; surface hydrology; climatology, meteorology, and air quality; environmental, land-use, and socioeconomic characteristics; repository design; waste package; and performance assessment.« less

  5. 40 CFR 280.63 - Initial site characterization.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 26 2010-07-01 2010-07-01 false Initial site characterization. 280.63... Hazardous Substances § 280.63 Initial site characterization. (a) Unless directed to do otherwise by the implementing agency, owners and operators must assemble information about the site and the nature of the...

  6. 75 FR 39093 - Proposed Confidentiality Determinations for Data Required Under the Mandatory Greenhouse Gas...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-07

    ... information that is sensitive or proprietary, such as detailed process designs or site plans. Because the... Inputs to Emission Equations X Calculation Methodology and Methodological Tier X Data Elements Reported...

  7. Library Web Sites in Pakistan: An Analysis of Content

    ERIC Educational Resources Information Center

    Qutab, Saima; Mahmood, Khalid

    2009-01-01

    Purpose: The purpose of this paper is to investigate library web sites in Pakistan, to analyse their content and navigational strengths and weaknesses and to give recommendations for developing better web sites and quality assessment studies. Design/methodology/approach: Survey of web sites of 52 academic, special, public and national libraries in…

  8. Methodology based on genetic heuristics for in-vivo characterizing the patient-specific biomechanical behavior of the breast tissues

    PubMed Central

    Lago, M. A.; Rúperez, M. J.; Martínez-Martínez, F.; Martínez-Sanchis, S.; Bakic, P. R.; Monserrat, C.

    2015-01-01

    This paper presents a novel methodology to in-vivo estimate the elastic constants of a constitutive model proposed to characterize the mechanical behavior of the breast tissues. An iterative search algorithm based on genetic heuristics was constructed to in-vivo estimate these parameters using only medical images, thus avoiding invasive measurements of the mechanical response of the breast tissues. For the first time, a combination of overlap and distance coefficients were used for the evaluation of the similarity between a deformed MRI of the breast and a simulation of that deformation. The methodology was validated using breast software phantoms for virtual clinical trials, compressed to mimic MRI-guided biopsies. The biomechanical model chosen to characterize the breast tissues was an anisotropic neo-Hookean hyperelastic model. Results from this analysis showed that the algorithm is able to find the elastic constants of the constitutive equations of the proposed model with a mean relative error of about 10%. Furthermore, the overlap between the reference deformation and the simulated deformation was of around 95% showing the good performance of the proposed methodology. This methodology can be easily extended to characterize the real biomechanical behavior of the breast tissues, which means a great novelty in the field of the simulation of the breast behavior for applications such as surgical planing, surgical guidance or cancer diagnosis. This reveals the impact and relevance of the presented work. PMID:27103760

  9. Methodology based on genetic heuristics for in-vivo characterizing the patient-specific biomechanical behavior of the breast tissues.

    PubMed

    Lago, M A; Rúperez, M J; Martínez-Martínez, F; Martínez-Sanchis, S; Bakic, P R; Monserrat, C

    2015-11-30

    This paper presents a novel methodology to in-vivo estimate the elastic constants of a constitutive model proposed to characterize the mechanical behavior of the breast tissues. An iterative search algorithm based on genetic heuristics was constructed to in-vivo estimate these parameters using only medical images, thus avoiding invasive measurements of the mechanical response of the breast tissues. For the first time, a combination of overlap and distance coefficients were used for the evaluation of the similarity between a deformed MRI of the breast and a simulation of that deformation. The methodology was validated using breast software phantoms for virtual clinical trials, compressed to mimic MRI-guided biopsies. The biomechanical model chosen to characterize the breast tissues was an anisotropic neo-Hookean hyperelastic model. Results from this analysis showed that the algorithm is able to find the elastic constants of the constitutive equations of the proposed model with a mean relative error of about 10%. Furthermore, the overlap between the reference deformation and the simulated deformation was of around 95% showing the good performance of the proposed methodology. This methodology can be easily extended to characterize the real biomechanical behavior of the breast tissues, which means a great novelty in the field of the simulation of the breast behavior for applications such as surgical planing, surgical guidance or cancer diagnosis. This reveals the impact and relevance of the presented work.

  10. Mission to Malapert

    NASA Astrophysics Data System (ADS)

    Otten, N. D.; Amoroso, E.; Jones, H. L.; Kitchell, F.; Wettergreen, D. S.; Whittaker, W. L.

    2016-11-01

    This work presents methodology for evaluating lunar landing site amenability and identifies promising sites for landing on Malapert Mountain, which features shallow slopes, uninterrupted Earth visibility, and ten-plus days of uninterrupted sunlight.

  11. Metrics for Systems Thinking in the Human Dimension

    DTIC Science & Technology

    2016-11-01

    corpora of documents. 2 Methodology Overview We present a human-in-the- loop methodology that assists researchers and analysts by characterizing...supervised learning methods. Building on this foundation, we present an unsupervised, human-in-the- loop methodology that utilizes topic models to...the definition of strong systems thinking and in the interpretation of topics, but this is what makes the human-in-the- loop methodology so effective

  12. Methodology to Assess No Touch Audit Software Using Field Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cai, Jie; Braun, James E.; Langner, M. Rois

    The research presented in this report builds upon these previous efforts and proposes a set of tests to assess no touch audit tools using real utility bill and on-site data. The proposed assessment methodology explicitly investigates the behaviors of the monthly energy end uses with respect to outdoor temperature, i.e., the building energy signature, to help understand the Tool's disaggregation accuracy. The project team collaborated with Field Diagnosis Services, Inc. (FDSI) to identify appropriate test sites for the evaluation.

  13. Application of Spatial Data Modeling and Geographical Information Systems (GIS) for Identification of Potential Siting Options for Various Electrical Generation Sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mays, Gary T; Belles, Randy; Blevins, Brandon R

    2012-05-01

    Oak Ridge National Laboratory (ORNL) initiated an internal National Electric Generation Siting Study, which is an ongoing multiphase study addressing several key questions related to our national electrical energy supply. This effort has led to the development of a tool, OR-SAGE (Oak Ridge Siting Analysis for power Generation Expansion), to support siting evaluations. The objective in developing OR-SAGE was to use industry-accepted approaches and/or develop appropriate criteria for screening sites and employ an array of Geographic Information Systems (GIS) data sources at ORNL to identify candidate areas for a power generation technology application. The initial phase of the study examinedmore » nuclear power generation. These early nuclear phase results were shared with staff from the Electric Power Research Institute (EPRI), which formed the genesis and support for an expansion of the work to several other power generation forms, including advanced coal with carbon capture and storage (CCS), solar, and compressed air energy storage (CAES). Wind generation was not included in this scope of work for EPRI. The OR-SAGE tool is essentially a dynamic visualization database. The results shown in this report represent a single static set of results using a specific set of input parameters. In this case, the GIS input parameters were optimized to support an economic study conducted by EPRI. A single set of individual results should not be construed as an ultimate energy solution, since US energy policy is very complex. However, the strength of the OR-SAGE tool is that numerous alternative scenarios can be quickly generated to provide additional insight into electrical generation or other GIS-based applications. The screening process divides the contiguous United States into 100 x 100 m (1-hectare) squares (cells), applying successive power generation-appropriate site selection and evaluation criteria (SSEC) to each cell. There are just under 700 million cells representing the contiguous United States. If a cell meets the requirements of each criterion, the cell is deemed a candidate area for siting a specific power generation form relative to a reference plant for that power type. Some SSEC parameters preclude siting a power plant because of an environmental, regulatory, or land-use constraint. Other SSEC assist in identifying less favorable areas, such as proximity to hazardous operations. All of the selected SSEC tend to recommend against sites. The focus of the ORNL electrical generation source siting study is on identifying candidate areas from which potential sites might be selected, stopping short of performing any detailed site evaluations or comparisons. This approach is designed to quickly screen for and characterize candidate areas. Critical assumptions supporting this work include the supply of cooling water to thermoelectric power generation; a methodology to provide an adequate siting footprint for typical power plant applications; a methodology to estimate thermoelectric plant capacity while accounting for available cooling water; and a methodology to account for future ({approx}2035) siting limitations as population increases and demands on freshwater sources change. OR-SAGE algorithms were built to account for these critical assumptions. Stream flow is the primary thermoelectric plant cooling source evaluated in this study. All cooling was assumed to be provided by a closed-cycle cooling (CCC) system requiring makeup water to account for evaporation and blowdown. Limited evaluations of shoreline cooling and the use of municipal processed water (gray) cooling were performed. Using a representative set of SSEC as input to the OR-SAGE tool and employing the accompanying critical assumptions, independent results for the various power generation sources studied were calculated.« less

  14. Effect of Methodological and Ecological Approaches on Heterogeneity of Nest-Site Selection of a Long-Lived Vulture

    PubMed Central

    Moreno-Opo, Rubén; Fernández-Olalla, Mariana; Margalida, Antoni; Arredondo, Ángel; Guil, Francisco

    2012-01-01

    The application of scientific-based conservation measures requires that sampling methodologies in studies modelling similar ecological aspects produce comparable results making easier their interpretation. We aimed to show how the choice of different methodological and ecological approaches can affect conclusions in nest-site selection studies along different Palearctic meta-populations of an indicator species. First, a multivariate analysis of the variables affecting nest-site selection in a breeding colony of cinereous vulture (Aegypius monachus) in central Spain was performed. Then, a meta-analysis was applied to establish how methodological and habitat-type factors determine differences and similarities in the results obtained by previous studies that have modelled the forest breeding habitat of the species. Our results revealed patterns in nesting-habitat modelling by the cinereous vulture throughout its whole range: steep and south-facing slopes, great cover of large trees and distance to human activities were generally selected. The ratio and situation of the studied plots (nests/random), the use of plots vs. polygons as sampling units and the number of years of data set determined the variability explained by the model. Moreover, a greater size of the breeding colony implied that ecological and geomorphological variables at landscape level were more influential. Additionally, human activities affected in greater proportion to colonies situated in Mediterranean forests. For the first time, a meta-analysis regarding the factors determining nest-site selection heterogeneity for a single species at broad scale was achieved. It is essential to homogenize and coordinate experimental design in modelling the selection of species' ecological requirements in order to avoid that differences in results among studies would be due to methodological heterogeneity. This would optimize best conservation and management practices for habitats and species in a global context. PMID:22413023

  15. A Methodological Framework to Estimate the Site Fidelity of Tagged Animals Using Passive Acoustic Telemetry

    PubMed Central

    Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent

    2015-01-01

    The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity (“residence times”) of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales. PMID:26261985

  16. A Methodological Framework to Estimate the Site Fidelity of Tagged Animals Using Passive Acoustic Telemetry.

    PubMed

    Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent

    2015-01-01

    The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity ("residence times") of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales.

  17. Demonstration of innovative monitoring technologies at the Savannah River Integrated Demonstration Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rossabi, J.; Jenkins, R.A.; Wise, M.B.

    1993-12-31

    The Department of Energy`s Office of Technology Development initiated an Integrated Demonstration Program at the Savannah River Site in 1989. The objective of this program is to develop, demonstrate, and evaluate innovative technologies that can improve present-day environmental restoration methods. The Integrated Demonstration Program at SRS is entitled ``Cleanup of Organics in Soils and Groundwater at Non-Arid Sites.`` New technologies in the areas of drilling, characterization, monitoring, and remediation are being demonstrated and evaluated for their technical performance and cost effectiveness in comparison with baseline technologies. Present site characterization and monitoring methods are costly, time-consuming, overly invasive, and often imprecise.more » Better technologies are required to accurately describe the subsurface geophysical and geochemical features of a site and the nature and extent of contamination. More efficient, nonintrusive characterization and monitoring techniques are necessary for understanding and predicting subsurface transport. More reliable procedures are also needed for interpreting monitoring and characterization data. Site characterization and monitoring are key elements in preventing, identifying, and restoring contaminated sites. The remediation of a site cannot be determined without characterization data, and monitoring may be required for 30 years after site closure.« less

  18. Assessment of methodologies for analysis of the dungeness B accidental aircraft crash risk.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaChance, Jeffrey L.; Hansen, Clifford W.

    2010-09-01

    The Health and Safety Executive (HSE) has requested Sandia National Laboratories (SNL) to review the aircraft crash methodology for nuclear facilities that are being used in the United Kingdom (UK). The scope of the work included a review of one method utilized in the UK for assessing the potential for accidental airplane crashes into nuclear facilities (Task 1) and a comparison of the UK methodology against similar International Atomic Energy Agency (IAEA), United States (US) Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) methods (Task 2). Based on the conclusions from Tasks 1 and 2, an additionalmore » Task 3 would provide an assessment of a site-specific crash frequency for the Dungeness B facility using one of the other methodologies. This report documents the results of Task 2. The comparison of the different methods was performed for the three primary contributors to aircraft crash risk at the Dungeness B site: airfield related crashes, crashes below airways, and background crashes. The methods and data specified in each methodology were compared for each of these risk contributors, differences in the methodologies were identified, and the importance of these differences was qualitatively and quantitatively assessed. The bases for each of the methods and the data used were considered in this assessment process. A comparison of the treatment of the consequences of the aircraft crashes was not included in this assessment because the frequency of crashes into critical structures is currently low based on the existing Dungeness B assessment. Although the comparison found substantial differences between the UK and the three alternative methodologies (IAEA, NRC, and DOE) this assessment concludes that use of any of these alternative methodologies would not change the conclusions reached for the Dungeness B site. Performance of Task 3 is thus not recommended.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, M.G.; Zimmerman, D.A.; Doesburg, J.M.

    A literature review was conducted to identify methodologies that could be used to interpret paleohydrologic environments. Paleohydrology is the study of past hydrologic systems or of the past behavior of an existing hydrologic system. The purpose of the review was to evaluate how well these methodologies could be applied to the siting of low-level radioactive waste facilities. The computer literature search queried five bibliographical data bases containing over five million citations of technical journals, books, conference papers, and reports. Two data-base searches (United States Geological Survey - USGS) and a manual search were also conducted. The methodologies were examined formore » data requirements and sensitivity limits. Paleohydrologic interpretations are uncertain because of the effects of time on hydrologic and geologic systems and because of the complexity of fluvial systems. Paleoflow determinations appear in many cases to be order-of-magnitude estimates. However, the methodologies identified in this report mitigate this uncertainty when used collectively as well as independently. That is, the data from individual methodologies can be compared or combined to corroborate hydrologic predictions. In this manner, paleohydrologic methodologies are viable tools to assist in evaluating the likely future hydrology of low-level radioactive waste sites.« less

  20. Afraid to Start Because the Outcome is Uncertain?: Social Site Characterization as a Tool for Informing Public Engagement Efforts

    USGS Publications Warehouse

    Wade, S.; Greenberg, S.

    2009-01-01

    This paper introduces the concept of social site characterization as a parallel effort to technical site characterization to be used in evaluating and planning carbon dioxides capture and storage (CCS) projects. Social site characterization, much like technical site characterization, relies on a series of iterative investigations into public attitudes towards a CCS project and the factors that will shape those views. This paper also suggests ways it can be used to design approaches for actively engaging stakeholders and communities in the deployment of CCS projects. This work is informed by observing the site selection process for FutureGen and the implementation of research projects under the Regional Carbon Sequestration Partnership Program. ?? 2009 Elsevier Ltd. All rights reserved.

  1. Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales

    NASA Technical Reports Server (NTRS)

    Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul

    2010-01-01

    Methodologies for understanding the plastic deformation mechanisms related to crack propagation at the nano-, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.

  2. Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales

    NASA Technical Reports Server (NTRS)

    Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul

    2011-01-01

    Methodologies for understanding the plastic deformation mechanisms related 10 crack propagation at the nano, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.

  3. Conducting a Surgical Site Infection Prevention Tracer.

    PubMed

    Padgette, Polly; Wood, Brittain

    2018-05-01

    Surgical site infections (SSIs) are the most common health care-associated infections in patients. Approximately half of SSIs are preventable when using evidence-based strategies; however, deviations from evidence-based practice can occur over time. Infection preventionists and perioperative staff members can help prevent these deviations by observing staff member practices using tracer methodology. Tracer methodology uses clinical information to follow patient care, treatment, or services provided throughout the care delivery system. The goal of tracer methodology for SSI prevention is to validate that organizational processes are promoting safer patient care. Using tracers, perioperative and infection prevention staff members can develop strategies to eliminate deviations from evidence-based practice, thereby helping to prevent SSIs and improve patient outcomes. © AORN, Inc, 2018.

  4. Characteristics of sequential swallowing of liquids in young and elderly adults: an integrative review.

    PubMed

    Veiga, Helena Perrut; Bianchini, Esther Mandelbaum Gonçalves

    2012-01-01

    To perform an integrative review of studies on liquid sequential swallowing, by characterizing the methodology of the studies and the most important findings in young and elderly adults. Review of the literature written in English and Portuguese on PubMed, LILACS, SciELO and MEDLINE databases, within the past twenty years, available fully, using the following uniterms: sequential swallowing, swallowing, dysphagia, cup, straw, in various combinations. Research articles with a methodological approach on the characterization of liquid sequential swallowing by young and/or elderly adults, regardless of health condition, excluding studies involving only the esophageal phase. The following research indicators were applied: objectives, number and gender of participants; age group; amount of liquid offered; intake instruction; utensil used, methods and main findings. 18 studies met the established criteria. The articles were categorized according to the sample characterization and the methodology on volume intake, utensil used and types of exams. Most studies investigated only healthy individuals, with no swallowing complaints. Subjects were given different instructions as to the intake of all the volume: usual manner, continually, as rapidly as possible. The findings about the characterization of sequential swallowing were varied and described in accordance with the objectives of each study. It found great variability in the methodology employed to characterize the sequential swallowing. Some findings are not comparable, and sequential swallowing is not studied in most swallowing protocols, without consensus on the influence of the utensil.

  5. SITE CHARACTERIZATION LIBRARY: VOLUMN 1 (RELEASE 2.5)

    EPA Science Inventory

    This CD-ROM, Volume 1, Release 2.5, of EPA's National Exposure Research Laboratory (NERL - Las Vegas) Site Characterization Library, contains additional electronic documents and computer programs related to the characterization of hazardous waste sites. EPA has produced this libr...

  6. T-4G Methodology: Undergraduate Pilot Training T-37 Phase.

    ERIC Educational Resources Information Center

    Woodruff, Robert R.; And Others

    The report's brief introduction describes the application of T-4G methodology to the T-37 instrument phase of undergraduate pilot training. The methodology is characterized by instruction in trainers, proficiency advancement, a highly structured syllabus, the training manager concept, early exposure to instrument training, and hands-on training.…

  7. Visions of Terror: A Q-Methodological Analysis of American Perceptions of International Terrorism.

    ERIC Educational Resources Information Center

    Dowling, Ralph E.; Nitcavic, Richard G.

    A study examined the efficacy of Q-methodology as a tool to explain perceptions of the American public regarding international terrorism, seeking to identify through this methodology distinct views of terrorism and the significant variables characterizing those views. To develop their instrument, researchers interviewed 16 individuals and based…

  8. The Prone Protected Posture

    DTIC Science & Technology

    1980-08-01

    5K 2. METHODOLOGY . . . . . . . . . . . . . . . . . . . . 5 3. RESULTS . . . . . . . . . . . . . . . . . . . . . . 23I 4...2. METHODOLOGY The first step required in this study was to characterize the prone protected posture. Basically, a man in the prone posture differs...reduction in the presented area of target personnel. Reference 6 contains a concise discussion of the methodology used to generate the shielding functions

  9. An Overview of Prognosis Health Management Research at Glenn Research Center for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.

    2009-01-01

    Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include (1) diagnostic/detection methodology, (2) prognosis/lifing methodology, (3) diagnostic/prognosis linkage, (4) experimental validation, and (5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multimechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.

  10. An Overview of Prognosis Health Management Research at GRC for Gas Turbine Engine Structures With Special Emphasis on Deformation and Damage Modeling

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Goldberg, Robert K.; Lerch, Bradley A.; Saleeb, Atef F.

    2009-01-01

    Herein a general, multimechanism, physics-based viscoelastoplastic model is presented in the context of an integrated diagnosis and prognosis methodology which is proposed for structural health monitoring, with particular applicability to gas turbine engine structures. In this methodology, diagnostics and prognostics will be linked through state awareness variable(s). Key technologies which comprise the proposed integrated approach include 1) diagnostic/detection methodology, 2) prognosis/lifing methodology, 3) diagnostic/prognosis linkage, 4) experimental validation and 5) material data information management system. A specific prognosis lifing methodology, experimental characterization and validation and data information management are the focal point of current activities being pursued within this integrated approach. The prognostic lifing methodology is based on an advanced multi-mechanism viscoelastoplastic model which accounts for both stiffness and/or strength reduction damage variables. Methods to characterize both the reversible and irreversible portions of the model are discussed. Once the multiscale model is validated the intent is to link it to appropriate diagnostic methods to provide a full-featured structural health monitoring system.

  11. Spatial terrain analysis for matching native tree species to sites: a methodology

    Treesearch

    Robin N. Thwaites

    2000-01-01

    Predicting the distribution of preferable biophysical sites for three of the favored plantation species - Araucaria cunninghamii, Eucalyptus cloeziana, and Flindersia brayleyana is necessary for private land rehabilitation in tropical north Queensland. The distribution of these predicted sites is expressed in a spatial and...

  12. Injector element characterization methodology

    NASA Technical Reports Server (NTRS)

    Cox, George B., Jr.

    1988-01-01

    Characterization of liquid rocket engine injector elements is an important part of the development process for rocket engine combustion devices. Modern nonintrusive instrumentation for flow velocity and spray droplet size measurement, and automated, computer-controlled test facilities allow rapid, low-cost evaluation of injector element performance and behavior. Application of these methods in rocket engine development, paralleling their use in gas turbine engine development, will reduce rocket engine development cost and risk. The Alternate Turbopump (ATP) Hot Gas Systems (HGS) preburner injector elements were characterized using such methods, and the methodology and some of the results obtained will be shown.

  13. A review of the current state-of-the-art methodology for handling bias and uncertainty in performing criticality safety evaluations. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Disney, R.K.

    1994-10-01

    The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE`s) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE`s within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ``site`` perception to a more uniform or ``national`` perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticalsmore » data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation.« less

  14. Risk-based economic decision analysis of remediation options at a PCE-contaminated site.

    PubMed

    Lemming, Gitte; Friis-Hansen, Peter; Bjerg, Poul L

    2010-05-01

    Remediation methods for contaminated sites cover a wide range of technical solutions with different remedial efficiencies and costs. Additionally, they may vary in their secondary impacts on the environment i.e. the potential impacts generated due to emissions and resource use caused by the remediation activities. More attention is increasingly being given to these secondary environmental impacts when evaluating remediation options. This paper presents a methodology for an integrated economic decision analysis which combines assessments of remediation costs, health risk costs and potential environmental costs. The health risks costs are associated with the residual contamination left at the site and its migration to groundwater used for drinking water. A probabilistic exposure model using first- and second-order reliability methods (FORM/SORM) is used to estimate the contaminant concentrations at a downstream groundwater well. Potential environmental impacts on the local, regional and global scales due to the site remediation activities are evaluated using life cycle assessments (LCA). The potential impacts on health and environment are converted to monetary units using a simplified cost model. A case study based upon the developed methodology is presented in which the following remediation scenarios are analyzed and compared: (a) no action, (b) excavation and off-site treatment of soil, (c) soil vapor extraction and (d) thermally enhanced soil vapor extraction by electrical heating of the soil. Ultimately, the developed methodology facilitates societal cost estimations of remediation scenarios which can be used for internal ranking of the analyzed options. Despite the inherent uncertainties of placing a value on health and environmental impacts, the presented methodology is believed to be valuable in supporting decisions on remedial interventions. Copyright 2010 Elsevier Ltd. All rights reserved.

  15. Watershed Regressions for Pesticides (WARP) for Predicting Annual Maximum and Annual Maximum Moving-Average Concentrations of Atrazine in Streams

    USGS Publications Warehouse

    Stone, Wesley W.; Gilliom, Robert J.; Crawford, Charles G.

    2008-01-01

    Regression models were developed for predicting annual maximum and selected annual maximum moving-average concentrations of atrazine in streams using the Watershed Regressions for Pesticides (WARP) methodology developed by the National Water-Quality Assessment Program (NAWQA) of the U.S. Geological Survey (USGS). The current effort builds on the original WARP models, which were based on the annual mean and selected percentiles of the annual frequency distribution of atrazine concentrations. Estimates of annual maximum and annual maximum moving-average concentrations for selected durations are needed to characterize the levels of atrazine and other pesticides for comparison to specific water-quality benchmarks for evaluation of potential concerns regarding human health or aquatic life. Separate regression models were derived for the annual maximum and annual maximum 21-day, 60-day, and 90-day moving-average concentrations. Development of the regression models used the same explanatory variables, transformations, model development data, model validation data, and regression methods as those used in the original development of WARP. The models accounted for 72 to 75 percent of the variability in the concentration statistics among the 112 sampling sites used for model development. Predicted concentration statistics from the four models were within a factor of 10 of the observed concentration statistics for most of the model development and validation sites. Overall, performance of the models for the development and validation sites supports the application of the WARP models for predicting annual maximum and selected annual maximum moving-average atrazine concentration in streams and provides a framework to interpret the predictions in terms of uncertainty. For streams with inadequate direct measurements of atrazine concentrations, the WARP model predictions for the annual maximum and the annual maximum moving-average atrazine concentrations can be used to characterize the probable levels of atrazine for comparison to specific water-quality benchmarks. Sites with a high probability of exceeding a benchmark for human health or aquatic life can be prioritized for monitoring.

  16. Thermal power systems, small power systems application project. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Marriott, A. T.

    1979-01-01

    Current small power system technology as applied to power plants up to 10 MWe in size was assessed. Markets for small power systems were characterized and cost goals were established. Candidate power plant system design concepts were selected for evaluation and preliminary performance and cost assessments were made. Economic studies were conducted and breakeven capital costs were determined for leading contenders among the candidate systems. An application study was made of the potential use of small power systems in providing part of the demand for pumping power by the extensive aqueduct system of California, estimated to be 1000 MWe by 1985. Criteria and methodologies were developed for application to the ranking of candidate power plant system design concepts. Experimental power plants concepts of 1 MWe rating were studied leading toward the definition of a power plant configuration for subsequent detail design, construction, testing and evaluation as Engineering Experiment No. 1 (EE No. 1). Site selection criteria and ground rules for the solicitation of EE No. 1 site participation proposals by DOE were developed.

  17. Probabilistic assessment method of the non-monotonic dose-responses-Part I: Methodological approach.

    PubMed

    Chevillotte, Grégoire; Bernard, Audrey; Varret, Clémence; Ballet, Pascal; Bodin, Laurent; Roudot, Alain-Claude

    2017-08-01

    More and more studies aim to characterize non-monotonic dose response curves (NMDRCs). The greatest difficulty is to assess the statistical plausibility of NMDRCs from previously conducted dose response studies. This difficulty is linked to the fact that these studies present (i) few doses tested, (ii) a low sample size per dose, and (iii) the absence of any raw data. In this study, we propose a new methodological approach to probabilistically characterize NMDRCs. The methodology is composed of three main steps: (i) sampling from summary data to cover all the possibilities that may be presented by the responses measured by dose and to obtain a new raw database, (ii) statistical analysis of each sampled dose-response curve to characterize the slopes and their signs, and (iii) characterization of these dose-response curves according to the variation of the sign in the slope. This method allows characterizing all types of dose-response curves and can be applied both to continuous data and to discrete data. The aim of this study is to present the general principle of this probabilistic method which allows to assess the non-monotonic dose responses curves, and to present some results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Measurement Sets and Sites Commonly Used for Characterization

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Holekamp, Kara; Ryan, Robert; Sellers, Richard; Davis, Bruce; Zanoni, Vicki

    2002-01-01

    Scientists at NASA's Earth Science Applications Directorate are creating a well-characterized Verification & Validation (V&V) site at the Stennis Space Center. This site enables the in-flight characterization of remote sensing systems and the data they acquire. The data are predominantly acquired by commercial, high spatial resolution satellite systems, such as IKONOS and QuickBird 2, and airborne systems. The smaller scale of these newer high resolution remote sensing systems allows scientists to characterize the geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists are also using the SSC V&V site to characterize thermal infrared systems and active LIDAR systems. SSC employs geodetic targets, edge targets, radiometric tarps, and thermal calibration ponds to characterize remote sensing data products. This paper presents a proposed set of required measurements for visible through long-wave infrared remote sensing systems and a description of the Stennis characterization. Other topics discussed include: 1) The use of ancillary atmospheric and solar measurements taken at SSC that support various characterizations; 2) Additional sites used for radiometric, geometric, and spatial characterization in the continental United States; 3) The need for a standardized technique to be adopted by CEOS and other organizations.

  19. Qualitative carbonyl profile in coffee beans through GDME-HPLC-DAD-MS/MS for coffee preliminary characterization.

    PubMed

    Cordeiro, Liliana; Valente, Inês M; Santos, João Rodrigo; Rodrigues, José A

    2018-05-01

    In this work, an analytical methodology for volatile carbonyl compounds characterization in green and roasted coffee beans was developed. The methodology relied on a recent and simple sample preparation technique, gas diffusion microextraction for extraction of the samples' volatiles, followed HPLC-DAD-MS/MS analysis. The experimental conditions in terms of extraction temperature and extraction time were studied. A profile for carbonyl compounds was obtained for both arabica and robusta coffee species (green and roasted samples). Twenty-seven carbonyl compounds were identified and further discussed, in light of reported literature, with different coffee characteristics: coffee ageing, organoleptic impact, presence of defective beans, authenticity, human's health implication, post-harvest coffee processing and roasting. The applied methodology showed to be a powerful analytical tool to be used for coffee characterization as it measures marker compounds of different coffee characteristics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Protecting groundwater resources at biosolids recycling sites.

    PubMed

    McFarland, Michael J; Kumarasamy, Karthik; Brobst, Robert B; Hais, Alan; Schmitz, Mark D

    2013-01-01

    In developing the national biosolids recycling rule (Title 40 of the Code of Federal Regulation Part 503 or Part 503), the USEPA conducted deterministic risk assessments whose results indicated that the probability of groundwater impairment associated with biosolids recycling was insignificant. Unfortunately, the computational capabilities available for performing risk assessments of pollutant fate and transport at that time were limited. Using recent advances in USEPA risk assessment methodology, the present study evaluates whether the current national biosolids pollutant limits remain protective of groundwater quality. To take advantage of new risk assessment approaches, a computer-based groundwater risk characterization screening tool (RCST) was developed using USEPA's Multimedia, Multi-pathway, Multi-receptor Exposure and Risk Assessment program. The RCST, which generates a noncarcinogenic human health risk estimate (i.e., hazard quotient [HQ] value), has the ability to conduct screening-level risk characterizations. The regulated heavy metals modeled in this study were As, Cd, Ni, Se, and Zn. Results from RCST application to biosolids recycling sites located in Yakima County, Washington, indicated that biosolids could be recycled at rates as high as 90 Mg ha, with no negative human health effects associated with groundwater consumption. Only under unrealistically high biosolids land application rates were public health risks characterized as significant (HQ ≥ 1.0). For example, by increasing the biosolids application rate and pollutant concentrations to 900 Mg ha and 10 times the regulatory limit, respectively, the HQ values varied from 1.4 (Zn) to 324.0 (Se). Since promulgation of Part 503, no verifiable cases of groundwater contamination by regulated biosolids pollutants have been reported. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  1. Reliability modelling and analysis of thermal MEMS

    NASA Astrophysics Data System (ADS)

    Muratet, Sylvaine; Lavu, Srikanth; Fourniols, Jean-Yves; Bell, George; Desmulliez, Marc P. Y.

    2006-04-01

    This paper presents a MEMS reliability study methodology based on the novel concept of 'virtual prototyping'. This methodology can be used for the development of reliable sensors or actuators and also to characterize their behaviour in specific use conditions and applications. The methodology is demonstrated on the U-shaped micro electro thermal actuator used as test vehicle. To demonstrate this approach, a 'virtual prototype' has been developed with the modeling tools MatLab and VHDL-AMS. A best practice FMEA (Failure Mode and Effect Analysis) is applied on the thermal MEMS to investigate and assess the failure mechanisms. Reliability study is performed by injecting the identified defaults into the 'virtual prototype'. The reliability characterization methodology predicts the evolution of the behavior of these MEMS as a function of the number of cycles of operation and specific operational conditions.

  2. A Computational Tool for Evaluating THz Imaging Performance in Brownout Conditions at Land Sites Throughout the World

    DTIC Science & Technology

    2009-03-01

    III. Methodology ...............................................................................................................26 Overview...applications relating to this research and the results they have obtained, as well as the background on LEEDR. Chapter 3 will detail the methodology ...different in that the snow dissipates faster and it is better to descend slower, at rates of 200 – 300 ft/min. 26 III. Methodology This chapter

  3. Estimating daily forest carbon fluxes using a combination of ground and remotely sensed data

    NASA Astrophysics Data System (ADS)

    Chirici, Gherardo; Chiesi, Marta; Corona, Piermaria; Salvati, Riccardo; Papale, Dario; Fibbi, Luca; Sirca, Costantino; Spano, Donatella; Duce, Pierpaolo; Marras, Serena; Matteucci, Giorgio; Cescatti, Alessandro; Maselli, Fabio

    2016-02-01

    Several studies have demonstrated that Monteith's approach can efficiently predict forest gross primary production (GPP), while the modeling of net ecosystem production (NEP) is more critical, requiring the additional simulation of forest respirations. The NEP of different forest ecosystems in Italy was currently simulated by the use of a remote sensing driven parametric model (modified C-Fix) and a biogeochemical model (BIOME-BGC). The outputs of the two models, which simulate forests in quasi-equilibrium conditions, are combined to estimate the carbon fluxes of actual conditions using information regarding the existing woody biomass. The estimates derived from the methodology have been tested against daily reference GPP and NEP data collected through the eddy correlation technique at five study sites in Italy. The first test concerned the theoretical validity of the simulation approach at both annual and daily time scales and was performed using optimal model drivers (i.e., collected or calibrated over the site measurements). Next, the test was repeated to assess the operational applicability of the methodology, which was driven by spatially extended data sets (i.e., data derived from existing wall-to-wall digital maps). A good estimation accuracy was generally obtained for GPP and NEP when using optimal model drivers. The use of spatially extended data sets worsens the accuracy to a varying degree, which is properly characterized. The model drivers with the most influence on the flux modeling strategy are, in increasing order of importance, forest type, soil features, meteorology, and forest woody biomass (growing stock volume).

  4. Preparation and characterization of Mg-Zr mixed oxide aerogels and their application as aldol condensation catalysts.

    PubMed

    Sádaba, Irantzu; Ojeda, Manuel; Mariscal, Rafael; Richards, Ryan; López Granados, Manuel

    2012-10-08

    A series of Mg-Zr mixed oxides with different nominal Mg/(Mg+Zr) atomic ratios, namely 0, 0.1, 0.2, 0.4, 0.85, and 1, is prepared by alcogel methodology and fundamental insights into the phases obtained and resulting active sites are studied. Characterization is performed by X-ray diffraction, transmission electron microscopy, X-ray photoelectron spectroscopy, N(2) adsorption-desorption isotherms, and thermal and chemical analysis. Cubic Mg(x)Zr(1-x)O(2-x) solid solution, which results from the dissolution of Mg(2+) cations within the cubic ZrO(2) structure, is the main phase detected for the solids with theoretical Mg/(Mg+Zr) atomic ratio ≤0.4. In contrast, the cubic periclase (c-MgO) phase derived from hydroxynitrates or hydroxy precursors predominates in the solid with Mg/(Mg+Zr)=0.85. c-MgO is also incipiently detected in samples with Mg/(Mg+Zr)=0.2 and 0.4, but in these solids the c-MgO phase mostly arises from the segregation of Mg atoms out of the alcogel-derived c-Mg(x)Zr(1-x)O(2-x) phase during the calcination process, and therefore the species c-MgO and c-Mg(x)Zr(1-x)O(2-x) are in close contact. Regarding the intrinsic activity in furfural-acetone aldol condensation in the aqueous phase, these Mg-O-Zr sites located at the interface between c-Mg(x)Zr(1-x)O(2-x) and segregated c-MgO display a much larger intrinsic activity than the other noninterface sites that are present in these catalysts: Mg-O-Mg sites on c-MgO and Mg-O-Zr sites on c-Mg(x)Zr(1-x)O(2-x). The very active Mg-O-Zr sites rapidly deactivate in the furfural-acetone condensation due to the leaching of active phases, deposition of heavy hydrocarbonaceous compounds, and hydration of the c-MgO phase. Nonetheless, these Mg-Zr materials with very high specific surface areas would be suitable solid catalysts for other relevant reactions catalyzed by strong basic sites in nonaqueous environments. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. The ASCD Healthy School Communities Project: Formative Evaluation Results

    ERIC Educational Resources Information Center

    Valois, Robert F.; Lewallen, Theresa C.; Slade, Sean; Tasco, Adriane N.

    2015-01-01

    Purpose: The purpose of this paper is to report the formative evaluation results from the Association for Supervision and Curriculum Development Healthy School Communities (HSC) pilot project. Design/methodology/approach: This study utilized 11 HSC pilot sites in the USA (eight sites) and Canada (three sites). The evaluation question was…

  6. Validation of SMAP surface soil moisture products with core validation sites

    USDA-ARS?s Scientific Manuscript database

    The NASA Soil Moisture Active Passive (SMAP) mission has utilized a set of core validation sites as the primary methodology in assessing the soil moisture retrieval algorithm performance. Those sites provide well-calibrated in situ soil moisture measurements within SMAP product grid pixels for diver...

  7. Development of a New Methodology for Computing Surface Sensible Heat Fluxes using Thermal Imagery

    NASA Astrophysics Data System (ADS)

    Morrison, T. J.; Calaf, M.; Fernando, H. J.; Price, T. A.; Pardyjak, E.

    2017-12-01

    Current numerical weather predication models utilize similarity to characterize momentum, moisture, and heat fluxes. Such formulations are only valid under the ideal assumptions of spatial homogeneity, statistical stationary, and zero subsidence. However, recent surface temperature measurements from the Mountain Terrain Atmospheric Modeling and Observations (MATERHORN) Program on the Salt Flats of Utah's West desert, show that even under the most a priori ideal conditions, heterogeneity of the aforementioned variables exists. We present a new method to extract spatially-distributed measurements of surface sensible heat flux from thermal imagery. The approach consists of using a surface energy budget, where the ground heat flux is easily computed from limited measurements using a force-restore-type methodology, the latent heat fluxes are neglected, and the energy storage is computed using a lumped capacitance model. Preliminary validation of the method is presented using experimental data acquired from a nearby sonic anemometer during the MATERHORN campaign. Additional evaluation is required to confirm the method's validity. Further decomposition analysis of on-site instrumentation (thermal camera, cold-hotwire probes, and sonic anemometers) using Proper Orthogonal Decomposition (POD), and wavelet analysis, reveals time scale similarity between the flow and surface fluctuations.

  8. Adhesion, biofilm formation, cell surface hydrophobicity, and antifungal planktonic susceptibility: relationship among Candida spp.

    PubMed

    Silva-Dias, Ana; Miranda, Isabel M; Branco, Joana; Monteiro-Soares, Matilde; Pina-Vaz, Cidália; Rodrigues, Acácio G

    2015-01-01

    We have performed the characterization of the adhesion profile, biofilm formation, cell surface hydrophobicity (CSH) and antifungal susceptibility of 184 Candida clinical isolates obtained from different human reservoirs. Adhesion was quantified using a flow cytometric assay and biofilm formation was evaluated using two methodologies: XTT and crystal violet assay. CSH was quantified with the microbial adhesion to hydrocarbons test while planktonic susceptibility was assessed accordingly the CLSI protocol for yeast M27-A3 S4. Yeast cells of non-albicans species exhibit increased ability to adhere and form biofilm. However, the correlation between adhesion and biofilm formation varied according to species and also with the methodology used for biofilm assessment. No association was found between strain's site of isolation or planktonic antifungal susceptibility and adhesion or biofilm formation. Finally CSH seemed to be a good predictor for biofilm formation but not for adhesion. Despite the marked variability registered intra and inter species, C. tropicalis and C. parapsilosis were the species exhibiting high adhesion profile. C. tropicalis, C. guilliermondii, and C. krusei revealed higher biofilm formation values in terms of biomass. C. parapsilosis was the species with lower biofilm metabolic activity.

  9. Adhesion, biofilm formation, cell surface hydrophobicity, and antifungal planktonic susceptibility: relationship among Candida spp.

    PubMed Central

    Silva-Dias, Ana; Miranda, Isabel M.; Branco, Joana; Monteiro-Soares, Matilde; Pina-Vaz, Cidália; Rodrigues, Acácio G.

    2015-01-01

    We have performed the characterization of the adhesion profile, biofilm formation, cell surface hydrophobicity (CSH) and antifungal susceptibility of 184 Candida clinical isolates obtained from different human reservoirs. Adhesion was quantified using a flow cytometric assay and biofilm formation was evaluated using two methodologies: XTT and crystal violet assay. CSH was quantified with the microbial adhesion to hydrocarbons test while planktonic susceptibility was assessed accordingly the CLSI protocol for yeast M27-A3 S4. Yeast cells of non-albicans species exhibit increased ability to adhere and form biofilm. However, the correlation between adhesion and biofilm formation varied according to species and also with the methodology used for biofilm assessment. No association was found between strain's site of isolation or planktonic antifungal susceptibility and adhesion or biofilm formation. Finally CSH seemed to be a good predictor for biofilm formation but not for adhesion. Despite the marked variability registered intra and inter species, C. tropicalis and C. parapsilosis were the species exhibiting high adhesion profile. C. tropicalis, C. guilliermondii, and C. krusei revealed higher biofilm formation values in terms of biomass. C. parapsilosis was the species with lower biofilm metabolic activity. PMID:25814989

  10. SITE CHARACTERIZATION AND ANALYSIS PENETROMETER SYSTEM(SCAPS) LAZER-INDUCED FLUORESCENCE (LIF) SENSOR AND SUPPORT SYSTEM

    EPA Science Inventory

    The Consortium for Site Characterization Technology (CSCT) has established a formal program to accelerate acceptance and application of innovative monitoring and site characterization technologies that improve the way the nation manages its environmental problems. In 1995 the CS...

  11. A methodology for optimization of wind farm allocation under land restrictions: the case of the Canary Islands

    NASA Astrophysics Data System (ADS)

    Castaño Moraga, C. A.; Suárez Santana, E.; Sabbagh Rodríguez, I.; Nebot Medina, R.; Suárez García, S.; Rodríguez Alvarado, J.; Piernavieja Izquierdo, G.; Ruiz Alzola, J.

    2010-09-01

    Wind farms authorization and power allocations to private investors promoting wind energy projects requires some planification strategies. This issue is even more important under land restrictions, as it is the case of Canary Islands, where numerous specially protected areas are present for environmental reasons and land is a scarce resource. Aware of this limitation, the Regional Government of Canary Islands designed the requirements of a public tender to grant licences to install new wind farms trying to maximize the energy produced in terms of occupied land. In this paper, we detail the methodology developed by the Canary Islands Institute of Technology (ITC, S.A.) to support the work of the technical staff of the Regional Ministry of Industry, responsible for the evaluation of a competitive tender process for awarding power lincenses to private investors. The maximization of wind energy production per unit of area requires an exhaustive wind profile characterization. To that end, wind speed was statistically characterized by means of a Weibull probability density function, which mainly depends on two parameters: the shape parameter K, which determines the slope of the curve, and the average wind speed v , which is a scale parameter. These two parameters have been evaluated at three different heights (40,60,80 m) over the whole canarian archipelago, as well as the main wind speed direction. These parameters are available from the public data source Wind Energy Map of the Canary Islands [1]. The proposed methodology is based on the calculation of an initially defined Energy Efficiency Basic Index (EEBI), which is a performance criteria that weighs the annual energy production of a wind farm per unit of area. The calculation of this parameter considers wind conditions, windturbine characteristics, geometry of windturbine distribution in the wind farm (position within the row and column of machines), and involves four steps: Estimation of the energy produced by every windturbine as if it were isolated from all the other machines of the wind farm, using its power curve and the statistical characterization of the wind profile at the site. Estimation of energy losses due to affections caused by other windturbine in the same row and missalignment with respect to the main wind speed direction. Estimation of energy losses due to affections induced by windturbines located upstream. EEBI calculation as the ratio between the annual energy production and the area occupied by the wind farm, as a function of wind speed profile and wind turbine characteristics. Computations involved above are modeled under a System Theory characterization

  12. Importance of geologic characterization of potential low-level radioactive waste disposal sites

    USGS Publications Warehouse

    Weibel, C.P.; Berg, R.C.

    1991-01-01

    Using the example of the Geff Alternative Site in Wayne County, Illinois, for the disposal of low-level radioactive waste, this paper demonstrates, from a policy and public opinion perspective, the importance of accurately determining site stratigraphy. Complete and accurate characterization of geologic materials and determination of site stratigraphy at potential low-level waste disposal sites provides the frame-work for subsequent hydrologic and geochemical investigations. Proper geologic characterization is critical to determine the long-term site stability and the extent of interactions of groundwater between the site and its surroundings. Failure to adequately characterize site stratigraphy can lead to the incorrect evaluation of the geology of a site, which in turn may result in a lack of public confidence. A potential problem of lack of public confidence was alleviated as a result of the resolution and proper definition of the Geff Alternative Site stratigraphy. The integrity of the investigation was not questioned and public perception was not compromised. ?? 1991 Springer-Verlag New York Inc.

  13. A Bayesian-Based Novel Methodology to Generate Reliable Site Response Mapping Sensitive to Data Uncertainties

    NASA Astrophysics Data System (ADS)

    Chakraborty, A.; Goto, H.

    2017-12-01

    The 2011 off the Pacific coast of Tohoku earthquake caused severe damage in many areas further inside the mainland because of site-amplification. Furukawa district in Miyagi Prefecture, Japan recorded significant spatial differences in ground motion even at sub-kilometer scales. The site responses in the damage zone far exceeded the levels in the hazard maps. A reason why the mismatch occurred is that mapping follow only the mean value at the measurement locations with no regard to the data uncertainties and thus are not always reliable. Our research objective is to develop a methodology to incorporate data uncertainties in mapping and propose a reliable map. The methodology is based on a hierarchical Bayesian modeling of normally-distributed site responses in space where the mean (μ), site-specific variance (σ2) and between-sites variance(s2) parameters are treated as unknowns with a prior distribution. The observation data is artificially created site responses with varying means and variances for 150 seismic events across 50 locations in one-dimensional space. Spatially auto-correlated random effects were added to the mean (μ) using a conditionally autoregressive (CAR) prior. The inferences on the unknown parameters are done using Markov Chain Monte Carlo methods from the posterior distribution. The goal is to find reliable estimates of μ sensitive to uncertainties. During initial trials, we observed that the tau (=1/s2) parameter of CAR prior controls the μ estimation. Using a constraint, s = 1/(k×σ), five spatial models with varying k-values were created. We define reliability to be measured by the model likelihood and propose the maximum likelihood model to be highly reliable. The model with maximum likelihood was selected using a 5-fold cross-validation technique. The results show that the maximum likelihood model (μ*) follows the site-specific mean at low uncertainties and converges to the model-mean at higher uncertainties (Fig.1). This result is highly significant as it successfully incorporates the effect of data uncertainties in mapping. This novel approach can be applied to any research field using mapping techniques. The methodology is now being applied to real records from a very dense seismic network in Furukawa district, Miyagi Prefecture, Japan to generate a reliable map of the site responses.

  14. An Enzyme-Mediated Methodology for the Site-Specific Radiolabeling of Antibodies Based on Catalyst-Free Click Chemistry

    PubMed Central

    Zeglis, Brian M.; Davis, Charles B.; Aggeler, Robert; Kang, Hee Chol; Chen, Aimei; Agnew, Brian J.; Lewis, Jason S.

    2013-01-01

    An enzyme- and click chemistry-mediated methodology for the site-selective radiolabeling of antibodies on the heavy chain glycans has been developed and validated. To this end, a model system based on the prostate specific membrane antigen-targeting antibody J591, the positron-emitting radiometal 89Zr, and the chelator desferrioxamine has been employed. The methodology consists of four steps: (1) the removal of sugars on the heavy chain region of the antibody to expose terminal N-acetylglucosamine residues; (2) the incorporation of azide-modified N-acetylgalactosamine monosaccharides into the glycans of the antibody; (3) the catalyst-free click conjugation of desferrioxamine-modified dibenzocyclooctynes to the azide-bearing sugars; and (4) the radiolabeling of the chelator-modified antibody with 89Zr. The site-selective labeling methodology has proven facile, reproducible, and robust, producing 89Zr-labeled radioimmunoconjguates that display high stability and immunoreactivity in vitro (>95%) in addition to high selective tumor uptake (67.5 ± 5.0 %ID/g) and tumor-to-background contrast in athymic nude mice bearing PSMA-expressing subcutaneous LNCaP xenografts. Ultimately, this strategy could play a critical role in the development of novel well-defined and highly immunoreactive radioimmunoconjugates for both the laboratory and clinic. PMID:23688208

  15. Assessing Perceived Credibility of Web Sites in a Terrorism Context: The PFLP, Tamil Tigers, Hamas, and Hezbollah

    ERIC Educational Resources Information Center

    Spinks, Brandon Todd

    2009-01-01

    The purpose of the study was to contribute to the overall understanding of terrorist organizations' use of the Internet and to increase researchers' knowledge of Web site effectiveness. The methodological approach was evaluation of the perceived credibility of Web sites based on existing criteria derived from information users. The Web sites of…

  16. GIS and Multi-criteria evaluation (MCE) for landform geodiversity assessment

    NASA Astrophysics Data System (ADS)

    Najwer, Alicja; Reynard, Emmanuel; Zwoliński, Zbigniew

    2014-05-01

    In geomorphology, at the contemporary stage of methodology and methodological development, it is very significant to undertake new research problems, from theoretical and application point of view. As an example of applying geoconservation results in landscape studies and environmental conservation one can refer to the problem of the landform geodiversity. The concept of geodiversity was created relatively recently and, therefore, little progress has been made in its objective assessment and mapping. In order to ensure clarity and coherency, it is recommended that the evaluation process to be rigorous. Multi-criteria evaluation meets these criteria well. The main objective of this presentation is to demonstrate a new methodology for the assessment of the selected natural environment components in response to the definition of geodiversity, as well as visualization of the landforms geodiversity, using the opportunities offered by the geoinformation environment. The study area consists of two peculiar alpine valleys: Illgraben and Derborence, located in the Swiss Alps. Apart from glacial and fluvial landforms, the morphology of these two sites is largely due to the extreme phenomena(rockslides, torrential processes). Both valleys are recognized as geosites of national importance. The basis of the assessment is the selection of the geographical environment features. Firstly, six factor maps were prepared for each area: the landform energy, the landform fragmentation, the contemporary landform preservation, geological settings and hydrographic elements (lakes and streams). Input maps were then standardized and resulted from map algebra operations carried out by multi-criteria evaluation (MCE) with GIS-based Weighted Sum technique. Weights for particular classes were calculated using pair-comparison matrixes method. The final stage of deriving landform geodiversity maps was the reclassification procedure with the use of natural breaks method. The final maps of landform geodiversity were generated with the use of the same methodological algorithm and multiplication of each factor map by its given weight with consistency ratio = 0.07. However, the results that were obtained were radically different. The map of geodiversity for Derborence is characterized by much more significant fragmentation. Areas of low geodiveristy constitute a greater contribution. In the Illgraben site, there is a significant contribution of high and very high geodiversity classes. The obtained maps were reviewed during the field exploration with positive results, which gives a basis to conclude that the methodology used is correct and can be applied for other similar areas. Therefore, it is very important to develop an objective methodology that can be implemented for areas at the local and regional scale, but also giving satisfactory results for areas with a landscape different from the alpine one. The maps of landform geodiversity may be used for environment conservation management, preservation of specific features within the geosite perimeter, spatial planning or tourism management.

  17. Clay Minerals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Karl T.; Sanders, Rebecca L.; Washton, Nancy M.

    2014-03-14

    Clay minerals are important components of the environment and are involved or implicated in processes such as the uptake of pollutants and the release of nutrients and as potential platforms for a number of chemical reactions. Owing to their small particle sizes (typically, on the order of microns or smaller) and mixing with a variety of other minerals and soil components, advanced characterization methods are needed to study their structures, dynamics, and reactivities. In this article, we describe the use of solid-state NMR methods to characterize the structures and chemistries of clay minerals. Early one-pulse magic-angle spinning (MAS) NMR studiesmore » of 27Al and 29Si have now been enhanced and extended with new studies utilizing advanced methodologies (such as Multiple Quantum MAS) as well as studies of less-sensitive nuclei. In additional work, the issue of reactivity of clay minerals has been addressed, including studies of reactive surface area in the environment. Utilizations of NMR-sensitive nuclides within the clay minerals themselves, and in molecules that react with specific sites on the clay mineral surfaces, have aided in understanding the reactivity of these complex aluminosilicate systems.« less

  18. Hydride bridge in [NiFe]-hydrogenase observed by nuclear resonance vibrational spectroscopy

    DOE PAGES

    Ogata, Hideaki; Krämer, Tobias; Wang, Hongxin; ...

    2015-08-10

    The metabolism of many anaerobes relies on [NiFe]-hydrogenases, whose characterization when bound to substrates has proven non-trivial. Presented here is direct evidence for a hydride bridge in the active site of the 57Fe-labelled fully reduced Ni-R form of Desulfovibrio vulgaris Miyazaki F [NiFe]-hydrogenase. A unique ‘wagging’ mode involving H- motion perpendicular to the Ni(μ-H)57Fe plane was studied using 57Fe-specific nuclear resonance vibrational spectroscopy and density functional theory (DFT) calculations. On Ni(μ-D)57Fe deuteride substitution, this wagging causes a characteristic perturbation of Fe–CO/CN bands. Spectra have been interpreted by comparison with Ni(μ-H/D)57Fe enzyme mimics [(dppe)Ni(μ-pdt)(μ-H/D)57Fe(CO)3]+ and DFT calculations, which collectively indicate amore » low-spin Ni(II)(μ-H)Fe(II) core for Ni-R, with H- binding Ni more tightly than Fe. Lastly, the present methodology is also relevant to characterizing Fe–H moieties in other important natural and synthetic catalysts.« less

  19. Hydride bridge in [NiFe]-hydrogenase observed by nuclear resonance vibrational spectroscopy

    PubMed Central

    Ogata, Hideaki; Krämer, Tobias; Wang, Hongxin; Schilter, David; Pelmenschikov, Vladimir; van Gastel, Maurice; Neese, Frank; Rauchfuss, Thomas B.; Gee, Leland B.; Scott, Aubrey D.; Yoda, Yoshitaka; Tanaka, Yoshihito; Lubitz, Wolfgang; Cramer, Stephen P.

    2015-01-01

    The metabolism of many anaerobes relies on [NiFe]-hydrogenases, whose characterization when bound to substrates has proven non-trivial. Presented here is direct evidence for a hydride bridge in the active site of the 57Fe-labelled fully reduced Ni-R form of Desulfovibrio vulgaris Miyazaki F [NiFe]-hydrogenase. A unique ‘wagging' mode involving H− motion perpendicular to the Ni(μ-H)57Fe plane was studied using 57Fe-specific nuclear resonance vibrational spectroscopy and density functional theory (DFT) calculations. On Ni(μ-D)57Fe deuteride substitution, this wagging causes a characteristic perturbation of Fe–CO/CN bands. Spectra have been interpreted by comparison with Ni(μ-H/D)57Fe enzyme mimics [(dppe)Ni(μ-pdt)(μ-H/D)57Fe(CO)3]+ and DFT calculations, which collectively indicate a low-spin Ni(II)(μ-H)Fe(II) core for Ni-R, with H− binding Ni more tightly than Fe. The present methodology is also relevant to characterizing Fe–H moieties in other important natural and synthetic catalysts. PMID:26259066

  20. Toward Paradigmatic Change in TESOL Methodologies: Building Plurilingual Pedagogies from the Ground Up

    ERIC Educational Resources Information Center

    Lin, Angel

    2013-01-01

    Contemporary TESOL methodologies have been characterized by compartmentalization of languages in the classroom. However, recent years have seen the beginning signs of paradigmatic change in TESOL methodologies that indicate a move toward plurilingualism. In this article, the author draws on the case of Hong Kong to illustrate how, in the past four…

  1. 10 CFR 60.16 - Site characterization plan required.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Site characterization plan required. 60.16 Section 60.16 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) DISPOSAL OF HIGH-LEVEL RADIOACTIVE WASTES IN GEOLOGIC REPOSITORIES Licenses Preapplication Review § 60.16 Site characterization plan required. Before proceeding to...

  2. Measurement Sets and Sites Commonly used for Characterizations

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Holekamp, Kara; Ryan, Robert; Blonski, Slawomir; Sellers, Richard; Davis, Bruce; Zanoni, Vicki

    2002-01-01

    Scientists with NASA's Earth Science Applications Directorate are creating a well-characterized Verification & Validation (V&V) site at the Stennis Space Center (SSC). This site enables the in-flight characterization of remote sensing systems and the data that they require. The data are predominantly acquired by commercial, high-spatial resolution satellite systems, such as IKONOS and QuickBird 2, and airborne systems. The smaller scale of these newer high-resolution remote sensing systems allows scientists to characterize the geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the earlier, coarser spatial resolution systems. Scientists are also using the SSC V&V site to characterize thermal infrared systems and active Light Detection and Ranging (LIDAR) systems. SSC employs geodetic targets, edge targets, radiometric tarps, and thermal calibration ponds to characterize remote sensing data products. This paper presents a proposed set of required measurements for visible-through-longwave infrared remote sensing systems, and a description of the Stennis characterization. Other topics discussed inslude: 1) use of ancillary atmospheric and solar measurements taken at SSC that support various characterizations, 2) other sites used for radiometric, geometric, and spatial characterization in the continental United States,a nd 3) the need for a standardized technique to be adopted by the Committee on Earth Observation Satellites (CEOS) and other organizations.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaFreniere, L. M.

    The Commodity Credit Corporation (CCC), an agency of the U.S. Department of Agriculture (USDA), formerly operated a grain storage facility approximately 1,100 ft north of Centralia (Figure 1.2). The CCC/USDA operated this facility from 1949 until 1971. None of the CCC/USDA structures remain. Two additional grain storage facilities currently exist in and near Centralia: the Nemaha County Co-op, approximately 4,000 ft south of the former CCC/USDA facility, and a private grain storage facility near the Don Morris residence, 3,500 ft north of the former CCC/USDA facility. Prior to 1986, commercial grain fumigants containing carbon tetrachloride were commonly used by themore » CCC/USDA and the grain storage industry to preserve grain. In April 1998, the Kansas Department of Health and Environment (KDHE) sampled the domestic well at the Don Morris residence near Centralia (Figure 1.2) as part of the CCC/USDA Private Well Sampling Program, which was initiated to determine whether carbon tetrachloride was present in domestic wells located near former CCC/USDA grain storage facilities in Kansas. Carbon tetrachloride was detected in the Morris well at 19.3 mg/L and confirmed at 25.4 mg/L, both concentrations above the maximum contaminant level (MCL) of 5 mg/L for carbon tetrachloride in drinking water. On the basis of the detection of carbon tetrachloride in the Morris well, the KDHE in August-September 1998 conducted preliminary investigations at the former CCC/USDA facility. For the details of previous investigations in the area and a summary of their findings, see the QuickSite{reg_sign} Phase I Work Plan for Centralia (Argonne 2002a). Because the KDHE found carbon tetrachloride at the former CCC/USDA facility at Centralia that might, in part, be linked to historical use of carbon tetrachloride-based grain fumigants at the facility, the CCC/USDA is conducting an environmental site investigation at Centralia. However, the KDHE established in 1998 that the probable groundwater flow direction at the former CCC/USDA facility is not toward the Morris well, and thus the former facility is not responsible for the carbon tetrachloride measured in that well. The town of Centralia and all residents near the former CCC/USDA facility currently obtain their water from Rural Water District No.3 (RWD 3). Therefore, these local residents are not drinking and using contaminated groundwater. The investigation at Centralia is being performed by the Environmental Research Division of Argonne National Laboratory. Argonne is a nonprofit, multidisciplinary research center operated by the University of Chicago for the U.S. Department of Energy (DOE). The CCC/USDA has entered into an interagency agreement with DOE, under which Argonne provides technical assistance to the CCC/USDA with environmental site characterization and remediation at its former grain storage facilities. At these facilities, Argonne is applying its QuickSite environmental site characterization methodology. QuickSite is Argonne's proprietary implementation system for the expedited site characterization (ESC) process. Argonne's Environmental Research Division developed the ESC process to optimize preremedial site characterization work at hazardous waste sites by obtaining and then applying a thorough understanding of a site's geology, hydrogeology, and hydrogeochemistry (e.g., Burton 1994). This approach is fundamental to successful site characterization because the geology and hydrogeology of a site largely govern the mobility and fate of contaminants there. Argonne's ESC process has been used successfully at a number of former CCC/USDA facilities in Kansas and Nebraska and has been adopted by the American Society for Testing and Materials (ASTM 1998) as standard practice for environmental site characterization. This report documents the findings of the Phase I activities at Centralia. Section 1 provides a brief history of the area and the QuickSite process, a summary of the geologic/hydrogeologic model, objectives of the Phase I investigation, and a brief description of the sections contained in this report. Section 2 describes the investigative methods used during the Phase I investigation. Section 3 presents all of the data obtained during the investigation. Section 4 describes the interpretation of the pertinent data used to meet the technical objectives of the investigation, including the contaminant migration pathways in soil and groundwater. A summary of the findings is also provided in Section 4. Section 5 presents the conclusions of the investigation relative to the technical objectives and outlines recommendations for Phase II. To streamline the reporting process, materials from the Work Plan (Argonne 2002a) and relevant sections of the Master Work Plan (Argonne 2002b) are not repeated in detail in this report. Consequently, these documents must also be consulted to obtain the complete details of the Phase I investigative program.« less

  4. Studying the Education of Educators: Methodology.

    ERIC Educational Resources Information Center

    Sirotnik, Kenneth A.

    1988-01-01

    Describes the methodology and research design of SEE, the study of the Education of Educators. The approach is multimethodological, exploratory, descriptive, and evaluative. The research design permits examination of working assumptions and concentration on the individual site--the college, the education departments, and specific programs within…

  5. Factors Contributing to Institutions Achieving Environmental Sustainability

    ERIC Educational Resources Information Center

    James, Matthew; Card, Karen

    2012-01-01

    Purpose: The purpose of this paper is to determine what factors contributed to three universities achieving environmental sustainability. Design/methodology/approach: A case study methodology was used to determine how each factor contributed to the institutions' sustainability. Site visits, fieldwork, document reviews, and interviews with…

  6. 10 CFR 63.16 - Review of site characterization activities. 2

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... which such activities are carried out and to observe excavations, borings, and in situ tests, as they... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Preapplication Review § 63.16 Review of site characterization activities. 2 2 In addition to the review of site characterization activities...

  7. 10 CFR 63.16 - Review of site characterization activities. 2

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... which such activities are carried out and to observe excavations, borings, and in situ tests, as they... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Preapplication Review § 63.16 Review of site characterization activities. 2 2 In addition to the review of site characterization activities...

  8. 10 CFR 63.16 - Review of site characterization activities. 2

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... which such activities are carried out and to observe excavations, borings, and in situ tests, as they... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Preapplication Review § 63.16 Review of site characterization activities. 2 2 In addition to the review of site characterization activities...

  9. 10 CFR 63.16 - Review of site characterization activities. 2

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... which such activities are carried out and to observe excavations, borings, and in situ tests, as they... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Preapplication Review § 63.16 Review of site characterization activities. 2 2 In addition to the review of site characterization activities...

  10. 10 CFR 63.16 - Review of site characterization activities. 2

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... which such activities are carried out and to observe excavations, borings, and in situ tests, as they... IN A GEOLOGIC REPOSITORY AT YUCCA MOUNTAIN, NEVADA Licenses Preapplication Review § 63.16 Review of site characterization activities. 2 2 In addition to the review of site characterization activities...

  11. SH-wave refraction/reflection and site characterization

    USGS Publications Warehouse

    Wang, Z.; Street, R.L.; Woolery, E.W.; Madin, I.P.

    2000-01-01

    Traditionally, nonintrusive techniques used to characterize soils have been based on P-wave refraction/reflection methods. However, near-surface unconsolidated soils are oftentimes water-saturated, and when groundwater is present at a site, the velocity of the P-waves is more related to the compressibility of the pore water than to the matrix of the unconsolidated soils. Conversely, SH-waves are directly relatable to the soil matrix. This makes SH-wave refraction/reflection methods effective in site characterizations where groundwater is present. SH-wave methods have been used extensively in site characterization and subsurface imaging for earthquake hazard assessments in the central United States and western Oregon. Comparison of SH-wave investigations with geotechnical investigations shows that SH-wave refraction/reflection techniques are viable and cost-effective for engineering site characterization.

  12. Collaborative Initiative on Fetal Alcohol Spectrum Disorders: Methodology of Clinical Projects

    PubMed Central

    Mattson, Sarah N.; Foroud, Tatiana; Sowell, Elizabeth R.; Jones, Kenneth Lyons; Coles, Claire D.; Fagerlund, Åse; Autti-Rämö, Ilona; May, Philip A.; Adnams, Colleen M.; Konovalova, Valentina; Wetherill, Leah; Arenson, Andrew D.; Barnett, William K.; Riley, Edward P.

    2009-01-01

    The Collaborative Initiative on Fetal Alcohol Spectrum Disorders (CIFASD) was created in 2003 to further understanding of fetal alcohol spectrum disorders. Clinical and basic science projects collect data across multiple sites using standardized methodology. This paper describes the methodology being used by the clinical projects that pertain to assessment of children and adolescents. Domains being addressed are dysmorphology, neurobehavior, 3D facial imaging, and brain imaging. PMID:20036488

  13. Innovations in Site Characterization Case Study: The Role of a Conceptual Site Model for Expedited Site Characterization Using the Triad Approach at the Poudre River Site, Fort Collins, Colorado

    EPA Pesticide Factsheets

    This case study examines how systematic planning, an evolving conceptual site model (CSM), dynamic work strategies, and real time measurement technologies can be used to unravel complex contaminant distribution patterns...

  14. Development of a High-Throughput Ion-Exchange Resin Characterization Workflow.

    PubMed

    Liu, Chun; Dermody, Daniel; Harris, Keith; Boomgaard, Thomas; Sweeney, Jeff; Gisch, Daryl; Goltz, Bob

    2017-06-12

    A novel high-throughout (HTR) ion-exchange (IEX) resin workflow has been developed for characterizing ion exchange equilibrium of commercial and experimental IEX resins against a range of different applications where water environment differs from site to site. Because of its much higher throughput, design of experiment (DOE) methodology can be easily applied for studying the effects of multiple factors on resin performance. Two case studies will be presented to illustrate the efficacy of the combined HTR workflow and DOE method. In case study one, a series of anion exchange resins have been screened for selective removal of NO 3 - and NO 2 - in water environments consisting of multiple other anions, varied pH, and ionic strength. The response surface model (RSM) is developed to statistically correlate the resin performance with the water composition and predict the best resin candidate. In case study two, the same HTR workflow and DOE method have been applied for screening different cation exchange resins in terms of the selective removal of Mg 2+ , Ca 2+ , and Ba 2+ from high total dissolved salt (TDS) water. A master DOE model including all of the cation exchange resins is created to predict divalent cation removal by different IEX resins under specific conditions, from which the best resin candidates can be identified. The successful adoption of HTR workflow and DOE method for studying the ion exchange of IEX resins can significantly reduce the resources and time to address industry and application needs.

  15. A methodology to enhance electromagnetic compatibility in joint military operations

    NASA Astrophysics Data System (ADS)

    Buckellew, William R.

    The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.

  16. Green Remediation Best Management Practices: Overview of EPA's Methodology to Address the Environmental Footprint of Site Cleanup

    EPA Pesticide Factsheets

    Contaminated site cleanups involving complex activities may benefit from a detailed environmental footprint analysis to inform decision-making about application of suitable best management practices for greener cleanups.

  17. Ecological sites: A useful tool for land management

    Treesearch

    Alicia N. Struckhoff; Douglas Wallace; Fred Young

    2017-01-01

    Developing ecological sites in Missouri is a multiagency, multidiscipline effort led by the Missouri Department of Conservation and the U.S. Department of Agriculture (USDA) Natural Resources Conservation Service. The methodology developed in Missouri has recently served as a model for ecological site development across the country and has aided in an initiative to...

  18. The Connectivity Between Site-Specific Life Cycle Impact Assessment and Site-Specific Weighting

    EPA Science Inventory

    The goal of many LCIAs is to come to a single score with all of the impacts from a wide variety of impact assessments weighted to form this single score. My past experiences with developing site-specific impact assessment methodologies and how this can change the valuation porti...

  19. Exploiting Multisite Gateway and pENFRUIT plasmid collection for fruit genetic engineering.

    PubMed

    Estornell, Leandro H; Granell, Antonio; Orzaez, Diego

    2012-01-01

    MultiSite Gateway cloning techniques based on homologous recombination facilitate the combinatorial assembly of basic genetic pieces (i.e., promoters, CDS, and terminators) into gene expression or gene silencing cassettes. pENFRUIT is a collection of MultiSite Triple Gateway Entry vectors dedicated to genetic engineering in fruits. It comprises a number of fruit-operating promoters as well as C-terminal tags adapted to the Gateway standard. In this way, flanking regulatory/labeling sequences can be easily Gateway-assembled with a given gene of interest for its ectopic expression or silencing in fruits. The resulting gene constructs can be analyzed in stable transgenic plants or in transient expression assays, the latter allowing fast testing of the increasing number of combinations arising from MultiSite methodology. A detailed description of the use of MultiSite cloning methodology for the assembly of pENFRUIT elements is presented.

  20. Munitions and Explosives of Concern Survey Methodology and In-field Testing for Wind Energy Areas on the Atlantic Outer Continental Shelf

    NASA Astrophysics Data System (ADS)

    DuVal, C.; Carton, G.; Trembanis, A. C.; Edwards, M.; Miller, J. K.

    2017-12-01

    Munitions and explosives of concern (MEC) are present in U.S. waters as a result of past and ongoing live-fire testing and training, combat operations, and sea disposal. To identify MEC that may pose a risk to human safety during development of offshore wind facilities on the Atlantic Outer Continental Shelf (OCS), the Bureau of Ocean Energy Management (BOEM) is preparing to develop guidance on risk analysis and selection processes for methods and technologies to identify MEC in Wind Energy Areas (WEA). This study developed a process for selecting appropriate technologies and methodologies for MEC detection using a synthesis of historical research, physical site characterization, remote sensing technology review, and in-field trials. Personnel were tasked with seeding a portion of the Delaware WEA with munitions surrogates, while a second group of researchers not privy to the surrogate locations tested and optimized the selected methodology to find and identify the placed targets. This in-field trial, conducted in July 2016, emphasized the use of multiple sensors for MEC detection, and led to further guidance for future MEC detection efforts on the Atlantic OCS. An April 2017 follow on study determined the fate of the munitions surrogates after the Atlantic storm season had passed. Using regional hydrodynamic models and incorporating the recommendations from the 2016 field trial, the follow on study examined the fate of the MEC and compared the findings to existing research on munitions mobility, as well as models developed as part of the Office of Naval Research Mine-Burial Program. Focus was given to characterizing the influence of sediment type on surrogate munitions behavior and the influence of mophodynamics and object burial on MEC detection. Supporting Mine-Burial models, ripple bedforms were observed to impede surrogate scour and burial in coarse sediments, while surrogate burial was both predicted and observed in finer sediments. Further, incorporation of recommendations from the previous trial in the 2017 study led to fourfold improvement of MEC detection rates over the 2016 approach. The use of modeling to characterize local morphodynamics, MEC burial or mobility, and the impact of seasonal or episodic storm events are discussed in light of technology selection and timing for future MEC detection surveys.

  1. Unravelling Soil Fungal Communities from Different Mediterranean Land-Use Backgrounds

    PubMed Central

    Nilsson, R. Henrik; Girlanda, Mariangela; Vizzini, Alfredo; Bonfante, Paola; Bianciotto, Valeria

    2012-01-01

    Background Fungi strongly influence ecosystem structure and functioning, playing a key role in many ecological services as decomposers, plant mutualists and pathogens. The Mediterranean area is a biodiversity hotspot that is increasingly threatened by intense land use. Therefore, to achieve a balance between conservation and human development, a better understanding of the impact of land use on the underlying fungal communities is needed. Methodology/Principal Findings We used parallel pyrosequencing of the nuclear ribosomal ITS regions to characterize the fungal communities in five soils subjected to different anthropogenic impact in a typical Mediterranean landscape: a natural cork-oak forest, a pasture, a managed meadow, and two vineyards. Marked differences in the distribution of taxon assemblages among the different sites and communities were found. Data analyses consistently indicated a sharp distinction of the fungal community of the cork oak forest soil from those described in the other soils. Each soil showed features of the fungal assemblages retrieved which can be easily related to the above-ground settings: ectomycorrhizal phylotypes were numerous in natural sites covered by trees, but were nearly completely missing from the anthropogenic and grass-covered sites; similarly, coprophilous fungi were common in grazed sites. Conclusions/Significance Data suggest that investigation on the below-ground fungal community may provide useful elements on the above-ground features such as vegetation coverage and agronomic procedures, allowing to assess the cost of anthropogenic land use to hidden diversity in soil. Datasets provided in this study may contribute to future searches for fungal bio-indicators as biodiversity markers of a specific site or a land-use degree. PMID:22536336

  2. Model Validation and Site Characterization for Early Deployment MHK Sites and Establishment of Wave Classification Scheme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kilcher, Levi F

    Model Validation and Site Characterization for Early Deployment Marine and Hydrokinetic Energy Sites and Establishment of Wave Classification Scheme presentation from from Water Power Technologies Office Peer Review, FY14-FY16.

  3. Recent Experience Using Active Love Wave Techniques to Characterize Seismographic Station Sites

    NASA Astrophysics Data System (ADS)

    Martin, A. J.; Yong, A.; Salomone, L.

    2014-12-01

    Active-source Love waves recorded by the multi-channel analysis of surface wave (MASLW) technique were recently analyzed in two site characterization projects. Between 2010 and 2011, the 2009 American Recovery and Reinvestment Act (ARRA) funded GEOVision to conduct geophysical investigations at 189 seismographic stations—185 in California and 4 in the Central Eastern U.S. (CEUS). The original project plan was to utilize active and passive Rayleigh wave-based techniques to obtain shear-wave velocity (VS) profiles to a minimum depth of 30 m and the time-averaged VS of the upper 30 meters (VS30). Early in the investigation it became evident that Rayleigh wave techniques, such as multi-channel analysis of surface waves (MASRW), were not effective at characterizing all sites. Shear-wave seismic refraction and MASLW techniques were therefore applied. The MASLW technique was deployed at a total of 38 sites, in addition to other methods, and used as the primary technique to characterize 22 sites, 5 of which were also characterized using Rayleigh wave techniques. In 2012, the Electric Power Research Institute funded characterization of 33 CEUS station sites. Based on experience from the ARRA investigation, both MASRW and MASLW data were acquired by GEOVision at 24 CEUS sites—the remaining 9 sites and 2 overlapping sites were characterized by University of Texas, Austin. Of the 24 sites characterized by GEOVision, 16 were characterized using MASLW data, 4 using both MASLW and MASRW data and 4 using MASRW data. Love wave techniques were often found to perform better, or at least yield phase velocity data that could be more readily modeled using the fundamental mode assumption, at shallow rock sites, sites with steep velocity gradients, and, sites with a thin, low velocity, surficial soil layer overlying stiffer sediments. These types of velocity structure often excite dominant higher modes in Rayleigh wave data, but not in Love wave data. At such sites, it may be possible to model Rayleigh wave data using multi- or effective-mode techniques; however, in many cases extraction of adequate Rayleigh wave dispersion data for modeling was difficult. These results imply that field procedures should include careful scrutiny of Rayleigh wave-based dispersion data in order to collect Love wave data when warranted.

  4. Site-Specific Integration of Foreign DNA into Minimal Bacterial and Human Target Sequences Mediated by a Conjugative Relaxase

    PubMed Central

    Agúndez, Leticia; González-Prieto, Coral; Machón, Cristina; Llosa, Matxalen

    2012-01-01

    Background Bacterial conjugation is a mechanism for horizontal DNA transfer between bacteria which requires cell to cell contact, usually mediated by self-transmissible plasmids. A protein known as relaxase is responsible for the processing of DNA during bacterial conjugation. TrwC, the relaxase of conjugative plasmid R388, is also able to catalyze site-specific integration of the transferred DNA into a copy of its target, the origin of transfer (oriT), present in a recipient plasmid. This reaction confers TrwC a high biotechnological potential as a tool for genomic engineering. Methodology/Principal Findings We have characterized this reaction by conjugal mobilization of a suicide plasmid to a recipient cell with an oriT-containing plasmid, selecting for the cointegrates. Proteins TrwA and IHF enhanced integration frequency. TrwC could also catalyze integration when it is expressed from the recipient cell. Both Y18 and Y26 catalytic tyrosil residues were essential to perform the reaction, while TrwC DNA helicase activity was dispensable. The target DNA could be reduced to 17 bp encompassing TrwC nicking and binding sites. Two human genomic sequences resembling the 17 bp segment were accepted as targets for TrwC-mediated site-specific integration. TrwC could also integrate the incoming DNA molecule into an oriT copy present in the recipient chromosome. Conclusions/Significance The results support a model for TrwC-mediated site-specific integration. This reaction may allow R388 to integrate into the genome of non-permissive hosts upon conjugative transfer. Also, the ability to act on target sequences present in the human genome underscores the biotechnological potential of conjugative relaxase TrwC as a site-specific integrase for genomic modification of human cells. PMID:22292089

  5. OPUS: Optimal Projection for Uncertain Systems. Volume 1

    DTIC Science & Technology

    1991-09-01

    unifiedI control- design methodology that directly addresses these technology issues. 1 In particular, optimal projection theory addresses the need for...effects, and limited identification accuracy in a 1-g environment. The principal contribution of OPUS is a unified design methodology that...characterizing solutions to constrained control- design problems. Transforming OPUS into a practi- cal design methodology requires the development of

  6. Quantitative PET/CT scanner performance characterization based upon the society of nuclear medicine and molecular imaging clinical trials network oncology clinical simulator phantom.

    PubMed

    Sunderland, John J; Christian, Paul E

    2015-01-01

    The Clinical Trials Network (CTN) of the Society of Nuclear Medicine and Molecular Imaging (SNMMI) operates a PET/CT phantom imaging program using the CTN's oncology clinical simulator phantom, designed to validate scanners at sites that wish to participate in oncology clinical trials. Since its inception in 2008, the CTN has collected 406 well-characterized phantom datasets from 237 scanners at 170 imaging sites covering the spectrum of commercially available PET/CT systems. The combined and collated phantom data describe a global profile of quantitative performance and variability of PET/CT data used in both clinical practice and clinical trials. Individual sites filled and imaged the CTN oncology PET phantom according to detailed instructions. Standard clinical reconstructions were requested and submitted. The phantom itself contains uniform regions suitable for scanner calibration assessment, lung fields, and 6 hot spheric lesions with diameters ranging from 7 to 20 mm at a 4:1 contrast ratio with primary background. The CTN Phantom Imaging Core evaluated the quality of the phantom fill and imaging and measured background standardized uptake values to assess scanner calibration and maximum standardized uptake values of all 6 lesions to review quantitative performance. Scanner make-and-model-specific measurements were pooled and then subdivided by reconstruction to create scanner-specific quantitative profiles. Different makes and models of scanners predictably demonstrated different quantitative performance profiles including, in some cases, small calibration bias. Differences in site-specific reconstruction parameters increased the quantitative variability among similar scanners, with postreconstruction smoothing filters being the most influential parameter. Quantitative assessment of this intrascanner variability over this large collection of phantom data gives, for the first time, estimates of reconstruction variance introduced into trials from allowing trial sites to use their preferred reconstruction methodologies. Predictably, time-of-flight-enabled scanners exhibited less size-based partial-volume bias than non-time-of-flight scanners. The CTN scanner validation experience over the past 5 y has generated a rich, well-curated phantom dataset from which PET/CT make-and-model and reconstruction-dependent quantitative behaviors were characterized for the purposes of understanding and estimating scanner-based variances in clinical trials. These results should make it possible to identify and recommend make-and-model-specific reconstruction strategies to minimize measurement variability in cancer clinical trials. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  7. Increasing Resiliency Through Renewable Energy Microgrids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Katherine H; DiOrio, Nicholas A; Cutler, Dylan S

    This paper describes a methodology to quantify the economic and resiliency benefit provided by renewable energy (RE) in a hybrid RE-storage-diesel microgrid. We present a case study to show how this methodology is applied to a multi-use/ multi-function telecommunications facility in southern California. In the case study, we first identify photovoltaic (PV) and battery energy storage system (BESS) technologies that minimize the lifecycle cost of energy at the site under normal, grid-connected operation. We then evaluate how those technologies could be incorporated alongside existing diesel generators in a microgrid to increase resiliency at the site, where resiliency is quantified inmore » terms of the amount of time that the microgrid can sustain the critical load during a grid outage. We find that adding PV and BESS to the existing backup diesel generators with a fixed fuel supply extends the amount of time the site could survive an outage by 1.8 days, from 1.7 days for the existing diesel-only backup system to 3.5 days for the PV/diesel/BESS hybrid system. Furthermore, even after diesel fuel supplies are exhausted, the site can continue to operate critical loads during daytime hours using just the PV/BESS when there is sufficient solar resource. We find that the site can save approximately $100,000 in energy costs over the 25-year lifecycle while doubling the amount of time they can survive an outage. The methodology presented here provides a template for increasing resiliency at telecomm sites by implementing renewable energy solutions, which provide additional benefits of carbon emission reduction and energy cost savings.« less

  8. Prioritization methodology for the decommissioning of nuclear facilities: a study case on the Iraq former nuclear complex.

    PubMed

    Jarjies, Adnan; Abbas, Mohammed; Monken Fernandes, Horst; Wong, Melanie; Coates, Roger

    2013-05-01

    There are a number of sites in Iraq which have been used for nuclear activities and which contain potentially significant amounts of radioactive waste. The principal nuclear site being Al-Tuwaitha. Many of these sites suffered substantial physical damage during the Gulf Wars and have been subjected to subsequent looting. All require decommissioning in order to ensure both radiological and non-radiological safety. However, it is not possible to undertake the decommissioning of all sites and facilities at the same time. Therefore, a prioritization methodology has been developed in order to aid the decision-making process. The methodology comprises three principal stages of assessment: i) a quantitative surrogate risk assessment ii) a range of sensitivity analyses and iii) the inclusion of qualitative modifying factors. A group of Tuwaitha facilities presented the highest risk among the evaluated ones, followed by a middle ranking grouping of Tuwaitha facilities and some other sites, and a relatively large group of lower risk facilities and sites. The initial order of priority is changed when modifying factors are taken into account. It has to be considered the Iraq's isolation from the international nuclear community over the last two decades and the lack of experienced personnel. Therefore it is appropriate to initiate decommissioning operations on selected low risk facilities at Tuwaitha in order to build capacity and prepare for work to be carried out in more complex and potentially high hazard facilities. In addition it is appropriate to initiate some prudent precautionary actions relating to some of the higher risk facilities. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. 10 CFR 960.3-2-2 - Nomination of sites as suitable for characterization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-2-2 Nomination of... of each repository site. For the second repository, at least three of the sites shall not have been nominated previously. Any site nominated as suitable for characterization for the first repository, but not...

  10. Ames expedited site characterization demonstration at the former manufactured gas plant site, Marshalltown, Iowa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bevolo, A.J.; Kjartanson, B.H.; Wonder, J.D.

    1996-03-01

    The goal of the Ames Expedited Site Characterization (ESC) project is to evaluate and promote both innovative technologies (IT) and state-of-the-practice technologies (SOPT) for site characterization and monitoring. In April and May 1994, the ESC project conducted site characterization, technology comparison, and stakeholder demonstration activities at a former manufactured gas plant (FMGP) owned by Iowa Electric Services (IES) Utilities, Inc., in Marshalltown, Iowa. Three areas of technology were fielded at the Marshalltown FMGP site: geophysical, analytical and data integration. The geophysical technologies are designed to assess the subsurface geological conditions so that the location, fate and transport of the targetmore » contaminants may be assessed and forecasted. The analytical technologies/methods are designed to detect and quantify the target contaminants. The data integration technology area consists of hardware and software systems designed to integrate all the site information compiled and collected into a conceptual site model on a daily basis at the site; this conceptual model then becomes the decision-support tool. Simultaneous fielding of different methods within each of the three areas of technology provided data for direct comparison of the technologies fielded, both SOPT and IT. This document reports the results of the site characterization, technology comparison, and ESC demonstration activities associated with the Marshalltown FMGP site. 124 figs., 27 tabs.« less

  11. Life-Cycle Cost/Benefit Assessment of Expedite Departure Path (EDP)

    NASA Technical Reports Server (NTRS)

    Wang, Jianzhong Jay; Chang, Paul; Datta, Koushik

    2005-01-01

    This report presents a life-cycle cost/benefit assessment (LCCBA) of Expedite Departure Path (EDP), an air traffic control Decision Support Tool (DST) currently under development at NASA. This assessment is an update of a previous study performed by bd Systems, Inc. (bd) during FY01, with the following revisions: The life-cycle cost assessment methodology developed by bd for the previous study was refined and calibrated using Free Flight Phase 1 (FFP1) cost information for Traffic Management Advisor (TMA, or TMA-SC in the FAA's terminology). Adjustments were also made to the site selection and deployment scheduling methodology to include airspace complexity as a factor. This technique was also applied to the benefit extrapolation methodology to better estimate potential benefits for other years, and at other sites. This study employed a new benefit estimating methodology because bd s previous single year potential benefit assessment of EDP used unrealistic assumptions that resulted in optimistic estimates. This methodology uses an air traffic simulation approach to reasonably predict the impacts from the implementation of EDP. The results of the costs and benefits analyses were then integrated into a life-cycle cost/benefit assessment.

  12. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Justin Leigh

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  13. M-TraCE: a new tool for high-resolution computation and statistical elaboration of backward trajectories on the Italian domain

    NASA Astrophysics Data System (ADS)

    Vitali, Lina; Righini, Gaia; Piersanti, Antonio; Cremona, Giuseppe; Pace, Giandomenico; Ciancarella, Luisella

    2017-12-01

    Air backward trajectory calculations are commonly used in a variety of atmospheric analyses, in particular for source attribution evaluation. The accuracy of backward trajectory analysis is mainly determined by the quality and the spatial and temporal resolution of the underlying meteorological data set, especially in the cases of complex terrain. This work describes a new tool for the calculation and the statistical elaboration of backward trajectories. To take advantage of the high-resolution meteorological database of the Italian national air quality model MINNI, a dedicated set of procedures was implemented under the name of M-TraCE (MINNI module for Trajectories Calculation and statistical Elaboration) to calculate and process the backward trajectories of air masses reaching a site of interest. Some outcomes from the application of the developed methodology to the Italian Network of Special Purpose Monitoring Stations are shown to assess its strengths for the meteorological characterization of air quality monitoring stations. M-TraCE has demonstrated its capabilities to provide a detailed statistical assessment of transport patterns and region of influence of the site under investigation, which is fundamental for correctly interpreting pollutants measurements and ascertaining the official classification of the monitoring site based on meta-data information. Moreover, M-TraCE has shown its usefulness in supporting other assessments, i.e., spatial representativeness of a monitoring site, focussing specifically on the analysis of the effects due to meteorological variables.

  14. Site Characterization Technologies for DNAPL Investigations

    EPA Pesticide Factsheets

    This document is intended to help managers at sites with potential or confirmed DNAPL contamination identify suitable characterization technologies, screen the technologies for potential application, learn about applications at similar sites, and...

  15. Improving surface-subsurface water budgeting for Brownfield study sites using high resolution satellite imagery

    NASA Astrophysics Data System (ADS)

    Dujardin, J.; Boel, S.; Anibas, C.; Batelaan, O.; Canters, F.

    2009-04-01

    Countries around the world have problems with contaminated brownfield sites as resulting from a relatively anarchic economical and industrial development during the 19th and 20th centuries. Since a few decades policy makers and stakeholders have become more aware of the risk posed by these sites because some of these sites present direct public hazards. Water is often the main vector of the mobility of contaminants. In order to propose remediation measures for the contaminated sites, it is required to describe and to quantify as accurately as possible the surface and subsurface water fluxes in the polluted site. In this research a modelling approach with integrated remote sensing analysis has been developed for accurately calculating water and contaminant fluxes on the polluted sites. Groundwater pollution in urban environments is linked to patterns of land use, so to identify the sources of contamination with great accuracy in urban environments it is essential to characterize the land cover in a detailed way. The use of high resolution spatial information is required because of the complexity of the urban land use. An object-oriented classification approach applied on high resolution satellite data has been adopted. Cluster separability analysis and visual interpretation of the image objects belonging to each cluster resulted in the selection of 8 land-cover categories (water, bare soil, meadow, mixed forest, grey urban surfaces, red roofs, bright roofs and shadow).To assign the image objects to one of the 8 selected classes a multiple layer perceptron (MLP) approach was adopted, using the NeuralWorks Predict software. After a post-classification shadow removal and a rule-based classification enhancement a kappa-value of 0.86 was obtained. Once the land cover was characterized, the groundwater recharge has been simulated using the spatially distributed WetSpass model and the subsurface water flow was simulated with GMS 6.0 in order to identify and budget the water fluxes on the brownfield. The obtained land use map shows to have a strong impact on the groundwater recharge, resulting in a high spatial variability. Simulated groundwater fluxes from brownfield to a receiving river where independently verified by measurements and simulation of groundwater-surface water interaction based on thermal gradients in the river bed. It is concluded that in order to better quantify total fluxes of contaminants from brownfields in the groundwater, remote sensing imagery can be operationally integrated in a modelling procedure. The developed methodology is applied to a case site in Vilvoorde, Brussels (Belgium).

  16. 77 FR 33729 - Disability and Rehabilitation Research Projects and Centers Program-National Data and Statistical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-07

    ... statistical and other methodological consultation to this collaborative project. Discussion: Grantees under... and technical assistance must be designed to contribute to the following outcomes: (a) Maintenance of... methodological consultation available for research projects that use the BMS Database, as well as site- specific...

  17. DEVELOPMENT OF A SUB-SLAB AIR SAMPLING PROTOCOL TO SUPPORT ASSESSMENT OF VAPOR INTRUSION

    EPA Science Inventory

    The primary purpose of this research effort is to develop a methodology for sub-slab sampling to support the EPA guidance and vapor intrusion investigations after vapor intrusion has been established at a site. Methodologies for sub-slab air sampling are currently lacking in ref...

  18. The Impact of E-Skills on the Settlement of Iranian Refugees in Australia

    ERIC Educational Resources Information Center

    Shariati, Saeed; Armarego, Jocelyn; Sudweeks, Fay

    2017-01-01

    Aim/Purpose: The research investigates the impact of Information and Communication Technologies (ICT) on Iranian refugees' settlement in Australia. Background: The study identifies the issues of settlement, such as language, cultural and social differences. Methodology: The Multi-Sited Ethnography (MSE), which is a qualitative methodology, has…

  19. Employees' Perceptions of Barriers to Participation in Training and Development in Small Engineering Businesses

    ERIC Educational Resources Information Center

    Susomrith, Pattanee; Coetzer, Alan

    2015-01-01

    Purpose: This paper aims to investigate barriers to employee participation in voluntary formal training and development opportunities from the perspective of employees in small engineering businesses. Design/methodology/approach: An exploratory qualitative methodology involving data collection via site visits and in-depth semi-structured…

  20. Propellant Readiness Level: A Methodological Approach to Propellant Characterization

    NASA Technical Reports Server (NTRS)

    Bossard, John A.; Rhys, Noah O.

    2010-01-01

    A methodological approach to defining propellant characterization is presented. The method is based on the well-established Technology Readiness Level nomenclature. This approach establishes the Propellant Readiness Level as a metric for ascertaining the readiness of a propellant or a propellant combination by evaluating the following set of propellant characteristics: thermodynamic data, toxicity, applications, combustion data, heat transfer data, material compatibility, analytical prediction modeling, injector/chamber geometry, pressurization, ignition, combustion stability, system storability, qualification testing, and flight capability. The methodology is meant to be applicable to all propellants or propellant combinations; liquid, solid, and gaseous propellants as well as monopropellants and propellant combinations are equally served. The functionality of the proposed approach is tested through the evaluation and comparison of an example set of hydrocarbon fuels.

  1. Currently available methodologies for the processing of intravascular ultrasound and optical coherence tomography images.

    PubMed

    Athanasiou, Lambros; Sakellarios, Antonis I; Bourantas, Christos V; Tsirka, Georgia; Siogkas, Panagiotis; Exarchos, Themis P; Naka, Katerina K; Michalis, Lampros K; Fotiadis, Dimitrios I

    2014-07-01

    Optical coherence tomography and intravascular ultrasound are the most widely used methodologies in clinical practice as they provide high resolution cross-sectional images that allow comprehensive visualization of the lumen and plaque morphology. Several methods have been developed in recent years to process the output of these imaging modalities, which allow fast, reliable and reproducible detection of the luminal borders and characterization of plaque composition. These methods have proven useful in the study of the atherosclerotic process as they have facilitated analysis of a vast amount of data. This review presents currently available intravascular ultrasound and optical coherence tomography processing methodologies for segmenting and characterizing the plaque area, highlighting their advantages and disadvantages, and discusses the future trends in intravascular imaging.

  2. Materials discovery guided by data-driven insights

    NASA Astrophysics Data System (ADS)

    Klintenberg, Mattias

    As the computational power continues to grow systematic computational exploration has become an important tool for materials discovery. In this presentation the Electronic Structure Project (ESP/ELSA) will be discussed and a number of examples presented that show some of the capabilities of a data-driven methodology for guiding materials discovery. These examples include topological insulators, detector materials and 2D materials. ESP/ELSA is an initiative that dates back to 2001 and today contain many tens of thousands of materials that have been investigated using a robust and high accuracy electronic structure method (all-electron FP-LMTO) thus providing basic materials first-principles data for most inorganic compounds that have been structurally characterized. The web-site containing the ESP/ELSA data has as of today been accessed from more than 4,000 unique computers from all around the world.

  3. Protein-Protein Interface and Disease: Perspective from Biomolecular Networks.

    PubMed

    Hu, Guang; Xiao, Fei; Li, Yuqian; Li, Yuan; Vongsangnak, Wanwipa

    Protein-protein interactions are involved in many important biological processes and molecular mechanisms of disease association. Structural studies of interfacial residues in protein complexes provide information on protein-protein interactions. Characterizing protein-protein interfaces, including binding sites and allosteric changes, thus pose an imminent challenge. With special focus on protein complexes, approaches based on network theory are proposed to meet this challenge. In this review we pay attention to protein-protein interfaces from the perspective of biomolecular networks and their roles in disease. We first describe the different roles of protein complexes in disease through several structural aspects of interfaces. We then discuss some recent advances in predicting hot spots and communication pathway analysis in terms of amino acid networks. Finally, we highlight possible future aspects of this area with respect to both methodology development and applications for disease treatment.

  4. Thermal power systems small power systems applications project. Volume 2: Detailed report

    NASA Technical Reports Server (NTRS)

    Marriott, A. T.

    1979-01-01

    Small power system technology as applied to power plants up to 10 MW in size was considered. Markets for small power systems were characterized and cost goals were established for the project. Candidate power plant system design concepts were selected for evaluation and preliminary performance and cost assessments were made. Breakeven capital costs were determined for leading contenders among the candidate systems. The potential use of small power systems in providing part of the demand for pumping power by the extensive aqueduct system of California, was studied. Criteria and methodologies were developed for the ranking of candidate power plant system design concepts. Experimental power plant concepts of 1 MW rating were studied to define a power plant configuration for subsequent detail design construction, testing and evaluation. Site selection criteria and ground rules were developed.

  5. Evaluation of Site Effects Using Numerical and Experimental Analyses In Cittas Di Castello (italy)

    NASA Astrophysics Data System (ADS)

    Pergalani, F.; de Franco, R.; Compagnoni, M.; Caielli, G.

    In the paper the results of the numerical and experimental analyses, in a site of the Umbria Region (Città di Castello - PG), finalized to the evaluations of site effects are shown. The aim of the work was to compare the two type of analyses, to give some methodologies that may be used at the level of urban planning, to consider these as- pects. Therefore a series of geologic, geomorphologic (1:5.000 scale), geotechnic and seismic analyses have been carried out, to identify the areas affected to local effects and to characterize the lithotechnic units. The expected seismic inputs are been indi- viduated and 2D (Quad4M, Hudson et al., 1993) numerical analyses have been done. An experimental analysis, using the registrations of small events, has been done. The results, for the two approaches, were performed in terms of elastic pseudo-acceleration spectra and amplification factors, as a ratio between spectral intensity (Housner, 1952), calculated using the pseudo-velocity spectra, in the periods of 0.1-0.5 s and 0.1-2.5 s of output and input. The results have been analyzed and compared, to give a method- ology that may be exhaustive and precise. The conclusions can be summarized in the following points: u° the results of the two approaches are coherent; u° the differences between the two approaches are: the use of the numerical analysis is easy and quick but, in this case, the use of 2D analysis produces a simplification of real geometry; the use of experimental analysis allows to consider the 3D conditions, but, in this case, the registrations of events characterized by low energy, do not allow to consider the non linear behavior of materials, moreover it is necessary to perform the registrations for a period depending from the seismicity of the region (1 month - two years); u° the possi- bility of integration of the two methodologies allows to perform a complete analysis, using the advantages of the two methods. Housner G.W., Spectrum Intensities of strong motion earthquakes. Proc. of the Symp. on Earth. and Blast Effects on Structures. Earth. Eng. Res. Inst., 1952. Hudson M.B., Idriss I.M., Beikae M. QUAD4M, A computer program for evaluating the seismic response of soil structure by variable damping finite element procedures. Report of Dip. of Civil & Env. Eng., University of California, Davis, 1993.

  6. Session: Pre-development project risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curry, Richard; Linehan, Andy

    This second session at the Wind Energy and Birds/Bats workshop consisted of two presentations followed by a discussion/question and answer period. The focus of the presentations was on the practices and methodologies used in the wind energy industry for assessing risk to birds and bats at candidate project sites. Presenters offered examples of pre-development siting evaluation requirements set by certain states. Presentation one was titled ''Practices and Methodologies and Initial Screening Tools'' by Richard Curry of Curry and Kerlinger, LLC. Presentation two was titled ''State of the Industry in the Pacific Northwest'' by Andy Linehan, CH2MHILL.

  7. Review of Natural Phenomena Hazard (NPH) Assessments for the Hanford 200 Areas (Non-Seismic)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snow, Robert L.; Ross, Steven B.; Sullivan, Robin S.

    2010-09-24

    The purpose of this review is to assess the need for updating Natural Phenomena Hazard (NPH) assessments for the Hanford 200 Areas, as required by DOE Order 420.1B Chapter IV, Natural Phenomena Hazards Mitigation, based on significant changes in state-of-the-art NPH assessment methodology or site-specific information. The review includes all natural phenomena hazards with the exception of seismic/earthquake hazards, which are being addressed under a separate effort. It was determined that existing non-seismic NPH assessments are consistent with current design methodology and site specific data.

  8. Why Do You Adopt Social Networking Sites? Investigating the Driving Factors through Structural Equation Modelling

    ERIC Educational Resources Information Center

    Jan, Muhammad Tahir

    2017-01-01

    Purpose: The purpose of this paper is to investigate those factors that are associated with the adoption of social networking sites from the perspective of Muslim users residing in Malaysia. Design/methodology/approach: A complete self-administered questionnaire was collected from 223 Muslim users of social networking sites in Malaysia. Both…

  9. Detailed Geophysical Fault Characterization in Yucca Flat, Nevada Test Site, Nevada

    USGS Publications Warehouse

    Asch, Theodore H.; Sweetkind, Donald S.; Burton, Bethany L.; Wallin, Erin L.

    2009-01-01

    Yucca Flat is a topographic and structural basin in the northeastern part of the Nevada Test Site (NTS) in Nye County, Nevada. Between the years 1951 and 1992, 659 underground nuclear tests took place in Yucca Flat; most were conducted in large, vertical excavations that penetrated alluvium and the underlying Cenozoic volcanic rocks. Radioactive and other potential chemical contaminants at the NTS are the subject of a long-term program of investigation and remediation by the U.S. Department of Energy (DOE), National Nuclear Security Administration, Nevada Site Office, under its Environmental Restoration Program. As part of the program, the DOE seeks to assess the extent of contamination and to evaluate the potential risks to humans and the environment from byproducts of weapons testing. To accomplish this objective, the DOE Environmental Restoration Program is constructing and calibrating a ground-water flow model to predict hydrologic flow in Yucca Flat as part of an effort to quantify the subsurface hydrology of the Nevada Test Site. A necessary part of calibrating and evaluating a model of the flow system is an understanding of the location and characteristics of faults that may influence ground-water flow. In addition, knowledge of fault-zone architecture and physical properties is a fundamental component of the containment of the contamination from underground nuclear tests, should such testing ever resume at the Nevada Test Site. The goal of the present investigation is to develop a detailed understanding of the geometry and physical properties of fault zones in Yucca Flat. This study was designed to investigate faults in greater detail and to characterize fault geometry, the presence of fault splays, and the fault-zone width. Integrated geological and geophysical studies have been designed and implemented to work toward this goal. This report describes the geophysical surveys conducted near two drill holes in Yucca Flat, the data analyses performed, and the integrated interpretations developed from the suite of geophysical methodologies utilized in this investigation. Data collection for this activity started in the spring of 2005 and continued into 2006. A suite of electrical geophysical surveys were run in combination with ground magnetic surveys; these surveys resulted in high-resolution subsurface data that portray subsurface fault geometry at the two sites and have identified structures not readily apparent from surface geologic mapping, potential field geophysical data, or surface effects fracture maps.

  10. ALTERNATIVES TO DUPLICATE DIET METHODOLOGY

    EPA Science Inventory

    Duplicate Diet (DD) methodology has been used to collect information about the dietary exposure component in the context of total exposure studies. DD methods have been used to characterize the dietary exposure component in the NHEXAS pilot studies. NERL desired to evaluate it...

  11. Evaluating IPv6 Adoption in the Internet

    NASA Astrophysics Data System (ADS)

    Colitti, Lorenzo; Gunderson, Steinar H.; Kline, Erik; Refice, Tiziana

    As IPv4 address space approaches exhaustion, large networks are deploying IPv6 or preparing for deployment. However, there is little data available about the quantity and quality of IPv6 connectivity. We describe a methodology to measure IPv6 adoption from the perspective of a Web site operator and to evaluate the impact that adding IPv6 to a Web site will have on its users. We apply our methodology to the Google Web site and present results collected over the last year. Our data show that IPv6 adoption, while growing significantly, is still low, varies considerably by country, and is heavily influenced by a small number of large deployments. We find that native IPv6 latency is comparable to IPv4 and provide statistics on IPv6 transition mechanisms used.

  12. Framework for Site Characterization for Monitored Natural Attenuation of Volatile Organic Compounds in Ground Water

    EPA Science Inventory

    Monitored Natural Attenuation (MNA) is unique among remedial technologies in relying entirely on natural processes to achieve site-specific objectives. Site characterization is essential to provide site-specific data and interpretations for the decision-making process (i.e., to ...

  13. A multicriteria-based methodology for site prioritisation in sediment management.

    PubMed

    Alvarez-Guerra, Manuel; Viguri, Javier R; Voulvoulis, Nikolaos

    2009-08-01

    Decision-making for sediment management is a complex task that incorporates the selections of areas for remediation and the assessment of options for any mitigation required. The application of Multicriteria Analysis (MCA) to rank different areas, according to their need for sediment management, provides a great opportunity for prioritisation, a first step in an integrated methodology that finally aims to assess and select suitable alternatives for managing the identified priority sites. This paper develops a methodology that starts with the delimitation of management units within areas of study, followed by the application of MCA methods that allows ranking of these management units, according to their need for remediation. This proposed process considers not only scientific evidence on sediment quality, but also other relevant aspects such as social and economic criteria associated with such decisions. This methodology is illustrated with its application to the case study area of the Bay of Santander, in northern Spain, highlighting some of the implications of utilising different MCA methods in the process. It also uses site-specific data to assess the subjectivity in the decision-making process, mainly reflected through the assignment of the criteria weights and uncertainties in the criteria scores. Analysis of the sensitivity of the results to these factors is used as a way to assess the stability and robustness of the ranking as a first step of the sediment management decision-making process.

  14. Lawrence Livermore National Laboratory Site Seismic Safety Program: Summary of Findings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savy, J B; Foxall, W

    The Lawrence Livermore National Laboratory (LLNL) Site Seismic Safety Program was conceived in 1979 during the preparation of the site Draft Environmental Impact Statement. The impetus for the program came from the development of new methodologies and geologic data that affect assessments of geologic hazards at the LLNL site; it was designed to develop a new assessment of the seismic hazard to the LLNL site and LLNL employees. Secondarily, the program was also intended to provide the technical information needed to make ongoing decisions about design criteria for future construction at LLNL and about the adequacy of existing facilities. Thismore » assessment was intended to be of the highest technical quality and to make use of the most recent and accepted hazard assessment methodologies. The basic purposes and objectives of the current revision are similar to those of the previous studies. Although all the data and experience assembled in the previous studies were utilized to their fullest, the large quantity of new information and new methodologies led to the formation of a new team that includes LLNL staff and outside consultants from academia and private consulting firms. A peer-review panel composed of individuals from academia (A. Cornell, Stanford University), the Department of Energy (DOE; Jeff Kimball), and consulting (Kevin Coppersmith), provided review and guidance. This panel was involved from the beginning of the project in a ''participatory'' type of review. The Senior Seismic Hazard Analysis Committee (SSHAC, a committee sponsored by the U.S. Nuclear Regulatory Commission, DOE, and the Electric Power Research Institute) strongly recommends the use of participatory reviews, in which the reviewers follow the progress of a project from the beginning, rather than waiting until the end to provide comments (Budnitz et al., 1997). Following the requirements for probabilistic seismic hazard analysis (PSHA) stipulated in the DOE standard DOE-STD-1023-95, a special effort was made to identify and quantify all types of uncertainties. The final seismic hazard estimates were de-aggregated to determine the contribution of all the seismic sources as well as the relative contributions of potential future earthquakes in terms of their magnitudes and distances from the site. It was found that, in agreement with previous studies, the Greenville Fault system contributes the most to the estimate of the seismic hazard expressed in terms of the probability of exceedance of the peak ground acceleration (PGA) at the center of the LLNL site (i.e., at high frequencies). It is followed closely by the Calaveras and Corral Hollow faults. The Mount Diablo thrust and the Springtown and Livermore faults were not considered in the hazard calculations in the 1991 study. In this study they contributed together approximately as much as the Greenville fault. At lower frequencies, more distant faults such as the Hayward and San Andreas faults begin to appear as substantial contributors to the total hazard. The results of this revision are presented in Figures 1 and 2. Figure 1 shows the estimated mean hazard curve in terms of the annual probability of exceedance of the peak ground acceleration (average of the two horizontal orthogonal components) at the LLNL site, assuming that the local site conditions are similar to those of a generic soil. Figure 2 shows the results in terms of the uniform hazard spectra (pseudo-spectral accelerations for 5% damping) for five return periods. Although this latest revision is based on a completely independent and in many respects very different set of data and methodology from the previous one, it gives essentially the same results for the prediction of the peak ground acceleration (PGA), albeit with a reduced uncertainty. The Greenville fault being a dominant contributor to the hazard, a field investigation was performed to better characterize the probability distribution of the rate of slip on the fault. Samples were collected from a trench located on the northern segment of the Greenville fault, and are in the process of being dated at the LLNL Center for Acceleration Mass Spectrometry (CAMS) using carbon-14. Preliminary results from the dating corroborate the range of values used in the hazard calculations. A final update after completion and qualification (quality assurance) of the date measurements, in the near future, will finalize the distribution of this important parameter, probably using Bayesian updating.« less

  15. The Ohio River Basin energy facility siting model. Volume 1: Methodology

    NASA Astrophysics Data System (ADS)

    Fowler, G. L.; Bailey, R. E.; Gordon, S. I.; Jansen, S. D.; Randolph, J. C.; Jones, W. W.

    1981-04-01

    The siting model developed for ORBES is specifically designed for regional policy analysis. The region includes 423 counties in an area that consists of all of Kentucky and substantial portions of Illinois, Indiana, Ohio, Pennsylvania, and West Virginia.

  16. USDA-ARS Southeast Watershed Laboratory at Tifton, GA:Index Site Design for the Suwannee Basin

    NASA Astrophysics Data System (ADS)

    Bosch, D.; Strickland, T.; Sheridan, J.; Lowrance, R.; Truman, C.; Hubbard, R.; Potter, T.; Wauchope, D.; Vellidis, G.; Thomas, D.

    2001-12-01

    The Southeast Watershed Hydrology Research Center (SEWHRC) was established in 1966 by order of the U.S. Senate "to identify and characterize those elements that control the flow of water from watersheds in the southeast". A 129 sq.mi. area within the headwaters of Little River Watershed (LRW) in central south Georgia was instrumented to provide data for evaluating and characterizing Coastal Plain hydrologic processes and for development and testing of prediction methodologies for use in ungaged watersheds in regions of low topographic relief. Pesticide analytical capabilities were added in 1976, and inorganic chemistry and sediment transport research were expanded. In 1980, the Center was renamed as the Southeast Watershed Research Laboratory (SEWRL), and laboratories were constructed for nutrient analysis and soil physics. A pesticide analysis laboratory was constructed in 1987. In the early 1990s, a hydraulics laboratory was established for sediment and chemical transport studies, and research on riparian buffers was expanded. The SEWRL research program continues to focus on hydrologic and environmental concerns. Major components of the program are hydrology, pesticides behavior, buffer systems, animal waste management, erosion, remote sensing of watershed condition, and relationships between site-specific agricultural management (BMPs) and small-to-large watershed response. SEWRL's program will be expanded over the next five years to include two additional watersheds comparable in size and instrumentation to the LRW; nesting the LRW within the full Little River drainage and subsequently...all three watersheds within the full Suwannee Basin; and mapping and quantifying irrigation water removals within the Suwannee Basin. We will instrument the three intensive study watersheds and the full Suwannee Basin to provide real-time characterization of precipitation, soil moisture, hydrologic flow, and water quality at a range of spatial and temporal scales. We will couple this information with research on BMP improvement in order to evaluate the relationships between land use, weather and climate, water quantity, water quality, and the impacts of BMP implementation on agricultural profitability. The specific objectives of this expansion are to develop: (a) conceptual understanding of responses in natural resource and environmental systems based on physical, chemical, and biological processes; (b) methodologies to direct optimal use of soil and water resources in the production of quality food and fiber while maintaining short- and long-term productivity requirements, ecosystem stability, and environmental quality; and (c) models and information based systems to guide responsible management decisions for action and regulatory agencies at field, farm, and small and large watershed scales.

  17. Active Site Detection by Spatial Conformity and Electrostatic Analysis—Unravelling a Proteolytic Function in Shrimp Alkaline Phosphatase

    PubMed Central

    Chakraborty, Sandeep; Minda, Renu; Salaye, Lipika; Bhattacharjee, Swapan K.; Rao, Basuthkar J.

    2011-01-01

    Computational methods are increasingly gaining importance as an aid in identifying active sites. Mostly these methods tend to have structural information that supplement sequence conservation based analyses. Development of tools that compute electrostatic potentials has further improved our ability to better characterize the active site residues in proteins. We have described a computational methodology for detecting active sites based on structural and electrostatic conformity - C ata L ytic A ctive S ite P rediction (CLASP). In our pipelined model, physical 3D signature of any particular enzymatic function as defined by its active sites is used to obtain spatially congruent matches. While previous work has revealed that catalytic residues have large pKa deviations from standard values, we show that for a given enzymatic activity, electrostatic potential difference (PD) between analogous residue pairs in an active site taken from different proteins of the same family are similar. False positives in spatially congruent matches are further pruned by PD analysis where cognate pairs with large deviations are rejected. We first present the results of active site prediction by CLASP for two enzymatic activities - β-lactamases and serine proteases, two of the most extensively investigated enzymes. The results of CLASP analysis on motifs extracted from Catalytic Site Atlas (CSA) are also presented in order to demonstrate its ability to accurately classify any protein, putative or otherwise, with known structure. The source code and database is made available at www.sanchak.com/clasp/. Subsequently, we probed alkaline phosphatases (AP), one of the well known promiscuous enzymes, for additional activities. Such a search has led us to predict a hitherto unknown function of shrimp alkaline phosphatase (SAP), where the protein acts as a protease. Finally, we present experimental evidence of the prediction by CLASP by showing that SAP indeed has protease activity in vitro. PMID:22174814

  18. 10 CFR 960.3-4 - Environmental impacts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REPOSITORY Implementation Guidelines § 960.3-4 Environmental impacts. Environmental impacts shall be considered by the DOE throughout the site characterization, site selection, and repository development..., during site characterization and repository construction, operation, closure, and decommissioning. ...

  19. SUMMARY OF TECHNIQUES AND UNIQUE USES FOR DIRECT PUSH METHODS IN SITE CHARACTERIZATION ON CONTAMINATED FIELD SITES

    EPA Science Inventory

    Site characterization of subsurface contaminant transport is often hampered by a lack of knowledge of site heterogeneity and temporal variations in hydrogeochemistry. Two case studies are reviewed to illustrate the utility of macro-scale mapping information along with spatially-...

  20. Background element content of the lichen Pseudevernia furfuracea: A supra-national state of art implemented by novel field data from Italy.

    PubMed

    Cecconi, Elva; Incerti, Guido; Capozzi, Fiore; Adamo, Paola; Bargagli, Roberto; Benesperi, Renato; Candotto Carniel, Fabio; Favero-Longo, Sergio Enrico; Giordano, Simonetta; Puntillo, Domenico; Ravera, Sonia; Spagnuolo, Valeria; Tretiach, Mauro

    2018-05-01

    In biomonitoring, the knowledge of background element content (BEC) values is an essential pre-requisite for the correct assessment of pollution levels. Here, we estimated the BEC values of a highly performing biomonitor, the epiphytic lichen Pseudevernia furfuracea, by means of a careful review of literature data, integrated by an extensive field survey. Methodologically homogeneous element content datasets, reflecting different exposure conditions across European and extra-European countries, were compiled and comparatively analysed. Element content in samples collected in remote areas was compared to that of potentially enriched samples, testing differences between medians for 25 elements. This analysis confirmed that the former samples were substantially unaffected by anthropogenic contributions, and their metrics were therefore proposed as a first overview at supra-national background level. We also showed that bioaccumulation studies suffer a huge methodological variability. Limited to original field data, we investigated the background variability of 43 elements in 62 remote Italian sites, characterized in GIS environment for anthropization, land use, climate and lithology at different scale resolution. The relationships between selected environmental descriptors and BEC were tested using Principal Component Regression (PCR) modelling. Elemental composition resulted significantly dependent on land use, climate and lithology. In the case of lithogenic elements, regression models correctly reproduced the lichen content throughout the country at randomly selected sites. Further descriptors should be identified only for As, Co, and V. Through a multivariate approach we also identified three geographically homogeneous macro-regions for which specific BECs were provided for use as reference in biomonitoring applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Improving surface-subsurface water budgeting using high resolution satellite imagery applied on a brownfield.

    PubMed

    Dujardin, J; Batelaan, O; Canters, F; Boel, S; Anibas, C; Bronders, J

    2011-01-15

    The estimation of surface-subsurface water interactions is complex and highly variable in space and time. It is even more complex when it has to be estimated in urban areas, because of the complex patterns of the land-cover in these areas. In this research a modeling approach with integrated remote sensing analysis has been developed for estimating water fluxes in urban environments. The methodology was developed with the aim to simulate fluxes of contaminants from polluted sites. Groundwater pollution in urban environments is linked to patterns of land use and hence it is essential to characterize the land cover in a detail. An object-oriented classification approach applied on high-resolution satellite data has been adopted. To assign the image objects to one of the land-cover classes a multiple layer perceptron approach was adopted (Kappa of 0.86). Groundwater recharge has been simulated using the spatially distributed WetSpass model and the subsurface water flow using MODFLOW in order to identify and budget water fluxes. The developed methodology is applied to a brownfield case site in Vilvoorde, Brussels (Belgium). The obtained land use map has a strong impact on the groundwater recharge, resulting in a high spatial variability. Simulated groundwater fluxes from brownfield to the receiving River Zenne were independently verified by measurements and simulation of groundwater-surface water interaction based on thermal gradients in the river bed. It is concluded that in order to better quantify total fluxes of contaminants from brownfields in the groundwater, remote sensing imagery can be operationally integrated in a modeling procedure. Copyright © 2010 Elsevier B.V. All rights reserved.

  2. Handbook of Reading Research.

    ERIC Educational Resources Information Center

    Pearson, P. David, Ed.; And Others

    Intended for reading educators and researchers, this handbook characterizes the current state of methodology and the cumulative research-based knowledge of reading. The book's three sections cover methodological issues, basic reading processes, and instructional practices. The 25 chapters discuss the following topics: (1) reading research history,…

  3. Fatigue Life Methodology for Bonded Composite Skin/Stringer Configurations

    NASA Technical Reports Server (NTRS)

    Krueger, Ronald; Paris, Isabelle L.; OBrien, T. Kevin

    2000-01-01

    A methodology is presented for determining the fatigue life of bonded composite skin/stringer structures based on delamination fatigue characterization data and geometric nonlinear finite element analyses. Results were compared to fatigue tests on stringer flange/skin specimens to verify the approach.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaFreniere, L. M.

    The Commodity Credit Corporation (CCC), an agency of the U.S. Department of Agriculture (USDA), operated grain storage facilities at two different locations at Everest, Kansas (Figure 1.1). One facility (referred to in this report as the Everest facility) was at the western edge of the city of Everest. The CCC/USDA operated this facility from 1950 until the early 1970s. The second facility (referred to in this report as Everest East) was about 0.5 mi northeast of the town. The CCC/USDA operated this facility from 1954 until the early 1970s. While these two former CCC/USDA grain storage facilities were in operation,more » commercial grain fumigants containing carbon tetrachloride were in common use by the CCC/USDA and the private grain storage industry to preserve grain. In 1997, the Kansas Department of Health and Environment (KDHE) sampled several domestic drinking water and nondrinking water wells in the Everest area. The KDHE sampling was part of the CCC/USDA Private Well Sampling Program, which was initiated to determine whether carbon tetrachloride was present in domestic wells near former CCC/USDA grain storage facilities in Kansas. All of the sampled domestic drinking water wells were located outside the Everest city boundaries. As a result of this sampling, carbon tetrachloride contamination was identified at a single domestic drinking water well (the Nigh well; DW06) approximately 3/8 mi northwest of the former Everest CCC/USDA grain storage facility. The CCC/USDA subsequently connected the Nigh residence to the Everest municipal water system. As a result of the detection of carbon tetrachloride in this well, the KDHE conducted preliminary investigations to further evaluate the existence of contamination and its potential effect on public health and the environment. The KDHE concluded that carbon tetrachloride in groundwater at Everest might, in part, be linked to historical use of carbon tetrachloride-based grain fumigants at the former CCC/USDA facilities. For this reason, the CCC/USDA is conducting an environmental site investigation to determine the source(s) and extent of the carbon tetrachloride contamination at Everest and to assess whether the contamination requires remedial action. The investigation at Everest is being performed by the Environmental Research Division of Argonne National Laboratory. Argonne is a nonprofit, multidisciplinary research center operated by the University of Chicago for the U.S. Department of Energy (DOE). The CCC/USDA has entered into an interagency agreement with DOE, under which Argonne provides technical assistance to the CCC/USDA with environmental site characterization and remediation at its former grain storage facilities. At these facilities, Argonne is applying its QuickSite{reg_sign} environmental site characterization methodology. This methodology has been applied successfully at a number of former CCC/USDA facilities in Kansas and Nebraska and has been adopted by the American Society for Testing and Materials (ASTM 1998) as standard practice for environmental site characterization. Phase I of the QuickSite{reg_sign} investigation examined the key geologic, hydrogeologic, and hydrogeochemical relationships that define potential contaminant migration pathways at Everest (Argonne 2001). Phase II of the QuickSite{reg_sign} investigation at Everest was undertaken with the primary goal of delineating and improving understanding of the distribution of carbon tetrachloride contamination in groundwater at this site and the potential source area(s) that might have contributed to this contamination. To address this goal, four specific technical objectives were developed to guide the Phase II field studies. Sampling of near-surface soils at the former Everest CCC/USDA facility that was originally planned for Phase I had to be postponed until October 2000 because of access restrictions. Viable vegetation was not available for sampling then. This period is termed the first session of Phase II field work at Everest. The main session of field work for the Phase II QuickSite{reg_sign} investigation of the Everest site began on March 6, 2001. Work was suspended at the site on April 6, 2001, (1) because of access limitations to key properties, located north and west of the former CCC/USDA facility, imposed by the private owners at the onset of the spring planting season and (2) to permit further documentation by Argonne, at the request of the CCC/USDA, of the land use and ownership history of the Nigh property as a precursor to completion of the field work. This period is termed the second session of Phase II field work at Everest. Investigation of the Nigh property history was prompted by groundwater contamination evidence obtained during the second session of Phase II field activities (discussed in Section 3.7).« less

  5. Evaluation and implementation of an improved methodology for earthquake ground response analysis : uniform treatment source, path and site effects.

    DOT National Transportation Integrated Search

    2008-12-01

    Shortly after the 1994 Northridge Earthquake, Caltrans geotechnical engineers charged with developing site-specific : response spectra for high priority California bridges initiated a research project aimed at broadening their perspective : from simp...

  6. Weigh-in-motion (WIM) data for site-specific LRFR bridge load rating.

    DOT National Transportation Integrated Search

    2011-08-12

    The live load factors in the Load and Resistant Factor Rating (LRFR) Manual are based on load data from Ontario : thought to be representative of traffic volumes nationwide. However, in accordance with the methodology for : developing site-specific l...

  7. EPIC'S PRODUCTS AND SERVICES

    EPA Science Inventory

    EPIC completes approximately 150 site characterizations annually using current and historical aerial photographs. This work is done in support of EPA Regional and Program
    offices. Site characterization provides detailed information about a site and its history, often going ba...

  8. Remedial Action Plan and site design for stabilization of the inactive uranium mill tailings site at Durango, Colorado: Attachment 3, Groundwater hydrology report. Revised final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-12-01

    The US Environmental Protection Agency (EPA) has established health and environmental protection regulations to correct and prevent groundwater contamination resulting from processing activities at inactive uranium milling sites. According to the Uranium Mill Tailings Radiation Control Act of 1978, (UMTRCA) the US Department of Energy (DOE) is responsible for assessing the inactive uranium processing sites. The DOE has determined this assessment shall include information on hydrogeologic site characterization. The water resources protection strategy that describes how the proposed action will comply with the EPA groundwater protection standards is presented in Attachment 4. Site characterization activities discussed in this section include:more » Characterization of the hydrogeologic environment; characterization of existing groundwater quality; definition of physical and chemical characteristics of the potential contaminant source; and description of local water resources.« less

  9. A Methodology to Evaluate Ecological Resources and Risk Using Two Case Studies at the Department of Energy's Hanford Site

    NASA Astrophysics Data System (ADS)

    Burger, Joanna; Gochfeld, Michael; Bunn, Amoret; Downs, Janelle; Jeitner, Christian; Pittfield, Taryn; Salisbury, Jennifer; Kosson, David

    2017-03-01

    An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy's Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale from non-discernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy's sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy's sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.

  10. A Methodology to Evaluate Ecological Resources and Risk Using Two Case Studies at the Department of Energy's Hanford Site.

    PubMed

    Burger, Joanna; Gochfeld, Michael; Bunn, Amoret; Downs, Janelle; Jeitner, Christian; Pittfield, Taryn; Salisbury, Jennifer; Kosson, David

    2017-03-01

    An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy's Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale from non-discernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy's sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy's sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.

  11. 40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...

  12. 40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...

  13. 40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...

  14. 40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...

  15. 40 CFR 132.4 - State adoption and application of methodologies, policies and procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... (2) The chronic water quality criteria and values for the protection of aquatic life, or site... AGENCY (CONTINUED) WATER PROGRAMS WATER QUALITY GUIDANCE FOR THE GREAT LAKES SYSTEM § 132.4 State...) The Methodologies for Development of Aquatic Life Criteria and Values in appendix A of this part; (3...

  16. Assessing Hydrologic Impacts of Future Land Cover Change ...

    EPA Pesticide Factsheets

    Long‐term land‐use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology was developed on the San Pedro River Basin to characterize hydrologic impacts from future urban growth through time. This methodology was then expanded and utilized to characterize the changing hydrology on the South Platte River Basin. Future urban growth is represented by housingdensity maps generated in decadal intervals from 2010 to 2100, produced by the U.S. Environmental Protection Agency (EPA) Integrated Climate and Land‐Use Scenarios (ICLUS) project. ICLUS developed future housing density maps by adapting the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) social, economic, and demographic storylines to the conterminous United States. To characterize hydrologic impacts from future growth, the housing density maps were reclassified to National Land Cover Database (NLCD) 2006 land cover classes and used to parameterize the Soil and Water Assessment Tool (SWAT) using the Automated Geospatial Watershed Assessment (AGWA) tool. The objectives of this project were to 1) develop and describe a methodology for adapting the ICLUS data for use in AGWA as anapproach to evaluate basin‐wide impacts of development on water‐quantity and ‐quality, 2) present initial results from the application of the methodology to

  17. Health and safety impacts of nuclear, geothermal, and fossil-fuel electric generation in California. Volume 9. Methodologies for review of the health and safety aspects of proposed nuclear, geothermal, and fossil-fuel sites and facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nero, A.V.; Quinby-Hunt, M.S.

    1977-01-01

    This report sets forth methodologies for review of the health and safety aspects of proposed nuclear, geothermal, and fossil-fuel sites and facilities for electric power generation. The review is divided into a Notice of Intention process and an Application for Certification process, in accordance with the structure to be used by the California Energy Resources Conservation and Development Commission, the first emphasizing site-specific considerations, the second examining the detailed facility design as well. The Notice of Intention review is divided into three possible stages: an examination of emissions and site characteristics, a basic impact analysis, and an assessment of publicmore » impacts. The Application for Certification review is divided into five possible stages: a review of the Notice of Intention treatment, review of the emission control equipment, review of the safety design, review of the general facility design, and an overall assessment of site and facility acceptability.« less

  18. The Methodology of Interactive Parametric Modelling of Construction Site Facilities in BIM Environment

    NASA Astrophysics Data System (ADS)

    Kozlovská, Mária; Čabala, Jozef; Struková, Zuzana

    2014-11-01

    Information technology is becoming a strong tool in different industries, including construction. The recent trend of buildings designing is leading up to creation of the most comprehensive virtual building model (Building Information Model) in order to solve all the problems relating to the project as early as in the designing phase. Building information modelling is a new way of approaching to the design of building projects documentation. Currently, the building site layout as a part of the building design documents has a very little support in the BIM environment. Recently, the research of designing the construction process conditions has centred on improvement of general practice in planning and on new approaches to construction site layout planning. The state of art in field of designing the construction process conditions indicated an unexplored problem related to connection of knowledge system with construction site facilities (CSF) layout through interactive modelling. The goal of the paper is to present the methodology for execution of 3D construction site facility allocation model (3D CSF-IAM), based on principles of parametric and interactive modelling.

  19. Local crystal/chemical structures at iron sites in amorphous, magnetic, and nanocrystalline materials

    NASA Astrophysics Data System (ADS)

    Clark, Ted Michael

    Order-disorder phenomena have been examined by means of Mossbauer spectroscopy in a variety of materials, including (a) tektites and other silicate glasses, (b) magnetic materials such as natural and synthetic magnetoplumbite, M-type hexagonal ferrites and magnetite, and (c) nanocrystalline zinc ferrite. A methodology has been established for the analysis of the local crystal/chemical structures of iron in tektites and its application has reconfirmed a low ferric/ferrous ratio of approximately 0.10 for tektites. Additionally, a greater degree of submirocscopic heterogeneity has been established for Muong Nong tektites in comparison with splash form tektites. The dynamics of the 2b site in hexagonal ferrites has been studied above and below the Curie temperature for magnetoplumbite and its synthetic analogs, and also for polycrystalline and oriented single-crystals of MeFesb{12}Osb{19} (Me=Ba, Sr, Pb). Cation ordering on this site is shown to be dependent on the thermal history of the material, while the dynamic disorder of the 2b site for the end-member hexagonal ferrites is shown to be influenced by the divalent heavy metal species, Me. The influence of chemical composition on the morphology of magnetite has been shown to depend on the site preference of impurity cations: Substitutional impurities with tetrahedral site preferences are postulated to result in the seldom-observed cubic habit. Based on the cation distributions of bulk and nanocrystalline material it is held that the enhanced magnetic moments and susceptibilities of nanocrystalline zinc ferrite are shown to be consistent with surface phenomena, independent of synthesis methodology, and contrary to claims of special effects resulting from a particular synthesis methodology.

  20. An Educational System to Help Students Assess Website Features and Identify High-Risk Websites

    ERIC Educational Resources Information Center

    Kajiyama, Tomoko; Echizen, Isao

    2015-01-01

    Purpose: The purpose of this paper is to propose an effective educational system to help students assess Web site risk by providing an environment in which students can better understand a Web site's features and determine the risks of accessing the Web site for themselves. Design/methodology/approach: The authors have enhanced a prototype…

  1. Synthesis of bis-Phosphate Iminoaltritol Enantiomers and Structural Characterization with Adenine Phosphoribosyltransferase.

    PubMed

    Harris, Lawrence D; Harijan, Rajesh K; Ducati, Rodrigo G; Evans, Gary B; Hirsch, Brett M; Schramm, Vern L

    2018-01-19

    Phosphoribosyl transferases (PRTs) are essential in nucleotide synthesis and salvage, amino acid, and vitamin synthesis. Transition state analysis of several PRTs has demonstrated ribocation-like transition states with a partial positive charge residing on the pentose ring. Core chemistry for synthesis of transition state analogues related to the 5-phospho-α-d-ribosyl 1-pyrophosphate (PRPP) reactant of these enzymes could be developed by stereospecific placement of bis-phosphate groups on an iminoaltritol ring. Cationic character is provided by the imino group and the bis-phosphates anchor both the 1- and 5-phosphate binding sites. We provide a facile synthetic path to these molecules. Cyclic-nitrone redox methodology was applied to the stereocontrolled synthesis of three stereoisomers of a selectively monoprotected diol relevant to the synthesis of transition-state analogue inhibitors. These polyhydroxylated pyrrolidine natural product analogues were bis-phosphorylated to generate analogues of the ribocationic form of 5-phosphoribosyl 1-phosphate. A safe, high yielding synthesis of the key intermediate represents a new route to these transition state mimics. An enantiomeric pair of iminoaltritol bis-phosphates (L-DIAB and D-DIAB) was prepared and shown to display inhibition of Plasmodium falciparum orotate phosphoribosyltransferase and Saccharomyces cerevisiae adenine phosphoribosyltransferase (ScAPRT). Crystallographic inhibitor binding analysis of L- and D-DIAB bound to the catalytic sites of ScAPRT demonstrates accommodation of both enantiomers by altered ring geometry and bis-phosphate catalytic site contacts.

  2. Soil physical, chemical and gas-flux characterization from Picea mariana stands near Erickson Creek, Alaska

    USGS Publications Warehouse

    O'Donnell, Jonathan A.; Harden, Jennifer W.; Manies, Kristen L.

    2011-01-01

    Fire is a particularly important control on the carbon (C) balance of the boreal forest, and fire-return intervals and fire severity appear to have increased since the late 1900s in North America. In addition to the immediate release of stored C to the atmosphere through organic-matter combustion, fire also modifies soil conditions, possibly affecting C exchange between terrestrial and atmospheric pools for decades after the burn. The effects of fire on ecosystem C dynamics vary across the landscape, with topographic position and soil drainage functioning as important controls. The data reported here contributed to a larger U.S. Geological Survey (USGS) study, published in the journal Ecosystems by O'Donnell and others (2009). To evaluate the effects of fire and drainage on ecosystem C dynamics, we selected sample sites within the 2003 Erickson Creek fire scar to measure CO2 fluxes and soil C inventories in burned and unburned (control) sites in both upland and lowland black spruce (Picea mariana) forests. The results of this study suggested that although fire can create soil climate conditions which are more conducive to rapid decomposition, rates of C release from soils may be constrained after fire by changes in moisture and (or) substrate quality that impede rates of decomposition. Here, we report detailed site information, methodology, and data (in spreadsheet files) from that study.

  3. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site-Working towards a toolbox for better assessment.

    PubMed

    Wroble, Julie; Frederick, Timothy; Frame, Alicia; Vallero, Daniel

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)'s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete ("grab") samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples.

  4. pH-Dependent Surface Chemistry from First Principles: Application to the BiVO4(010)-Water Interface.

    PubMed

    Ambrosio, Francesco; Wiktor, Julia; Pasquarello, Alfredo

    2018-03-28

    We present a theoretical formulation for studying the pH-dependent interfacial coverage of semiconductor-water interfaces through ab initio electronic structure calculations, molecular dynamics simulations, and the thermodynamic integration method. This general methodology allows one to calculate the acidity of the individual adsorption sites on the surface and consequently the pH at the point of zero charge, pH PZC , and the preferential adsorption mode of water molecules, either molecular or dissociative, at the semiconductor-water interface. The proposed method is applied to study the BiVO 4 (010)-water interface, yields a pH PZC in excellent agreement with the experimental characterization. Furthermore, from the calculated p K a values of the individual adsorption sites, we construct an ab initio concentration diagram of all adsorbed species at the interface as a function of the pH of the aqueous solution. The diagram clearly illustrates the pH-dependent coverage of the surface and indicates that protons are found to be significantly adsorbed (∼1% of available sites) only in highly acidic conditions. The surface is found to be mostly covered by molecularly adsorbed water molecules in a wide interval of pH values ranging from 2 to 8. Hydroxyl ions are identified as the dominant adsorbed species at pH larger than 8.2.

  5. Forensic source differentiation of petrogenic, pyrogenic, and biogenic hydrocarbons in Canadian oil sands environmental samples.

    PubMed

    Wang, Zhendi; Yang, C; Parrott, J L; Frank, R A; Yang, Z; Brown, C E; Hollebone, B P; Landriault, M; Fieldhouse, B; Liu, Y; Zhang, G; Hewitt, L M

    2014-04-30

    To facilitate monitoring efforts, a forensic chemical fingerprinting methodology has been applied to characterize and differentiate pyrogenic (combustion derived) and biogenic (organism derived) hydrocarbons from petrogenic (petroleum derived) hydrocarbons in environmental samples from the Canadian oil sands region. Between 2009 and 2012, hundreds of oil sands environmental samples including water (snowmelt water, river water, and tailings pond water) and sediments (from river beds and tailings ponds) have been analyzed. These samples were taken from sites where assessments of wild fish health, invertebrate communities, toxicology and detailed chemistry are being conducted as part of the Canada-Alberta Joint Oil Sands Monitoring Plan (JOSMP). This study describes the distribution patterns and potential sources of PAHs from these integrated JOSMP study sites, and findings will be linked to responses in laboratory bioassays and in wild organisms collected from these same sites. It was determined that hydrocarbons in Athabasca River sediments and waters were most likely from four sources: (1) petrogenic heavy oil sands bitumen; (2) biogenic compounds; (3) petrogenic hydrocarbons of other lighter fuel oils; and (4) pyrogenic PAHs. PAHs and biomarkers detected in snowmelt water samples collected near mining operations imply that these materials are derived from oil sands particulates (from open pit mines, stacks and coke piles). Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  6. Characterizing Postural Sway during Quiet Stance Based on the Intermittent Control Hypothesis

    NASA Astrophysics Data System (ADS)

    Nomura, Taishin; Nakamura, Toru; Fukada, Kei; Sakoda, Saburo

    2007-07-01

    This article illustrates a signal processing methodology for the time series of postural sway and accompanied electromyographs from the lower limb muscles during quiet stance. It was shown that the proposed methodology was capable of identifying the underlying postural control mechanisms. A preliminary application of the methodology provided evidence that supports the intermittent control hypothesis alternative to the conventional stiffness control hypothesis during human quiet upright stance.

  7. Nanoparticle standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Havrilla, George Joseph

    2016-12-08

    We will purchase a COTS materials printer and adapt it for solution printing of known elemental concentration solutions. A methodology will be developed to create deposits of known mass in known locations on selected substrates. The deposits will be characterized for deposited mass, physical morphology, thickness and uniformity. Once an acceptable methodology has been developed and validated, we will create round robin samples to be characterized by LGSIMS instruments at LANL, PNNL and NIST. We will demonstrate the feasibility of depositing nanoparticles in known masses with the goal of creating separated nanoparticles in known locations.

  8. Track train dynamics analysis and test program: Methodology development for the derailment safety analysis of six-axle locomotives

    NASA Technical Reports Server (NTRS)

    Marcotte, P. P.; Mathewson, K. J. R.

    1982-01-01

    The operational safety of six axle locomotives is analyzed. A locomotive model with corresponding data on suspension characteristics, a method of track defect characterization, and a method of characterizing operational safety are used. A user oriented software package was developed as part of the methodology and was used to study the effect (on operational safety) of various locomotive parameters and operational conditions such as speed, tractive effort, and track curvature. The operational safety of three different locomotive designs was investigated.

  9. Evaluating software development by analysis of changes: The data from the software engineering laboratory

    NASA Technical Reports Server (NTRS)

    1982-01-01

    An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.

  10. 10 CFR 960.3-2-2-4 - The environmental assessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... the site-characterization activities at the site on public health and safety and the environment; a discussion of alternative activities related to site characterization that may be taken to avoid such impact; and an assessment of the regional and local impacts of locating a repository at the site. The draft...

  11. 10 CFR 960.3-2-3 - Recommendation of sites for characterization.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Recommendation of sites for characterization. 960.3-2-3 Section 960.3-2-3 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-2-3 Recommendation of sites...

  12. 10 CFR 960.3-2-3 - Recommendation of sites for characterization.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Recommendation of sites for characterization. 960.3-2-3 Section 960.3-2-3 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-2-3 Recommendation of sites...

  13. 10 CFR 960.3-2-3 - Recommendation of sites for characterization.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Recommendation of sites for characterization. 960.3-2-3 Section 960.3-2-3 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-2-3 Recommendation of sites...

  14. SUMMARY OF TECNIQUES AND UNIQUE USES FOR DIRECT PUSH METHODS IN SITE CHARACTERIZATION ON CONTAMINATED FIELD SITES

    EPA Science Inventory

    At many of the sites where we have been asked to assist in site characterization, we have discovered severe discrepancies that new technologies may be able to prevent. This presentation is designed to illustrate these new technologies or unique uses of existing technology and the...

  15. Proteomic Profiling of Rat Thyroarytenoid Muscle

    ERIC Educational Resources Information Center

    Welham, Nathan V.; Marriott, Gerard; Bless, Diane M.

    2006-01-01

    Purpose: Proteomic methodologies offer promise in elucidating the systemwide cellular and molecular processes that characterize normal and diseased thyroarytenoid (TA) muscle. This study examined methodological issues central to the application of 2-dimensional sodium dodecyl sulfate polyacrylamide gel electrophoresis (2D SDS-PAGE) to the study of…

  16. Elastic plastic fracture mechanics methodology for surface cracks

    NASA Astrophysics Data System (ADS)

    Ernst, Hugo A.; Boatwright, D. W.; Curtin, W. J.; Lambert, D. M.

    1993-08-01

    The Elastic Plastic Fracture Mechanics (EPFM) Methodology has evolved significantly in the last several years. Nevertheless, some of these concepts need to be extended further before the whole methodology can be safely applied to structural parts. Specifically, there is a need to include the effect of constraint in the characterization of material resistance to crack growth and also to extend these methods to the case of 3D defects. As a consequence, this project was started as a 36 month research program with the general objective of developing an EPFM methodology to assess the structural reliability of pressure vessels and other parts of interest to NASA containing defects. This report covers a computer modelling algorithm used to simulate the growth of a semi-elliptical surface crack; the presentation of a finite element investigation that compared the theoretical (HRR) stress field to that produced by elastic and elastic-plastic models; and experimental efforts to characterize three dimensional aspects of fracture present in 'two dimensional', or planar configuration specimens.

  17. Elastic plastic fracture mechanics methodology for surface cracks

    NASA Technical Reports Server (NTRS)

    Ernst, Hugo A.; Boatwright, D. W.; Curtin, W. J.; Lambert, D. M.

    1993-01-01

    The Elastic Plastic Fracture Mechanics (EPFM) Methodology has evolved significantly in the last several years. Nevertheless, some of these concepts need to be extended further before the whole methodology can be safely applied to structural parts. Specifically, there is a need to include the effect of constraint in the characterization of material resistance to crack growth and also to extend these methods to the case of 3D defects. As a consequence, this project was started as a 36 month research program with the general objective of developing an EPFM methodology to assess the structural reliability of pressure vessels and other parts of interest to NASA containing defects. This report covers a computer modelling algorithm used to simulate the growth of a semi-elliptical surface crack; the presentation of a finite element investigation that compared the theoretical (HRR) stress field to that produced by elastic and elastic-plastic models; and experimental efforts to characterize three dimensional aspects of fracture present in 'two dimensional', or planar configuration specimens.

  18. DARPA ANTIBODY TECHNOLOGY PROGRAM STANDARDIZED TEST BED FOR ANTIBODY CHARACTERIZATION: CHARACTERIZATION OF TWO MS2 SCFV ANTIBODIES PRODUCED BY THE UNIVERSITY OF TEXAS

    DTIC Science & Technology

    2017-05-01

    a quality program for the standardization of test methods to support comprehensive characterization and comparison of the physical and functional...1 2.     MATERIALS AND METHODS ...4  2.8       SPR Methodology

  19. Statistical and Economic Techniques for Site-specific Nematode Management.

    PubMed

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  20. Incorporating hydrologic data and ecohydrologic relationships in ecological site descriptions

    USDA-ARS?s Scientific Manuscript database

    The purpose of this paper is to recommend a framework and methodology for inclusion of key ecohydrologic feedbacks and relationships in Ecological Site Descriptions (ESDs) and thereby enhance the utility of ESDs for assessing rangelands and guiding resilience-based management strategies. Resilience...

  1. An investigation of hydraulic conductivity estimation in a ground-water flow study of Northern Long Valley, New Jersey

    USGS Publications Warehouse

    Hill, Mary C.

    1985-01-01

    The purpose of this study was to develop a methodology to be used to investigate the aquifer characteristics and water supply potential of an aquifer system. In particular, the geohydrology of northern Long Valley, New Jersey, was investigated. Geohydrologic data were collected and analyzed to characterize the site. Analysis was accomplished by interpreting the available data and by using a numerical simulation of the watertable aquifer. Special attention was given to the estimation of hydraulic conductivity values and hydraulic conductivity structure which together define the hydraulic conductivity of the modeled aquifer. Hydraulic conductivity and all other aspects of the system were first estimated using the trial-and-error method of calibration. The estimation of hydraulic conductivity was improved using a least squares method to estimate hydraulic conductivity values and by improvements in the parameter structure. These efforts improved the calibration of the model far more than a preceding period of similar effort using the trial-and-error method of calibration. In addition, the proposed method provides statistical information on the reliability of estimated hydraulic conductivity values, calculated heads, and calculated flows. The methodology developed and applied in this work proved to be of substantial value in the evaluation of the aquifer considered.

  2. Surface Emissivity Maps for Satellite Retrieval of the Longwave Radiation Budget

    NASA Technical Reports Server (NTRS)

    Gupta, Shashi K.; Wilber, Anne C.; Kratz, David P.

    1999-01-01

    This paper presents a brief description of the procedure used to produce global surface emissivity maps for the broadband LW, the 8-12 micrometer window, and 12 narrow LW bands. For a detailed description of the methodology and the input data, the reader is referred to Wilber et al. (1999). These maps are based on a time-independent surface type map published by the IGBP, and laboratory measurements of spectral reflectances of surface materials. These maps represent a first attempt to characterize emissivity based on surface types, and many improvements to the methodology presented here are already underway. Effects of viewing zenith angle and sea state on the emissivity of ocean surface (Smith et al. 1996, Wu and Smith 1997, Masuda et al. 1988) will be taken into account. Measurements form ASTER and MODIS will be incorporated as they become available. Seasonal variation of emissivity based on changes in the characteristics of vegetation will be considered, and the variability of emissivity of barren land areas will be accounted for with the use of Zobler World Soil Maps (Zobler 1986). The current maps have been made available to the scientific community from the web site: http://tanalo.larc.nasa.gov:8080/surf_htmls/ SARB_surf.html

  3. Environmental impact reduction through ecological planning at Bahia Magdalena, Mexico.

    PubMed

    Malagrino, Giovanni; Lagunas, Magdalena; Rubio, Alfredo Ortega

    2008-03-01

    For analyzing basic marine and coastal characteristics we selected the potential sites where shrimp culture could be developed in a large coastal zone, Bahia Magdalena, Baja California Sur, Mexico. Based on our analysis, 6 sites were preselected and field stages of work were then developed to assess the precise suitability of each site in order to develop the proposed aquaculture activities. In ranking the suitability we were able to recommend the most appropriate places to develop shrimp culture in this region. Also, knowing the exact biological, physico-chemical and social environment, we determined the best species to cultivate, the recommended total area and the methodology to be used to lessen the environmental impact and to obtain the maximum profitability Our methodology could be used not only to select appropriate sites for shrimp culture in other coastal lagoons, but it also could be applied to assess the suitability in a quick and accurate way, of any other production activity in coastal zones.

  4. Characterization of deformable materials in the THOR dummy

    DOT National Transportation Integrated Search

    2000-01-01

    Methodologies used to characterize the mechanical behavior of various materials used in the construction of the crash test dummy called THOR (Test device for Human Occupant Restraint) are described. These materials include polyurethane, neoprene, and...

  5. Stennis Space Center Verification & Validation Capabilities

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; ONeal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir

    2005-01-01

    Scientists within NASA s Applied Sciences Directorate have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial and moderate resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists are also using the SSC V&V site to characterize thermal infrared systems and active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.

  6. Hydrologic climate change impacts in the Columbia River Basin and their sensitivity to methodological choices

    NASA Astrophysics Data System (ADS)

    Chegwidden, O.; Nijssen, B.; Mao, Y.; Rupp, D. E.

    2016-12-01

    The Columbia River Basin (CRB) in the United States' Pacific Northwest (PNW) is highly regulated for hydropower generation, flood control, fish survival, irrigation and navigation. Historically it has had a hydrologic regime characterized by winter precipitation in the form of snow, followed by a spring peak in streamflow from snowmelt. Anthropogenic climate change is expected to significantly alter this regime, causing changes to streamflow timing and volume. While numerous hydrologic studies have been conducted across the CRB, the impact of methodological choices in hydrologic modeling has not been as heavily investigated. To better understand their impact on the spread in modeled projections of hydrological change, we ran simulations involving permutations of a variety of methodological choices. We used outputs from ten global climate models (GCMs) and two representative concentration pathways from the Intergovernmental Panel on Climate Change's Fifth Assessment Report. After downscaling the GCM output using three different techniques we forced the Variable Infiltration Capacity (VIC) model and the Precipitation Runoff Modeling System (PRMS), both implemented at 1/16th degree ( 5 km) for the period 1950-2099. For the VIC model, we used three independently-derived parameter sets. We will show results from the range of simulations, both in the form of basin-wide spatial analyses of hydrologic variables and through analyses of changes in streamflow at selected sites throughout the CRB. We will then discuss the differences in sensitivities to climate change seen among the projections, paying particular attention to differences in projections from the hydrologic models and different parameter sets.

  7. The 100-C-7 Remediation Project. An Overview of One of DOE's Largest Remediation Projects - 13260

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Post, Thomas C.; Strom, Dean; Beulow, Laura

    The U.S. Department of Energy Richland Operations Office (RL), U.S. Environmental Protection Agency (EPA) and Washington Closure Hanford LLC (WCH) completed remediation of one of the largest waste sites in the U.S. Department of Energy complex. The waste site, 100-C-7, covers approximately 15 football fields and was excavated to a depth of 85 feet (groundwater). The project team removed a total of 2.3 million tons of clean and contaminated soil, concrete debris, and scrap metal. 100-C-7 lies in Hanford's 100 B/C Area, home to historic B and C Reactors. The waste site was excavated in two parts as 100-C-7 andmore » 100-C-7:1. The pair of excavations appear like pit mines. Mining engineers were hired to design their tiered sides, with safety benches every 17 feet and service ramps which allowed equipment access to the bottom of the excavations. The overall cleanup project was conducted over a span of almost 10 years. A variety of site characterization, excavation, load-out and sampling methodologies were employed at various stages of remediation. Alternative technologies were screened and evaluated during the project. A new method for cost effectively treating soils was implemented - resulting in significant cost savings. Additional opportunities for minimizing waste streams and recycling were identified and effectively implemented by the project team. During the final phase of cleanup the project team applied lessons learned throughout the entire project to address the final, remaining source of chromium contamination. The C-7 cleanup now serves as a model for remediating extensive deep zone contamination sites at Hanford. (authors)« less

  8. A Restoration Suitability Index Model for the Eastern Oyster (Crassostrea virginica) in the Mission-Aransas Estuary, TX, USA

    PubMed Central

    Beseres Pollack, Jennifer; Cleveland, Andrew; Palmer, Terence A.; Reisinger, Anthony S.; Montagna, Paul A.

    2012-01-01

    Oyster reefs are one of the most threatened marine habitats on earth, with habitat loss resulting from water quality degradation, coastal development, destructive fishing practices, overfishing, and storm impacts. For successful and sustainable oyster reef restoration efforts, it is necessary to choose sites that support long-term growth and survival of oysters. Selection of suitable sites is critically important as it can greatly influence mortality factors and may largely determine the ultimate success of the restoration project. The application of Geographic Information Systems (GIS) provides an effective methodology for identifying suitable sites for oyster reef restoration and removes much of the uncertainty involved in the sometimes trial and error selection process. This approach also provides an objective and quantitative tool for planning future oyster reef restoration efforts. The aim of this study was to develop a restoration suitability index model and reef quality index model to characterize locations based on their potential for successful reef restoration within the Mission-Aransas Estuary, Texas, USA. The restoration suitability index model focuses on salinity, temperature, turbidity, dissolved oxygen, and depth, while the reef quality index model focuses on abundance of live oysters, dead shell, and spat. Size-specific Perkinsus marinus infection levels were mapped to illustrate general disease trends. This application was effective in identifying suitable sites for oyster reef restoration, is flexible in its use, and provides a mechanism for considering alternative approaches. The end product is a practical decision-support tool that can be used by coastal resource managers to improve oyster restoration efforts. As oyster reef restoration activities continue at small and large-scales, site selection criteria are critical for assisting stakeholders and managers and for maximizing long-term sustainability of oyster resources. PMID:22792410

  9. Local SPTHA through tsunami inundation simulations: a test case for two coastal critical infrastructures in the Mediterranean

    NASA Astrophysics Data System (ADS)

    Volpe, M.; Selva, J.; Tonini, R.; Romano, F.; Lorito, S.; Brizuela, B.; Argyroudis, S.; Salzano, E.; Piatanesi, A.

    2016-12-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) is a methodology to assess the exceedance probability for different thresholds of tsunami hazard intensity, at a specific site or region in a given time period, due to a seismic source. A large amount of high-resolution inundation simulations is typically required for taking into account the full variability of potential seismic sources and their slip distributions. Starting from regional SPTHA offshore results, the computational cost can be reduced by considering for inundation calculations only a subset of `important' scenarios. We here use a method based on an event tree for the treatment of the seismic source aleatory variability; a cluster analysis on the offshore results to define the important sources; epistemic uncertainty treatment through an ensemble modeling approach. We consider two target sites in the Mediterranean (Milazzo, Italy, and Thessaloniki, Greece) where coastal (non nuclear) critical infrastructures (CIs) are located. After performing a regional SPTHA covering the whole Mediterranean, for each target site, few hundreds of representative scenarios are filtered out of all the potential seismic sources and the tsunami inundation is explicitly modeled, obtaining a site-specific SPTHA, with a complete characterization of the tsunami hazard in terms of flow depth and velocity time histories. Moreover, we also explore the variability of SPTHA at the target site accounting for coseismic deformation (i.e. uplift or subsidence) due to near field sources located in very shallow water. The results are suitable and will be applied for subsequent multi-hazard risk analysis for the CIs. These applications have been developed in the framework of the Italian Flagship Project RITMARE, EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, and of the INGV-DPC Agreement.

  10. Optimization of self-aligned double patterning (SADP)-compliant layout designs using pattern matching for sub-20nm metal routing

    NASA Astrophysics Data System (ADS)

    Wang, Lynn T.-N.; Schroeder, Uwe Paul; Madhavan, Sriram

    2017-03-01

    A pattern-based methodology for optimizing SADP-compliant layout designs is developed based on identifying cut mask patterns and replacing them with pre-characterized fixing solutions. A pattern-based library of difficult-tomanufacture cut patterns with pre-characterized fixing solutions is built. A pattern-based engine searches for matching patterns in the decomposed layouts. When a match is found, the engine opportunistically replaces the detected pattern with a pre-characterized fixing solution. The methodology was demonstrated on a 7nm routed metal2 block. A small library of 30 cut patterns increased the number of more manufacturable cuts by 38% and metal-via enclosure by 13% with a small parasitic capacitance impact of 0.3%.

  11. Approaches for assessing risks to sensitive populations: Lessons learned from evaluating risks in the pediatric populations*

    EPA Science Inventory

    Assessing the risk profiles of potentially sensitive populations requires a 'tool chest' of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of...

  12. Another Breakthrough, Another Baby Thrown out with the Bathwater

    ERIC Educational Resources Information Center

    Bell, David M.

    2009-01-01

    "Process-oriented pedagogy: facilitation, empowerment, or control?" claims that process-oriented pedagogy (POP) represents the methodological perspective of most practising teachers and that outcomes-based education (OBE) poses a real and present danger to stakeholder autonomy. Whereas POP may characterize methodological practices in the inner…

  13. USING A RISK-BASED METHODOLOGY FOR THE TRANSFER OF RADIOACTIVE MATERIAL WITHIN THE SAVANNAH RIVER SITE BOUNDARY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loftin, B; Watkins, R; Loibl, M

    2010-06-03

    Shipment of radioactive materials (RAM) is discussed in the Code of Federal Regulations in parts of both 49 CFR and 10 CFR. The regulations provide the requirements and rules necessary for the safe shipment of RAM across public highways, railways, waterways, and through the air. These shipments are sometimes referred to as in-commerce shipments. Shipments of RAM entirely within the boundaries of Department of Energy sites, such as the Savannah River Site (SRS), can be made using methodology allowing provisions to maintain equivalent safety while deviating from the regulations for in-commerce shipments. These onsite shipments are known as transfers atmore » the SRS. These transfers must follow the requirements approved in a site-specific Transportation Safety Document (TSD). The TSD defines how the site will transfer materials so that they have equivalence to the regulations. These equivalences are documented in an Onsite Safety Assessment (OSA). The OSA can show how a particular packaging used onsite is equivalent to that which would be used for an in-commerce shipment. This is known as a deterministic approach. However, when a deterministic approach is not viable, the TSD allows for a risk-based OSA to be written. These risk-based assessments show that if a packaging does not provide the necessary safety to ensure that materials are not released (during normal or accident conditions) then the worst-case release of materials does not result in a dose consequence worse than that defined for the SRS. This paper will discuss recent challenges and successes using this methodology at the SRS.« less

  14. Integrated multi-parameters Probabilistic Seismic Landslide Hazard Analysis (PSLHA): the case study of Ischia island, Italy

    NASA Astrophysics Data System (ADS)

    Caccavale, Mauro; Matano, Fabio; Sacchi, Marco; Mazzola, Salvatore; Somma, Renato; Troise, Claudia; De Natale, Giuseppe

    2014-05-01

    The Ischia island is a large, complex, partly submerged, active volcanic field located about 20 km east to the Campi Flegrei, a major active volcano-tectonic area near Naples. The island is morphologically characterized in its central part by the resurgent block of Mt. Epomeo, controlled by NW-SE and NE-SW trending fault systems, by mountain stream basin with high relief energy and by a heterogeneous coastline with alternation of beach and tuff/lava cliffs in a continuous reshape due to the weather and sea erosion. The volcano-tectonic process is a main factor for slope stability, as it produces seismic activity and generated steep slopes in volcanic deposits (lava, tuff, pumice and ash layers) characterized by variable strength. In the Campi Flegrei and surrounding areas the possible occurrence of a moderate/large seismic event represents a serious threat for the inhabitants, for the infrastructures as well as for the environment. The most relevant seismic sources for Ischia are represented by the Campi Flegrei caldera and a 5 km long fault located below the island north coast. However those sources are difficult to constrain. The first one due to the on-shore and off-shore extension not yet completely defined. The second characterized only by few large historical events is difficult to parameterize in the framework of probabilistic hazard approach. The high population density, the presence of many infrastructures and the more relevant archaeological sites associated with the natural and artistic values, makes this area a strategic natural laboratory to develop new methodologies. Moreover Ischia represents the only sector, in the Campi Flegrei area, with documented historical landslides originated by earthquake, allowing for the possibility of testing the adequacy and stability of the method. In the framework of the Italian project MON.I.C.A (infrastructural coastlines monitoring) an innovative and dedicated probabilistic methodology has been applied to identify the areas with higher susceptibility of landslide occurrence due to the seismic effect. The (PSLHA) combines the probability of exceedance maps for different GM parameters with the geological and geomorphological information, in terms of critical acceleration and dynamic stability factor. Generally the maps are evaluated for Peak Ground Acceleration, Velocity or Intensity, are well related with anthropic infrastructures (e.g. streets, building, etc.). Each ground motion parameter represents a different aspect in the hazard and has a different correlation with the generation of possible damages. Many works pointed out that other GM like Arias and Housner intensity and the absolute displacement could represent a better choice to analyse for example the cliffs stability. The selection of the GM parameter is of crucial importance to obtain the most useful hazard maps. However in the last decades different Ground Motion Prediction Equations for a new set of GM parameters have been published. Based on this information a series of landslide hazard maps can be produced. The new maps will lead to the identification of areas with highest probability of landslide induced by an earthquake. In a strategic site like Ischia this new methodologies will represent an innovative and advanced tool for the landslide hazard mitigation.

  15. 10 CFR 960.3-1-4-3 - Site recommendation for characterization.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Site recommendation for characterization. 960.3-1-4-3 Section 960.3-1-4-3 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-1-4-3 Site recommendation...

  16. 10 CFR 960.3-1-4-3 - Site recommendation for characterization.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Site recommendation for characterization. 960.3-1-4-3 Section 960.3-1-4-3 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-1-4-3 Site recommendation...

  17. 10 CFR 960.3-1-4-3 - Site recommendation for characterization.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Site recommendation for characterization. 960.3-1-4-3 Section 960.3-1-4-3 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-1-4-3 Site recommendation...

  18. 10 CFR 960.3-1-4-3 - Site recommendation for characterization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Site recommendation for characterization. 960.3-1-4-3 Section 960.3-1-4-3 Energy DEPARTMENT OF ENERGY GENERAL GUIDELINES FOR THE PRELIMINARY SCREENING OF POTENTIAL SITES FOR A NUCLEAR WASTE REPOSITORY Implementation Guidelines § 960.3-1-4-3 Site recommendation...

  19. A methodology for using borehole temperature-depth profiles under ambient, single and cross-borehole pumping conditions to estimate fracture hydraulic properties

    NASA Astrophysics Data System (ADS)

    Klepikova, M.; Le Borgne, T.; Bour, O.; Lavenant, N.

    2011-12-01

    In fractured aquifers flow generally takes place in a few fractured zones. The identification of these main flow paths is critical as it controls the transfer of fluids in the subsurface. For realistic modeling of the flow the knowledge about the spatial variability of hydraulic properties is required. Inverse problems based on hydraulic head data are generally strongly underconstrained. A possible way of reducing the uncertainty is to combine different type of data, such as flow measurements, temperature profiles or tracer test data. Here, we focus on the use of temperature, which can be seen as a natural tracer of ground water flow. Previous studies used temperature anomalies to quantify vertical or horizontal regional groundwater flow velocities. Most of these studies assume that water in the borehole is stagnant, and, thus, the temperature profile in the well is representative of the temperature in the aquifer. In fractured media, differences in hydraulic head between flow paths connected to a borehole generally create ambient vertical flow within the borehole. These differences in hydraulic head are in general due to regional flow conditions. Estimation of borehole vertical flow is of interest as it can be used to derive large scale hydraulic connections. Under a single-borehole configuration, the estimation of vertical flow can be used to estimate the local transimissivities and the hydraulic head differences driving the flow through the borehole. Under a cross-borehole set up, it can be used to characterize hydraulic connections and estimate their hydraulic properties. Using a flow and heat transfer numerical model, we find that the slope of the temperature profile is related directly to vertical borehole flow velocity. Thus, we propose a method to invert temperature measurements to derive borehole flow velocities and subsequently the fracture zone hydraulic and connectivity properties. The advantage of temperature measurements compared to flowmeter measurements is that temperature can be measured easily and very accurately, continuously in space and time. To test the methodology, we have performed a field experiment at a crystalline rocks field site, located in Ploemeur, Brittany (France). The site is composed of three 100 meters deep boreholes, located at 6-10 m distances from each other. The experiment consisted in measuring the borehole temperature profiles under all possible pumping configurations. Hence, the pumping and monitoring wells were successively changed. The thermal response in observation well induced by changes in pumping conditions is related to changes in vertical flow velocities and thus to the inter-borehole fracture connectivity. Based on this dataset, we propose a methodology to include temperature profiles in inverse problem for characterizing the spatial distribution of fracture zone hydraulic properties.

  20. Malaria vector populations across ecological zones in Guinea Conakry and Mali, West Africa.

    PubMed

    Coulibaly, Boubacar; Kone, Raymond; Barry, Mamadou S; Emerson, Becky; Coulibaly, Mamadou B; Niare, Oumou; Beavogui, Abdoul H; Traore, Sekou F; Vernick, Kenneth D; Riehle, Michelle M

    2016-04-08

    Malaria remains a pervasive public health problem in sub-Saharan West Africa. Here mosquito vector populations were explored across four sites in Mali and the Republic of Guinea (Guinea Conakry). The study samples the major ecological zones of malaria-endemic regions in West Africa within a relatively small distance. Mosquito vectors were sampled from larval pools, adult indoor resting sites, and indoor and outdoor human-host seeking adults. Mosquitoes were collected at sites spanning 350 km that represented arid savannah, humid savannah, semi-forest and deep forest ecological zones, in areas where little was previously known about malaria vector populations. 1425 mosquito samples were analysed by molecular assays to determine species, genetic attributes, blood meal sources and Plasmodium infection status. Anopheles gambiae and Anopheles coluzzii were the major anophelines represented in all collections across the ecological zones, with A. coluzzii predominant in the arid savannah and A. gambiae in the more humid sites. The use of multiple collection methodologies across the sampling sites allows assessment of potential collection bias of the different methods. The L1014F kdr insecticide resistance mutation (kdr-w) is found at high frequency across all study sites. This mutation appears to have swept almost to fixation, from low frequencies 6 years earlier, despite the absence of widespread insecticide use for vector control. Rates of human feeding are very high across ecological zones, with only small fractions of animal derived blood meals in the arid and humid savannah. About 30 % of freshly blood-fed mosquitoes were positive for Plasmodium falciparum presence, while the rate of mosquitoes with established infections was an order of magnitude lower. The study represents detailed vector characterization from an understudied area in West Africa with endemic malaria transmission. The deep forest study site includes the epicenter of the 2014 Ebola virus epidemic. With new malaria control interventions planned in Guinea, these data provide a baseline measure and an opportunity to assess the outcome of future interventions.

  1. Estimating forest species abundance through linear unmixing of CHRIS/PROBA imagery

    NASA Astrophysics Data System (ADS)

    Stagakis, Stavros; Vanikiotis, Theofilos; Sykioti, Olga

    2016-09-01

    The advancing technology of hyperspectral remote sensing offers the opportunity of accurate land cover characterization of complex natural environments. In this study, a linear spectral unmixing algorithm that incorporates a novel hierarchical Bayesian approach (BI-ICE) was applied on two spatially and temporally adjacent CHRIS/PROBA images over a forest in North Pindos National Park (Epirus, Greece). The scope is to investigate the potential of this algorithm to discriminate two different forest species (i.e. beech - Fagus sylvatica, pine - Pinus nigra) and produce accurate species-specific abundance maps. The unmixing results were evaluated in uniformly distributed plots across the test site using measured fractions of each species derived by very high resolution aerial orthophotos. Landsat-8 images were also used to produce a conventional discrete-type classification map of the test site. This map was used to define the exact borders of the test site and compare the thematic information of the two mapping approaches (discrete vs abundance mapping). The required ground truth information, regarding training and validation of the applied mapping methodologies, was collected during a field campaign across the study site. Abundance estimates reached very good overall accuracy (R2 = 0.98, RMSE = 0.06). The most significant source of error in our results was due to the shadowing effects that were very intense in some areas of the test site due to the low solar elevation during CHRIS acquisitions. It is also demonstrated that the two mapping approaches are in accordance across pure and dense forest areas, but the conventional classification map fails to describe the natural spatial gradients of each species and the actual species mixture across the test site. Overall, the BI-ICE algorithm presented increased potential to unmix challenging objects with high spectral similarity, such as different vegetation species, under real and not optimum acquisition conditions. Its full potential remains to be investigated in further and more complex study sites in view of the upcoming satellite hyperspectral missions.

  2. Scaffold-Based Delivery of Autologous Mesenchymal Stem Cells for Mandibular Distraction Osteogenesis: Preliminary Studies in a Porcine Model

    PubMed Central

    Sun, Zongyang; Tee, Boon Ching; Kennedy, Kelly S.; Kennedy, Patrick M.; Kim, Do-Gyoon; Mallery, Susan R.; Fields, Henry W.

    2013-01-01

    Purpose Bone regeneration through distraction osteogenesis (DO) is promising but remarkably slow. To accelerate it, autologous mesenchymal stem cells have been directly injected to the distraction site in a few recent studies. Compared to direct injection, a scaffold-based method can provide earlier cell delivery with potentially better controlled cell distribution and retention. This pilot project investigated a scaffold-based cell-delivery approach in a porcine mandibular DO model. Materials and Methods Eleven adolescent domestic pigs were used for two major sets of studies. The in-vitro set established methodologies to: aspirate bone marrow from the tibia; isolate, characterize and expand bone marrow-derived mesenchymal stem cells (BM-MSCs); enhance BM-MSC osteogenic differentiation using FGF-2; and confirm cell integration with a gelatin-based Gelfoam scaffold. The in-vivo set transplanted autologous stem cells into the mandibular distraction sites using Gelfoam scaffolds; completed a standard DO-course and assessed bone regeneration by macroscopic, radiographic and histological methods. Repeated-measure ANOVAs and t-tests were used for statistical analyses. Results From aspirated bone marrow, multi-potent, heterogeneous BM-MSCs purified from hematopoietic stem cell contamination were obtained. FGF-2 significantly enhanced pig BM-MSC osteogenic differentiation and proliferation, with 5 ng/ml determined as the optimal dosage. Pig BM-MSCs integrated readily with Gelfoam and maintained viability and proliferative ability. After integration with Gelfoam scaffolds, 2.4–5.8×107 autologous BM-MSCs (undifferentiated or differentiated) were transplanted to each experimental DO site. Among 8 evaluable DO sites included in the final analyses, the experimental DO sites demonstrated less interfragmentary mobility, more advanced gap obliteration, higher mineral content and faster mineral apposition than the control sites, and all transplanted scaffolds were completely degraded. Conclusion It is technically feasible and biologically sound to deliver autologous BM-MSCs to the distraction site immediately after osteotomy using a Gelfoam scaffold to enhance mandibular DO. PMID:24040314

  3. Economic Analysis of a Multi-Site Prevention Program: Assessment of Program Costs and Characterizing Site-level Variability

    PubMed Central

    Corso, Phaedra S.; Ingels, Justin B.; Kogan, Steven M.; Foster, E. Michael; Chen, Yi-Fu; Brody, Gene H.

    2013-01-01

    Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95% confidence interval) incremental difference was $2149 ($397, $3901). With the probabilistic sensitivity analysis approach, the incremental difference was $2583 ($778, $4346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention. PMID:23299559

  4. Economic analysis of a multi-site prevention program: assessment of program costs and characterizing site-level variability.

    PubMed

    Corso, Phaedra S; Ingels, Justin B; Kogan, Steven M; Foster, E Michael; Chen, Yi-Fu; Brody, Gene H

    2013-10-01

    Programmatic cost analyses of preventive interventions commonly have a number of methodological difficulties. To determine the mean total costs and properly characterize variability, one often has to deal with small sample sizes, skewed distributions, and especially missing data. Standard approaches for dealing with missing data such as multiple imputation may suffer from a small sample size, a lack of appropriate covariates, or too few details around the method used to handle the missing data. In this study, we estimate total programmatic costs for a prevention trial evaluating the Strong African American Families-Teen program. This intervention focuses on the prevention of substance abuse and risky sexual behavior. To account for missing data in the assessment of programmatic costs we compare multiple imputation to probabilistic sensitivity analysis. The latter approach uses collected cost data to create a distribution around each input parameter. We found that with the multiple imputation approach, the mean (95 % confidence interval) incremental difference was $2,149 ($397, $3,901). With the probabilistic sensitivity analysis approach, the incremental difference was $2,583 ($778, $4,346). Although the true cost of the program is unknown, probabilistic sensitivity analysis may be a more viable alternative for capturing variability in estimates of programmatic costs when dealing with missing data, particularly with small sample sizes and the lack of strong predictor variables. Further, the larger standard errors produced by the probabilistic sensitivity analysis method may signal its ability to capture more of the variability in the data, thus better informing policymakers on the potentially true cost of the intervention.

  5. Variation in teenage mothers' experiences of child care and other components of welfare reform: selection processes and developmental consequences.

    PubMed

    Yoshikawa, H; Rosman, E A; Hsueh, J

    2001-01-01

    Developmental evaluations of the current wave of welfare reform programs present challenges with regard to (1) assessing child outcomes; (2) accounting for heterogeneity among low-income families in both baseline characteristics and involvement in self-sufficiency activities and supports, and (3) development of alternatives to experimental approaches to causal inference. This study (N = 1,079) addresses these challenges by examining effects on 4- to 6-year-old children of different patterns of child care, self-sufficiency activities, and other service utilization indicators among experimental-group mothers in a 16-site welfare reform program. Outcomes in areas of cognitive ability and behavior problems were investigated. The study identified seven subgroups of participants engaging in different patterns of service utilization and activity involvement. A two-stage simultaneous equation methodology was used to account for selection, and effects on child cognitive ability of participation in specific patterns of services and activities were found. For example, children of mothers characterized by high levels of involvement in center-based child care, education, and job training showed higher levels of cognitive ability than children of mothers in groups characterized by high involvement in center-based care and education, or center-based care and job training. In addition, children of mothers in groups with high levels of involvement in any of these activities showed higher levels of cognitive ability than those with low levels of involvement. The bulk of selection effects occurred through site-level differences, rather than family-level socio-economic status or maternal depression indicators. Implications for welfare reform program and policy concerns are discussed.

  6. MX Siting Investigation. Prime Characterization Sites Central High Plains Candidate Siting Province.

    DTIC Science & Technology

    1979-02-15

    information obtained from these studies , in combination with data obtained in the Screen- ing studies , has been used for geotechnical ranking (FN-TR-25). I...Plains Candi- date Siting Province (CSP), one of six provinces included in the geotechnical Characterization studies . The location of the sites within...remaining after Intermediate Screening were divided into CSPs based on similar geotechnical characteristics. Intermediate Screening studies (FN-TR-17

  7. Force 2025 and Beyond Strategic Force Design Analytic Model

    DTIC Science & Technology

    2017-01-12

    depiction of the core ideas of our force design model. Figure 1: Description of Force Design Model Figure 2 shows an overview of our methodology ...the F2025B Force Design Analytic Model research conducted by TRAC- MTRY and the Naval Postgraduate School. Our research develops a methodology for...designs. We describe a data development methodology that characterizes the data required to construct a force design model using our approach. We

  8. An Evaluation of Public Preferences for Superfund Site Cleanup, Volume 1: A Preliminary Assessment (1995)

    EPA Pesticide Factsheets

    The purpose of the project is to develop a methodology for determining the cleanup options for National Priority List (NPL) sites governed under Superfund legislation that are acceptable to the public. A survey instrument was developed for the study.

  9. SENSITIVITY ANALYSIS OF THE APPLICATION OF CHEMICAL EXPOSURE CRITERIA FOR COMPARING SITES AND WATERSHEDS

    EPA Science Inventory

    A methodology was developed for deriving quantitative exposure criteria useful for comparing a site or watershed to a reference condition. The prototype method used indicators of exposures to oil contamination and combustion by-products, naphthalene and benzo(a)pyrene metabolites...

  10. Visual sensitivity of river recreation to power plants

    Treesearch

    David H. Blau; Michael C. Bowie

    1979-01-01

    The consultants were asked by the Power Plant Siting Staff of the Minnesota Environmental Quality Council to develop a methodology for evaluating the sensitivity of river-related recreational activities to visual intrusion by large coal-fired power plants. The methodology, which is applicable to any major stream in the state, was developed and tested on a case study...

  11. 78 FR 13742 - 60-Day Notice of Proposed Information Collection: Repatriation/Emergency Medical and Dietary...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-28

    ... . You can search for the document by entering ``Public Notice '' in the Search bar. If necessary, use... the time and cost burden for this proposed collection, including the validity of the methodology and.... Methodology: The Bureau of Consular Affairs will be posting this form on Department of State Web sites to give...

  12. Leveraging the Methodological Affordances of Facebook: Social Networking Strategies in Longitudinal Writing Research

    ERIC Educational Resources Information Center

    Sheffield, Jenna Pack; Kimme Hea, Amy C.

    2016-01-01

    While composition studies researchers have examined the ways social media are impacting our lives inside and outside of the classroom, less attention has been given to the ways in which social media--specifically Social Network Sites (SNSs)--may enhance our own research methods and methodologies by helping to combat research participant attrition…

  13. A Task-oriented Approach for Hydrogeological Site Characterization

    NASA Astrophysics Data System (ADS)

    Rubin, Y.; Nowak, W.; de Barros, F.

    2010-12-01

    Hydrogeological site characterization is a challenging task from several reasons: (1) the large spatial variability and scarcity of prior information render the outcome of any planned sampling campaign uncertain; (2) there are no simple tools for comparing between the many alternative measurement techniques and data acquisition strategies, and (3) physical and budgetary constraints associated with data acquisition. This paper presents several ideas on how to plan sampling campaigns in a rational manner while addressing these challenges. The first idea is to recognize that different sites and different problems require different characterization strategies. Hence the idea is to plan data acquisition according to its capability for meeting site-specific goals. For example, the characterization needs at a “research problem” site (e.g., a site intended to investigate the transport of uranium in the subsurface such as in Hanford) are different from those of a “problem” site (e.g., contaminated site associated with a health risk to human such as Camp Lejeune, or determining the safe yield of an aquifer). This distinction requires planners to define the characterization goal(s) in a quantitative manner. The second idea is to define metrics that could link specific data types and data acquisition strategies with the site-specific goals in a way that would allow planners to compare between strongly different, alternatives strategies at the design stage (even prior to data acquisition) and to modify the strategies as more data become available. To meet this goal, we developed the concept of the (comparative) information yield curve. Finally, we propose to look at site characterization from the perspective of statistical hypothesis testing, whereby data acquisition strategies could be evaluated in terms of their ability to support or refute various hypotheses made with regard to the characterization goals, and the strategies could be modified once the test is completed. Accept/reject regions for hypothesis testing can be determined based on goals determined by regulations or by agreement between the stakeholders. Hypothesis-driven design could help in minimizing the chances of making wrong decision (false positives or false negatives) with regard to the site-specific goals.

  14. Characterization of aggregates for sustainable freight transportation infrastructure.

    DOT National Transportation Integrated Search

    2011-01-01

    A novel method, X-ray computed tomography, has recently emerged as a powerful, nondestructive : methodology for material characterization, including geomaterials. This : method produces 3D images of the object that can be analyzed in various ways bas...

  15. [SciELO: method for electronic publishing].

    PubMed

    Laerte Packer, A; Rocha Biojone, M; Antonio, I; Mayumi Takemaka, R; Pedroso García, A; Costa da Silva, A; Toshiyuki Murasaki, R; Mylek, C; Carvalho Reisl, O; Rocha F Delbucio, H C

    2001-01-01

    It describes the SciELO Methodology Scientific Electronic Library Online for electronic publishing of scientific periodicals, examining issues such as the transition from traditional printed publication to electronic publishing, the scientific communication process, the principles which founded the methodology development, its application in the building of the SciELO site, its modules and components, the tools use for its construction etc. The article also discusses the potentialities and trends for the area in Brazil and Latin America, pointing out questions and proposals which should be investigated and solved by the methodology. It concludes that the SciELO Methodology is an efficient, flexible and wide solution for the scientific electronic publishing.

  16. Behavioral data of thin-film single junction amorphous silicon (a-Si) photovoltaic modules under outdoor long term exposure

    PubMed Central

    Kichou, Sofiane; Silvestre, Santiago; Nofuentes, Gustavo; Torres-Ramírez, Miguel; Chouder, Aissa; Guasch, Daniel

    2016-01-01

    Four years׳ behavioral data of thin-film single junction amorphous silicon (a-Si) photovoltaic (PV) modules installed in a relatively dry and sunny inland site with a Continental-Mediterranean climate (in the city of Jaén, Spain) are presented in this article. The shared data contributes to clarify how the Light Induced Degradation (LID) impacts the output power generated by the PV array, especially in the first days of exposure under outdoor conditions. Furthermore, a valuable methodology is provided in this data article permitting the assessment of the degradation rate and the stabilization period of the PV modules. Further discussions and interpretations concerning the data shared in this article can be found in the research paper “Characterization of degradation and evaluation of model parameters of amorphous silicon photovoltaic modules under outdoor long term exposure” (Kichou et al., 2016) [1]. PMID:26977439

  17. Model Test Bed for Evaluating Wave Models and Best Practices for Resource Assessment and Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neary, Vincent Sinclair; Yang, Zhaoqing; Wang, Taiping

    A wave model test bed is established to benchmark, test and evaluate spectral wave models and modeling methodologies (i.e., best practices) for predicting the wave energy resource parameters recommended by the International Electrotechnical Commission, IEC TS 62600-101Ed. 1.0 ©2015. Among other benefits, the model test bed can be used to investigate the suitability of different models, specifically what source terms should be included in spectral wave models under different wave climate conditions and for different classes of resource assessment. The overarching goal is to use these investigations to provide industry guidance for model selection and modeling best practices depending onmore » the wave site conditions and desired class of resource assessment. Modeling best practices are reviewed, and limitations and knowledge gaps in predicting wave energy resource parameters are identified.« less

  18. LANDSAT applications to wetlands classification in the upper Mississippi River Valley. Ph.D. Thesis. Final Report

    NASA Technical Reports Server (NTRS)

    Lillesand, T. M.; Werth, L. F. (Principal Investigator)

    1980-01-01

    A 25% improvement in average classification accuracy was realized by processing double-date vs. single-date data. Under the spectrally and spatially complex site conditions characterizing the geographical area used, further improvement in wetland classification accuracy is apparently precluded by the spectral and spatial resolution restrictions of the LANDSAT MSS. Full scene analysis of scanning densitometer data extracted from scale infrared photography failed to permit discrimination of many wetland and nonwetland cover types. When classification of photographic data was limited to wetland areas only, much more detailed and accurate classification could be made. The integration of conventional image interpretation (to simply delineate wetland boundaries) and machine assisted classification (to discriminate among cover types present within the wetland areas) appears to warrant further research to study the feasibility and cost of extending this methodology over a large area using LANDSAT and/or small scale photography.

  19. Study for the selection of optimal site in northeastern, Mexico for wind power generation using genetic algorithms.

    NASA Astrophysics Data System (ADS)

    Gonzalez, T.; Ruvalcaba, A.; Oliver, L.

    2016-12-01

    The electricity generation from renewable resources has acquired a leading role. Mexico particularrly it has great interest in renewable natural resources for power generation, especially wind energy. Therefore, the country is rapidly entering in the development of wind power generators sites. The development of a wind places as an energy project, does not have a standardized methodology. Techniques vary according to the developer to select the best place to install a wind turbine system. Generally to install the system the developers consider three key factors: 1) the characteristics of the wind, 2) the potential distribution of electricity and 3) transport access to the site. This paper presents a study with a different methodology which is carried out in two stages: the first at regional scale uses "space" and "natural" criteria in order to select a region based on its cartographic features such as politics and physiographic division, location of conservation natural areas, water bodies, urban criteria; and natural criteria such as the amount and direction of the wind, the type and land use, vegetation, topography and biodiversity of the site. The result of the application of these criteria, gives a first optimal selection area. The second part of the methodology includes criteria and variables on detail scale. The analysis of all data information collected will provide new parameters (decision variables) for the site. The overall analysis of the information, based in these criteria, indicates that the best location that the best location of the field would be the southern Coahuila and the central part of Nuevo Leon. The wind power site will contribute to the economy grow of important cities including Monterrey. Finally, computational model of genetic algorithm will be used as a tool to determine the best site selection depending on the parameters considered.

  20. Land-Energy Nexus: Life Cycle Land Use of Natural Gas-Fired Electricity

    NASA Astrophysics Data System (ADS)

    Heath, G.; Jordaan, S.; Macknick, J.; Mohammadi, E.; Ben-Horin, D.; Urrea, V.

    2014-12-01

    Comparisons of the land required for different types of energy are challenging due to the fact that upstream land use of fossil fuel technologies is not well characterized. This research focuses on improving estimates of the life cycle land use of natural gas-fired electricity through the novel combination of inventories of the location of natural gas-related infrastructure, satellite imagery analysis and gas production data. Land area per unit generation is calculated as the sum of natural gas life cycle stages divided by the throughput of natural gas, combined with the land use of the power plant divided by the generation of the power plant. Five natural gas life cycle stages are evaluated for their area: production, gathering, processing, transmission and disposal. The power plant stage is characterized by a thermal efficiency ηth, which converts MegaJoules (MJ) to kilowatt hours (kWh). We focus on seven counties in the Barnett shale region in Texas that represent over 90% of total Barnett Shale gas production. In addition to assessing the gathering and transmission pipeline network, approximately 500 sites are evaluated from the five life cycle stages plus power plants. For instance, assuming a 50 foot right-of-way for transmission pipelines, this part of the Barnett pipeline network occupies nearly 26,000 acres. Site, road and water components to total area are categorized. Methods are developed to scale up sampled results for each component type to the full population of sites within the Barnett. Uncertainty and variability are charaterized. Well-level production data are examined by integrating commercial datasets with advanced methods for quantifying estimated ultimate recovery (EUR) for wells, then summed to estimate natural gas produced in an entire play. Wells that are spatially coincident are merged using ArcGIS. All other sites are normalized by an estimate of gas throughput. Prior land use estimates are used to validate the satellite imagery analysis. Results of this research will provide a step towards better quantifying the land footprint of energy production activities and a methodologically consistent baseline from which more robust comparisons with alternative energy choices can be made.

  1. Additive Manufacturing in Production: A Study Case Applying Technical Requirements

    NASA Astrophysics Data System (ADS)

    Ituarte, Iñigo Flores; Coatanea, Eric; Salmi, Mika; Tuomi, Jukka; Partanen, Jouni

    Additive manufacturing (AM) is expanding the manufacturing capabilities. However, quality of AM produced parts is dependent on a number of machine, geometry and process parameters. The variability of these parameters affects the manufacturing drastically and therefore standardized processes and harmonized methodologies need to be developed to characterize the technology for end use applications and enable the technology for manufacturing. This research proposes a composite methodology integrating Taguchi Design of Experiments, multi-objective optimization and statistical process control, to optimize the manufacturing process and fulfil multiple requirements imposed to an arbitrary geometry. The proposed methodology aims to characterize AM technology depending upon manufacturing process variables as well as to perform a comparative assessment of three AM technologies (Selective Laser Sintering, Laser Stereolithography and Polyjet). Results indicate that only one machine, laser-based Stereolithography, was feasible to fulfil simultaneously macro and micro level geometrical requirements but mechanical properties were not at required level. Future research will study a single AM system at the time to characterize AM machine technical capabilities and stimulate pre-normative initiatives of the technology for end use applications.

  2. Change Detection via Cross-Borehole and VSP Seismic Surveys for the Source Physics Experiments (SPE) at the Nevada National Security Site (NNSS)

    NASA Astrophysics Data System (ADS)

    Knox, H. A.; Abbott, R. E.; Bonal, N. D.; Aldridge, D. F.; Preston, L. A.; Ober, C.

    2012-12-01

    In support of the Source Physics Experiment (SPE) at the Nevada National Security Site (NNSS), we have conducted two cross-borehole seismic experiments in the Climax Stock. The first experiment was conducted prior to the third shot in this multi-detonation program using two available boreholes and the shot hole, while the second experiment was conducted after the shot using four of the available boreholes. The first study focused on developing a well-characterized 2D pre-explosion Vp model including two VSPs and a seismic refraction survey, as well as quantifying baseline waveform similarity at reoccupied sites. This was accomplished by recording both "sparker" and accelerated weight drop sources on a hydrophone string and surface geophones. In total more than 18,500 unique source-receiver pairs were acquired during this testing. In the second experiment, we reacquired aproximately 8,800 source-receiver pairs and performed a cross-line survey allowing for a 3D post-explosion Vp model. The data acquired from the reoccupied sites was processed using cross-correlation methods and change detection methodologies, including comparison of the tomographic images. The survey design and subsequent processing provided an opportunity to investigate seismic wave propagation through damaged rock. We also performed full waveform forward modelling for a granitic body hosting a perched aquifer. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  3. Periodic Application of Stochastic Cost Optimization Methodology to Achieve Remediation Objectives with Minimized Life Cycle Cost

    NASA Astrophysics Data System (ADS)

    Kim, U.; Parker, J.

    2016-12-01

    Many dense non-aqueous phase liquid (DNAPL) contaminated sites in the U.S. are reported as "remediation in progress" (RIP). However, the cost to complete (CTC) remediation at these sites is highly uncertain and in many cases, the current remediation plan may need to be modified or replaced to achieve remediation objectives. This study evaluates the effectiveness of iterative stochastic cost optimization that incorporates new field data for periodic parameter recalibration to incrementally reduce prediction uncertainty and implement remediation design modifications as needed to minimize the life cycle cost (i.e., CTC). This systematic approach, using the Stochastic Cost Optimization Toolkit (SCOToolkit), enables early identification and correction of problems to stay on track for completion while minimizing the expected (i.e., probability-weighted average) CTC. This study considers a hypothetical site involving multiple DNAPL sources in an unconfined aquifer using thermal treatment for source reduction and electron donor injection for dissolved plume control. The initial design is based on stochastic optimization using model parameters and their joint uncertainty based on calibration to site characterization data. The model is periodically recalibrated using new monitoring data and performance data for the operating remediation systems. Projected future performance using the current remediation plan is assessed and reoptimization of operational variables for the current system or consideration of alternative designs are considered depending on the assessment results. We compare remediation duration and cost for the stepwise re-optimization approach with single stage optimization as well as with a non-optimized design based on typical engineering practice.

  4. Analytical methods for characterization of explosives-contaminated sites on U.S. Army installations

    NASA Astrophysics Data System (ADS)

    Jenkins, Thomas F.; Walsh, Marianne E.; Thorne, Philip G.

    1995-10-01

    The U.S. Army manufactures munitions at facilities throughout the United States. Many of these facilities are contaminated with residues of explosives from production, disposal of off- specification, and out-of-data munitions. The first step in remediating these sites is careful characterization. Currently sites are being characterized using a combination of on-site field screening and off-site laboratory analysis. Most of the contamination is associated with TNT (2,4,6-trinitrotoluene) and RDX (hexahydro-1,3,5-tri-nitro-1,3,5-triazine) and their manufacturing impurities and environmental transformation products. Both colorimetric and enzyme immunoassay-based field screening methods have been used successfully for on-site characterization. These methods have similar detection capabilities but differ in their selectivity. Although field screening is very cost-effective, laboratory analysis is still required to fully characterize a site. Laboratory analysis for explosives residues in the United States is generally conducted using high-performance liquid chromatography equipped with a UV detector. Air-dried soils are extracted with acetonitrile in an ultrasonic bath. Water is analyzed directly if detection limits in the range of 10 - 20 (mu) g/L are acceptable, or preconcentrated using either salting-out solvent extraction with acetonitrile or solid phase extraction.

  5. Approaches for Assessing Risks to Sensitive Populations: Lessons Learned from Evaluating Risks in the Pediatric Population

    EPA Science Inventory

    Assessing the risk profiles of potentially sensitive populations requires a "tool chest" of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of t...

  6. METHODOLOGIES FOR ESTIMATING AIR EMISSIONS FROM THREE NON-TRADITIONAL SOURCE CATEGORIES: OIL SPILLS, PETROLEUM VESSEL LOADING & UNLOADING, AND COOLING TOWERS

    EPA Science Inventory

    The report discusses part of EPA's program to identify and characterize emissions sources not currently accounted for by either the existing Aerometric Information Retrieval System (AIRS) or State Implementation Plan (sip) area source methodologies and to develop appropriate emis...

  7. Epistemological-Methodological Issues Related to Applied Organizational Research.

    ERIC Educational Resources Information Center

    van Meel, R. M.

    Applied research is supposed to take the perspective with the highest degree of corroboration as a basis for action. The realm of organizational perspectives is characterized, however, with a multitude of competing research programs, seldom tested against each other. Epistemological and methodological issues overwhelm inquiry in applied research.…

  8. Site Characterization and Monitoring Technical Support Center FY16 Report

    EPA Science Inventory

    SCMTSC’s primary goal is to provide technical assistance to regional programs on complex hazardous waste site characterization issues. This annual report illustrates the range and extent of projects that SCMTSC supported in FY 2016. Our principal audiences are site project manage...

  9. Baseline Ecological Risk Assessment for the Upland at the LCP Chemical Site, Brunswick, Georgia - Site Investigation/Analysis and Risk Characterization (Final)

    EPA Pesticide Factsheets

    Site Investigation/Analysis and Risk Characterization (Final) Prepared for Honeywell International Inc. Prepared by CDR Environmental Specialists, Inc. August 2010 Region ID: 04 DocID: 10746263, DocDate: 08-01-2010

  10. Early Market Site Identification Data

    DOE Data Explorer

    Levi Kilcher

    2016-04-01

    This data was compiled for the 'Early Market Opportunity Hot Spot Identification' project. The data and scripts included were used in the 'MHK Energy Site Identification and Ranking Methodology' Reports (Part I: Wave, NREL Report #66038; Part II: Tidal, NREL Report #66079). The Python scripts will generate a set of results--based on the Excel data files--some of which were described in the reports. The scripts depend on the 'score_site' package, and the score site package depends on a number of standard Python libraries (see the score_site install instructions).

  11. Stennis Space Center Verification & Validation Capabilities

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary; Ryan, Robert E.; Holekamp, Kara; O'Neal, Duane; Knowlton, Kelly; Ross, Kenton; Blonski, Slawomir

    2007-01-01

    Scientists within NASA#s Applied Research & Technology Project Office (formerly the Applied Sciences Directorate) have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site enables the in-flight characterization of satellite and airborne high spatial resolution remote sensing systems and their products. The smaller scale of the newer high resolution remote sensing systems allows scientists to characterize geometric, spatial, and radiometric data properties using a single V&V site. The targets and techniques used to characterize data from these newer systems can differ significantly from the techniques used to characterize data from the earlier, coarser spatial resolution systems. Scientists have used the SSC V&V site to characterize thermal infrared systems. Enhancements are being considered to characterize active lidar systems. SSC employs geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment, and thermal calibration ponds to characterize remote sensing data products. Similar techniques are used to characterize moderate spatial resolution sensing systems at selected nearby locations. The SSC Instrument Validation Lab is a key component of the V&V capability and is used to calibrate field instrumentation and to provide National Institute of Standards and Technology traceability. This poster presents a description of the SSC characterization capabilities and examples of calibration data.

  12. Characterizing team performance in network-centric operations: philosophical and methodological issues.

    PubMed

    Bolia, Robert S; Nelson, W Todd

    2007-05-01

    The recently promulgated doctrine of network-centric warfare suggests that increases in shared situation awareness and self-synchronization will be emergent properties of densely connected military networks. What it fails to say is how these enhancements are to be measured. The present article frames the discussion as a question of how to characterize team performance, and considers such performance in the context of its hypothetical components: situation awareness, workload, and error. This examination concludes that reliable measures of these constructs are lacking for teams, even when they exist for individual operators, and that this is due to philosophical and/or methodological flaws in their conceptual development. Additional research is recommended to overcome these deficiencies, as well as consideration of novel multidisciplinary approaches that draw on methodologies employed in the social, physical, and biological sciences.

  13. Observation of sediment resuspension in Old Tampa Bay, Florida

    USGS Publications Warehouse

    Schoellhamer, David H.; ,

    1990-01-01

    Equipment and methodology have been developed to monitor sediment resuspension at two sites in Old Tampa Bay. Velocities are measured with electromagnetic current meters and suspended solids and turbidity are monitored with optical backscatterance sensors. In late November 1989, a vertical array of instrument pairs was deployed from a permanent platform at a deep-water site, and a submersible instrument package with a single pair of instruments was deployed at a shallow-water site. Wind waves caused resuspension at the shallow-water site, but not at the deeper platform site, and spring tidal currents did not cause resuspension at either site.

  14. Potential Application of Environmental Noise Recordings in Geoarchaeological Site Characterization

    NASA Astrophysics Data System (ADS)

    Di Luzio, E.

    2015-12-01

    Environmental noise recordings are commonly applied in seismic microzonation studies. By calculating the H/V spectral ratio, the fundamental frequency of soft terrains overlying a rigid bedrock can be determined (Nakamura (1989). In such a simple two-layer system, equation f = n Vs/4H (1) links the resonance frequency "f" to the thickness "H" and shear waves velocity "Vs "of the resonating layer. In recent years, this methodology has been applied generally to obtain information on the seismostratigraphy of an investigated site in different environmental context. In this work, its potential application in the characterization of archaeological features hosted in shallow geological levels is discussed. Field cases are identified in the Appia Antica archaeological site which is placed in central Italy. Here, acknowledged targets correspond to: i) empty tanks carved by the Romans into Cretaceous limestone in the IV-III cen. BC and ii): the basaltic stone paving of the ancient road track which is locally buried beneath colluvial deposits. Narrowly-spaced recordings of environmental noise were carried using a portable digital seismograph equipped with three electrodynamic orthogonal sensors (velocimeters) responding in the band 0.1 ÷1024 Hz and adopting a sampling frequency of 256 Hz.. Results are discussed in terms of absolute H/V values and related distribution maps in the very high-frequency interval of 10-40Hz. In the tanks hosting area, interpolation of H/V maximum values around 13Hz matches caves location and alignment, which is also evidenced by clear inversions (H/V<1) at lower frequencies (10-1Hz). Correlation between H/V peaks and the top surface of the buried stone paving along the prosecution of the road track is even more straightforward. Finally, the depth variations of the tank roofs and the basaltic paving were reconstructed combining in equation (1) results of noise recordings with borehole data and geophysical surveys (SASW analysis).

  15. Ka-Band Atmospheric Phase Stability Measurements in Goldstone, CA; White Sands, NM; and Guam

    NASA Technical Reports Server (NTRS)

    Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.

    2014-01-01

    As spacecraft communication links are driven to higher frequencies (e.g. Ka-band) both by spectrum congestion and the appeal of higher data rates, the propagation phenomena at these frequencies must be well characterized for effective system design. In particular, the phase stability of a site at a given frequency will govern whether or not the site is a practical location for an antenna array, particularly if uplink capabilities are desired. Propagation studies to characterize such phenomena must be done on a site-by-site basis due to the wide variety of climates and weather conditions at each ground terminal. Accordingly, in order to statistically characterize the atmospheric effects on Ka-Band links, site test interferometers (STIs) have been deployed at three of NASA's operational sites to directly measure each site's tropospheric phase stability. Using three years of results from these experiments, this paper will statistically characterize the simultaneous atmospheric phase noise measurements recorded by the STIs deployed at the following ground station sites: the Goldstone Deep Space Communications Complex near Barstow, CA; the White Sands Ground Terminal near Las Cruces, NM; and the Guam Remote Ground Terminal on the island of Guam.

  16. A Methodology to Evaluate Ecological Resources and Risk Using Two Case Studies at the Department of Energy’s Hanford Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burger, Joanna; Gochfeld, Michael; Bunn, Amoret

    An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy’s Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale frommore » nondiscernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy’s sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy’s sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.« less

  17. Environmental waste site characterization utilizing aerial photographs and satellite imagery: Three sites in New Mexico, USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Eeckhout, E.; Pope, P.; Becker, N.

    1996-04-01

    The proper handling and characterization of past hazardous waste sites is becoming more and more important as world population extends into areas previously deemed undesirable. Historical photographs, past records, current aerial satellite imagery can play an important role in characterizing these sites. These data provide clear insight into defining problem areas which can be surface samples for further detail. Three such areas are discussed in this paper: (1) nuclear wastes buried in trenches at Los Alamos National Laboratory, (2) surface dumping at one site at Los Alamos National Laboratory, and (3) the historical development of a municipal landfill near Lasmore » Cruces, New Mexico.« less

  18. Participant Observation, Anthropology Methodology and Design Anthropology Research Inquiry

    ERIC Educational Resources Information Center

    Gunn, Wendy; Løgstrup, Louise B.

    2014-01-01

    Within the design studio, and across multiple field sites, the authors compare involvement of research tools and materials during collaborative processes of designing. Their aim is to trace temporal dimensions (shifts/ movements) of where and when learning takes place along different sites of practice. They do so by combining participant…

  19. Incorporating hydrologic data and ecohydrologic relationships into ecological site descriptions

    Treesearch

    C. Jason Williams; Frederick B. Pierson; Kenneth E. Spaeth; Joel R. Brown; Osama Z. Al-Hamdan; Mark A. Weltz; Mark A. Nearing; Jeffrey E. Herrick; Jan Boll; Pete Robichaud; David C. Goodrich; Phillip Heilman; D. Phillip Guertin; Mariano Hernandez; Haiyan Wei; Stuart P. Hardegree; Eva K. Strand; Jonathan D. Bates; Loretta J. Metz; Mary H. Nichols

    2016-01-01

    The purpose of this paper is to recommend a framework and methodology for incorporating hydrologic data and ecohydrologic relationships in Ecological Site Descriptions (ESDs) and thereby enhance the utility of ESDs for assessing rangelands and guiding resilience-based management strategies. Resilience-based strategies assess and manage ecological state...

  20. Perspectives on ... Multiculturalism and Library Exhibits: Sites of Contested Representation

    ERIC Educational Resources Information Center

    Reece, Gwendolyn J.

    2005-01-01

    This article analyzes a multicultural library exhibit presenting the Palestinian/Israeli conflict as a site of contested representation. Qualitative methodology is used to interrogate the exhibit and its audience reception. Drawing on insights from critical pedagogy, implications for libraries arising from this case study are given and suggestions…

  1. Integration of aerial imaging and variable-rate technology for site-specific aerial herbicide application

    USDA-ARS?s Scientific Manuscript database

    As remote sensing and variable rate technology are becoming more available for aerial applicators, practical methodologies on effective integration of these technologies are needed for site-specific aerial applications of crop production and protection materials. The objectives of this study were to...

  2. 40 CFR Appendix C to Part 58 - Ambient Air Quality Monitoring Methodology

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., National Exposure Research Laboratory (MD-D205-03), U.S. Environmental Protection Agency, Research Triangle....2.3For which a quantitative relationship to a reference or equivalent method for PM 10 has been established at the use site. Procedures for establishing a quantitative site-specific relationship are...

  3. 40 CFR Appendix C to Part 58 - Ambient Air Quality Monitoring Methodology

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., National Exposure Research Laboratory (MD-D205-03), U.S. Environmental Protection Agency, Research Triangle....2.3For which a quantitative relationship to a reference or equivalent method for PM 10 has been established at the use site. Procedures for establishing a quantitative site-specific relationship are...

  4. MULTI-SITE PERFORMANCE EVALUATIONS OF CANDIDATE METHODOLOGIES FOR DETERMINING COARSE PARTICULATE MATTER (PMC) CONCENTRATIONS

    EPA Science Inventory

    Comprehensive field studies were conducted to evaluate the performance of sampling methods for measuring the coarse fraction of PM10 in ambient air. Five separate sampling approaches were evaluated at each of three sampling sites. As the primary basis of comparison, a discret...

  5. MULTI-SITE EVALUATIONS OF CANDIDATE METHODOLOGIES FOR DETERMINING COARSE PARTICULATE MATTER (PMC) CONCENTRATIONS

    EPA Science Inventory

    Comprehensive field studies were conducted to evaluate the performance of sampling methods for measuring the coarse fraction of PM10 in ambient air. Five separate sampling approaches were evaluated at each of three sampling sites. As the primary basis of comparison, a discret...

  6. MULTI-SITE EVALUATIONS OF CANDIDATE METHODOLOGIES FOR DETERMINING COARSE PARTICULATE MATTER (PMC) CONCENTRATIONS

    EPA Science Inventory

    Comprehensive field studies were conducted to evaluate the performance of sampling methods for measuring the coarse fraction of PM10 in ambient air. Five separate sampling approaches were evaluated at each of three sampling sites. As the primary basis of comparison, a discrete ...

  7. CO{sub 2} Sequestration Capacity and Associated Aspects of the Most Promising Geologic Formations in the Rocky Mountain Region: Local-Scale Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laes, Denise; Eisinger, Chris; Morgan, Craig

    2013-07-30

    The purpose of this report is to provide a summary of individual local-­scale CCS site characterization studies conducted in Colorado, New Mexico and Utah. These site-­ specific characterization analyses were performed as part of the “Characterization of Most Promising Sequestration Formations in the Rocky Mountain Region” (RMCCS) project. The primary objective of these local-­scale analyses is to provide a basis for regional-­scale characterization efforts within each state. Specifically, limits on time and funding will typically inhibit CCS projects from conducting high-­ resolution characterization of a state-­sized region, but smaller (< 10,000 km{sup 2}) site analyses are usually possible, and suchmore » can provide insight regarding limiting factors for the regional-­scale geology. For the RMCCS project, the outcomes of these local-­scale studies provide a starting point for future local-­scale site characterization efforts in the Rocky Mountain region.« less

  8. Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrouchov, George; Doll, William E.; Beard, Les P.

    2009-01-01

    Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less

  9. In-Situ Characterization of Potential-Induced Degradation in Crystalline Silicon Photovoltaic Modules Through Dark I–V Measurements

    DOE PAGES

    Luo, Wei; Hacke, Peter; Singh, Jai Prakash; ...

    2016-11-14

    Here, a temperature correction methodology for in-situ dark I-V(DIV) characterization of conventional p-type crystalline silicon photovoltaic (PV) modules undergoing potential-induced degradation (PID) is proposed.

  10. INDOOR AIR ASSESSMENT - A REVIEW OF INDOOR AIR QUALITY RISK CHARACTERIZATION

    EPA Science Inventory

    Risk assessment methodologies provide a mechanism for incorporating scientific evidence and Judgments Into the risk management decision process. isk characterization framework has been developed to provide a systematic approach for analysis and presentation of risk characterizati...

  11. FATE AND TRANSPORT MODELING OF CONTAMINANTS OF CONCERN FROM A CAFO IN AN AGRICULTURAL WATERSHED

    EPA Science Inventory

    The groundwater flow and transport modeling effort will require hydrogeological site characterization and the development of a conceptual flow model for the site. Site characterization will involve an assessment of both the surface and subsurface and be accomplished through joint...

  12. Draft environmental assessment: Davis Canyon site, Utah. Nuclear Waste Policy Act (Section 112). [Contains glossary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1984-12-01

    In February 1983, the US Department of Energy (DOE) identified the Davis Canyon site in Utah, as one of nine potentially acceptable sites for a mined geologic repository for spent nuclear fuel and high-level radioactive waste. To determine their suitability, the Davis Canyon site and the eight other potentially acceptable sites have been evaluated in accordance with the DOE's General Guidelines for the Recommendation of Sites for Nuclear Waste Repositories. These evaluations are reported in this draft environmental assessment (EA), which is being issued for public review and comment. The DOE findings and determinations that are based on these evaluationsmore » are preliminary and subject to public review and comment. A final EA will be prepared after considering the comments received. On the basis of the evaluations reported in this draft EA, the DOE has found that the Davis Canyon site is not disqualified under the guidelines. The site is in the Paradox Basin, which is one of five distinct geohydrologic settings considered for the first repository. This setting contains one other potentially acceptable site - the Lavender Canyon site. Although the Lavender Canyon site appears to be suitable for site characterization, the DOE has concluded that the Davis Canyon site is the preferred site in the Paradox Basin. Furthermore, the DOE finds that the site is suitable for site characterization because the evidence does not support a conclusion that the site will not be able to meet each of the qualifying conditions specified in the guidelines. On the basis of these findings, the DOE is proposing to nominate the Davis Canyon site as one of five sites suitable for characterization. Having compared the Davis Canyon site with the other four sites proposed for nomination, the DOE has determined that the Davis Canyon site is not one of the three preferred sites for recommendation to the President as candidates for characterization.« less

  13. Airborne and Ground-Based Optical Characterization of Legacy Underground Nuclear Test Sites

    NASA Astrophysics Data System (ADS)

    Vigil, S.; Craven, J.; Anderson, D.; Dzur, R.; Schultz-Fellenz, E. S.; Sussman, A. J.

    2015-12-01

    Detecting, locating, and characterizing suspected underground nuclear test sites is a U.S. security priority. Currently, global underground nuclear explosion monitoring relies on seismic and infrasound sensor networks to provide rapid initial detection of potential underground nuclear tests. While seismic and infrasound might be able to generally locate potential underground nuclear tests, additional sensing methods might be required to further pinpoint test site locations. Optical remote sensing is a robust approach for site location and characterization due to the ability it provides to search large areas relatively quickly, resolve surface features in fine detail, and perform these tasks non-intrusively. Optical remote sensing provides both cultural and surface geological information about a site, for example, operational infrastructure, surface fractures. Surface geological information, when combined with known or estimated subsurface geologic information, could provide clues concerning test parameters. We have characterized two legacy nuclear test sites on the Nevada National Security Site (NNSS), U20ak and U20az using helicopter-, ground- and unmanned aerial system-based RGB imagery and light detection and ranging (lidar) systems. The multi-faceted information garnered from these different sensing modalities has allowed us to build a knowledge base of how a nuclear test site might look when sensed remotely, and the standoff distances required to resolve important site characteristics.

  14. The relation between air pollution data and planetary boundary layer quantities in a complex coastal industrial site nearby populated areas.

    NASA Astrophysics Data System (ADS)

    Mammarella, M. C.; Grandoni, G.; Fernando, J.; Cacciani, M.; di Sabatino, S.; Favaron, M.; Fedele, P.

    2010-09-01

    The connection among boundary layer phenomena, atmospheric pollutant dynamics and human health is an established fact, taking many different forms depending on local characteristics, including slope and position of relief and/or coastline, surface roughness, emission patterns. The problem is especially interesting in complex and coastal terrain, where concurrence of slope and sea induced local circulation interact reciprocally, yielding a complex pattern whose interpretation may go beyond pure modeling, and devise specific measurements among which the planetary boundary layer (PBL) height. An occasion for studying this important theme has been offered by Regione Molise and Valle del Biferno Consortium (COSIB), for the specific case of the industrial complex of Valle del Biferno, 3 km inland of Termoli, in Central Italy, on the Adriatic coast. The local government, sensitive to air quality and public health in the industrial area, together with COSIB has co-financed a research project aimed at gaining knowledge about local meteorology, PBL phenomena and atmospheric pollutant dispersion in the area. Expected results include new air quality monitoring and control methodologies in Valle del Biferno for a sustainable development in an environmentally respectful manner, at a site already characterized by a high environmental and landscape value. The research project, developed by ENEA, has began in 2007 and will conclude in December 2010. Project activities involve research group from Europe, the United States of America, and the Russian Federation. Scientific and practical results will be published and presented in occasion of the final workshop to be held on project conclusion. The scientific interest of Valle del Biferno case stems from the specific local characteristics at site. Given the valley orientation respect to mean synoptic circulation, local effects as sea and slope breezes are dominant, and a complex wind regime develops affecting local transport and diffusion of pollutants emitted in the area of the industrial complex. All effects studied, although influenced by local conditions, characterize not only this industrial area but all areas located along the coastline. This location is highly frequent in Italy and the World, as most industrial complexes in the World occur at coastal sites, where access to harbors and transport networks are facilitated. The Valle del Biferno case may then yield important data to many industrial sites.

  15. Copper-catalyst-controlled site-selective allenylation of ketones and aldehydes with propargyl boronates.

    PubMed

    Fandrick, Keith R; Ogikubo, Junichi; Fandrick, Daniel R; Patel, Nitinchandra D; Saha, Jaideep; Lee, Heewon; Ma, Shengli; Grinberg, Nelu; Busacca, Carl A; Senanayake, Chris H

    2013-03-15

    A practical and highly site-selective copper-PhBPE-catalyst-controlled allenylation with propargyl boronates has been developed. The methodology has shown to be tolerant of diverse ketones and aldehydes providing the allenyl adducts in high selectivity. The BPE ligand and boronate substituents were shown to direct the site selectivity for which either propargyl or allenyl adducts can be acquired in high selectivity. A model is proposed that explains the origin of the site selectivity.

  16. Waste treatability guidance program. User`s guide. Revision 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toth, C.

    1995-12-21

    DOE sites across the country generate and manage radioactive, hazardous, mixed, and sanitary wastes. It is necessary for each site to find the technologies and associated capacities required to manage its waste. One role of DOE HQ Office of Environmental Restoration and Waste Management is to facilitate the integration of the site- specific plans into coherent national plans. DOE has developed a standard methodology for defining and categorizing waste streams into treatability groups based on characteristic parameters that influence waste management technology needs. This Waste Treatability Guidance Program automates the Guidance Document for the categorization of waste information into treatabilitymore » groups; this application provides a consistent implementation of the methodology across the National TRU Program. This User`s Guide provides instructions on how to use the program, including installations instructions and program operation. This document satisfies the requirements of the Software Quality Assurance Plan.« less

  17. Measurement Sets and Sites Commonly Used for High Spatial Resolution Image Product Characterization

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary

    2006-01-01

    Scientists within NASA's Applied Sciences Directorate have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site has enabled the in-flight characterization of satellite high spatial resolution remote sensing system products form Space Imaging IKONOS, Digital Globe QuickBird, and ORBIMAGE OrbView, as well as advanced multispectral airborne digital camera products. SSC utilizes engineered geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment and their Instrument Validation Laboratory to characterize high spatial resolution remote sensing data products. This presentation describes the SSC characterization capabilities and techniques in the visible through near infrared spectrum and examples of calibration results.

  18. Stochastic inversion of time-lapse geophysical data to characterize the vadose zone at the Arrenaes field site (Denmark)

    NASA Astrophysics Data System (ADS)

    Marie, S.; Irving, J. D.; Looms, M. C.; Nielsen, L.; Holliger, K.

    2011-12-01

    Geophysical methods such as ground-penetrating radar (GPR) can provide valuable information on the hydrological properties of the vadose zone. In particular, there is evidence to suggest that the stochastic inversion of such data may allow for significant reductions in uncertainty regarding subsurface van-Genuchten-Mualem (VGM) parameters, which characterize unsaturated hydrodynamic behaviour as defined by the combination of the water retention and hydraulic conductivity functions. A significant challenge associated with the use of geophysical methods in a hydrological context is that they generally exhibit an indirect and/or weak sensitivity to the hydraulic parameters of interest. A novel and increasingly popular means of addressing this issue involves the acquisition of geophysical data in a time-lapse fashion while changes occur in the hydrological condition of the probed subsurface region. Another significant challenge when attempting to use geophysical data for the estimation of subsurface hydrological properties is the inherent non-linearity and non-uniqueness of the corresponding inverse problems. Stochastic inversion approaches have the advantage of providing a comprehensive exploration of the model space, which makes them ideally suited for addressing such issues. In this work, we present the stochastic inversion of time-lapse zero-offset-profile (ZOP) crosshole GPR traveltime data, collected during a forced infiltration experiment at the Arreneas field site in Denmark, in order to estimate subsurface VGM parameters and their corresponding uncertainties. We do this using a Bayesian Markov-chain-Monte-Carlo (MCMC) inversion approach. We find that the Bayesian-MCMC methodology indeed allows for a substantial refinement in the inferred posterior parameter distributions of the VGM parameters as compared to the corresponding priors. To further understand the potential impact on capturing the underlying hydrological behaviour, we also explore how the posterior VGM parameter distributions affect the hydrodynamic characteristics. In doing so, we find clear evidence that the approach pursued in this study allows for effective characterization of the hydrological behaviour of the probed subsurface region.

  19. Inhibiting Substances as Tracers for the Reactivity Assessment of Fe(0)-PRBs

    NASA Astrophysics Data System (ADS)

    Dahmke, A.

    2001-12-01

    Passivation processes of Fe(0)-barriers are well known from lab-studies (Phillips et al., (2000), Schlicker et al., (2000)) and from field-sites. Normally the passivation processes are correlated with the groundwater composition but quantitative prediction and monitoring of the inhibition velocity under field conditions is a serious problem. Currently, only concentration profiles of contaminants, isotope studies or the measurement of reactivity loss with column-experiments of altered Fe(0)-material from the field sites are used for the characterization of Fe(0)-reactivity. All of theses approaches have serious disadvantages and limitations. Thus the sampling of unaltered Fe(0)-material out of the reactive barrier is difficult and the perturbed installation of the material in column experiments may lead to significant modification in the field behaviour of the Fe(0)-barrier. In addition, the concentration profile of the contaminant is not always a good tool for reactivity estimations due to uncertainties in hydrogeological boundary conditions. The same general restrictions apply also for isotope studies, in which the shift of the d13C signal is used as an indicator for degradation processes of the chlorinated aliphatics. Therefore here the use of Fe(0) inhibiting substances as reactive tracers is presented as a new approach for the characterization of the Fe(0)-reactivity. The methodology of reactive tracers to determine reactive surface areas of Fe(III) in porous was developed last year by Veehmayer et al. (2000) by interpretation of the breakthrough curves of species with known specific interactions with the solid phase. The concept is also applicable for the estimation of reactive sites in Fe(0)-columns, so that the breakthrough curves of oxidants like NO3-, CrO42- or oxidizing organic substances may be interpreted as indicative of reactive reducing sites in the Fe(0)-column. Such correlation was already shown by Schlicker et al., (2000), who explained the movement of passivation fronts by the blocking of reactive sites at the Fe(0) surface. To investigate this approach different column experiments with passivated Fe(0) are being currently carried out. Initial results from the lab indicate that different inorganic as well as organic substances can be used for characterization of the passivation state of the Fe(0) surface. Application of reactive tracer combinations also give some clues about the surface properties of the inhibiting substances, which might be helpful with respect to reactivation approaches for passivated permeable Fe(0)-barriers. Despite the first encouraging but more phenomenological lab results some theoretical problems, like the alteration of the specific surface area during the lab experiments or competition processes between organic or inorganic compounds at the altered surface of the Fe particles have to be addressed more in detail.

  20. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    NASA Astrophysics Data System (ADS)

    Graves, Robert; Jordan, Thomas H.; Callaghan, Scott; Deelman, Ewa; Field, Edward; Juve, Gideon; Kesselman, Carl; Maechling, Philip; Mehta, Gaurang; Milner, Kevin; Okaya, David; Small, Patrick; Vahi, Karan

    2011-03-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i.e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process.

  1. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    USGS Publications Warehouse

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and magnitude uncertainty estimates used in the definition of the ruptures than is found in the traditional GMPE approach. This reinforces the need for continued development of a better understanding of earthquake source characterization and the constitutive relations that govern the earthquake rupture process. ?? 2010 Springer Basel AG.

  2. Site Characterization for a Deep Borehole Field Test

    NASA Astrophysics Data System (ADS)

    Kuhlman, K. L.; Hardin, E. L.; Freeze, G. A.; Sassani, D.; Brady, P. V.

    2015-12-01

    The US Department of Energy Office of Nuclear Energy is at the beginning of 5-year Deep Borehole Field Test (DBFT) to investigate the feasibility of constructing and characterizing two boreholes in crystalline basement rock to a depth of 5 km (16,400 ft). The concept of deep borehole disposal for radioactive waste has some advantages over mined repositories, including incremental construction and loading, the enhanced natural barriers provided by deep continental crystalline basement, and reduced site characterization. Site characterization efforts need to determine an eligible site that does not have the following disqualifying characteristics: greater than 2 km to crystalline basement, upward vertical fluid potential gradients, presence of economically exploitable natural resources, presence of high permeability connection to the shallow subsurface, and significant probability of future seismic or volcanic activity. Site characterization activities for the DBFT will include geomechanical (i.e., rock in situ stress state, and fluid pressure), geological (i.e., rock and fracture infill lithology), hydrological (i.e., quantity of fluid, fluid convection properties, and solute transport mechanisms), and geochemical (i.e., rock-water interaction and natural tracers) aspects. Both direct (i.e., sampling and in situ testing) and indirect (i.e., borehole geophysical) methods are planned for efficient and effective characterization of these site aspects and physical processes. Borehole-based characterization will be used to determine the variability of system state (i.e., stress, pressure, temperature, and chemistry) with depth, and interpretation of material and system parameters relevant to numerical site simulation. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  3. Developing a programmatic approach to investigating and remediating many unrelated comprehensive environmental response, compensation, and liability act sites at Kelly Air Force Base

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamp, G.; Regan, P.; Ninesteel, R.

    1988-01-01

    Kelly Air Force Base (AFB), which was founded in 1917, is involved in logistics and maintenance activities supporting the Air Logistics Command. In addition, Kelly AFB hosts over 50 tenant organizations representing the Air Force, Department of Defense, and other government agencies. Over the years waste disposal from this complex was conducted in a manner that led to the identification of over 30 sites to be included in the Installation Restoration Program (IRP) after the Phase 1 investigation. A methodology was needed to prioritize the Remedial Investigations and Feasibility Study (RI/FS) activities for the sites. A Strategy Plan was developedmore » that involved reviewing and interpreting existing data, identifying data voids relative to site specific RI/FS activities, and developing methodology to prioritize activities. Sites were prioritized, and a comprehensive IRP planning document was developed. One data deficiency was revealed -- the lack of understanding of the Basewide hydrogeologic conditions necessary to establish an effective restoration program. A Hydrogeologic Investigation was initiated to provide this data. This data will allow better interpretation of the interaction of the sites, particularly those in close proximity, and improved planning of remediation activities.« less

  4. A Methodology to Assess the Accuracy with which Remote Data Characterize a Specific Surface, as a Function of Full Width at Half Maximum (FWHM): Application to Three Italian Coastal Waters

    PubMed Central

    Cavalli, Rosa Maria; Betti, Mattia; Campanelli, Alessandra; Di Cicco, Annalisa; Guglietta, Daniela; Penna, Pierluigi; Piermattei, Viviana

    2014-01-01

    This methodology assesses the accuracy with which remote data characterizes a surface, as a function of Full Width at Half Maximum (FWHM). The purpose is to identify the best remote data that improves the characterization of a surface, evaluating the number of bands in the spectral range. The first step creates an accurate dataset of remote simulated data, using in situ hyperspectral reflectances. The second step evaluates the capability of remote simulated data to characterize this surface. The spectral similarity measurements, which are obtained using classifiers, provide this capability. The third step examines the precision of this capability. The assumption is that in situ hyperspectral reflectances are considered the “real” reflectances. They are resized with the same spectral range of the remote data. The spectral similarity measurements which are obtained from “real” resized reflectances, are considered “real” measurements. Therefore, the quantity and magnitude of “errors” (i.e., differences between spectral similarity measurements obtained from “real” resized reflectances and from remote data) provide the accuracy as a function of FWHM. This methodology was applied to evaluate the accuracy with which CHRIS-mode1, CHRIS-mode2, Landsat5-TM, MIVIS and PRISMA data characterize three coastal waters. Their mean values of uncertainty are 1.59%, 3.79%, 7.75%, 3.15% and 1.18%, respectively. PMID:24434875

  5. Building dismantlement and site remediation at the Apollo Fuel Plant: When is technology the answer?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walton, L.

    1995-01-01

    The Apollo fuel plant was located in Pennsylvania on a site known to have been used continuously for stell production from before the Civil War until after World War II. Then the site became a nuclear fuel chemical processing plants. Finally it was used to convert uranium hexafluoride to various oxide fuel forms. After the fuel manufacturing operations were teminated, the processing equipment was partially decontaminated, removed, packaged and shipped to a licensed low-level radioactive waste burial site. The work was completed in 1984. In 1990 a detailed site characterization was initiated to establishe the extent of contamination and tomore » plan the building dismantlement and soil remediation efforts. This article discusses the site characterization and remedial action at the site in the following subsections: characterization; criticality control; mobile containment; soil washing; in-process measurements; and the final outcome of the project.« less

  6. 77 FR 8288 - Applications and Amendments to Facility Operating Licenses Involving Proposed No Significant...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-14

    ... not listed on the Web site, but should note that the NRC's E-Filing system does not support unlisted... (COLR), to update the methodology reference list to support the core design with the new AREVA fuel... methodologies listed in Technical Specification 5.7.1.5 has no impact on any plant configuration or system...

  7. Alcohol- and Drug-Involved Driving in the United States: Methodology for the 2007 National Roadside Survey

    ERIC Educational Resources Information Center

    Lacey, John H.; Kelley-Baker, Tara; Voas, Robert B.; Romano, Eduardo; Furr-Holden, C. Debra; Torres, Pedro; Berning, Amy

    2011-01-01

    This article describes the methodology used in the 2007 U.S. National Roadside Survey to estimate the prevalence of alcohol- and drug-impaired driving and alcohol- and drug-involved driving. This study involved randomly stopping drivers at 300 locations across the 48 continental U.S. states at sites selected through a stratified random sampling…

  8. Concepts of Connectivity and Human Epileptic Activity

    PubMed Central

    Lemieux, Louis; Daunizeau, Jean; Walker, Matthew C.

    2011-01-01

    This review attempts to place the concept of connectivity from increasingly sophisticated neuroimaging data analysis methodologies within the field of epilepsy research. We introduce the more principled connectivity terminology developed recently in neuroimaging and review some of the key concepts related to the characterization of propagation of epileptic activity using what may be called traditional correlation-based studies based on EEG. We then show how essentially similar methodologies, and more recently models addressing causality, have been used to characterize whole-brain and regional networks using functional MRI data. Following a discussion of our current understanding of the neuronal system aspects of the onset and propagation of epileptic discharges and seizures, we discuss the most advanced and ambitious framework to attempt to fully characterize epileptic networks based on neuroimaging data. PMID:21472027

  9. Spectral Induced Polarization monitoring of the groundwater physico-chemical parameters daily variations for stream-groundwater interactions

    NASA Astrophysics Data System (ADS)

    Jougnot, Damien; Camerlynck, Christian; Robain, Henri; Tallec, Gaëlle; Ribolzi, Olivier; Gaillardet, Jérôme

    2017-04-01

    During the last decades, geophysical methods have been attracting an increasing interest in hydrology and environmental sciences given their sensitivity to parameters of interests and their non-intrusive nature. The Spectral Induced Polarization (SIP) is a low frequency electro-magnetic method that allows the characterization of the subsurface through its complex electrical conductivity. It reports the modulus of the conductivity and the phase between an injected current and a measured voltage over a rather large frequency range (from few millihertz to few tens of kilohertz). The real part of the conductivity is sensitive to lithological (porosity, specific surface area) and hydrological (water saturation, water salinity) parameters, while the imaginary part is linked to electrochemical polarizations, that have been shown to be largely influenced by the chemistry of the pore water. In the present contribution, we aim at better characterizing the exchanges between a stream and the surrounding groundwater using the SIP method and its sensitivity to pore water changes over time. Two sites from the OZCAR Research Infrastructure (French Critical Zone observatories) have been chosen for this study: the Houay Pano catchment (Laos) and the Orgeval catchment (France). These two sites have a good existing infrastructure and have been already studied extensively in terms of hydrology, geophysics, and hydrochemistry. They constitute perfect experimental sites to develop novel methodologies for the assessment of stream-groundwater exchanges. We propose to obtain a vertical description of the changes in complex electrical conductivity with depth based on SIP soundings undertaken with the multi-channel system SIP Fuchs III. We conducted a high-frequency monitoring close to a river stream (one vertical profiles every 30 min). In parallel, a high frequency monitoring of the physico-chemical parameters (temperature, conductivity, ionic concentrations) in the river stream has been performed. Relating the daily fluctuations of the groundwater complex conductivity and the river physico-chemical parameters could therefore establish a new proxy to characterize stream-groundwater interactions. In parallel to the field measurements, laboratory experiments have been conducted on soil samples from the two sites. These measurements provide a better understanding of the complex conductivity signature of the samples submitted to saturation and pore water physico-chemical changes. This work is in progress but the first results already show that the method has a real interest for the monitoring of daily variations of the physico-chemistry properties of the groundwater and their relations to those of the stream.

  10. Candidate locations for SPS rectifying antennas

    NASA Technical Reports Server (NTRS)

    Eberhardt, A. W.

    1977-01-01

    The feasibility of placing 120 Satellite Power System (SPS) rectifying antenna (rectenna) sites across the U.S. was studied. An initial attempt is made to put two land sites in each state using several land site selection criteria. When only 69 land sites are located, it is decided to put the remaining sites in the sea and sea site selection criteria are identified. An estimated projection of electrical demand distribution for the year 2000 is then used to determine the distribution of these sites along the Pacific, Atlantic, and Gulf Coasts. A methodology for distributing rectenna sites across the country and for fine-tuning exact locations is developed, and recommendations on rectenna design and operations are made.

  11. Characterization of Single-Event Burnout in Power MOSFET Using Backside Laser Testing

    NASA Astrophysics Data System (ADS)

    Miller, F.; Luu, A.; Prud'homme, F.; Poirot, P.; Gaillard, R.; Buard, N.; Carrire, T.

    2006-12-01

    This paper presents a new methodology based upon backside laser irradiations to characterize the sensitivity of power devices towards Single-Event Burnout. It is shown that this technique can be used to define the safe operating area

  12. Asphalt materials characterization in support of implementation of the proposed mechanistic-empirical pavement design guide.

    DOT National Transportation Integrated Search

    2007-01-01

    The proposed Mechanistic-Empirical Pavement Design Guide (MEPDG) procedure is an improved methodology for pavement design and evaluation of paving materials. Since this new procedure depends heavily on the characterization of the fundamental engineer...

  13. Use of Electrical Conductivity Logging to Characterize the Geological Context of Releases at UST Sites

    EPA Science Inventory

    Risk is the combination of hazard and exposure. Risk characterization at UST release sites has traditionally emphasized hazard (presence of residual fuel) with little attention to exposure. Exposure characterization often limited to a one-dimensional model such as the RBCA equa...

  14. Remedial Action Assessment System: A computer-based methodology for conducting feasibility studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M.K.; Buelt, J.L.; Stottlemyre, J.A.

    1991-02-01

    Because of the complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process. The Remedial Action Assessment System (RAAS), can be used for screening, linking, and evaluating established technology processes in support of conducting feasibility studies. It is also intended to do the same in support of corrective measures studies. The user interface employs menus, windows, help features, and graphical information while RAAS is in operation. Object-oriented programming is used to link unit processes into sets ofmore » compatible processes that form appropriate remedial alternatives. Once the remedial alternatives are formed, the RAAS methodology can evaluate them in terms of effectiveness, implementability, and cost. RAAS will access a user-selected risk assessment code to determine the reduction of risk after remedial action by each recommended alternative. The methodology will also help determine the implementability of the remedial alternatives at a site and access cost estimating tools to provide estimates of capital, operating, and maintenance costs. This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses graphical, tabular and textual information about technologies, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab.« less

  15. Characterizing crown fuel distribution for conifers in the interior western United States

    Treesearch

    Seth Ex; Frederick W. Smith; Tara Keyser

    2015-01-01

    Canopy fire hazard evaluation is essential for prioritizing fuel treatments and for assessing potential risk to firefighters during suppression activities. Fire hazard is usually expressed as predicted potential fire behavior, which is sensitive to the methodology used to quantitatively describe fuel profiles: methodologies that assume that fuel is distributed...

  16. A new methodology capable of characterizing most volatile and less volatile minor edible oils components in a single chromatographic run without solvents or reagents. Detection of new components.

    PubMed

    Alberdi-Cedeño, Jon; Ibargoitia, María L; Cristillo, Giovanna; Sopelana, Patricia; Guillén, María D

    2017-04-15

    The possibilities offered by a new methodology to determine minor components in edible oils are described. This is based on immersion of a solid-phase microextraction fiber of PDMS/DVB into the oil matrix, followed by Gas Chromatography/Mass Spectrometry. It enables characterization and differentiation of edible oils in a simple way, without either solvents or sample modification. This methodology allows simultaneous identification and quantification of sterols, tocols, hydrocarbons of different natures, fatty acids, esters, monoglycerides, fatty amides, aldehydes, ketones, alcohols, epoxides, furans, pyrans and terpenic oxygenated derivatives. The broad information provided by this methodology is useful for different areas of interest such as nutritional value, oxidative stability, technological performance, quality, processing, safety and even the prevention of fraudulent practices. Furthermore, for the first time, certain fatty amides, gamma- and delta-lactones of high molecular weight, and other aromatic compounds such as some esters derived from cinnamic acid have been detected in edible oils. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Lunar site characterization and mining

    NASA Technical Reports Server (NTRS)

    Glass, Charles E.

    1992-01-01

    Lunar mining requirements do not appear to be excessively demanding in terms of volume of material processed. It seems clear, however, that the labor-intensive practices that characterize terrestrial mining will not suffice at the low-gravity, hard-vacuum, and inaccessible sites on the Moon. New research efforts are needed in three important areas: (1) to develop high-speed, high-resolution through-rock vision systems that will permit more detailed and efficient mine site investigation and characterization; (2) to investigate the impact of lunar conditions on our ability to convert conventional mining and exploration equipment to lunar prototypes; and (3) to develop telerobotic or fully robotic mining systems for operations on the Moon and other bodies in the inner solar system. Other aspects of lunar site characterization and mining are discussed.

  18. D Integrated Methodologies for the Documentation and the Virtual Reconstruction of AN Archaeological Site

    NASA Astrophysics Data System (ADS)

    Balletti, C.; Guerra, F.; Scocca, V.; Gottardi, C.

    2015-02-01

    Highly accurate documentation and 3D reconstructions are fundamental for analyses and further interpretations in archaeology. In the last years the integrated digital survey (ground-based survey methods and UAV photogrammetry) has confirmed its main role in the documentation and comprehension of excavation contexts, thanks to instrumental and methodological development concerning the on site data acquisition. The specific aim of the project, reported in this paper and realized by the Laboratory of Photogrammetry of the IUAV University of Venice, is to check different acquisition systems and their effectiveness test, considering each methodology individually or integrated. This research focuses on the awareness that the integration of different survey's methodologies can as a matter of fact increase the representative efficacy of the final representations; these are based on a wider and verified set of georeferenced metric data. Particularly the methods' integration allows reducing or neutralizing issues related to composite and complex objects' survey, since the most appropriate tools and techniques can be chosen considering the characteristics of each part of an archaeological site (i.e. urban structures, architectural monuments, small findings). This paper describes the experience in several sites of the municipality of Sepino (Molise, Italy), where the 3d digital acquisition of cities and structure of monuments, sometimes hard to reach, was realized using active and passive techniques (rage-based and image based methods). This acquisition was planned in order to obtain not only the basic support for interpretation analysis, but also to achieve models of the actual state of conservation of the site on which some reconstructive hypotheses can be based on. Laser scanning data were merged with Structure from Motion techniques' clouds into the same reference system, given by a topographical and GPS survey. These 3d models are not only the final results of the metric survey, but also the starting point for the whole reconstruction of the city and its urban context, from the research point of view. This reconstruction process will concern even some areas that have not yet been excavated, where the application of procedural modelling can offer an important support to the reconstructive hypothesis.

  19. Model coupling methodology for thermo-hydro-mechanical-chemical numerical simulations in integrated assessment of long-term site behaviour

    NASA Astrophysics Data System (ADS)

    Kempka, Thomas; De Lucia, Marco; Kühn, Michael

    2015-04-01

    The integrated assessment of long-term site behaviour taking into account a high spatial resolution at reservoir scale requires a sophisticated methodology to represent coupled thermal, hydraulic, mechanical and chemical processes of relevance. Our coupling methodology considers the time-dependent occurrence and significance of multi-phase flow processes, mechanical effects and geochemical reactions (Kempka et al., 2014). Hereby, a simplified hydro-chemical coupling procedure was developed (Klein et al., 2013) and validated against fully coupled hydro-chemical simulations (De Lucia et al., 2015). The numerical simulation results elaborated for the pilot site Ketzin demonstrate that mechanical reservoir, caprock and fault integrity are maintained during the time of operation and that after 10,000 years CO2 dissolution is the dominating trapping mechanism and mineralization occurs on the order of 10 % to 25 % with negligible changes to porosity and permeability. De Lucia, M., Kempka, T., Kühn, M. A coupling alternative to reactive transport simulations for long-term prediction of chemical reactions in heterogeneous CO2 storage systems (2014) Geosci Model Dev Discuss 7:6217-6261. doi:10.5194/gmdd-7-6217-2014. Kempka, T., De Lucia, M., Kühn, M. Geomechanical integrity verification and mineral trapping quantification for the Ketzin CO2 storage pilot site by coupled numerical simulations (2014) Energy Procedia 63:3330-3338, doi:10.1016/j.egypro.2014.11.361. Klein E, De Lucia M, Kempka T, Kühn M. Evaluation of longterm mineral trapping at the Ketzin pilot site for CO2 storage: an integrative approach using geo-chemical modelling and reservoir simulation. Int J Greenh Gas Con 2013; 19:720-730. doi:10.1016/j.ijggc.2013.05.014.

  20. Characterization factors for land use impacts on biodiversity in life cycle assessment based on direct measures of plant species richness in European farmland in the 'Temperate Broadleaf and Mixed Forest' biome.

    PubMed

    Knudsen, Marie Trydeman; Hermansen, John E; Cederberg, Christel; Herzog, Felix; Vale, Jim; Jeanneret, Philippe; Sarthou, Jean-Pierre; Friedel, Jürgen K; Balázs, Katalin; Fjellstad, Wendy; Kainz, Max; Wolfrum, Sebastian; Dennis, Peter

    2017-02-15

    Life Cycle Assessment (LCA) is a widely used tool to assess environmental sustainability of products. The LCA should optimally cover the most important environmental impact categories such as climate change, eutrophication and biodiversity. However, impacts on biodiversity are seldom included in LCAs due to methodological limitations and lack of appropriate characterization factors. When assessing organic agricultural products the omission of biodiversity in LCA is problematic, because organic systems are characterized by higher species richness at field level compared to the conventional systems. Thus, there is a need for characterization factors to estimate land use impacts on biodiversity in life cycle assessment that are able to distinguish between organic and conventional agricultural land use that can be used to supplement and validate the few currently suggested characterization factors. Based on a unique dataset derived from field recording of plant species diversity in farmland across six European countries, the present study provides new midpoint occupation Characterization Factors (CF) expressing the Potentially Disappeared Fraction (PDF) to estimate land use impacts on biodiversity in the 'Temperate Broadleaf and Mixed Forest' biome in Europe. The method is based on calculation of plant species on randomly selected test sites in the biome and enables the calculation of characterization factors that are sensitive to particular types of management. While species richness differs between countries, the calculated CFs are able to distinguish between different land use types (pastures (monocotyledons or mixed), arable land and hedges) and management practices (organic or conventional production systems) across countries. The new occupation CFs can be used to supplement or validate the few current CF's and can be applied in LCAs of agricultural products to assess land use impacts on species richness in the 'Temperate Broadleaf and Mixed Forest' biome. Copyright © 2016 Elsevier B.V. All rights reserved.

Top