Sample records for methods testing site

  1. 40 CFR 53.30 - General provisions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... more information about the site. Any such pre-test approval of a test site by the EPA shall indicate... Methods and Reference Methods § 53.30 General provisions. (a) Determination of comparability. The test... discretion of the Administrator. (b) Selection of test sites. (1) Each test site shall be in an area which...

  2. 40 CFR 53.30 - General provisions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... more information about the site. Any such pre-test approval of a test site by the EPA shall indicate... Methods and Reference Methods § 53.30 General provisions. (a) Determination of comparability. The test... discretion of the Administrator. (b) Selection of test sites. (1) Each test site shall be in an area which...

  3. 40 CFR 53.30 - General provisions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... more information about the site. Any such pre-test approval of a test site by the EPA shall indicate... Methods and Reference Methods § 53.30 General provisions. (a) Determination of comparability. The test... discretion of the Administrator. (b) Selection of test sites. (1) Each test site shall be in an area which...

  4. 40 CFR 53.30 - General provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... more information about the site. Any such pre-test approval of a test site by the EPA shall indicate... Methods and Reference Methods § 53.30 General provisions. (a) Determination of comparability. The test... discretion of the Administrator. (b) Selection of test sites. (1) Each test site shall be in an area which...

  5. 40 CFR 53.30 - General provisions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... more information about the site. Any such pre-test approval of a test site by the EPA shall indicate... Methods and Reference Methods § 53.30 General provisions. (a) Determination of comparability. The test... discretion of the Administrator. (b) Selection of test sites. (1) Each test site shall be in an area which...

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowe, M.D.; Pierce, B.L.

    This report presents results of tests of different final site selection methods used for siting large-scale facilities such as nuclear power plants. Test data are adapted from a nuclear power plant siting study conducted on Long Island, New York. The purpose of the tests is to determine whether or not different final site selection methods produce different results, and to obtain some understanding of the nature of any differences found. Decision rules and weighting methods are included. Decision rules tested are Weighting Summation, Power Law, Decision Analysis, Goal Programming, and Goal Attainment; weighting methods tested are Categorization, Ranking, Rating Ratiomore » Estimation, Metfessel Allocation, Indifferent Tradeoff, Decision Analysis lottery, and Global Evaluation. Results show that different methods can, indeed, produce different results, but that the probability that they will do so is controlled by the structure of differences among the sites being evaluated. Differences in weights and suitability scores attributable to methods have reduced significance if the alternatives include one or two sites that are superior to all others in many attributes. The more tradeoffs there are among good and bad levels of different attributes at different sites, the more important are the specifics of methods to the final decision. 5 refs., 14 figs., 19 tabs.« less

  7. Slope stability and bearing capacity of landfills and simple on-site test methods.

    PubMed

    Yamawaki, Atsushi; Doi, Yoichi; Omine, Kiyoshi

    2017-07-01

    This study discusses strength characteristics (slope stability, bearing capacity, etc.) of waste landfills through on-site tests that were carried out at 29 locations in 19 sites in Japan and three other countries, and proposes simple methods to test and assess the mechanical strength of landfills on site. Also, the possibility of using a landfill site was investigated by a full-scale eccentric loading test. As a result of this, landfills containing more than about 10 cm long plastics or other fibrous materials were found to be resilient and hard to yield. An on-site full scale test proved that no differential settlement occurs. The repose angle test proposed as a simple on-site test method has been confirmed to be a good indicator for slope stability assessment. The repose angle test suggested that landfills which have high, near-saturation water content have considerably poorer slope stability. The results of our repose angle test and the impact acceleration test were related to the internal friction angle and the cohesion, respectively. In addition to this, it was found that the air pore volume ratio measured by an on-site air pore volume ratio test is likely to be related to various strength parameters.

  8. Field demonstration of on-site analytical methods for TNT and RDX in ground water

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, H.; Ferguson, G.; Markos, A.

    1996-12-31

    A field demonstration was conducted to assess the performance of eight commercially-available and emerging colorimetric, immunoassay, and biosensor on-site analytical methods for explosives 2,4,6-trinitrotoluene (TNT) and hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) in ground water and leachate at the Umatilla Army Depot Activity, Hermiston, Oregon and US Naval Submarine Base, Bangor, Washington, Superfund sites. Ground water samples were analyzed by each of the on-site methods and results compared to laboratory analysis using high performance liquid chromatography (HPLC) with EPA SW-846 Method 8330. The commercial methods evaluated include the EnSys, Inc., TNT and RDX colorimetric test kits (EPA SW-846 Methods 8515 and 8510) with amore » solid phase extraction (SPE) step, the DTECH/EM Science TNT and RDX immunoassay test kits (EPA SW-846 Methods 4050 and 4051), and the Ohmicron TNT immunoassay test kit. The emerging methods tested include the antibody-based Naval Research Laboratory (NRL) Continuous Flow Immunosensor (CFI) for TNT and RDX, and the Fiber Optic Biosensor (FOB) for TNT. Accuracy of the on-site methods were evaluated using linear regression analysis and relative percent difference (RPD) comparison criteria. Over the range of conditions tested, the colorimetric methods for TNT and RDX showed the highest accuracy of the emerging methods for TNT and RDX. The colorimetric method was selected for routine ground water monitoring at the Umatilla site, and further field testing on the NRL CFI and FOB biosensors will continue at both Superfund sites.« less

  9. Open-field test site

    NASA Astrophysics Data System (ADS)

    Gyoda, Koichi; Shinozuka, Takashi

    1995-06-01

    An open-field test site with measurement equipment, a turn table, antenna positioners, and measurement auxiliary equipment was remodelled at the CRL north-site. This paper introduces the configuration, specifications and characteristics of this new open-field test site. Measured 3-m and 10-m site attenuations are in good agreement with theoretical values, and this means that this site is suitable for using 3-m and 10-m method EMI/EMC measurements. The site is expected to be effective for antenna measurement, antenna calibration, and studies on EMI/EMC measurement methods.

  10. Determination of antenna factors using a three-antenna method at open-field test site

    NASA Astrophysics Data System (ADS)

    Masuzawa, Hiroshi; Tejima, Teruo; Harima, Katsushige; Morikawa, Takao

    1992-09-01

    Recently NIST has used the three-antenna method for calibration of the antenna factor of an antenna used for EMI measurements. This method does not require the specially designed standard antennas which are necessary in the standard field method or the standard antenna method, and can be used at an open-field test site. This paper theoretically and experimentally examines the measurement errors of this method and evaluates the precision of the antenna-factor calibration. It is found that the main source of the error is the non-ideal propagation characteristics of the test site, which should therefore be measured before the calibration. The precision of the antenna-factor calibration at the test site used in these experiments, is estimated to be 0.5 dB.

  11. Testing the efficiency of rover science protocols for robotic sample selection: A GeoHeuristic Operational Strategies Test

    NASA Astrophysics Data System (ADS)

    Yingst, R. A.; Bartley, J. K.; Chidsey, T. C.; Cohen, B. A.; Gilleaudeau, G. J.; Hynek, B. M.; Kah, L. C.; Minitti, M. E.; Williams, R. M. E.; Black, S.; Gemperline, J.; Schaufler, R.; Thomas, R. J.

    2018-05-01

    The GHOST field tests are designed to isolate and test science-driven rover operations protocols, to determine best practices. During a recent field test at a potential Mars 2020 landing site analog, we tested two Mars Science Laboratory data-acquisition and decision-making methods to assess resulting science return and sample quality: a linear method, where sites of interest are studied in the order encountered, and a "walkabout-first" method, where sites of interest are examined remotely before down-selecting to a subset of sites that are interrogated with more resource-intensive instruments. The walkabout method cost less time and fewer resources, while increasing confidence in interpretations. Contextual data critical to evaluating site geology was acquired earlier than for the linear method, and given a higher priority, which resulted in development of more mature hypotheses earlier in the analysis process. Combined, this saved time and energy in the collection of data with more limited spatial coverage. Based on these results, we suggest that the walkabout method be used where doing so would provide early context and time for the science team to develop hypotheses-critical tests; and that in gathering context, coverage may be more important than higher resolution.

  12. Non-Invasive Seismic Methods for Earthquake Site Classification Applied to Ontario Bridge Sites

    NASA Astrophysics Data System (ADS)

    Bilson Darko, A.; Molnar, S.; Sadrekarimi, A.

    2017-12-01

    How a site responds to earthquake shaking and its corresponding damage is largely influenced by the underlying ground conditions through which it propagates. The effects of site conditions on propagating seismic waves can be predicted from measurements of the shear wave velocity (Vs) of the soil layer(s) and the impedance ratio between bedrock and soil. Currently the seismic design of new buildings and bridges (2015 Canadian building and bridge codes) requires determination of the time-averaged shear-wave velocity of the upper 30 metres (Vs30) of a given site. In this study, two in situ Vs profiling methods; Multichannel Analysis of Surface Waves (MASW) and Ambient Vibration Array (AVA) methods are used to determine Vs30 at chosen bridge sites in Ontario, Canada. Both active-source (MASW) and passive-source (AVA) surface wave methods are used at each bridge site to obtain Rayleigh-wave phase velocities over a wide frequency bandwidth. The dispersion curve is jointly inverted with each site's amplification function (microtremor horizontal-to-vertical spectral ratio) to obtain shear-wave velocity profile(s). We apply our non-invasive testing at three major infrastructure projects, e.g., five bridge sites along the Rt. Hon. Herb Gray Parkway in Windsor, Ontario. Our non-invasive testing is co-located with previous invasive testing, including Standard Penetration Test (SPT), Cone Penetration Test and downhole Vs data. Correlations between SPT blowcount and Vs are developed for the different soil types sampled at our Ontario bridge sites. A robust earthquake site classification procedure (reliable Vs30 estimates) for bridge sites across Ontario is evaluated from available combinations of invasive and non-invasive site characterization methods.

  13. Standardization of Laboratory Methods for the PERCH Study

    PubMed Central

    Karron, Ruth A.; Morpeth, Susan C.; Bhat, Niranjan; Levine, Orin S.; Baggett, Henry C.; Brooks, W. Abdullah; Feikin, Daniel R.; Hammitt, Laura L.; Howie, Stephen R. C.; Knoll, Maria Deloria; Kotloff, Karen L.; Madhi, Shabir A.; Scott, J. Anthony G.; Thea, Donald M.; Adrian, Peter V.; Ahmed, Dilruba; Alam, Muntasir; Anderson, Trevor P.; Antonio, Martin; Baillie, Vicky L.; Dione, Michel; Endtz, Hubert P.; Gitahi, Caroline; Karani, Angela; Kwenda, Geoffrey; Maiga, Abdoul Aziz; McClellan, Jessica; Mitchell, Joanne L.; Morailane, Palesa; Mugo, Daisy; Mwaba, John; Mwansa, James; Mwarumba, Salim; Nyongesa, Sammy; Panchalingam, Sandra; Rahman, Mustafizur; Sawatwong, Pongpun; Tamboura, Boubou; Toure, Aliou; Whistler, Toni; O’Brien, Katherine L.; Murdoch, David R.

    2017-01-01

    Abstract The Pneumonia Etiology Research for Child Health study was conducted across 7 diverse research sites and relied on standardized clinical and laboratory methods for the accurate and meaningful interpretation of pneumonia etiology data. Blood, respiratory specimens, and urine were collected from children aged 1–59 months hospitalized with severe or very severe pneumonia and community controls of the same age without severe pneumonia and were tested with an extensive array of laboratory diagnostic tests. A standardized testing algorithm and standard operating procedures were applied across all study sites. Site laboratories received uniform training, equipment, and reagents for core testing methods. Standardization was further assured by routine teleconferences, in-person meetings, site monitoring visits, and internal and external quality assurance testing. Targeted confirmatory testing and testing by specialized assays were done at a central reference laboratory. PMID:28575358

  14. Laboratory and clinical evaluation of on-site urine drug testing.

    PubMed

    Beck, Olof; Carlsson, Sten; Tusic, Marinela; Olsson, Robert; Franzen, Lisa; Hulten, Peter

    2014-11-01

    Products for on-site urine drug testing offer the possibility to perform screening for drugs of abuse directly at the point-of-care. This is a well-established routine in emergency and dependency clinics but further evaluation of performance is needed due to inherent limitations with the available products. Urine drug testing by an on-site product was compared with routine laboratory methods. First, on-site testing was performed at the laboratory in addition to the routine method. Second, the on-site testing was performed at a dependency clinic and urine samples were subsequently sent to the laboratory for additional analytical investigation. The on-site testing products did not perform with assigned cut-off levels. The subjective reading between the presence of a spot (i.e. negative test result) being present or no spot (positive result) was difficult in 3.2% of the cases, and occurred for all parameters. The tests performed more accurately in drug negative samples (specificity 96%) but less accurately for detecting positives (sensitivity 79%). Of all incorrect results by the on-site test the proportion of false negatives was 42%. The overall agreement between on-site and laboratory testing was 95% in the laboratory study and 98% in the clinical study. Although a high degree of agreement was observed between on-site and routine laboratory urine drug testing, the performance of on-site testing was not acceptable due to significant number of false negative results. The limited sensitivity of on-site testing compared to laboratory testing reduces the applicability of these tests.

  15. Benefits of Multiple Methods for Evaluating HIV Counseling and Testing Sites in Pennsylvania.

    ERIC Educational Resources Information Center

    Encandela, John A.; Gehl, Mary Beth; Silvestre, Anthony; Schelzel, George

    1999-01-01

    Examines results from two methods used to evaluate publicly funded human immunodeficiency virus (HIV) counseling and testing in Pennsylvania. Results of written mail surveys of all sites and interviews from a random sample of 30 sites were similar in terms of questions posed and complementary in other ways. (SLD)

  16. A comparison of methods of estimating potential evapotranspiration from climatological data in arid and subhumid environments

    USGS Publications Warehouse

    Cruff, R.W.; Thompson, T.H.

    1967-01-01

    This study compared potential evapotranspiration, computed from climatological data by each of six empirical methods, with pan evaporation adjusted to equivalent lake evaporation by regional coefficients. The six methods tested were the Thornthwaite, U.S. Weather Bureau (a modification of the Permian method), Lowry-Johnson, Blaney-Criddle, Lane, and Hamon methods. The test was limited to 25 sites in the arid and subhumid parts of Arizona, California, and Nevada, where pan evaporation and concurrent climatological data were available. However, some of the sites lacked complete climatological data for the application of all six methods. Average values of adjusted pan evaporation and computed potential evapotransp4ration were compared for two periods---the calendar year and the 6-month period from May 1 through October 31. The 25 sites sampled a wide range of climatic conditions. Ten sites (group 1) were in a highly arid environment and four (group 2) were in an arid environment that was modified by extensive irrigation. The remaining 11 sites (group 3) were in a subhumid environment. Only the Weather Bureau method gave estimates of potential evapotranspiration that closely agreed with the adjusted pan evaporation at all sites where the method was used. However, lack of climatological data restricted the use of the Weather Bureau method to seven sites. Results obtained by use of the Thornthwaite, Lowry-Johnson, and Hamon methods were consistently low. Results obtained by use of the Lane method agreed with adjusted pan evaporation at the group 1 sites but were consistently high at the group 2 and 3 sites. During the analysis it became apparent that adjusted pan evaporation in an arid environment (group 1 sites) was a spurious standard for evaluating the reliability of .the methods that were tested. Group 1 data were accordingly not considered when making conclusions as ,to which of the six methods tested was best. The results of this study for group 2 and 3 data indicated that the Blaney-Criddle method, which uses climatological data that can be readily obtained or deduced, was the most practical of the six methods for estimating potential evapotranspiration. At all 15 sites in the two environments, potential evapotranspiration computed by the Blaney-Criddle method checked the adjusted pan evaporation within ?22 percent. This percentage range is generally considered to be the range of reliability for estimating lake evaporation from evaporation pans.

  17. Use of an analog site near Raymond, California, to develop equipment and methods for characterizing a potential high-level, nuclear waste repository site at Yucca Mountain, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Umari, A.M.J.; Geldon, A.; Patterson, G.

    1994-12-31

    Yucca Mountain, Nevada, currently is being investigated by the U.S. Geological Survey as a potential site for a high-level nuclear waste repository. Planned hydraulic-stress and tracer tests in fractured, tuffaceous rocks below the water table at Yucca Mountain will require work at depths in excess of 1,300 feet. To facilitate prototype testing of equipment and methods to be used in aquifer tests at Yucca Mountain, an analog site was selected in the foothills of the Sierra Nevada near Raymond, California. Two of nine 250- to 300-feet deep wells drilled into fractured, granitic rocks at the Raymond site have been instrumentedmore » with packers, pressure transducers, and other equipment that will be used at Yucca Mountain. Aquifer tests conducted at the Raymond site to date have demonstrated a need to modify some of the equipment and methods conceived for use at Yucca Mountain.« less

  18. Wellskins and slug tests: where's the bias?

    NASA Astrophysics Data System (ADS)

    Rovey, C. W.; Niemann, W. L.

    2001-03-01

    Pumping tests in an outwash sand at the Camp Dodge Site give hydraulic conductivities ( K) approximately seven times greater than conventional slug tests in the same wells. To determine if this difference is caused by skin bias, we slug tested three sets of wells, each in a progressively greater stage of development. Results were analyzed with both the conventional Bouwer-Rice method and the deconvolution method, which quantifies the skin and eliminates its effects. In 12 undeveloped wells the average skin is +4.0, causing underestimation of conventional slug-test K (Bouwer-Rice method) by approximately a factor of 2 relative to the deconvolution method. In seven nominally developed wells the skin averages just +0.34, and the Bouwer-Rice method gives K within 10% of that calculated with the deconvolution method. The Bouwer-Rice K in this group is also within 5% of that measured by natural-gradient tracer tests at the same site. In 12 intensely developed wells the average skin is <-0.82, consistent with an average skin of -1.7 measured during single-well pumping tests. At this site the maximum possible skin bias is much smaller than the difference between slug and pumping-test Ks. Moreover, the difference in K persists even in intensely developed wells with negative skins. Therefore, positive wellskins do not cause the difference in K between pumping and slug tests at this site.

  19. 77 FR 14319 - Unmanned Aircraft System Test Sites

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-09

    ... DoD and NASA. A project at a test range (a defined geographic area where research and development are..., the FAA believes that the new test sites need to include focal points to ensure that research is... and experience in conducting UAS operations and research. Methods that test site operators can use for...

  20. A relation between landsat digital numbers, surface reflectance, and the cosine of the solar zenith angle

    USGS Publications Warehouse

    Kowalik, William S.; Marsh, Stuart E.; Lyon, Ronald J. P.

    1982-01-01

    A method for estimating the reflectance of ground sites from satellite radiance data is proposed and tested. The method uses the known ground reflectance from several sites and satellite data gathered over a wide range of solar zenith angles. The method was tested on each of 10 different Landsat images using 10 small sites in the Walker Lake, Nevada area. Plots of raw Landsat digital numbers (DNs) versus the cosine of the solar zenith angle (cos Z) for the the test areas are linear, and the average correlation coefficients of the data for Landsat bands 4, 5, 6, and 7 are 0.94, 0.93, 0.94, and 0.94, respectively. Ground reflectance values for the 10 sites are proportional to the slope of the DN versus cos Z relation at each site. The slope of the DN versus cos Z relation for seven additional sites in Nevada and California were used to estimate the ground reflectances of those sites. The estimates for nearby sites are in error by an average of 1.2% and more distant sites are in error by 5.1%. The method can successfully estimate the reflectance of sites outside the original scene, but extrapolation of the reflectance estimation equations to other areas may violate assumptions of atmospheric homogeneity.

  1. Easier operation and similar power of 10 g monofilament test for screening diabetic peripheral neuropathy.

    PubMed

    Zhang, Qi; Yi, Na; Liu, Siying; Zheng, Hangping; Qiao, Xiaona; Xiong, Qian; Liu, Xiaoxia; Zhang, Shuo; Wen, Jie; Ye, Hongying; Zhou, Linuo; Li, Yiming; Hu, Renming; Lu, Bin

    2018-01-01

    Objective The 10 g Semmes-Weinstein monofilament evaluation (SWME) of 4 sites on each foot is recommended for distal symmetric polyneuropathy screening and diagnosis. A similar method has been proposed to diagnose 'high-risk' (for ulceration) feet, using 3 sites per foot. This study compared the effectiveness of SWME for testing 3, 4 and 10 sites per foot to identify patients with diabetic neuropathy. Methods We included 3497 subjects in a SWME of 10 sites; records from the 10-site SWME were used for a SWME of 3 and 4 sites. Neuropathy symptom scores and neuropathy deficit scores were evaluated to identify patients with diabetic peripheral neuropathy. Results The sensitivities of the 10 g SWME for 3, 4 and 10 sites were 17.8%, 19.0% and 22.4%, respectively. The Kappa coefficients for the SWME tests of 3, 4 and 10 sites were high (range: 0.78-0.93). Conclusions There were no significant differences in the effectiveness of 3-, 4- and 10-site SWME testing for diabetic peripheral neuropathy screening. SWME testing of 3 sites on each foot may be sufficient to screen for diabetic neuropathy.

  2. ENVIRONMENTAL METHODS TESTING SITE PROJECT: DATA MANAGEMENT PROCEDURES PLAN

    EPA Science Inventory

    The Environmental Methods Testing Site (EMTS) Data Management Procedures Plan identifies the computer hardware and software resources used in the EMTS project. It identifies the major software packages that are available for use by principal investigators for the analysis of data...

  3. A Profilometry-Based Dentifrice Abrasion Method for V8 Brushing Machines Part III: Multi-Laboratory Validation Testing of RDA-PE.

    PubMed

    Schneiderman, Eva; Colón, Ellen L; White, Donald J; Schemehorn, Bruce; Ganovsky, Tara; Haider, Amir; Garcia-Godoy, Franklin; Morrow, Brian R; Srimaneepong, Viritpon; Chumprasert, Sujin

    2017-09-01

    We have previously reported on progress toward the refinement of profilometry-based abrasivity testing of dentifrices using a V8 brushing machine and tactile or optical measurement of dentin wear. The general application of this technique may be advanced by demonstration of successful inter-laboratory confirmation of the method. The objective of this study was to explore the capability of different laboratories in the assessment of dentifrice abrasivity using a profilometry-based evaluation technique developed in our Mason laboratories. In addition, we wanted to assess the interchangeability of human and bovine specimens. Participating laboratories were instructed in methods associated with Radioactive Dentin Abrasivity-Profilometry Equivalent (RDA-PE) evaluation, including site visits to discuss critical elements of specimen preparation, masking, profilometry scanning, and procedures. Laboratories were likewise instructed on the requirement for demonstration of proportional linearity as a key condition for validation of the technique. Laboratories were provided with four test dentifrices, blinded for testing, with a broad range of abrasivity. In each laboratory, a calibration curve was developed for varying V8 brushing strokes (0, 4,000, and 10,000 strokes) with the ISO abrasive standard. Proportional linearity was determined as the ratio of standard abrasion mean depths created with 4,000 and 10,000 strokes (2.5 fold differences). Criteria for successful calibration within the method (established in our Mason laboratory) was set at proportional linearity = 2.5 ± 0.3. RDA-PE was compared to Radiotracer RDA for the four test dentifrices, with the latter obtained by averages from three independent Radiotracer RDA sites. Individual laboratories and their results were compared by 1) proportional linearity and 2) acquired RDA-PE values for test pastes. Five sites participated in the study. One site did not pass proportional linearity objectives. Data for this site are not reported at the request of the researchers. Three of the remaining four sites reported herein tested human dentin and all three met proportional linearity objectives for human dentin. Three of four sites participated in testing bovine dentin and all three met the proportional linearity objectives for bovine dentin. RDA-PE values for test dentifrices were similar between sites. All four sites that met proportional linearity requirement successfully identified the dentifrice formulated above the industry standard 250 RDA (as RDA-PE). The profilometry method showed at least as good reproducibility and differentiation as Radiotracer assessments. It was demonstrated that human and bovine specimens could be used interchangeably. The standardized RDA-PE method was reproduced in multiple laboratories in this inter-laboratory study. Evidence supports that this method is a suitable technique for ISO method 11609 Annex B.

  4. Evaluation of Two Statistical Methods Provides Insights into the Complex Patterns of Alternative Polyadenylation Site Switching

    PubMed Central

    Li, Jie; Li, Rui; You, Leiming; Xu, Anlong; Fu, Yonggui; Huang, Shengfeng

    2015-01-01

    Switching between different alternative polyadenylation (APA) sites plays an important role in the fine tuning of gene expression. New technologies for the execution of 3’-end enriched RNA-seq allow genome-wide detection of the genes that exhibit significant APA site switching between different samples. Here, we show that the independence test gives better results than the linear trend test in detecting APA site-switching events. Further examination suggests that the discrepancy between these two statistical methods arises from complex APA site-switching events that cannot be represented by a simple change of average 3’-UTR length. In theory, the linear trend test is only effective in detecting these simple changes. We classify the switching events into four switching patterns: two simple patterns (3’-UTR shortening and lengthening) and two complex patterns. By comparing the results of the two statistical methods, we show that complex patterns account for 1/4 of all observed switching events that happen between normal and cancerous human breast cell lines. Because simple and complex switching patterns may convey different biological meanings, they merit separate study. We therefore propose to combine both the independence test and the linear trend test in practice. First, the independence test should be used to detect APA site switching; second, the linear trend test should be invoked to identify simple switching events; and third, those complex switching events that pass independence testing but fail linear trend testing can be identified. PMID:25875641

  5. FUELS IN SOIL TEST KIT: FIELD USE OF DIESEL DOG SOIL TEST KITS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Susan S. Sorini; John F. Schabron; Joseph F. Rovani, Jr.

    Western Research Institute (WRI) has developed a new commercial product ready for technology transfer, the Diesel Dog{reg_sign} Portable Soil Test Kit, for performing analysis of fuel-contaminated soils in the field. The technology consists of a method developed by WRI (U.S. Patents 5,561,065 and 5,976,883) and hardware developed by WRI that allows the method to be performed in the field (patent pending). The method is very simple and does not require the use of highly toxic reagents. The aromatic components in a soil extract are measured by absorption at 254 nm with a field-portable photometer. WRI added significant value to themore » technology by taking the method through the American Society for Testing and Materials (ASTM) approval and validation processes. The method is designated as ASTM Method D 5831-96, Standard Test Method for Screening Fuels in Soils. This ASTM designation allows the method to be used for federal compliance activities. In June 2001, the Diesel Dog technology won an American Chemical Society Regional Industrial Innovations Award. To gain field experience with the new technology, Diesel Dog kits have been used for a variety of site evaluation and cleanup activities. Information gained from these activities has led to improvements in hardware configurations and additional insight into correlating Diesel Dog results with results from laboratory methods. The Wyoming Department of Environmental Quality (DEQ) used Diesel Dog Soil Test Kits to guide cleanups at a variety of sites throughout the state. ENSR, of Acton, Massachusetts, used a Diesel Dog Portable Soil Test Kit to evaluate sites in the Virgin Islands and Georgia. ChemTrack and the U.S. Army Corps of Engineers successfully used a test kit to guide excavation at an abandoned FAA fuel-contaminated site near Fairbanks, Alaska. Barenco, Inc. is using a Diesel Dog Portable Soil Test Kit for site evaluations in Canada. A small spill of diesel fuel was cleaned up in Laramie, Wyoming using a Diesel Dog Soil Test Kit.« less

  6. Comparison of on-site field measured inorganic arsenic in rice with laboratory measurements using a field deployable method: Method validation.

    PubMed

    Mlangeni, Angstone Thembachako; Vecchi, Valeria; Norton, Gareth J; Raab, Andrea; Krupp, Eva M; Feldmann, Joerg

    2018-10-15

    A commercial arsenic field kit designed to measure inorganic arsenic (iAs) in water was modified into a field deployable method (FDM) to measure iAs in rice. While the method has been validated to give precise and accurate results in the laboratory, its on-site field performance has not been evaluated. This study was designed to test the method on-site in Malawi in order to evaluate its accuracy and precision in determination of iAs on-site by comparing with a validated reference method and giving original data on inorganic arsenic in Malawian rice and rice-based products. The method was validated by using the established laboratory-based HPLC-ICPMS. Statistical tests indicated there were no significant differences between on-site and laboratory iAs measurements determined using the FDM (p = 0.263, ά = 0.05) and between on-site measurements and measurements determined using HPLC-ICP-MS (p = 0.299, ά = 0.05). This method allows quick (within 1 h) and efficient screening of rice containing iAs concentrations on-site. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. 75 FR 7593 - Recent Postings of Broadly Applicable Alternative Test Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-22

    ... Applicable Alternative Test Methods AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of availability. SUMMARY: This notice announces the broadly applicable alternative test method approval decisions... electronic copy of each alternative test method approval document is available on EPA's Web site at http...

  8. A critical issue in model-based inference for studying trait-based community assembly and a solution.

    PubMed

    Ter Braak, Cajo J F; Peres-Neto, Pedro; Dray, Stéphane

    2017-01-01

    Statistical testing of trait-environment association from data is a challenge as there is no common unit of observation: the trait is observed on species, the environment on sites and the mediating abundance on species-site combinations. A number of correlation-based methods, such as the community weighted trait means method (CWM), the fourth-corner correlation method and the multivariate method RLQ, have been proposed to estimate such trait-environment associations. In these methods, valid statistical testing proceeds by performing two separate resampling tests, one site-based and the other species-based and by assessing significance by the largest of the two p -values (the p max test). Recently, regression-based methods using generalized linear models (GLM) have been proposed as a promising alternative with statistical inference via site-based resampling. We investigated the performance of this new approach along with approaches that mimicked the p max test using GLM instead of fourth-corner. By simulation using models with additional random variation in the species response to the environment, the site-based resampling tests using GLM are shown to have severely inflated type I error, of up to 90%, when the nominal level is set as 5%. In addition, predictive modelling of such data using site-based cross-validation very often identified trait-environment interactions that had no predictive value. The problem that we identify is not an "omitted variable bias" problem as it occurs even when the additional random variation is independent of the observed trait and environment data. Instead, it is a problem of ignoring a random effect. In the same simulations, the GLM-based p max test controlled the type I error in all models proposed so far in this context, but still gave slightly inflated error in more complex models that included both missing (but important) traits and missing (but important) environmental variables. For screening the importance of single trait-environment combinations, the fourth-corner test is shown to give almost the same results as the GLM-based tests in far less computing time.

  9. EPA flow reference method testing and analysis: Data report -- Pennsylvania Electric Company, G.P.U. Genco Homer City Station. Volume 1: Test description and appendix A (data distribution package)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-09-01

    This report describes the test site, equipment, and procedures and presents the data obtained during field testing at G.P.U. Genco Homer City Station, August 19--24, 1997. This was the third of three field tests that the US Environmental Protection Agency (EPA) conducted in 1997 as part of a major study to evaluate potential improvements to Method 3, EPA`s test method for measuring flue gas volumetric flow in stacks. The report also includes a Data Distribution Package, the official, complete repository of the results obtained at the test site.

  10. Correlating the EMC analysis and testing methods for space systems in MIL-STD-1541A

    NASA Technical Reports Server (NTRS)

    Perez, Reinaldo J.

    1990-01-01

    A study was conducted to improve the correlation between the electromagnetic compatibility (EMC) analysis models stated in MIL-STD-1541A and the suggested testing methods used for space systems. The test and analysis methods outlined in MIL-STD-1541A are described, and a comparative assessment of testing and analysis techniques as they relate to several EMC areas is presented. Suggestions on present analysis and test methods are introduced to harmonize and bring the analysis and testing tools in MIL-STD-1541A into closer agreement. It is suggested that test procedures in MIL-STD-1541A must be improved by providing alternatives to the present use of shielded enclosures as the primary site for such tests. In addition, the alternate use of anechoic chambers and open field test sites must be considered.

  11. Integration of Biomass Harvesting and Site Preparation

    Treesearch

    Bryce J. Stokes; William F. Watson

    1986-01-01

    This study was conducted to assess the costs of various site preparation methods with various levels of harvesting Site impacts, soil compaction and disturbance were examined. Three hawesting methods rare evaluated in pine pulpwood plantation and pine sawtimber stands. The harvesting methods tested were (1) conventional - harvesting all roundwood. (2) two-pass - first...

  12. Influence of mycorrhizal source and seeding methods on native grass species grown in soils from a disturbed site

    Treesearch

    Todd R. Caplan; Heather A. Pratt; Samuel R. Loftin

    1999-01-01

    Mycorrhizal fungi are crucial elements in native plant communities and restoring these fungi to disturbed sites is known to improve revegetation success. We tested the seedball method of plant dispersal for restoration of plants and mycorrhizal fungi to disturbed ecosystems. We tested the seedball method with a native mycorrhizal fungi inoculum, and a commercial...

  13. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  14. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  15. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  16. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  17. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  18. Protein-protein interaction site predictions with minimum covariance determinant and Mahalanobis distance.

    PubMed

    Qiu, Zhijun; Zhou, Bo; Yuan, Jiangfeng

    2017-11-21

    Protein-protein interaction site (PPIS) prediction must deal with the diversity of interaction sites that limits their prediction accuracy. Use of proteins with unknown or unidentified interactions can also lead to missing interfaces. Such data errors are often brought into the training dataset. In response to these two problems, we used the minimum covariance determinant (MCD) method to refine the training data to build a predictor with better performance, utilizing its ability of removing outliers. In order to predict test data in practice, a method based on Mahalanobis distance was devised to select proper test data as input for the predictor. With leave-one-validation and independent test, after the Mahalanobis distance screening, our method achieved higher performance according to Matthews correlation coefficient (MCC), although only a part of test data could be predicted. These results indicate that data refinement is an efficient approach to improve protein-protein interaction site prediction. By further optimizing our method, it is hopeful to develop predictors of better performance and wide range of application. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Prediction of lysine ubiquitylation with ensemble classifier and feature selection.

    PubMed

    Zhao, Xiaowei; Li, Xiangtao; Ma, Zhiqiang; Yin, Minghao

    2011-01-01

    Ubiquitylation is an important process of post-translational modification. Correct identification of protein lysine ubiquitylation sites is of fundamental importance to understand the molecular mechanism of lysine ubiquitylation in biological systems. This paper develops a novel computational method to effectively identify the lysine ubiquitylation sites based on the ensemble approach. In the proposed method, 468 ubiquitylation sites from 323 proteins retrieved from the Swiss-Prot database were encoded into feature vectors by using four kinds of protein sequences information. An effective feature selection method was then applied to extract informative feature subsets. After different feature subsets were obtained by setting different starting points in the search procedure, they were used to train multiple random forests classifiers and then aggregated into a consensus classifier by majority voting. Evaluated by jackknife tests and independent tests respectively, the accuracy of the proposed predictor reached 76.82% for the training dataset and 79.16% for the test dataset, indicating that this predictor is a useful tool to predict lysine ubiquitylation sites. Furthermore, site-specific feature analysis was performed and it was shown that ubiquitylation is intimately correlated with the features of its surrounding sites in addition to features derived from the lysine site itself. The feature selection method is available upon request.

  20. On-orbit characterization of hyperspectral imagers

    NASA Astrophysics Data System (ADS)

    McCorkel, Joel

    Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne- and satellite-based sensors. Often, ground-truth measurements at these tests sites are not always successful due to weather and funding availability. Therefore, RSG has also employed automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor. This dissertation presents a method for determining the radiometric calibration of a hyperspectral imager using multispectral imagery. The work relies on a multispectral sensor, Moderate-resolution Imaging Spectroradiometer (MODIS), as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. A method to predict hyperspectral surface reflectance using a combination of MODIS data and spectral shape information is developed and applied for the characterization of Hyperion. Spectral shape information is based on RSG's historical in situ data for the Railroad Valley test site and spectral library data for the Libyan test site. Average atmospheric parameters, also based on historical measurements, are used in reflectance prediction and transfer to space. Results of several cross-calibration scenarios that differ in image acquisition coincidence, test site, and reference sensor are found for the characterization of Hyperion. These are compared with results from the reflectance-based approach of vicarious calibration, a well-documented method developed by the RSG that serves as a baseline for calibration performance for the cross-calibration method developed here. Cross-calibration provides results that are within 2% of those of reflectance-based results in most spectral regions. Larger disagreements exist for shorter wavelengths studied in this work as well as in spectral areas that experience absorption by the atmosphere.

  1. Characterization studies and indicated remediation methods for plutonium contaminated soils at the Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murarik, T.M.; Wenstrand, T.K.; Rogers, L.A.

    An initial soil characterization study was conducted to help identify possible remediation methods to remove plutonium from the Nevada Test Site and Tonapah Test Range surface soils. Results from soil samples collected across various isopleths at five sites indicate that the size-fraction distribution patterns of plutonium remain similar to findings from the Nevada Applied Ecology Group (NAEG) (1970's). The plutonium remains in the upper 10--15 cm of soils, as indicated in previous studies. Distribution of fine particles downwind'' of ground zero at each site is suggested. Whether this pattern was established immediately after each explosion or this resulted from post-shotmore » wind movement of deposited material is unclear. Several possible soil treatment scenarios are presented. Removal of plutonium from certain size fractions of the soils would alleviate the sites of much of the plutonium burden. However, the nature of association of plutonium with soil components will determine which remediation methods will most likely succeed.« less

  2. Characterization studies and indicated remediation methods for plutonium contaminated soils at the Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murarik, T.M.; Wenstrand, T.K.; Rogers, L.A.

    An initial soil characterization study was conducted to help identify possible remediation methods to remove plutonium from the Nevada Test Site and Tonapah Test Range surface soils. Results from soil samples collected across various isopleths at five sites indicate that the size-fraction distribution patterns of plutonium remain similar to findings from the Nevada Applied Ecology Group (NAEG) (1970`s). The plutonium remains in the upper 10--15 cm of soils, as indicated in previous studies. Distribution of fine particles ``downwind`` of ground zero at each site is suggested. Whether this pattern was established immediately after each explosion or this resulted from post-shotmore » wind movement of deposited material is unclear. Several possible soil treatment scenarios are presented. Removal of plutonium from certain size fractions of the soils would alleviate the sites of much of the plutonium burden. However, the nature of association of plutonium with soil components will determine which remediation methods will most likely succeed.« less

  3. Rock-Magnetic Method for Post Nuclear Detonation Diagnostics

    NASA Astrophysics Data System (ADS)

    Englert, J.; Petrosky, J.; Bailey, W.; Watts, D. R.; Tauxe, L.; Heger, A. S.

    2011-12-01

    A magnetic signature characteristic of a Nuclear Electromagnetic Pulse (NEMP) may still be detectable near the sites of atmospheric nuclear tests conducted at what is now the Nevada National Security Site. This signature is due to a secondary magnetization component of the natural remanent magnetization of material containing traces of ferromagnetic particles that have been exposed to a strong pulse of magnetic field. We apply a rock-magnetic method introduced by Verrier et al. (2002), and tested on samples exposed to artificial lightning, to samples of rock and building materials (e.g. bricks, concrete) retrieved from several above ground nuclear test sites. The results of magnetization measurements are compared to NEMP simulations and historic test measurements.

  4. Multiple well-shutdown tests and site-scale flow simulation in fractured rocks

    USGS Publications Warehouse

    Tiedeman, Claire; Lacombe, Pierre J.; Goode, Daniel J.

    2010-01-01

    A new method was developed for conducting aquifer tests in fractured-rock flow systems that have a pump-and-treat (P&T) operation for containing and removing groundwater contaminants. The method involves temporary shutdown of individual pumps in wells of the P&T system. Conducting aquifer tests in this manner has several advantages, including (1) no additional contaminated water is withdrawn, and (2) hydraulic containment of contaminants remains largely intact because pumping continues at most wells. The well-shutdown test method was applied at the former Naval Air Warfare Center (NAWC), West Trenton, New Jersey, where a P&T operation is designed to contain and remove trichloroethene and its daughter products in the dipping fractured sedimentary rocks underlying the site. The detailed site-scale subsurface geologic stratigraphy, a three-dimensional MODFLOW model, and inverse methods in UCODE_2005 were used to analyze the shutdown tests. In the model, a deterministic method was used for representing the highly heterogeneous hydraulic conductivity distribution and simulations were conducted using an equivalent porous media method. This approach was very successful for simulating the shutdown tests, contrary to a common perception that flow in fractured rocks must be simulated using a stochastic or discrete fracture representation of heterogeneity. Use of inverse methods to simultaneously calibrate the model to the multiple shutdown tests was integral to the effectiveness of the approach.

  5. A likelihood ratio test for evolutionary rate shifts and functional divergence among proteins

    PubMed Central

    Knudsen, Bjarne; Miyamoto, Michael M.

    2001-01-01

    Changes in protein function can lead to changes in the selection acting on specific residues. This can often be detected as evolutionary rate changes at the sites in question. A maximum-likelihood method for detecting evolutionary rate shifts at specific protein positions is presented. The method determines significance values of the rate differences to give a sound statistical foundation for the conclusions drawn from the analyses. A statistical test for detecting slowly evolving sites is also described. The methods are applied to a set of Myc proteins for the identification of both conserved sites and those with changing evolutionary rates. Those positions with conserved and changing rates are related to the structures and functions of their proteins. The results are compared with an earlier Bayesian method, thereby highlighting the advantages of the new likelihood ratio tests. PMID:11734650

  6. 77 FR 73286 - Codification of Animal Testing Policy

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-10

    ... to post the test method on the animal testing Web site. In the final statement of policy, we refer to... case-by-case basis and, upon review, determine whether to post the test method on the animal testing... on a case-by- case basis and, upon review, determine whether to post the test method on the animal...

  7. Usability Evaluation of Public Web Mapping Sites

    NASA Astrophysics Data System (ADS)

    Wang, C.

    2014-04-01

    Web mapping sites are interactive maps that are accessed via Webpages. With the rapid development of Internet and Geographic Information System (GIS) field, public web mapping sites are not foreign to people. Nowadays, people use these web mapping sites for various reasons, in that increasing maps and related map services of web mapping sites are freely available for end users. Thus, increased users of web mapping sites led to more usability studies. Usability Engineering (UE), for instance, is an approach for analyzing and improving the usability of websites through examining and evaluating an interface. In this research, UE method was employed to explore usability problems of four public web mapping sites, analyze the problems quantitatively and provide guidelines for future design based on the test results. Firstly, the development progress for usability studies were described, and simultaneously several usability evaluation methods such as Usability Engineering (UE), User-Centered Design (UCD) and Human-Computer Interaction (HCI) were generally introduced. Then the method and procedure of experiments for the usability test were presented in detail. In this usability evaluation experiment, four public web mapping sites (Google Maps, Bing maps, Mapquest, Yahoo Maps) were chosen as the testing websites. And 42 people, who having different GIS skills (test users or experts), gender (male or female), age and nationality, participated in this test to complete the several test tasks in different teams. The test comprised three parts: a pretest background information questionnaire, several test tasks for quantitative statistics and progress analysis, and a posttest questionnaire. The pretest and posttest questionnaires focused on gaining the verbal explanation of their actions qualitatively. And the design for test tasks targeted at gathering quantitative data for the errors and problems of the websites. Then, the results mainly from the test part were analyzed. The success rate from different public web mapping sites was calculated and compared, and displayed by the means of diagram. And the answers from questionnaires were also classified and organized in this part. Moreover, based on the analysis, this paper expands the discussion about the layout, map visualization, map tools, search logic and etc. Finally, this paper closed with some valuable guidelines and suggestions for the design of public web mapping sites. Also, limitations for this research stated in the end.

  8. Test site suitability assessment for radiation measurements

    NASA Astrophysics Data System (ADS)

    Borsero, M.; Nano, E.

    1980-04-01

    Field and attenuation methods for site suitability assessment for radiation measurements are presented. Attention is given to the IEC procedure for checking the suitability of radiation measurement site.

  9. Method and apparatus for electrical cable testing by pulse-arrested spark discharge

    DOEpatents

    Barnum, John R.; Warne, Larry K.; Jorgenson, Roy E.; Schneider, Larry X.

    2005-02-08

    A method for electrical cable testing by Pulse-Arrested Spark Discharge (PASD) uses the cable response to a short-duration high-voltage incident pulse to determine the location of an electrical breakdown that occurs at a defect site in the cable. The apparatus for cable testing by PASD includes a pulser for generating the short-duration high-voltage incident pulse, at least one diagnostic sensor to detect the incident pulse and the breakdown-induced reflected and/or transmitted pulses propagating from the electrical breakdown at the defect site, and a transient recorder to record the cable response. The method and apparatus are particularly useful to determine the location of defect sites in critical but inaccessible electrical cabling systems in aging aircraft, ships, nuclear power plants, and industrial complexes.

  10. Test Methods for Telemetry Systems and Subsystems. Volume 1. Test Methods for Vehicle Telemetry Systems

    DTIC Science & Technology

    2006-06-01

    IEC Web Site - http://www.iec.org/ National Instruments Web Site - http://www.ni.com/ ASA ( Acoustical Society of America) - http://asa.aip.org/ Flow...1994 (R2004), Acoustical Terminology. ANSI S1.10-1966 (R2001), USA Standard Method for Calibration of Microphones. ANSI S1.15-1997, USA Standard...R2001), American National Standard Specification for Acoustical Calibrators. ANSI S1.9-1996 (R2001), American National Standard Instruments for

  11. 40 CFR 63.1161 - Performance testing and test methods.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Facilities and Hydrochloric Acid Regeneration Plants § 63.1161 Performance testing and test methods. (a...) Establishment of hydrochloric acid regeneration plant operating parameters. (1) During the performance test for hydrochloric acid regeneration plants, the owner or operator shall establish site-specific operating parameter...

  12. 40 CFR 63.1161 - Performance testing and test methods.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Facilities and Hydrochloric Acid Regeneration Plants § 63.1161 Performance testing and test methods. (a...) Establishment of hydrochloric acid regeneration plant operating parameters. (1) During the performance test for hydrochloric acid regeneration plants, the owner or operator shall establish site-specific operating parameter...

  13. Systematic Approach to Identifying Deeply Buried Archeological Deposits

    DOT National Transportation Integrated Search

    2018-02-01

    Traditional methods used to discover archeological sites include pedestrian surface surveys and relatively shallow hand-dug shovel or soil core testing. While these methods are appropriate for locating surface and near-surface sites on ridges, hillto...

  14. Evaluation of New Repair Methods for Seal Surface Defects on Reusable Solid Rocket Motor (RSRM) Hardware

    NASA Technical Reports Server (NTRS)

    Stanley, Stephanie; Selvidge, Shawn

    2003-01-01

    The focus of the evaluation was to develop a back-up method to cell plating for the improvement or repair of seal surface defects within D6-AC steel and 7075-T73 aluminum used in the RSRM program. Several techniques were investigated including thermal and non-thermal based techniques. Ideally the repair would maintain the inherent properties of the substrate without losing integrity at the repair site. The repaired sites were tested for adhesion, corrosion, hardness, microhardness, surface toughness, thermal stability, ability to withstand bending of the repair site, and the ability to endure a high-pressure water blast without compromising the repaired site. The repaired material could not change the inherent properties of the substrate throughout each of the test in order to remain a possible technique to repair the RSRM substrate materials. One repair method, Electro-Spark Alloying, passed all the testing and is considered a candidate for further evaluation.

  15. Evaluation of New Repair Methods for Seal Surface Defects on Reusable Solid Rocket Motor (RSRM) Hardware

    NASA Technical Reports Server (NTRS)

    Stanley, Stephanie D.; Selvidge, Shawn A.; Cash, Steve (Technical Monitor)

    2002-01-01

    The focus of the evaluation was to develop a back-up method to cell plating for the improvement or repair of seal surface defects within D6-AC steel and 7075-T73 aluminum used in the RSRM program. Several techniques were investigated including thermal and non-thermal based techniques. Ideally the repair would maintain the inherent properties of the substrate without losing integrity at the repair site. The repaired sites were tested for adhesion, corrosion, hardness, microhardness, surface toughness, thermal stability, ability to withstand bending of the repair site, and the ability to endure a high-pressure water blast without compromising the repaired site. The repaired material could not change the inherent properties of the substrate throughout each of the test in order to remain a possible technique to repair the RSRM substrate materials. One repair method, Electro-Spark Alloying, passed all the testing and is considered a candidate for further evaluation.

  16. A cross-site comparison of methods used for hydrogeologic characterization of the Galena-Platteville aquifer in Illinois and Wisconsin, with examples from selected Superfund sites

    USGS Publications Warehouse

    Kay, Robert T.; Mills, Patrick C.; Dunning, Charles P.; Yeskis, Douglas J.; Ursic, James R.; Vendl, Mark

    2004-01-01

    The effectiveness of 28 methods used to characterize the fractured Galena-Platteville aquifer at eight sites in northern Illinois and Wisconsin is evaluated. Analysis of government databases, previous investigations, topographic maps, aerial photographs, and outcrops was essential to understanding the hydrogeology in the area to be investigated. The effectiveness of surface-geophysical methods depended on site geology. Lithologic logging provided essential information for site characterization. Cores were used for stratigraphy and geotechnical analysis. Natural-gamma logging helped identify the effect of lithology on the location of secondary- permeability features. Caliper logging identified large secondary-permeability features. Neutron logs identified trends in matrix porosity. Acoustic-televiewer logs identified numerous secondary-permeability features and their orientation. Borehole-camera logs also identified a number of secondary-permeability features. Borehole ground-penetrating radar identified lithologic and secondary-permeability features. However, the accuracy and completeness of this method is uncertain. Single-point-resistance, density, and normal resistivity logs were of limited use. Water-level and water-quality data identified flow directions and indicated the horizontal and vertical distribution of aquifer permeability and the depth of the permeable features. Temperature, spontaneous potential, and fluid-resistivity logging identified few secondary-permeability features at some sites and several features at others. Flowmeter logging was the most effective geophysical method for characterizing secondary-permeability features. Aquifer tests provided insight into the permeability distribution, identified hydraulically interconnected features, the presence of heterogeneity and anisotropy, and determined effective porosity. Aquifer heterogeneity prevented calculation of accurate hydraulic properties from some tests. Different methods, such as flowmeter logging and slug testing, occasionally produced different interpretations. Aquifer characterization improved with an increase in the number of data points, the period of data collection, and the number of methods used.

  17. Airborne and Ground-Based Optical Characterization of Legacy Underground Nuclear Test Sites

    NASA Astrophysics Data System (ADS)

    Vigil, S.; Craven, J.; Anderson, D.; Dzur, R.; Schultz-Fellenz, E. S.; Sussman, A. J.

    2015-12-01

    Detecting, locating, and characterizing suspected underground nuclear test sites is a U.S. security priority. Currently, global underground nuclear explosion monitoring relies on seismic and infrasound sensor networks to provide rapid initial detection of potential underground nuclear tests. While seismic and infrasound might be able to generally locate potential underground nuclear tests, additional sensing methods might be required to further pinpoint test site locations. Optical remote sensing is a robust approach for site location and characterization due to the ability it provides to search large areas relatively quickly, resolve surface features in fine detail, and perform these tasks non-intrusively. Optical remote sensing provides both cultural and surface geological information about a site, for example, operational infrastructure, surface fractures. Surface geological information, when combined with known or estimated subsurface geologic information, could provide clues concerning test parameters. We have characterized two legacy nuclear test sites on the Nevada National Security Site (NNSS), U20ak and U20az using helicopter-, ground- and unmanned aerial system-based RGB imagery and light detection and ranging (lidar) systems. The multi-faceted information garnered from these different sensing modalities has allowed us to build a knowledge base of how a nuclear test site might look when sensed remotely, and the standoff distances required to resolve important site characteristics.

  18. Development of monitoring and diagnostic methods for robots used in remediation of waste sites. 1997 annual progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tecza, J.

    1998-06-01

    'Safe and efficient clean up of hazardous and radioactive waste sites throughout the DOE complex will require extensive use of robots. This research effort focuses on developing Monitoring and Diagnostic (M and D) methods for robots that will provide early detection, isolation, and tracking of impending faults before they result in serious failure. The utility and effectiveness of applying M and D methods to hydraulic robots has never been proven. The present research program is utilizing seeded faults in a laboratory test rig that is representative of an existing hydraulically-powered remediation robot. This report summarizes activity conducted in the firstmore » 9 months of the project. The research team has analyzed the Rosie Mobile Worksystem as a representative hydraulic robot, developed a test rig for implanted fault testing, developed a test plan and agenda, and established methods for acquiring and analyzing the test data.'« less

  19. Implementing Rapid HIV Testing With or Without Risk-Reduction Counseling in Drug Treatment Centers: Results of a Randomized Trial

    PubMed Central

    Feaster, Daniel J.; Gooden, Lauren; Matheson, Tim; Mandler, Raul N.; Haynes, Louise; Tross, Susan; Kyle, Tiffany; Gallup, Dianne; Kosinski, Andrzej S.; Douaihy, Antoine; Schackman, Bruce R.; Das, Moupali; Lindblad, Robert; Erickson, Sarah; Korthuis, P. Todd; Martino, Steve; Sorensen, James L.; Szapocznik, José; Walensky, Rochelle; Branson, Bernard; Colfax, Grant N.

    2012-01-01

    Objectives. We examined the effectiveness of risk reduction counseling and the role of on-site HIV testing in drug treatment. Methods. Between January and May 2009, we randomized 1281 HIV-negative (or status unknown) adults who reported no past-year HIV testing to (1) referral for off-site HIV testing, (2) HIV risk-reduction counseling with on-site rapid HIV testing, or (3) verbal information about testing only with on-site rapid HIV testing. Results. We defined 2 primary self-reported outcomes a priori: receipt of HIV test results and unprotected anal or vaginal intercourse episodes at 6-month follow-up. The combined on-site rapid testing participants received more HIV test results than off-site testing referral participants (P < .001; Mantel-Haenszel risk ratio = 4.52; 97.5% confidence interval [CI] = 3.57, 5.72). At 6 months, there were no significant differences in unprotected intercourse episodes between the combined on-site testing arms and the referral arm (P = .39; incidence rate ratio [IRR] = 1.04; 97.5% CI = 0.95, 1.14) or the 2 on-site testing arms (P = .81; IRR = 1.03; 97.5% CI = 0.84, 1.26). Conclusions. This study demonstrated on-site rapid HIV testing’s value in drug treatment centers and found no additional benefit from HIV sexual risk-reduction counseling. PMID:22515871

  20. Card Sorting in an Online Environment: Key to Involving Online-Only Student Population in Usability Testing of an Academic Library Web Site?

    ERIC Educational Resources Information Center

    Paladino, Emily B.; Klentzin, Jacqueline C.; Mills, Chloe P.

    2017-01-01

    Based on in-person, task-based usability testing and interviews, the authors' library Web site was recently overhauled in order to improve user experience. This led to the authors' interest in additional usability testing methods and test environments that would most closely fit their library's goals and situation. The appeal of card sorting…

  1. EGFR Mutation Testing Practices within the Asia Pacific Region

    PubMed Central

    Kerr, Keith M.; Utomo, Ahmad; Rajadurai, Pathmanathan; Tran, Van Khanh; Du, Xiang; Chou, Teh-Ying; Enriquez, Ma. Luisa D.; Lee, Geon Kook; Iqbal, Jabed; Shuangshoti, Shanop; Chung, Jin-Haeng; Hagiwara, Koichi; Liang, Zhiyong; Normanno, Nicola; Park, Keunchil; Toyooka, Shinichi; Tsai, Chun-Ming; Waring, Paul; Zhang, Li; McCormack, Rose; Ratcliffe, Marianne; Itoh, Yohji; Sugeno, Masatoshi; Mok, Tony

    2015-01-01

    Introduction: The efficacy of epidermal growth factor receptor (EGFR) tyrosine kinase inhibitors in EGFR mutation-positive non–small-cell lung cancer (NSCLC) patients necessitates accurate, timely testing. Although EGFR mutation testing has been adopted by many laboratories in Asia, data are lacking on the proportion of NSCLC patients tested in each country, and the most commonly used testing methods. Methods: A retrospective survey of records from NSCLC patients tested for EGFR mutations during 2011 was conducted in 11 Asian Pacific countries at 40 sites that routinely performed EGFR mutation testing during that period. Patient records were used to complete an online questionnaire at each site. Results: Of the 22,193 NSCLC patient records surveyed, 31.8% (95% confidence interval: 31.2%–32.5%) were tested for EGFR mutations. The rate of EGFR mutation positivity was 39.6% among the 10,687 cases tested. The majority of samples were biopsy and/or cytology samples (71.4%). DNA sequencing was the most commonly used testing method accounting for 40% and 32.5% of tissue and cytology samples, respectively. A pathology report was available only to 60.0% of the sites, and 47.5% were not members of a Quality Assurance Scheme. Conclusions: In 2011, EGFR mutation testing practices varied widely across Asia. These data provide a reference platform from which to improve the molecular diagnosis of NSCLC, and EGFR mutation testing in particular, in Asia. PMID:25376513

  2. Modeling and Evaluation of Geophysical Methods for Monitoring and Tracking CO2 Migration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daniels, Jeff

    2012-11-30

    Geological sequestration has been proposed as a viable option for mitigating the vast amount of CO{sub 2} being released into the atmosphere daily. Test sites for CO{sub 2} injection have been appearing across the world to ascertain the feasibility of capturing and sequestering carbon dioxide. A major concern with full scale implementation is monitoring and verifying the permanence of injected CO{sub 2}. Geophysical methods, an exploration industry standard, are non-invasive imaging techniques that can be implemented to address that concern. Geophysical methods, seismic and electromagnetic, play a crucial role in monitoring the subsurface pre- and post-injection. Seismic techniques have beenmore » the most popular but electromagnetic methods are gaining interest. The primary goal of this project was to develop a new geophysical tool, a software program called GphyzCO2, to investigate the implementation of geophysical monitoring for detecting injected CO{sub 2} at test sites. The GphyzCO2 software consists of interconnected programs that encompass well logging, seismic, and electromagnetic methods. The software enables users to design and execute 3D surface-to-surface (conventional surface seismic) and borehole-to-borehole (cross-hole seismic and electromagnetic methods) numerical modeling surveys. The generalized flow of the program begins with building a complex 3D subsurface geological model, assigning properties to the models that mimic a potential CO{sub 2} injection site, numerically forward model a geophysical survey, and analyze the results. A test site located in Warren County, Ohio was selected as the test site for the full implementation of GphyzCO2. Specific interest was placed on a potential reservoir target, the Mount Simon Sandstone, and cap rock, the Eau Claire Formation. Analysis of the test site included well log data, physical property measurements (porosity), core sample resistivity measurements, calculating electrical permittivity values, seismic data collection, and seismic interpretation. The data was input into GphyzCO2 to demonstrate a full implementation of the software capabilities. Part of the implementation investigated the limits of using geophysical methods to monitor CO{sub 2} injection sites. The results show that cross-hole EM numerical surveys are limited to under 100 meter borehole separation. Those results were utilized in executing numerical EM surveys that contain hypothetical CO{sub 2} injections. The outcome of the forward modeling shows that EM methods can detect the presence of CO{sub 2}.« less

  3. Remediation of petroleum hydrocarbon-contaminated sites by DNA diagnosis-based bioslurping technology.

    PubMed

    Kim, Seungjin; Krajmalnik-Brown, Rosa; Kim, Jong-Oh; Chung, Jinwook

    2014-11-01

    The application of effective remediation technologies can benefit from adequate preliminary testing, such as in lab-scale and Pilot-scale systems. Bioremediation technologies have demonstrated tremendous potential with regards to cost, but they cannot be used for all contaminated sites due to limitations in biological activity. The purpose of this study was to develop a DNA diagnostic method that reduces the time to select contaminated sites that are good candidates for bioremediation. We applied an oligonucleotide microarray method to detect and monitor genes that lead to aliphatic and aromatic degradation. Further, the bioremediation of a contaminated site, selected based on the results of the genetic diagnostic method, was achieved successfully by applying bioslurping in field tests. This gene-based diagnostic technique is a powerful tool to evaluate the potential for bioremediation in petroleum hydrocarbon contaminated soil. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. An evaluation of applicability of seismic refraction method in identifying shallow archaeological features A case study at archaeological site

    NASA Astrophysics Data System (ADS)

    Jahangardi, Morteza; Hafezi Moghaddas, Naser; Keivan Hosseini, Sayyed; Garazhian, Omran

    2015-04-01

    We applied the seismic refraction method at archaeological site, Tepe Damghani located in Sabzevar, NE of Iran, in order to determine the structures of archaeological interests. This pre-historical site has special conditions with respect to geographical location and geomorphological setting, so it is an urban archaeological site, and in recent years it has been used as an agricultural field. In spring and summer of 2012, the third season of archaeological excavation was carried out. Test trenches of excavations in this site revealed that cultural layers were often disturbed adversely due to human activities such as farming and road construction in recent years. Conditions of archaeological cultural layers in southern and eastern parts of Tepe are slightly better, for instance, in test trench 3×3 m²1S03, third test trench excavated in the southern part of Tepe, an adobe in situ architectural structure was discovered that likely belongs to cultural features of a complex with 5 graves. After conclusion of the third season of archaeological excavation, all of the test trenches were filled with the same soil of excavated test trenches. Seismic refraction method was applied with12 channels of P geophones in three lines with a geophone interval of 0.5 meter and a 1.5 meter distance between profiles on test trench 1S03. The goal of this operation was evaluation of applicability of seismic method in identification of archaeological features, especially adobe wall structures. Processing of seismic data was done with the seismic software, SiesImager. Results were presented in the form of seismic section for every profile, so that identification of adobe wall structures was achieved hardly. This could be due to that adobe wall had been built with the same materials of the natural surrounding earth. Thus, there is a low contrast and it has an inappropriate effect on seismic processing and identifying of archaeological features. Hence the result could be that application of the seismic method in order to determine the archaeological features, having the same conditions, is not affordable and efficient in comparison to GPR or magnetic methods which yield more desirable results.

  5. Evaluation of the OnSite malaria rapid test performance in Miandrivazo, Madagascar.

    PubMed

    Ravaoarisoa, E; Andriamiandranoro, T; Raherinjafy, R; Jahevitra, M; Razanatsiorimalala, S; Andrianaranjaka, V; Randrianarivelojosia, M

    2017-10-01

    The performance of the malaria rapid diagnostic test OnSite-for detecting pan specific pLDH and Plasmodium falciparum specific HRP2 - was assessed during the malaria transmission peak period in Miandrivazo, in the southwestern part of Madagascar from April 20 to May 6, 2010. At the laboratory, the quality control OnSite Malaria Rapid Test according to the WHO/TDR/FIND method demonstrated that the test had good sensitivity. Of the 218 OnSite tests performed at the Miandrivazo Primary Health Center on patients with fever or a recent history of fever, four (1.8%, 95% CI: 0.6-4.9%) were invalid. Ninety four (43,1%) cases of malaria were confirmed by microscopy, of which 90 were P. falciparum malaria and 4 Plasmodium vivax malaria. With a Cohen's kappa coefficient of 0.94, the agreement between microscopy and OnSite is excellent. Compared with the rapid test CareStart™ commonly used within the public health structures in Madagascar, the sensitivity and specificity of the OnSite test were 97.9% and 96.8%.

  6. PTSITE--a new method of site evaluation for loblolly pine: model development and user's guide

    Treesearch

    Constance A. Harrington

    1991-01-01

    A model, named PTSITE, was developed to predict site index for loblolly pine based on soil characteristics, site location on the landscape, and land history. The model was tested with data from several sources and judged to predict site index within + 4 feet (P

  7. Multichannel Analysis of Surface Waves and Down-Hole Tests in the Archeological "Palatine Hill" Area (Rome, Italy): Evaluation and Influence of 2D Effects on the Shear Wave Velocity

    NASA Astrophysics Data System (ADS)

    Di Fiore, V.; Cavuoto, G.; Tarallo, D.; Punzo, M.; Evangelista, L.

    2016-05-01

    A joint analysis of down-hole (DH) and multichannel analysis of surface waves (MASW) measurements offers a complete evaluation of shear wave velocity profiles, especially for sites where a strong lateral variability is expected, such as archeological sites. In this complex stratigraphic setting, the high "subsoil anisotropy" (i.e., sharp lithological changes due to the presence of anthropogenic backfill deposits and/or buried man-made structures) implies a different role for DH and MASW tests. This paper discusses some results of a broad experimental program conducted on the Palatine Hill, one of the most ancient areas of the city of Rome (Italy). The experiments were part of a project on seismic microzoning and consisted of 20 MASW and 11 DH tests. The main objective of this study was to examine the difficulties related to the interpretation of the DH and MASW tests and the reliability limits inherent in the application of the noninvasive method in complex stratigraphic settings. As is well known, DH tests provide good determinations of shear wave velocities (Vs) for different lithologies and man-made materials, whereas MASW tests provide average values for the subsoil volume investigated. The data obtained from each method with blind tests were compared and were correlated to site-specific subsurface conditions, including lateral variability. Differences between punctual (DH) and global (MASW) Vs measurements are discussed, quantifying the errors by synthetic comparison and by site response analyses. This study demonstrates that, for archeological sites, VS profiles obtained from the DH and MASW methods differ by more than 15 %. However, the local site effect showed comparable results in terms of natural frequencies, whereas the resolution of the inverted shear wave velocity was influenced by the fundamental mode of propagation.

  8. Influence of thinning on acoustic velocity of Douglas-fir trees in western Washington and western Oregon

    Treesearch

    David G. Briggs; Gonzalo Thienel; Eric C. Turnblom; Eini Lowell; Dennis Dykstra; Robert J. Ross; Xiping Wang; Peter Carter

    2008-01-01

    Acoustic velocity was measured with a time-of-flight method on approximately 50 trees in each of five plots from four test sites of a Douglas-fir (Pseudostuga menziesii (Mirb.) Franco) thinning trial. The test sites reflect two age classes, 33 to 35 and 48 to 50 years, with 50-year site index ranging from 35 to 50 m. The acoustic velocity...

  9. Final Report - Assessment of Testing Options for the NTR at the INL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howe, Steven D; McLing, Travis L; McCurry, Michael

    One of the main technologies that can be developed to dramatically enhance the human exploration of space is the nuclear thermal rocket (NTR). Several studies over the past thirty years have shown that the NTR can reduce the cost of a lunar outpost, reduce the risk of a human mission to Mars, enable fast transits for most missions throughout the solar system, and reduce the cost and time for robotic probes to deep space. Three separate committees of the National Research Council of the National Academy of Sciences have recommended that NASA develop the NTR. One of the primary issuesmore » in development of the NTR is the ability to verify a flight ready unit. Three main methods can be used to validate safe operation of a NTR: 1) Full power, full duration test in an above ground facility that scrubs the rocket exhaust clean of any fission products; 2) Full power , full duration test using the Subsurface Active Filtering of Exhaust (SAFE) technique to capture the exhaust in subsurface strata; 3) Test of the reactor fuel at temperature and power density in a driver reactor with subsequent first test of the fully integrated NTR in space. The first method, the above ground facility, has been studied in the past. The second method, SAFE, has been examined for application at the Nevada Test Site. The third method relies on the fact that the Nuclear Furnace series of tests in 1971 showed that the radioactive exhaust coming from graphite based fuel for the NTR could be completely scrubbed of fission products and the clean hydrogen flared into the atmosphere. Under funding from the MSFC, the Center for Space Nuclear Research (CSNR) at the Idaho National laboratory (INL) has completed a reexamination of Methods 2 and 3 for implementation at the INL site. In short, the effort performed the following: 1) Assess the geology of the INL site and determine a location suitable SAFE testing; 2) Perform calculations of gas transport throughout the geology; 3) Produce a cost estimate of a non-nuclear , sub-scale test using gas injection to validate the computational models; 4) Produce a preliminary cost estimate to build a nuclear furnace equivalent facility to test NTR fuel on a green field location on the INL site. The results show that the INL geology is substantially better suited to the SAFE testing method than the NTS site. The existence of impermeable interbeds just above the sub-surface aquifer ensure that no material from the test, radioactive or not, can enter the water table. Similar beds located just below the surface will prevent any gaseous products from reaching the surface for dispersion. The extremely high permeability of the strata between the interbeds allows rapid dispersion of the rocket exhaust. In addition, the high permeability suggests that a lower back-pressure may develop in the hole against the rocket thrust, which increases safety of operations. Finally, the cost of performing a sub-scale, non-nuclear verification experiment was determined to be $3M. The third method was assessed through discussions with INL staff resident at the site. In essence, any new Category I facility on any DOE site will cost in excess of $250M. Based on the results of this study, a cost estimate for testing a nuclear rocket at the INL site appears to be warranted. Given the fact that a new nuclear fuel may be possible that does not release any fission products, the SAFE testing option appears to be the most affordable.« less

  10. Evaluation of the White Test for the Intraoperative Detection of Bile Leakage

    PubMed Central

    Leelawat, Kawin; Chaiyabutr, Kittipong; Subwongcharoen, Somboon; Treepongkaruna, Sa-ad

    2012-01-01

    We assess whether the White test is better than the conventional bile leakage test for the intraoperative detection of bile leakage in hepatectomized patients. This study included 30 patients who received elective liver resection. Both the conventional bile leakage test (injecting an isotonic sodium chloride solution through the cystic duct) and the White test (injecting a fat emulsion solution through the cystic duct) were carried out in the same patients. The detection of bile leakage was compared between the conventional method and the White test. A bile leak was demonstrated in 8 patients (26.7%) by the conventional method and in 19 patients (63.3%) by the White test. In addition, the White test detected a significantly higher number of bile leakage sites compared with the conventional method (Wilcoxon signed-rank test; P < 0.001). The White test is better than the conventional test for the intraoperative detection of bile leakage. Based on our study, we recommend that surgeons investigating bile leakage sites during liver resections should use the White test instead of the conventional bile leakage test. PMID:22547901

  11. Evaluation of the white test for the intraoperative detection of bile leakage.

    PubMed

    Leelawat, Kawin; Chaiyabutr, Kittipong; Subwongcharoen, Somboon; Treepongkaruna, Sa-Ad

    2012-01-01

    We assess whether the White test is better than the conventional bile leakage test for the intraoperative detection of bile leakage in hepatectomized patients. This study included 30 patients who received elective liver resection. Both the conventional bile leakage test (injecting an isotonic sodium chloride solution through the cystic duct) and the White test (injecting a fat emulsion solution through the cystic duct) were carried out in the same patients. The detection of bile leakage was compared between the conventional method and the White test. A bile leak was demonstrated in 8 patients (26.7%) by the conventional method and in 19 patients (63.3%) by the White test. In addition, the White test detected a significantly higher number of bile leakage sites compared with the conventional method (Wilcoxon signed-rank test; P < 0.001). The White test is better than the conventional test for the intraoperative detection of bile leakage. Based on our study, we recommend that surgeons investigating bile leakage sites during liver resections should use the White test instead of the conventional bile leakage test.

  12. The Cost-effectiveness of Rapid HIV Testing in Substance Abuse Treatment: Results of a Randomized Trial*

    PubMed Central

    Schackman, Bruce R.; Metsch, Lisa R.; Colfax, Grant N.; Leff, Jared A.; Wong, Angela; Scott, Callie A.; Feaster, Daniel J.; Gooden, Lauren; Matheson, Tim; Haynes, Louise F.; Paltiel, A. David; Walensky, Rochelle P.

    2012-01-01

    BACKGROUND The President’s National HIV/AIDS Strategy calls for coupling HIV screening and prevention services with substance abuse treatment programs. Fewer than half of US community-based substance abuse treatment programs make HIV testing available on-site or through referral. METHODS We measured the cost-effectiveness of three HIV testing strategies evaluated in a randomized trial conducted in 12 community-based substance abuse treatment programs in 2009: off-site testing referral, on-site rapid testing with information only, on-site rapid testing with risk reduction counseling. Data from the trial included patient demographics, prior testing history, test acceptance and receipt of results, undiagnosed HIV prevalence (0.4%) and program costs. The Cost Effectiveness of Preventing AIDS Complications (CEPAC) computer simulation model was used to project life expectancy, lifetime costs, and quality-adjusted life years (QALYs) for HIV-infected individuals. Incremental cost-effectiveness ratios (2009 US $/QALY) were calculated after adding costs of testing HIV-uninfected individuals; costs and QALYs were discounted at 3% annually. RESULTS Referral for off-site testing is less efficient (dominated) compared to offering on-site testing with information only. The cost-effectiveness ratio for on-site testing with information is $60,300/QALY in the base case, or $76,300/QALY with 0.1% undiagnosed HIV prevalence. HIV risk-reduction counseling costs $36 per person more without additional benefit. CONCLUSIONS A strategy of on-site rapid HIV testing offer with information only in substance abuse treatment programs increases life expectancy at a cost-effectiveness ratio <$100,000/QALY. Policymakers and substance abuse treatment leaders should seek funding to implement on-site rapid HIV testing in substance abuse treatment programs for those not recently tested. PMID:22971593

  13. OSI Passive Seismic Experiment at the Former Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweeney, J J; Harben, P

    On-site inspection (OSI) is one of the four verification provisions of the Comprehensive Nuclear Test Ban Treaty (CTBT). Under the provisions of the CTBT, once the Treaty has entered into force, any signatory party can request an on-site inspection, which can then be carried out after approval (by majority voting) of the Executive Council. Once an OSI is approved, a team of 40 inspectors will be assembled to carry out an inspection to ''clarify whether a nuclear weapon test explosion or any other nuclear explosion has been carried out in violation of Article I''. One challenging aspect of carrying outmore » an on-site inspection (OSI) in the case of a purported underground nuclear explosion is to detect and locate the underground effects of an explosion, which may include an explosion cavity, a zone of damaged rock, and/or a rubble zone associated with an underground collapsed cavity. The CTBT (Protocol, Section II part D, paragraph 69) prescribes several types of geophysical investigations that can be carried out for this purpose. One of the methods allowed by the CTBT for geophysical investigation is referred to in the Treaty Protocol as ''resonance seismometry''. This method, which was proposed and strongly promoted by Russia during the Treaty negotiations, is not described in the Treaty. Some clarification about the nature of the resonance method can be gained from OSI workshop presentations by Russian experts in the late 1990s. Our understanding is that resonance seismometry is a passive method that relies on seismic reverberations set up in an underground cavity by the passage of waves from regional and teleseismic sources. Only a few examples of the use of this method for detection of underground cavities have been presented, and those were done in cases where the existence and precise location of an underground cavity was known. As is the case with many of the geophysical methods allowed during an OSI under the Treaty, how resonance seismology really works and its effectiveness for OSI purposes has yet to be determined. For this experiment, we took a broad approach to the definition of ''resonance seismometry''; stretching it to include any means that employs passive seismic methods to infer the character of underground materials. In recent years there have been a number of advances in the use of correlation and noise analysis methods in seismology to obtain information about the subsurface. Our objective in this experiment was to use noise analysis and correlation analysis to evaluate these techniques for detecting and characterizing the underground damage zone from a nuclear explosion. The site that was chosen for the experiment was the Mackerel test in Area 4 of the former Nevada Test Site (now named the Nevada National Security Site, or NNSS). Mackerel was an underground nuclear test of less than 20 kT conducted in February of 1964 (DOENV-209-REV 15). The reason we chose this site is because there was a known apical cavity occurring at about 50 m depth above a rubble zone, and that the site had been investigated by the US Geological Survey with active seismic methods in 1965 (Watkins et al., 1967). Note that the time delay between detonation of the explosion (1964) and the time of the present survey (2010) is nearly 46 years - this would not be typical of an expected OSI under the CTBT.« less

  14. Analysis of terrestrial conditions and dynamics

    NASA Technical Reports Server (NTRS)

    Goward, S. N. (Principal Investigator)

    1984-01-01

    Land spectral reflectance properties for selected locations, including the Goddard Space Flight Center, the Wallops Flight Facility, a MLA test site in Cambridge, Maryland, and an acid test site in Burlington, Vermont, were measured. Methods to simulate the bidirectional reflectance properties of vegetated landscapes and a data base for spatial resolution were developed. North American vegetation patterns observed with the Advanced Very High Resolution Radiometer were assessed. Data and methods needed to model large-scale vegetation activity with remotely sensed observations and climate data were compiled.

  15. Subsurface profiling using integrated geophysical methods for 2D site response analysis in Bangalore city, India: a new approach

    NASA Astrophysics Data System (ADS)

    Chandran, Deepu; Anbazhagan, P.

    2017-10-01

    Recently, site response analysis has become a mandatory step for the design of important structures. Subsurface investigation is an essential step, from where the input parameters for the site response study like density, shear wave velocity (Vs), thickness and damping characteristics, etc, are obtained. Most site response studies at shallow bedrock sites are one-dimensional (1D) and are usually carried out by using Vs from multi-channel analysis of surface waves (MASW) or a standard penetration test (SPT) for N values with assumptions that soil layers are horizontal, uniform and homogeneous. These assumptions are not completely true in shallow bedrock regions as soil deposits are heterogeneous. The objective of this study is to generate the actual subsurface profiles in two-dimensions at shallow bedrock regions using integrated subsurface investigation testing. The study area selected for this work is Bangalore, India. Three survey lines were selected in Bangalore at two different locations; one at the Indian Institute of Science (IISc) Campus and the other at Whitefield. Geophysical surveys like ground penetrating radar (GPR) and 2D MASW were carried out at these survey lines. Geophysical test results are compared and validated with a conventional geotechnical SPT. At the IISc site, the soil profile is obtained from a trench excavated for a proposed pipeline used to compare the geophysical test results. Test results show that GPR is very useful to delineate subsurface layers, especially for shallow depths at both sites (IISc Campus and Whitefield). MASW survey results show variation of Vs values and layer thickness comparatively at deeper depths for both sites. They also show higher density soil strata with high Vs value obtained at the IISc Campus site, whereas at the Whitefield site weaker soil with low shear velocity is observed. Combining these two geophysical methods helped to generate representative 2D subsurface profiles. These subsurface profiles can be further used to understand the difference between 1D and 2D site response.

  16. Quantitative PCR detection of Batrachochytrium dendrobatidis DNA from sediments and water

    USGS Publications Warehouse

    Kirshtein, Julie D.; Anderson, Chauncey W.; Wood, J.S.; Longcore, Joyce E.; Voytek, Mary A.

    2007-01-01

    The fungal pathogen Batrachochytrium dendrobatidis (Bd) causes chytridiomycosis, a disease implicated in amphibian declines on 5 continents. Polymerase chain reaction (PCR) primer sets exist with which amphibians can be tested for this disease, and advances in sampling techniques allow non-invasive testing of animals. We developed filtering and PCR based quantitative methods by modifying existing PCR assays to detect Bd DNA in water and sediments, without the need for testing amphibians; we tested the methods at 4 field sites. The SYBR based assay using Boyle primers (SYBR/Boyle assay) and the Taqman based assay using Wood primers performed similarly with samples generated in the laboratory (Bd spiked filters), but the SYBR/Boyle assay detected Bd DNA in more field samples. We detected Bd DNA in water from 3 of 4 sites tested, including one pond historically negative for chytridiomycosis. Zoospore equivalents in sampled water ranged from 19 to 454 l-1 (nominal detection limit is 10 DNA copies, or about 0.06 zoospore). We did not detect DNA of Bd from sediments collected at any sites. Our filtering and amplification methods provide a new tool to investigate critical aspects of Bd in the environment. ?? Inter-Research 2007.

  17. Canadian Health Measures Survey pre-test: design, methods, results.

    PubMed

    Tremblay, Mark; Langlois, Renée; Bryan, Shirley; Esliger, Dale; Patterson, Julienne

    2007-01-01

    The Canadian Health Measures Survey (CHMS) pre-test was conducted to provide information about the challenges and costs associated with administering a physical health measures survey in Canada. To achieve the specific objectives of the pre-test, protocols were developed and tested, and methods for household interviewing and clinic testing were designed and revised. The cost, logistics and suitability of using fixed sites for the CHMS were assessed. Although data collection, transfer and storage procedures are complex, the pre-test experience confirmed Statistics Canada's ability to conduct a direct health measures survey and the willingness of Canadians to participate in such a health survey. Many operational and logistical procedures worked well and, with minor modifications, are being employed in the main survey. Fixed sites were problematic, and survey costs were higher than expected.

  18. Microbiologic tests in epidemiologic studies: are they reproducible?

    PubMed

    Aass, A M; Preus, H R; Zambon, J J; Gjermo, P

    1994-12-01

    Microbiologic assessments are often included in longitudinal studies to elucidate the significance of the association of certain Gram-negative bacteria and the development of periodontal diseases. In such studies, the reliability of methods is crucial. There are several methods to identify putative pathogens, and some of them are commercially available. The purpose of the present study was to compare the reproducibility of four different methods for detecting Actinobacillus actinomycetemcomitans, Porphyromonas gingivalis, and Prevotella intermedia in order to evaluate their usefulness in epidemiologic studies. The test panel consisted of 10 young subjects and 10 adult periodontitis patients. Subgingival plaque was sampled from sites showing bone loss and "healthy" control sites. The four different methods for detecting the target bacteria were 1) cultivation, 2) Evalusite (a chair-side kit based on ELISA), 3) OmniGene, Inc, based on DNA probes, and 4) indirect immunofluorescence (IIF). The test procedure was repeated after a 1-wk interval and was performed by one examiner. Sites reported to be positive for a microorganism by any of the four methods at one or both examinations were considered to be positive for that organism and included in the analysis. The reproducibility of the four methods was low. The IIF and the cultivation methods showed somewhat higher reproducibility than did the commercial systems. A second test was done for Evalusite, three paper points for sampling being used instead of one as described in the manual. The reproducibility of the second test was improved, indicating that the detection level of the system may influence the reliability.

  19. [Research on the application of in-situ biological stabilization solidification technology in chromium contaminated site management].

    PubMed

    Zhang, Jian-rong; Li, Juan; Xu, Wei

    2013-09-01

    In-situ biological stabilization solidification (SS) technology is an effective ground water risk control method for chromium contaminated sites. Through on-site engineering test, this paper has preliminarily validated the remediation effect of in-situ SS method on a southern chromium contaminated site. The engineering test site has an area of approximately 600 m2, and is located at the upstream of the contaminated area. Due to the severe contamination of chromium, the total chromium concentration reached up to 11,850 mg x kg(-1), while the hexavalent chromium concentration reached up to 349 mg x kg(-1), and the most severely contaminated soil had a depth of -0.5 - -2 m. Variations in hexavalent chromium and total chromium concentration in groundwater were observed through the injection of reducing agents and microbial regulators into the injection wells in the test site, and through the monitoring analysis at different time and different depth under the action of the injection agents. Results of the engineering test showed that the on-site SS technology significantly changed the chromium speciation in soil and then reduced the migration of chromium, thus the groundwater risk was reduced. The injected agents had a good effect of hexavalent chromium remediation in groundwater within the effective range of the injection wells, and the SS rate of hexavalent chromium into trivalent chromium reached 94%-99.9%, the SS rate of total chromium fixation reached 83.9%-99.8%. The test results are of significant reference value for the remediation of contaminated sites with features of shallow groundwater depth and soil mainly consisting of silty clay and sandy clay.

  20. HIV-1 protease cleavage site prediction based on two-stage feature selection method.

    PubMed

    Niu, Bing; Yuan, Xiao-Cheng; Roeper, Preston; Su, Qiang; Peng, Chun-Rong; Yin, Jing-Yuan; Ding, Juan; Li, HaiPeng; Lu, Wen-Cong

    2013-03-01

    Knowledge of the mechanism of HIV protease cleavage specificity is critical to the design of specific and effective HIV inhibitors. Searching for an accurate, robust, and rapid method to correctly predict the cleavage sites in proteins is crucial when searching for possible HIV inhibitors. In this article, HIV-1 protease specificity was studied using the correlation-based feature subset (CfsSubset) selection method combined with Genetic Algorithms method. Thirty important biochemical features were found based on a jackknife test from the original data set containing 4,248 features. By using the AdaBoost method with the thirty selected features the prediction model yields an accuracy of 96.7% for the jackknife test and 92.1% for an independent set test, with increased accuracy over the original dataset by 6.7% and 77.4%, respectively. Our feature selection scheme could be a useful technique for finding effective competitive inhibitors of HIV protease.

  1. SUPERFUND TREATABILITY CLEARINGHOUSE: FINAL REPORT, PHASE I - IMMEDIATE ASSESSMENT, ACME SOLVENTS SITE

    EPA Science Inventory

    This is a site assessment and feasibility study of incineration alternatives at the ACME Solvents Site at Rockford, Illinois. The document contains laboratory results that are reported to simulate incineration conditions but no details on test methods were provided. The d...

  2. SITE project. Phase 1: Continuous data bit-error-rate testing

    NASA Technical Reports Server (NTRS)

    Fujikawa, Gene; Kerczewski, Robert J.

    1992-01-01

    The Systems Integration, Test, and Evaluation (SITE) Project at NASA LeRC encompasses a number of research and technology areas of satellite communications systems. Phase 1 of this project established a complete satellite link simulator system. The evaluation of proof-of-concept microwave devices, radiofrequency (RF) and bit-error-rate (BER) testing of hardware, testing of remote airlinks, and other tests were performed as part of this first testing phase. This final report covers the test results produced in phase 1 of the SITE Project. The data presented include 20-GHz high-power-amplifier testing, 30-GHz low-noise-receiver testing, amplitude equalization, transponder baseline testing, switch matrix tests, and continuous-wave and modulated interference tests. The report also presents the methods used to measure the RF and BER performance of the complete system. Correlations of the RF and BER data are summarized to note the effects of the RF responses on the BER.

  3. FIELD OPERATIONS AND METHODS FOR MEASURING THE ECOLOGICAL CONDITION OF NON-WADEABLE RIVERS AND STREAMS

    EPA Science Inventory

    The methods and instructions for field operations presented in this manual for surveys of non-wadeable streams and rivers were developed and tested based on 55 sample sites in the Mid-Atlantic region and 53 sites in an Oregon study during two years of pilot and demonstration proj...

  4. 40 CFR 60.63 - Monitoring of operations.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... § 60.62, you must demonstrate compliance through an initial performance test. You will conduct your performance test using Method 5 or Method 5I at appendix A-3 to part 60 of this chapter. You must also monitor... CPMS, you will establish a site-specific operating limit. If your PM performance test demonstrates your...

  5. 40 CFR 60.63 - Monitoring of operations.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... § 60.62, you must demonstrate compliance through an initial performance test. You will conduct your performance test using Method 5 or Method 5I at appendix A-3 to part 60 of this chapter. You must also monitor... CPMS, you will establish a site-specific operating limit. If your PM performance test demonstrates your...

  6. Quantification of aquifer properties with surface nuclear magnetic resonance in the Platte River valley, central Nebraska, using a novel inversion method

    USGS Publications Warehouse

    Irons, Trevor P.; Hobza, Christopher M.; Steele, Gregory V.; Abraham, Jared D.; Cannia, James C.; Woodward, Duane D.

    2012-01-01

    Surface nuclear magnetic resonance, a noninvasive geophysical method, measures a signal directly related to the amount of water in the subsurface. This allows for low-cost quantitative estimates of hydraulic parameters. In practice, however, additional factors influence the signal, complicating interpretation. The U.S. Geological Survey, in cooperation with the Central Platte Natural Resources District, evaluated whether hydraulic parameters derived from surface nuclear magnetic resonance data could provide valuable input into groundwater models used for evaluating water-management practices. Two calibration sites in Dawson County, Nebraska, were chosen based on previous detailed hydrogeologic and geophysical investigations. At both sites, surface nuclear magnetic resonance data were collected, and derived parameters were compared with results from four constant-discharge aquifer tests previously conducted at those same sites. Additionally, borehole electromagnetic-induction flowmeter data were analyzed as a less-expensive surrogate for traditional aquifer tests. Building on recent work, a novel surface nuclear magnetic resonance modeling and inversion method was developed that incorporates electrical conductivity and effects due to magnetic-field inhomogeneities, both of which can have a substantial impact on the data. After comparing surface nuclear magnetic resonance inversions at the two calibration sites, the nuclear magnetic-resonance-derived parameters were compared with previously performed aquifer tests in the Central Platte Natural Resources District. This comparison served as a blind test for the developed method. The nuclear magnetic-resonance-derived aquifer parameters were in agreement with results of aquifer tests where the environmental noise allowed data collection and the aquifer test zones overlapped with the surface nuclear magnetic resonance testing. In some cases, the previously performed aquifer tests were not designed fully to characterize the aquifer, and the surface nuclear magnetic resonance was able to provide missing data. In favorable locations, surface nuclear magnetic resonance is able to provide valuable noninvasive information about aquifer parameters and should be a useful tool for groundwater managers in Nebraska.

  7. Site Classification using Multichannel Channel Analysis of Surface Wave (MASW) method on Soft and Hard Ground

    NASA Astrophysics Data System (ADS)

    Ashraf, M. A. M.; Kumar, N. S.; Yusoh, R.; Hazreek, Z. A. M.; Aziman, M.

    2018-04-01

    Site classification utilizing average shear wave velocity (Vs(30) up to 30 meters depth is a typical parameter. Numerous geophysical methods have been proposed for estimation of shear wave velocity by utilizing assortment of testing configuration, processing method, and inversion algorithm. Multichannel Analysis of Surface Wave (MASW) method is been rehearsed by numerous specialist and professional to geotechnical engineering for local site characterization and classification. This study aims to determine the site classification on soft and hard ground using MASW method. The subsurface classification was made utilizing National Earthquake Hazards Reduction Program (NERHP) and international Building Code (IBC) classification. Two sites are chosen to acquire the shear wave velocity which is in the state of Pulau Pinang for soft soil and Perlis for hard rock. Results recommend that MASW technique can be utilized to spatially calculate the distribution of shear wave velocity (Vs(30)) in soil and rock to characterize areas.

  8. Evaluation of modifications of the traditional patch test in assessing the chemical irritation potential of feminine hygiene products.

    PubMed

    Farage, Miranda A; Meyer, Sandy; Walter, Dave

    2004-05-01

    The first main objective of the work presented in this paper was to investigate ways of optimizing the current arm patch test protocol by (1) increasing the sensitivity of the test in order to evaluate more effectively the products that are inherently non-irritating, and/or (2) reducing the costs of these types of studies by shortening the protocol. The second main objective was to use the results of these studies and the results of the parallel studies conducted using the behind-the-knee method to better understand the contribution of mechanical irritation to the skin effects produced by these types of products. In addition, we were interested in continuing the evaluation of sensory effects and their relationship to objective measures of irritation. Test materials were prepared from three, currently marketed feminine protection pads. Wet and dry samples were applied to the upper arm using the standard 24-h patch test. Applications were repeated daily for 4 consecutive days. The test sites were scored for irritation prior to the first patch application, and 30-60 min after removal of each patch. Some test sites were treated by tape stripping the skin prior to the initial patch application. In addition, in one experiment, panelists were asked to keep a daily diary describing any sensory skin effects they noticed at each test site. All protocol variations ([intact skin/dry samples], [compromised skin/dry samples], [intact skin/wet samples], and [compromised skin/wet samples]) gave similar results for the products tested. When compared to the behind-the-knee test method, the standard upper arm patch test gave consistently lower levels of irritation when the test sites were scored shortly after patch removal, even though the sample application was longer (24 vs. 6 h) in the standard patch test. The higher level of irritation in the behind-the-knee method was likely due to mechanical irritation. The sensory skin effects did not appear to be related to a particular test product or a particular protocol variation. However, the mean irritation scores at those sites where a sensory effect was reported were higher than the mean irritation scores at those sites were no sensory effects were reported. All four protocol variations of the standard upper arm patch test can be used to assess the inherent chemical irritant properties of feminine protection products. For these products, which are inherently non-irritating, tape stripping and/or applying wet samples does not increase the sensitivity of the patch test method. Differences in irritation potential were apparent after one to three 24-h applications. Therefore, the standard patch test protocol can be shortened to three applications without compromising our ability to detect differences in the chemical irritation produced by the test materials. The patch test can be used to evaluate effectively the inherent chemical irritation potential of these types of products. However, this method is not suitable for testing the mechanical irritation due to friction that occurs during product use. There is no relationship between specific test conditions, i.e., compromised skin and/or testing wet samples and reports of perceived sensory reactions. However, there seems to be a clear relationship between sensory reactions and objective irritation scores.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    K.E. Rasmuson

    The U.S. Department of Energy has implemented a program to reclaim lands disturbed by site characterization at Yucca Mountain. Long term goals of the program are to re-establish processes on disturbed sites that will lead to self-sustaining plant communities. The Biological Opinion for Yucca Mountain Site Characterization Studies required that the U.S. Department of Energy develop a Reclamation Standards and Monitoring Plan to evaluate the success of reclamation efforts. According to the Reclamation Standards and Monitoring Plan, reclaimed sites will be monitored periodically, remediated if necessary, and eventually compared to an appropriate reference area to determine whether reclamation goals havemore » been achieved and the site can be released from further monitoring. Plant cover, density, and species richness (success parameters) on reclaimed sites are compared to 60 percent of the values (success criteria) for the same parameters on the reference area. Small sites (less than 0.1 ha) are evaluated for release using qualitative methods while large sites (greater than 0.1 ha) are evaluated using quantitative methods. In the summer of 2000, 31 small sites reclaimed in 1993 and 1994 were evaluated for reclamation success and potential release from further monitoring. Plant density, cover, and species richness were estimated on the C-Well Pipeline, UE-25 Large Rocks test site, and 29 ground surface facility test pits. Evidence of erosion, reproduction and natural recruitment, exotic species abundance, and animal use (key attributes) also were recorded for each site and used in success evaluations. The C-Well Pipeline and ground surface facility test pits were located in a ''Larrea tridentata - Ephedra nevadensis'' vegetation association while the UE-25 Large Rocks test site was located in an area dominated by ''Coleogyne ramosissima and Ephedra nevadensis''. Reference areas in the same vegetation associations with similar slope and aspect were chosen for comparison to the reclaimed sites. Sixty percent of the reference area means for density, cover, and species richness were compared to the estimated means for the reclaimed sites. Plant density, cover, and species richness at the C-Well Pipeline and UE-25 Large Rocks test site were greater than the success criteria and all key attributes indicated the sites were in acceptable condition. Therefore, these two sites were recommended for release from further monitoring. Of the 29 ground surface facility test pits, 26 met the criterion for density, 21 for cover, and 23 for species richness. When key attributes and conditions of the plant community near each pit were taken into account, 27 of these pits were recommended for release. Success parameters and key attributes at ground surface facility test pits 19 and 20 were inadequate for site release. Transplants of native species were added to these two sites in 2001 to improve density, cover, and species richness.« less

  10. LEAK AND GAS PERMEABILITY TESTING DURING SOIL-GAS SAMPLING AT HAL'S CHEVRON LUST SITE IN GREEN RIVER, UTAH

    EPA Science Inventory

    The results of gas permeability and leak testing during active soil-gas sampling at Hal’s Chevron LUST Site in Green River, Utah are presented. This study was conducted to support development of a passive soil-gas sampling method. Gas mixtures containing helium and methane were...

  11. A new detection method for the K variant of butyrylcholinesterase based on PCR primer introduced restriction analysis (PCR-PIRA).

    PubMed Central

    Shibuta, K; Abe, M; Suzuki, T

    1994-01-01

    The K variant of human butyrylcholinesterase is caused by a G/A transition in the butyrylcholinesterase gene, which neither creates nor destroys any restriction site. In an attempt to detect the K variant both simply and rapidly, we developed a two step method of "PCR primer introduced restriction analysis" (PCR-PIRA). The first step was used to introduce a new Fun4HI site into the normal allele for a screening test, while the second step was performed to create a new MaeIII site on the variant allele for a specific test. This method thus enabled us to distinguish clearly the K variant from the normal allele, and also showed that the frequency of the K variant allele is 0.164 in the Japanese population. Images PMID:7966197

  12. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  13. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  14. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM−2.5.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  15. A simple linear model for estimating ozone AOT40 at forest sites from raw passive sampling data.

    PubMed

    Ferretti, Marco; Cristofolini, Fabiana; Cristofori, Antonella; Gerosa, Giacomo; Gottardini, Elena

    2012-08-01

    A rapid, empirical method is described for estimating weekly AOT40 from ozone concentrations measured with passive samplers at forest sites. The method is based on linear regression and was developed after three years of measurements in Trentino (northern Italy). It was tested against an independent set of data from passive sampler sites across Italy. It provides good weekly estimates compared with those measured by conventional monitors (0.85 ≤R(2)≤ 0.970; 97 ≤ RMSE ≤ 302). Estimates obtained using passive sampling at forest sites are comparable to those obtained by another estimation method based on modelling hourly concentrations (R(2) = 0.94; 131 ≤ RMSE ≤ 351). Regression coefficients of passive sampling are similar to those obtained with conventional monitors at forest sites. Testing against an independent dataset generated by passive sampling provided similar results (0.86 ≤R(2)≤ 0.99; 65 ≤ RMSE ≤ 478). Errors tend to accumulate when weekly AOT40 estimates are summed to obtain the total AOT40 over the May-July period, and the median deviation between the two estimation methods based on passive sampling is 11%. The method proposed does not require any assumptions, complex calculation or modelling technique, and can be useful when other estimation methods are not feasible, either in principle or in practice. However, the method is not useful when estimates of hourly concentrations are of interest.

  16. Screening tests for the rapid detection of diarrhetic shellfish toxins in Washington State.

    PubMed

    Eberhart, Bich-Thuy L; Moore, Leslie K; Harrington, Neil; Adams, Nicolaus G; Borchert, Jerry; Trainer, Vera L

    2013-09-30

    The illness of three people due to diarrhetic shellfish poisoning (DSP) following their ingestion of recreationally harvested mussels from Sequim Bay State Park in the summer of 2011, resulted in intensified monitoring for diarrhetic shellfish toxins (DSTs) in Washington State. Rapid testing at remote sites was proposed as a means to provide early warning of DST events in order to protect human health and allow growers to test "pre-harvest" shellfish samples, thereby preventing harvest of toxic product that would later be destroyed or recalled. Tissue homogenates from several shellfish species collected from two sites in Sequim Bay, WA in the summer 2012, as well as other sites throughout Puget Sound, were analyzed using three rapid screening methods: a lateral flow antibody-based test strip (Jellett Rapid Test), an enzyme-linked immunosorbent assay (ELISA) and a protein phosphatase 2A inhibition assay (PP2A). The results were compared to the standard regulatory method of liquid chromatography coupled with tandem mass spectroscopy (LC-MS/MS). The Jellett Rapid Test for DSP gave an unacceptable number of false negatives due to incomplete extraction of DSTs using the manufacturer's recommended method while the ELISA antibody had low cross-reactivity with dinophysistoxin-1, the major toxin isomer in shellfish from the region. The PP2A test showed the greatest promise as a screening tool for Washington State shellfish harvesters.

  17. Screening Tests for the Rapid Detection of Diarrhetic Shellfish Toxins in Washington State

    PubMed Central

    Eberhart, Bich-Thuy L.; Moore, Leslie K.; Harrington, Neil; Adams, Nicolaus G.; Borchert, Jerry; Trainer, Vera L.

    2013-01-01

    The illness of three people due to diarrhetic shellfish poisoning (DSP) following their ingestion of recreationally harvested mussels from Sequim Bay State Park in the summer of 2011, resulted in intensified monitoring for diarrhetic shellfish toxins (DSTs) in Washington State. Rapid testing at remote sites was proposed as a means to provide early warning of DST events in order to protect human health and allow growers to test “pre-harvest” shellfish samples, thereby preventing harvest of toxic product that would later be destroyed or recalled. Tissue homogenates from several shellfish species collected from two sites in Sequim Bay, WA in the summer 2012, as well as other sites throughout Puget Sound, were analyzed using three rapid screening methods: a lateral flow antibody-based test strip (Jellett Rapid Test), an enzyme-linked immunosorbent assay (ELISA) and a protein phosphatase 2A inhibition assay (PP2A). The results were compared to the standard regulatory method of liquid chromatography coupled with tandem mass spectroscopy (LC-MS/MS). The Jellett Rapid Test for DSP gave an unacceptable number of false negatives due to incomplete extraction of DSTs using the manufacturer’s recommended method while the ELISA antibody had low cross-reactivity with dinophysistoxin-1, the major toxin isomer in shellfish from the region. The PP2A test showed the greatest promise as a screening tool for Washington State shellfish harvesters. PMID:24084788

  18. Evaluation of moxifloxacin-hydroxyapatite composite graft in the regeneration of intrabony defects: A clinical, radiographic, and microbiological study

    PubMed Central

    Nagarjuna Reddy, Y. V.; Deepika, P. C.; Venkatesh, M. P.; Rajeshwari, K. G.

    2016-01-01

    Background: The formation of new connective periodontal attachment is contingent upon the elimination or marked reduction of pathogens at the treated periodontal site. An anti-microbial agent, i.e. moxifloxacin has been incorporated into the bone graft to control infection and facilitate healing during and after periodontal therapy. Materials and Methods: By purposive sampling, 15 patients with at least two contralateral vertical defect sites were selected. The selected sites in each individual were divided randomly into test and control sites according to split-mouth design. Test site received moxifloxacin-hydroxyapatite composite graft and control site received hydroxyapatite-placebo gel composite graft. Probing depth (PD) and Clinical attachment level (CAL) were assessed at baseline, 3, 6, 9, and 12 months. Bone probing depth (BPD) and hard tissue parameters such as amount of defect fill, percentage of defect fill, and changes in alveolar crest were assessed at baseline, 6, and 12 months. Changes in subgingival microflora were also assessed by culturing the subgingival plaque samples at baseline and at 3-month follow-up. The clinical, radiographic, and microbiological data obtained were subjected to statistical analysis using descriptive statistics, paired sample t-test, independent t-test, and contingency test. Results: On intragroup comparison at test and control sites, there was a significant improvement in all clinical and radiographic parameters. However, on intergroup comparison of the same, there was no statistically significant difference between test and control sites at any interval. Although test sites showed slightly higher amount of bone fill, it was not statistically significant. There was a significant reduction in the counts of Aggregatibacter actinomycetemcomitans and Porphyromonas gingivalis at both sites from baseline to 3 months. In addition, there was a significant reduction at test sites as compared to control sites at 3-month follow-up (P = 0.003 and P = 0.013). Conclusion: The reduction in microbial counts found in test sites at 3-month follow-up could not bring similar significant improvements in the clinical and radiographic parameters though the test sites showed slightly higher bone fill. PMID:27630501

  19. Applying site-index curves to northern hardwoods in New Hampshire

    Treesearch

    Dale S. Solomon

    1968-01-01

    Describes a new method for testing site-index curves. Study results indicate that Vermont site-index curves for yellow birch, paper birch, white ash, and sugar maple, and New York-Connecticut curves for red maple, can be applied satisfactorily in New Hampshire when used with certain precautions and corrections.

  20. Testing the applicability of rapid on-site enzymatic activity detection for surface water monitoring

    NASA Astrophysics Data System (ADS)

    Stadler, Philipp; Vogl, Wolfgang; Juri, Koschelnik; Markus, Epp; Maximilian, Lackner; Markus, Oismüller; Monika, Kumpan; Peter, Strauss; Regina, Sommer; Gabriela, Ryzinska-Paier; Farnleitner Andreas, H.; Matthias, Zessner

    2015-04-01

    On-site detection of enzymatic activities has been suggested as a rapid surrogate for microbiological pollution monitoring of water resources (e.g. using glucuronidases, galactosidases, esterases). Due to the possible short measuring intervals enzymatic methods have high potential as near-real time water quality monitoring tools. This presentation describes results from a long termed field test. For twelve months, two ColiMinder devices (Vienna Water Monitoring, Austria) for on-site determination of enzymatic activity were tested for stream water monitoring at the experimental catchment HOAL (Hydrological Open Air Laboratory, Center for Water Resource Systems, Vienna University of Technology). The devices were overall able to follow and reflect the diverse hydrological and microbiological conditions of the monitored stream during the test period. Continuous data in high temporal resolution captured the course of enzymatic activity in stream water during diverse rainfall events. The method also proofed sensitive enough to determine diurnal fluctuations of enzymatic activity in stream water during dry periods. The method was able to capture a seasonal trend of enzymatic activity in stream water that matches the results gained from Colilert18 analysis for E. coli and coliform bacteria of monthly grab samples. Furthermore the comparison of ColiMinder data with measurements gained at the same test site with devices using the same method but having different construction design (BACTcontrol, microLAN) showed consistent measuring results. Comparative analysis showed significant differences between measured enzymatic activity (modified fishman units and pmol/min/100ml) and cultivation based analyses (most probable number, colony forming unit). Methods of enzymatic activity measures are capable to detect ideally the enzymatic activity caused by all active target bacteria members, including VBNC (viable but nonculturable) while cultivation based methods cannot detect VBNC bacteria. Therefore the applicability of on-site enzymatic activity determination as a direct surrogate or proxy parameter for microbiological standard assays and quantification of fecal indicator bacteria (FIB) concentration could not be approved and further research in this field is necessary. Presently we conclude that rapid on-site detection of enzymatic activity is applicable for surface water monitoring and that it constitutes a complementary on-site monitoring parameter with high potential. Selection of the type of measured enzymatic activities has to be done on a catchment-specific basis and further work is needed to learn more about its detailed information characteristics in different habitats. The accomplishment of this method detecting continuous data of enzymatic activity in high temporal resolution caused by a target bacterial member is on the way of becoming a powerful tool for water quality monitoring, health related water quality- and early warning requirements.

  1. Regeneration alternatives for upland white spruce after buring and logging in interior Alaska

    Treesearch

    R. V. Densmore; G. P. Juday; John C. Zasada

    1999-01-01

    Site-preparation and regeneration methods for white spruce (Picea glaucu (Meench) Voss) were tested near Fairbanks Alaska, on two upland sites which had been burned in a wildfire and salvage logged. After 5 and 10 years, white spruce regeneration did not differ among the four scarification methods but tended to be lower without scarification....

  2. 40 CFR 63.5719 - How do I conduct a performance test?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... appendix A to 40 CFR part 60, as appropriate, to select the sampling sites. (2) Use Method 2, 2A, 2C, 2D... emissions. (4) You may use American Society for Testing and Materials (ASTM) D6420-99 (available for... parts being made and material application methods. The production conditions during the test must also...

  3. Study on evaluation methods for Rayleigh wave dispersion characteristic

    USGS Publications Warehouse

    Shi, L.; Tao, X.; Kayen, R.; Shi, H.; Yan, S.

    2005-01-01

    The evaluation of Rayleigh wave dispersion characteristic is the key step for detecting S-wave velocity structure. By comparing the dispersion curves directly with the spectra analysis of surface waves (SASW) method, rather than comparing the S-wave velocity structure, the validity and precision of microtremor-array method (MAM) can be evaluated more objectively. The results from the China - US joint surface wave investigation in 26 sites in Tangshan, China, show that the MAM has the same precision with SASW method in 83% of the 26 sites. The MAM is valid for Rayleigh wave dispersion characteristic testing and has great application potentiality for site S-wave velocity structure detection.

  4. Multimodality language mapping in patients with left-hemispheric language dominance on Wada test

    PubMed Central

    Kojima, Katsuaki; Brown, Erik C.; Rothermel, Robert; Carlson, Alanna; Matsuzaki, Naoyuki; Shah, Aashit; Atkinson, Marie; Mittal, Sandeep; Fuerst, Darren; Sood, Sandeep; Asano, Eishi

    2012-01-01

    Objective We determined the utility of electrocorticography (ECoG) and stimulation for detecting language-related sites in patients with left-hemispheric language-dominance on Wada test. Methods We studied 13 epileptic patients who underwent language mapping using event-related gamma-oscillations on ECoG and stimulation via subdural electrodes. Sites showing significant gamma-augmentation during an auditory-naming task were defined as language-related ECoG sites. Sites at which stimulation resulted in auditory perceptual changes, failure to verbalize a correct answer, or sensorimotor symptoms involving the mouth were defined as language-related stimulation sites. We determined how frequently these methods revealed language-related sites in the superior-temporal, inferior-frontal, dorsolateral-premotor, and inferior-Rolandic regions. Results Language-related sites in the superior-temporal and inferior-frontal gyri were detected by ECoG more frequently than stimulation (p < 0.05), while those in the dorsolateral-premotor and inferior-Rolandic regions were detected by both methods equally. Stimulation of language-related ECoG sites, compared to the others, more frequently elicited language symptoms (p < 0.00001). One patient developed dysphasia requiring in-patient speech therapy following resection of the dorsolateral-premotor and inferior-Rolandic regions containing language-related ECoG sites not otherwise detected by stimulation. Conclusions Language-related gamma-oscillations may serve as an alternative biomarker of underlying language function in patients with left-hemispheric language-dominance. Significance Measurement of language-related gamma-oscillations is warranted in presurgical evaluation of epileptic patients. PMID:22503906

  5. New methods for engineering site characterization using reflection and surface wave seismic survey

    NASA Astrophysics Data System (ADS)

    Chaiprakaikeow, Susit

    This study presents two new seismic testing methods for engineering application, a new shallow seismic reflection method and Time Filtered Analysis of Surface Waves (TFASW). Both methods are described in this dissertation. The new shallow seismic reflection was developed to measure reflection at a single point using two to four receivers, assuming homogeneous, horizontal layering. It uses one or more shakers driven by a swept sine function as a source, and the cross-correlation technique to identify wave arrivals. The phase difference between the source forcing function and the ground motion due to the dynamic response of the shaker-ground interface was corrected by using a reference geophone. Attenuated high frequency energy was also recovered using the whitening in frequency domain. The new shallow seismic reflection testing was performed at the crest of Porcupine Dam in Paradise, Utah. The testing used two horizontal Vibroseis sources and four receivers for spacings between 6 and 300 ft. Unfortunately, the results showed no clear evidence of the reflectors despite correction of the magnitude and phase of the signals. However, an improvement in the shape of the cross-correlations was noticed after the corrections. The results showed distinct primary lobes in the corrected cross-correlated signals up to 150 ft offset. More consistent maximum peaks were observed in the corrected waveforms. TFASW is a new surface (Rayleigh) wave method to determine the shear wave velocity profile at a site. It is a time domain method as opposed to the Spectral Analysis of Surface Waves (SASW) method, which is a frequency domain method. This method uses digital filtering to optimize bandwidth used to determine the dispersion curve. Results from testings at three different sites in Utah indicated good agreement with the dispersion curves measured using both TFASW and SASW methods. The advantage of TFASW method is that the dispersion curves had less scatter at long wavelengths as a result from wider bandwidth used in those tests.

  6. Testing the regionalization of a SVAT model for a region with high observation density

    NASA Astrophysics Data System (ADS)

    Eiermann, Sven; Thies, Boris; Bendix, Jörg

    2014-05-01

    The variable soil moisture is an important quantity in weather and climate investigations, because it has an essential influence on the energy exchange between the land surface and the atmosphere. However the recording of soil moisture in high spatio-temporal resolution is problematic. The planned Tandem-L mission of the German Aerospace Center (DLR) with an innovative L-band radar on board provides the opportunity to get daily soil moisture data at a spatial resolution of 50 meters. Within the Helmholtz Alliance Remote Sensing and Earth System Dynamics this data is planned to be used to regionalize a Soil Vegetation Atmosphere Transfer Model, in order to analyze the energy flux and the gas exchange and to improve the prediction of the water exchange between soil, vegetation and atmosphere. As investigation areas selected regions of the TERENO (TERrestrial ENviromental Observatoria) test sites and, later on, a region in South Ecuador will be used, for which data for the model initialization and validation are available. The reason for testing the method for the TERENO test sites first is the good data basis as a result of the already established high observation density there. The poster will present the methods being used for the model adaptation for the TERENO test sites and discuss the improvements achieved by these methods.

  7. SU-F-R-33: Can CT and CBCT Be Used Simultaneously for Radiomics Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, R; Wang, J; Zhong, H

    2016-06-15

    Purpose: To investigate whether CBCT and CT can be used in radiomics analysis simultaneously. To establish a batch correction method for radiomics in two similar image modalities. Methods: Four sites including rectum, bladder, femoral head and lung were considered as region of interest (ROI) in this study. For each site, 10 treatment planning CT images were collected. And 10 CBCT images which came from same site of same patient were acquired at first radiotherapy fraction. 253 radiomics features, which were selected by our test-retest study at rectum cancer CT (ICC>0.8), were calculated for both CBCT and CT images in MATLAB.more » Simple scaling (z-score) and nonlinear correction methods were applied to the CBCT radiomics features. The Pearson Correlation Coefficient was calculated to analyze the correlation between radiomics features of CT and CBCT images before and after correction. Cluster analysis of mixed data (for each site, 5 CT and 5 CBCT data are randomly selected) was implemented to validate the feasibility to merge radiomics data from CBCT and CT. The consistency of clustering result and site grouping was verified by a chi-square test for different datasets respectively. Results: For simple scaling, 234 of the 253 features have correlation coefficient ρ>0.8 among which 154 features haveρ>0.9 . For radiomics data after nonlinear correction, 240 of the 253 features have ρ>0.8 among which 220 features have ρ>0.9. Cluster analysis of mixed data shows that data of four sites was almost precisely separated for simple scaling(p=1.29 * 10{sup −7}, χ{sup 2} test) and nonlinear correction (p=5.98 * 10{sup −7}, χ{sup 2} test), which is similar to the cluster result of CT data (p=4.52 * 10{sup −8}, χ{sup 2} test). Conclusion: Radiomics data from CBCT can be merged with those from CT by simple scaling or nonlinear correction for radiomics analysis.« less

  8. Uptake of Workplace HIV Counselling and Testing: A Cluster-Randomised Trial in Zimbabwe

    PubMed Central

    Corbett, Elizabeth L; Dauya, Ethel; Matambo, Ronnie; Cheung, Yin Bun; Makamure, Beauty; Bassett, Mary T; Chandiwana, Steven; Munyati, Shungu; Mason, Peter R; Butterworth, Anthony E; Godfrey-Faussett, Peter; Hayes, Richard J

    2006-01-01

    Background HIV counselling and testing is a key component of both HIV care and HIV prevention, but uptake is currently low. We investigated the impact of rapid HIV testing at the workplace on uptake of voluntary counselling and testing (VCT). Methods and Findings The study was a cluster-randomised trial of two VCT strategies, with business occupational health clinics as the unit of randomisation. VCT was directly offered to all employees, followed by 2 y of open access to VCT and basic HIV care. Businesses were randomised to either on-site rapid HIV testing at their occupational clinic (11 businesses) or to vouchers for off-site VCT at a chain of free-standing centres also using rapid tests (11 businesses). Baseline anonymised HIV serology was requested from all employees. HIV prevalence was 19.8% and 18.4%, respectively, at businesses randomised to on-site and off-site VCT. In total, 1,957 of 3,950 employees at clinics randomised to on-site testing had VCT (mean uptake by site 51.1%) compared to 586 of 3,532 employees taking vouchers at clinics randomised to off-site testing (mean uptake by site 19.2%). The risk ratio for on-site VCT compared to voucher uptake was 2.8 (95% confidence interval 1.8 to 3.8) after adjustment for potential confounders. Only 125 employees (mean uptake by site 4.3%) reported using their voucher, so that the true adjusted risk ratio for on-site compared to off-site VCT may have been as high as 12.5 (95% confidence interval 8.2 to 16.8). Conclusions High-impact VCT strategies are urgently needed to maximise HIV prevention and access to care in Africa. VCT at the workplace offers the potential for high uptake when offered on-site and linked to basic HIV care. Convenience and accessibility appear to have critical roles in the acceptability of community-based VCT. PMID:16796402

  9. An overview of the refinements and improvements to the USEPA’s sediment toxicity methods for freshwater sediment

    EPA Science Inventory

    Sediment toxicity tests are used for contaminated sediments, chemical registration, and water quality criteria evaluations and can be a core component of ecological risk assessments at contaminated sediments sites. Standard methods for conducting sediment toxicity tests have been...

  10. Screening methods for assessment of biodegradability of chemicals in seawater--results from a ring test

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nyholm, N.; Kristensen, P.

    1992-04-01

    An international ring test involving 14 laboratories was organized on behalf of the Commission of the European Economic Communities (EEC) with the purpose of evaluating two proposed screening methods for assessment of biodegradability in seawater: (a) a shake flask die-away test based primarily on analysis of dissolved organic carbon and (b) a closed bottle test based on determination of dissolved oxygen. Both tests are performed with nutrient-enriched natural seawater as the test medium and with no inoculum added other than the natural seawater microflora. The test methods are seawater versions of the modified OECD screening test and the closed bottlemore » test, respectively, adopted by the Organization for Economic Cooperation and Development (OECD) and by the EEC as tests for ready biodegradability.' The following five chemicals were examined: sodium benzoate, aniline, diethylene glycol, pentaerythritol, and 4-nitrophenol. Sodium benzoate and aniline, which are known to be generally readily biodegradable consistently degraded in practically all tests, thus demonstrating the technical feasibility of the methods. Like in previous ring tests with freshwater screening methods variable results were obtained with the other three compounds, which is believed primarily to be due to site-specific differences between the microflora of the different seawater samples used and to some extent also to differences in the applied concentrations of test material. A positive result with the screening methods indicates that the test substance will most likely degrade relatively rapidly in seawater from the site of collection, while a negative test result does not preclude biodegradability under environmental conditions where the concentrations of chemicals are much lower than the concentrations applied for analytical reasons in screening tests.« less

  11. Multi-level slug tests in highly permeable formations: 2. Hydraulic conductivity identification, method verification, and field applications

    USGS Publications Warehouse

    Zlotnik, V.A.; McGuire, V.L.

    1998-01-01

    Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.Using the developed theory and modified Springer-Gelhar (SG) model, an identification method is proposed for estimating hydraulic conductivity from multi-level slug tests. The computerized algorithm calculates hydraulic conductivity from both monotonic and oscillatory well responses obtained using a double-packer system. Field verification of the method was performed at a specially designed fully penetrating well of 0.1-m diameter with a 10-m screen in a sand and gravel alluvial aquifer (MSEA site, Shelton, Nebraska). During well installation, disturbed core samples were collected every 0.6 m using a split-spoon sampler. Vertical profiles of hydraulic conductivity were produced on the basis of grain-size analysis of the disturbed core samples. These results closely correlate with the vertical profile of horizontal hydraulic conductivity obtained by interpreting multi-level slug test responses using the modified SG model. The identification method was applied to interpret the response from 474 slug tests in 156 locations at the MSEA site. More than 60% of responses were oscillatory. The method produced a good match to experimental data for both oscillatory and monotonic responses using an automated curve matching procedure. The proposed method allowed us to drastically increase the efficiency of each well used for aquifer characterization and to process massive arrays of field data. Recommendations generalizing this experience to massive application of the proposed method are developed.

  12. EGFR mutation testing practices within the Asia Pacific region: results of a multicenter diagnostic survey.

    PubMed

    Yatabe, Yasushi; Kerr, Keith M; Utomo, Ahmad; Rajadurai, Pathmanathan; Tran, Van Khanh; Du, Xiang; Chou, Teh-Ying; Enriquez, Ma Luisa D; Lee, Geon Kook; Iqbal, Jabed; Shuangshoti, Shanop; Chung, Jin-Haeng; Hagiwara, Koichi; Liang, Zhiyong; Normanno, Nicola; Park, Keunchil; Toyooka, Shinichi; Tsai, Chun-Ming; Waring, Paul; Zhang, Li; McCormack, Rose; Ratcliffe, Marianne; Itoh, Yohji; Sugeno, Masatoshi; Mok, Tony

    2015-03-01

    The efficacy of epidermal growth factor receptor (EGFR) tyrosine kinase inhibitors in EGFR mutation-positive non-small-cell lung cancer (NSCLC) patients necessitates accurate, timely testing. Although EGFR mutation testing has been adopted by many laboratories in Asia, data are lacking on the proportion of NSCLC patients tested in each country, and the most commonly used testing methods. A retrospective survey of records from NSCLC patients tested for EGFR mutations during 2011 was conducted in 11 Asian Pacific countries at 40 sites that routinely performed EGFR mutation testing during that period. Patient records were used to complete an online questionnaire at each site. Of the 22,193 NSCLC patient records surveyed, 31.8% (95% confidence interval: 31.2%-32.5%) were tested for EGFR mutations. The rate of EGFR mutation positivity was 39.6% among the 10,687 cases tested. The majority of samples were biopsy and/or cytology samples (71.4%). DNA sequencing was the most commonly used testing method accounting for 40% and 32.5% of tissue and cytology samples, respectively. A pathology report was available only to 60.0% of the sites, and 47.5% were not members of a Quality Assurance Scheme. In 2011, EGFR mutation testing practices varied widely across Asia. These data provide a reference platform from which to improve the molecular diagnosis of NSCLC, and EGFR mutation testing in particular, in Asia.

  13. An evaluation of the utility of four in situ test methods for transmission line foundation design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mullen, W.G. Jr.

    1991-01-01

    This research examines four existing in-situ soil strength testing methods; standard penetration test (SPT), the cone penetrometer (CPT), the flat plate dilatometer (DMT), and the pressuremeter (PMT). Soils data were collected at eight separate sites using each of the devices. The test sites were chosen to mirror soil conditions encountered within the service territory of Virginia Power, the project sponsor. A total of 19 standard soil borings, 30 cone penetrometer soundings, 26 dilatometer soundings, and 33 pressuremeter tests were undertaken in residual, alluvial and marine clay soil conditions. The testing program was conducted with five areas of concern: (1) comparisonmore » of the penetration/stiffness data from the four tests, (2) comparison of values of undrained shear strength and angle of internal friction developed from each of the test methods, (3) determination if pressuremeter data can be correlated to and thereby developed from one of the more rapid tests, (4) comparison of indirect soil type identifications from the standard borings, (5) development of information on the relative effort required for each test. Comparison of the penetration resistance stiffness data produced useful correlations among the CPT and DMT, with the SPT data yielding more erratic results. Shear strength data was most consistent for the marine clay sites, while the CPT and DMT returned useful friction angle data in the alluvial sands. PMT data correlated well to both the CPT and DMT test results. Correlation of PMT results to the SPT was more erratic. Indirect soil identification from the CPT and DMT was fully adequate for transmission line foundation design purposes, and finally, useful comparative data on the relative testing time required for the four in-situ tests was developed.« less

  14. Retrospective dose assessment for the population living in areas of local fallout from the Semipalatinsk nuclear test site Part I: External exposure.

    PubMed

    Gordeev, Konstantin; Shinkarev, Sergey; Ilyin, Leonid; Bouville, André; Hoshi, Masaharu; Luckyanov, Nickolas; Simon, Steven L

    2006-02-01

    A short analysis of all 111 atmospheric events conducted at the Semipalatinsk Test Site (STS) in 1949-1962 with regard to significant off-site exposure (more than 5 mSv of the effective dose during the first year after the explosion) has been made. The analytical method used to assess external exposure to the residents living in settlements near the STS is described. This method makes use of the archival data on the radiological conditions, including the measurements of exposure rate. Special attention was given to the residents of Dolon and Kanonerka villages exposed mainly as a result of the first test, detonated on August 29, 1949. For the residents of those settlements born in 1935, the dose estimates calculated according to the analytical method, are compared to those derived from the thermoluminescence measurements in bricks and electron paramagnetic resonance measurements in teeth. The methods described in this paper were used for external dose assessment for the cohort members at an initial stage of an ongoing epidemiological study conducted by the U.S. National Cancer Institute in the Republic of Kazakhstan. Recently revised methods and estimates of external exposure for that cohort are given in another paper (Simon et al.) in this conference.

  15. Assessment of multiple geophysical techniques for the characterization of municipal waste deposit sites

    NASA Astrophysics Data System (ADS)

    Gaël, Dumont; Tanguy, Robert; Nicolas, Marck; Frédéric, Nguyen

    2017-10-01

    In this study, we tested the ability of geophysical methods to characterize a large technical landfill installed in a former sand quarry. The geophysical surveys specifically aimed at delimitating the deposit site horizontal extension, at estimating its thickness and at characterizing the waste material composition (the moisture content in the present case). The site delimitation was conducted with electromagnetic (in-phase and out-of-phase) and magnetic (vertical gradient and total field) methods that clearly showed the transition between the waste deposit and the host formation. Regarding waste deposit thickness evaluation, electrical resistivity tomography appeared inefficient on this particularly thick deposit site. Thus, we propose a combination of horizontal to vertical noise spectral ratio (HVNSR) and multichannel analysis of the surface waves (MASW), which successfully determined the approximate waste deposit thickness in our test landfill. However, ERT appeared to be an appropriate tool to characterize the moisture content of the waste, which is of prior information for the organic waste biodegradation process. The global multi-scale and multi-method geophysical survey offers precious information for site rehabilitation studies, water content mitigation processes for enhanced biodegradation or landfill mining operation planning.

  16. 40 CFR 60.704 - Test methods and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sites. The control device inlet sampling site for determination of vent stream molar composition or TOC... compliance with the 20 ppmv limit. The sampling site shall be the same as that of the TOC samples, and the samples shall be taken during the same time that the TOC samples are taken. The TOC concentration...

  17. 40 CFR 60.664 - Test methods and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sites. The control device inlet sampling site for determination of vent stream molar composition or TOC... compliance with the 20 ppmv limit. The sampling site shall be the same as that of the TOC samples, and the samples shall be taken during the same time that the TOC samples are taken. The TOC concentration...

  18. Web vulnerability study of online pharmacy sites.

    PubMed

    Kuzma, Joanne

    2011-01-01

    Consumers are increasingly using online pharmacies, but these sites may not provide an adequate level of security with the consumers' personal data. There is a gap in this research addressing the problems of security vulnerabilities in this industry. The objective is to identify the level of web application security vulnerabilities in online pharmacies and the common types of flaws, thus expanding on prior studies. Technical, managerial and legal recommendations on how to mitigate security issues are presented. The proposed four-step method first consists of choosing an online testing tool. The next steps involve choosing a list of 60 online pharmacy sites to test, and then running the software analysis to compile a list of flaws. Finally, an in-depth analysis is performed on the types of web application vulnerabilities. The majority of sites had serious vulnerabilities, with the majority of flaws being cross-site scripting or old versions of software that have not been updated. A method is proposed for the securing of web pharmacy sites, using a multi-phased approach of technical and managerial techniques together with a thorough understanding of national legal requirements for securing systems.

  19. Comparing regeneration techniques for afforesting previously farmed bottomland hardwood sites in the Lower Mississippi Alluvial Valley, USA

    Treesearch

    Brian Roy Lockhart; Bob Keeland; John McCoy; Thomas J. Dean

    2003-01-01

    A study was implemented to test site preparation methods and artificial regeneration of three oak (Quercus spp.) species on four agricultural fields in the Lower Mississippi Alluvial Valley in Louisiana, USA. Six years after establishment, few consistent differences were found in oak density between sowing acorn methods (seed drill versus broadcast...

  20. Impacts of heterogeneous organic matter on phenanthrene sorption--Different soil and sediment samples

    USGS Publications Warehouse

    Karapanagioti, Hrissi K.; Childs, Jeffrey; Sabatini, David A.

    2001-01-01

    Organic petrography has been proposed as a tool for characterizing the heterogeneous organic matter present in soil and sediment samples. A new simplified method is proposed as a quantitative means of interpreting observed sorption behavior for phenanthrene and different soils and sediments based on their organic petrographical characterization. This method is tested under singe solute conditions and at phenanthrene concentration of 1 μg/L. Since the opaque organic matter fraction dominates the sorption process, we propose that by quantifying this fraction one can interpret organic content normalized sorption distribution coefficient (Koc) values for a sample. While this method was developed and tested for various samples within the same aquifer, in the current study the method is validated for soil and sediment samples from different sites that cover a wide range of organic matter origin, age, and organic content. All 10 soil and sediment samples studied had log Koc values for the opaque particles between 5.6 and 6.8. This range of Koc values illustrates the heterogeneity of opaque particles between sites and geological formations and thus the need to characterize the opaque fraction of materials on a site-by-site basis.

  1. Detecting disease-predisposing variants: the haplotype method.

    PubMed Central

    Valdes, A M; Thomson, G

    1997-01-01

    For many HLA-associated diseases, multiple alleles-- and, in some cases, multiple loci--have been suggested as the causative agents. The haplotype method for identifying disease-predisposing amino acids in a genetic region is a stratification analysis. We show that, for each haplotype combination containing all the amino acid sites involved in the disease process, the relative frequencies of amino acid variants at sites not involved in disease but in linkage disequilibrium with the disease-predisposing sites are expected to be the same in patients and controls. The haplotype method is robust to mode of inheritance and penetrance of the disease and can be used to determine unequivocally whether all amino acid sites involved in the disease have not been identified. Using a resampling technique, we developed a statistical test that takes account of the nonindependence of the sites sampled. Further, when multiple sites in the genetic region are involved in disease, the test statistic gives a closer fit to the null expectation when some--compared with none--of the true predisposing factors are included in the haplotype analysis. Although the haplotype method cannot distinguish between very highly correlated sites in one population, ethnic comparisons may help identify the true predisposing factors. The haplotype method was applied to insulin-dependent diabetes mellitus (IDDM) HLA class II DQA1-DQB1 data from Caucasian, African, and Japanese populations. Our results indicate that the combination DQA1#52 (Arg predisposing) DQB1#57 (Asp protective), which has been proposed as an important IDDM agent, does not include all the predisposing elements. With rheumatoid arthritis HLA class II DRB1 data, the results were consistent with the shared-epitope hypothesis. PMID:9042931

  2. Restriction-Site-Specific PCR as a Rapid Test To Detect Enterohemorrhagic Escherichia coli O157:H7 Strains in Environmental Samples

    PubMed Central

    Kimura, Richard; Mandrell, Robert E.; Galland, John C.; Hyatt, Doreene; Riley, Lee W.

    2000-01-01

    Enterohemorrhagic Escherichia coli (EHEC) O157:H7 is an important food-borne pathogen in industrialized countries. We developed a rapid and simple test for detecting E. coli O157:H7 using a method based on restriction site polymorphisms. Restriction-site-specific PCR (RSS-PCR) involves the amplification of DNA fragments using primers based on specific restriction enzyme recognition sequences, without the use of endonucleases, to generate a set of amplicons that yield “fingerprint” patterns when resolved electrophoretically on an agarose gel. The method was evaluated in a blinded study of E. coli isolates obtained from environmental samples collected at beef cattle feedyards. The 54 isolates were all initially identified by a commonly used polyclonal antibody test as belonging to O157:H7 serotype. They were retested by anti-O157 and anti-H7 monoclonal antibody enzyme-linked immunosorbent assay (ELISA). The RSS-PCR method identified all 28 isolates that were shown to be E. coli O157:H7 by the monoclonal antibody ELISA as belonging to the O157:H7 serotype. Of the remaining 26 ELISA-confirmed non-O157:H7 strains, the method classified 25 strains as non-O157:H7. The specificity of the RSS-PCR results correlated better with the monoclonal antibody ELISA than with the polyclonal antibody latex agglutination tests. The RSS-PCR method may be a useful test to distinguish E. coli O157:H7 from a large number of E. coli isolates from environmental samples. PMID:10831431

  3. Methods to Use Surface Infiltration Tests in Permeable Pavement Systems to Determine Maintenance Frequency

    EPA Science Inventory

    Currently, there is limited guidance on selecting test sites to measure surface infiltration rates in permeable pavement systems to determine maintenance frequency. The ASTM method (ASTM C1701) for measuring infiltration rate of in-place pervious concrete suggest to either (1) p...

  4. Cavity detection and delineation research. Part 1: Microgravimetric and magnetic surveys: Medford Cave Site, Florida

    NASA Astrophysics Data System (ADS)

    Butler, D. K.

    1982-03-01

    This report reviews the scope of a research effort initiated in 1974 at the U.S. Army Engineer Waterways Experiment Station with the objectives of (a) assessing the state of the art in geophysical cavity detection and delineation methodology and (b) developing new methods and improving or adapting old methods for application to cavity detection and delineation. Two field test sites were selected: (a) the Medford Cave site with a relatively shallow (10- to 50-ft-deep) air-filled cavity system and (b) the Manatee Springs site with a deeper (approximately 100-ft-deep) water-filled cavity system. Results of field studies at the Medford Cave site are presented in this report: (a) the site geology, (b) the site topographic survey, (c) the site drilling program (boreholes for geophysical tests, for determination of a detailed geological cross section, and for verification of geophysical anomalies), (d) details of magnetic and microgravimetric surveys, and (e) correlation of geophysical results with known site geology. Qualitative interpretation guidelines using complementary geophysical techniques for site investigations in karst regions are presented. Including the results of electrical resistivity surveys conducted at the Medford Cave site, the qualitative guidelines are applied to four profile lines, and drilling locations are indicated on the profile plots of gravity, magnetic, and electrical resistivity data. Borehole logs are then presented for comparison with the predictions of the qualitative interpretation guidelines.

  5. Interference of avian guano in analyses of fuel-contaminated soils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, D.E.; Johnson, T.E.; Kreamer, D.K.

    1996-01-01

    Site characterization on Johnston Island, Johnston Atoll, Pacific Ocean, has yielded preliminary data that seabird guano can be an interference in three common petroleum hydrocarbon quantification methods. Volatiles from seabird guano were measured on a hydrocarbon-specific handheld vapor meter (catalytic detector) in concentrations as high as 256 ppm by volume total hydrocarbon. Analysis of guano solids produced measurable concentrations of total petroleum hydrocarbon (TPH) as diesel using both an immunoassay test and the EPA 8015 Modified Method. The testing was conducted on one surface sample of guano collected from a seabird roosting and nesting area. Source species were not identified.more » Positive hydrocarbon test results for guano raise concerns regarding the effectiveness of standard methods of petroleum-contaminated site characterization for Johnston island, other Pacific islands, and coastal areas with historic or contemporary seabird populations.« less

  6. Test-well drilling in the upper Satus Creek basin, Yakima Indian Reservation, Washington

    USGS Publications Warehouse

    Pearson, H.E.

    1977-01-01

    Two test wells were drilled in the upper Satus Creek basin of the Yakima Indian Reservation, Washington, using the air-rotary method. At site 1 the well penetrated a young basalt and 175 feet of the Yakima Basalt, and at site 2 the well penetrated the young basalt. The well at site 1 was drilled to a depth of 350 feet. Tests for drawdown and yield indicated a specific capacity of about 11 gallons per minute per foot of drawdown. The potential yield of this well may be about 1,000 gallons per minute. The well at site 2 was drilled to a depth of 500 feet. Only a small quantity of water was encountered and no test for yield was made. Data from these wells, including chemical analysis of the water from the well at site 1, will provide information useful in the development and management of the ground-water resources in this part of the Yakima Indian Reservation. (Woodard-USGS)

  7. Difference Between Dormant Conduction Sites Revealed by Adenosine Triphosphate Provocation and Unipolar Pace-Capture Sites Along the Ablation Line After Pulmonary Vein Isolation.

    PubMed

    Kogawa, Rikitake; Okumura, Yasuo; Watanabe, Ichiro; Sonoda, Kazumasa; Sasaki, Naoko; Takahashi, Keiko; Iso, Kazuki; Nagashima, Koichi; Ohkubo, Kimie; Nakai, Toshiko; Kunimoto, Satoshi; Hirayama, Atsushi

    2016-01-01

    Dormant pulmonary vein (PV) conduction revealed by adenosine/adenosine triphosphate (ATP) provocation test and exit block to the left atrium by pacing from the PV side of the ablation line ("pace and ablate" method) are used to ensure durable pulmonary vein isolation (PVI). However, the mechanistic relation between ATP-provoked PV reconnection and the unexcitable gap along the ablation line is unclear.Forty-five patients with atrial fibrillation (AF) (paroxysmal: 31 patients, persistent: 14 patients; age: 61.1 ± 9.7 years) underwent extensive encircling PVI (EEPVI, 179 PVs). After completion of EEPVI, an ATP provocation test (30 mg, bolus injection) and unipolar pacing (output, 10 mA; pulse width, 2 ms) were performed along the previous EEPVI ablation line to identify excitable gaps. Dormant conduction was revealed in 29 (34 sites) of 179 PVs (16.2%) after EEP-VI (22/45 patients). Pace capture was revealed in 59 (89 sites) of 179 PVs (33.0%) after EEPVI (39/45 patients), and overlapping sites, ie, sites showing both dormant conduction and pace capture, were observed in 22 of 179 (12.3%) PVs (17/45 patients).Some of the ATP-provoked dormant PV reconnection sites were identical to the sites with excitable gaps revealed by pace capture, but most of the PV sites were differently distributed, suggesting that the main underling mechanism differs between these two forms of reconnection. These findings also suggest that performance of the ATP provocation test followed by the "pace and ablate" method can reduce the occurrence of chronic PV reconnections.

  8. Control of pest species: Tree shelters help protect seedlings from nutria

    USGS Publications Warehouse

    Allen, J.A.; Boykin, R.

    1991-01-01

    Various methods of nutria preventative techniques were tested in attempts to curb the loss of seedlings due to nutria capturing. The results of testing possibly indicate that tree shelters have real potential for use in forest restoration projects on sites with moderate nutria populations. Tree shelters may even prove effective on sites with high nutria populations, as long as alternative food supplies are available.

  9. Probing numerical Laplace inversion methods for two and three-site molecular exchange between interconnected pore structures.

    PubMed

    Silletta, Emilia V; Franzoni, María B; Monti, Gustavo A; Acosta, Rodolfo H

    2018-01-01

    Two-dimension (2D) Nuclear Magnetic Resonance relaxometry experiments are a powerful tool extensively used to probe the interaction among different pore structures, mostly in inorganic systems. The analysis of the collected experimental data generally consists of a 2D numerical inversion of time-domain data where T 2 -T 2 maps are generated. Through the years, different algorithms for the numerical inversion have been proposed. In this paper, two different algorithms for numerical inversion are tested and compared under different conditions of exchange dynamics; the method based on Butler-Reeds-Dawson (BRD) algorithm and the fast-iterative shrinkage-thresholding algorithm (FISTA) method. By constructing a theoretical model, the algorithms were tested for a two- and three-site porous media, varying the exchange rates parameters, the pore sizes and the signal to noise ratio. In order to test the methods under realistic experimental conditions, a challenging organic system was chosen. The molecular exchange rates of water confined in hierarchical porous polymeric networks were obtained, for a two- and three-site porous media. Data processed with the BRD method was found to be accurate only under certain conditions of the exchange parameters, while data processed with the FISTA method is precise for all the studied parameters, except when SNR conditions are extreme. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Rapid urease test (RUT) for evaluation of urease activity in oral bacteria in vitro and in supragingival dental plaque ex vivo.

    PubMed

    Dahlén, Gunnar; Hassan, Haidar; Blomqvist, Susanne; Carlén, Anette

    2018-05-18

    Urease is an enzyme produced by plaque bacteria hydrolysing urea from saliva and gingival exudate into ammonia in order to regulate the pH in the dental biofilm. The aim of this study was to assess the urease activity among oral bacterial species by using the rapid urease test (RUT) in a micro-plate format and to examine whether this test could be used for measuring the urease activity in site-specific supragingival dental plaque samples ex vivo. The RUT test is based on 2% urea in peptone broth solution and with phenol red at pH 6.0. Oral bacterial species were tested for their urease activity using 100 μl of RUT test solution in the well of a micro-plate to which a 1 μl amount of cells collected after growth on blood agar plates or in broth, were added. The color change was determined after 15, 30 min, and 1 and 2 h. The reaction was graded in a 4-graded scale (none, weak, medium, strong). Ex vivo evaluation of dental plaque urease activity was tested in supragingival 1 μl plaque samples collected from 4 interproximal sites of front teeth and molars in 18 adult volunteers. The color reaction was read after 1 h in room temperature and scored as in the in vitro test. The strongest activity was registered for Staphylococcus epidermidis, Helicobacter pylori, Campylobacter ureolyticus and some strains of Haemophilus parainfluenzae, while known ureolytic species such as Streptococcus salivarius and Actinomyces naeslundii showed a weaker, variable and strain-dependent activity. Temperature had minor influence on the RUT reaction. The interproximal supragingival dental plaque between the lower central incisors (site 31/41) showed significantly higher scores compared to between the upper central incisors (site 11/21), between the upper left first molar and second premolar (site 26/25) and between the lower right second premolar and molar (site 45/46). The rapid urease test (RUT) in a micro-plate format can be used as a simple and rapid method to test urease activity in bacterial strains in vitro and as a chair-side method for testing urease activity in site-specific supragingival plaque samples ex vivo.

  11. Determination of in situ state of stress at the Spent Fuel Test-Climax site, Climax Stock, Nevada Test Site

    USGS Publications Warehouse

    Ellis, W.L.; Magner, J.E.

    1982-01-01

    Determination of the in situ state of stress at the site of the Spent Fuel Test--Climax, using the U.S. Bureau of Mines overcore method, indicates principal stress magnitudes of 11.6 MPa, 7.1 MPa, and 2.8 MPa. The bearing and plunge of the maximum and minimum principal stress components are, respectively: N. 56? E., 29? NE; and N. 42? W., 14? NW. The vertical stress magnitude of 7.9 MPa calculated from the overcore data is significantly less than expected from overburden pressure, suggesting the stress field is influenced by local or areal geologic factors. Results from this investigation indicate (1) the stress state at the Spent Fuel Test--Climax site deviates significantly from a gravitational stress field, both in relative stress magnitudes and in orientation; (2) numerical modeling will not realistically simulate the near-field response of the Spent Fuel Test--Climax site if gravitational and (or) horizontal and vertical applied stress boundary conditions are assumed; and (3) substantial stress variations may occur spatially within the stock.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McLellan, G.W.

    This test plan describes the field demonstration of the sonic drilling system being conducted as a coordinated effort between the VOC-Arid ID (Integrated Demonstration) and the 200 West Area Carbon Tetrachloride ERA (Expedited Response Action) programs at Hanford. The purpose of this test is to evaluate the Water Development Corporation`s drilling system, modify components as necessary and determine compatible drilling applications for the sonic drilling method for use at facilities in the DOE complex. The sonic demonstration is being conducted as the first field test under the Cooperative Research and Development Agreement (CRADA) which involves the US Department of Energy,more » Pacific Northwest Laboratory, Westinghouse Hanford Company and Water Development Corporation. The sonic drilling system will be used to drill a 45 degree vadose zone well, two vertical wells at the VOC-Arid ID site, and several test holes at the Drilling Technology Test Site north of the 200 Area fire station. Testing at other locations will depend on the performance of the drilling method. Performance of this technology will be compared to the baseline drilling method (cable-tool).« less

  13. Considerations regarding the implementation of EPR dosimetry for the population in the vicinity of Semipalatinsk nuclear test site based on experience from other radiation accidents.

    PubMed

    Skvortsov, Valeriy; Ivannikov, Alexander; Tikunov, Dimitri; Stepanenko, Valeriy; Borysheva, Natalie; Orlenko, Sergey; Nalapko, Mikhail; Hoshi, Masaharu

    2006-02-01

    General aspects of applying the method of retrospective dose estimation by electron paramagnetic resonance spectroscopy of human tooth enamel (EPR dosimetry) to the population residing in the vicinity of the Semipalatinsk nuclear test site are analyzed and summarized. The analysis is based on the results obtained during 20 years of investigations conducted in the Medical Radiological Research Center regarding the development and practical application of this method for wide-scale dosimetrical investigation of populations exposed to radiation after the Chernobyl accident and other radiation accidents.

  14. Methods of measurement of exploratory well impacts, offshore Florida

    USGS Publications Warehouse

    Dustan, Phillip A.; Kindinger, Jack L.; Lidz, B.H.; Hudson, J.H.

    1990-01-01

    Six offshore oil well tests were drilled off Key West in the late 1950s and early 1960s. Two wells were drilled on coral bottom, two on carbonate sand, and two on mixed turtle grass and gorgonian/sponge hardbottom. After locating the sites with a proton magnetometer; several underwater assessment methods were used to measure the ecological impacts of drilling. Because of differing environments and bottom types, no single method was applicable at every site.

  15. A biomonitoring plan for assessing potential radionuclide exposure using Amchitka Island in the Aleutian chain of Alaska as a case study.

    PubMed

    Burger, Joanna; Gochfeld, Michael; Kosson, D S; Powers, Charles W

    2007-01-01

    With the ending of the Cold War, the US and other nations were faced with a legacy of nuclear wastes. For some sites where hazardous nuclear wastes will remain in place, methods must be developed to protect human health and the environment. Biomonitoring is one method of assessing the status and trends of potential radionuclide exposure from nuclear waste sites, and of providing the public with early warning of any potential harmful exposure. Amchitka Island (51 degrees N lat, 179 degrees E long) was the site of three underground nuclear tests from 1965 to 1971. Following a substantive study of radionuclide levels in biota from the marine environment around Amchitka and a reference site, we developed a suite of bioindicators (with suggested isotopes) that can serve as a model for other sites contaminated with radionuclides. Although the species selection was site-specific, the methods can provide a framework for other sites. We selected bioindicators using five criteria: (1) occurrence at all three test shots (and reference site), (2) receptor groups (subsistence foods, commercial species, and food chain nodes), (3) species groups (plants, invertebrates, fish, and birds), (4) trophic levels, and (5) an accumulator of one or several radionuclides. Our major objective was to identify bioindicators that could serve for both human health and the ecosystem, and were abundant enough to collect adjacent to the three test sites and at the reference site. Site-specific information on both biota availability and isotope levels was essential in the final selection of bioindicators. Actinides bioaccumulated in algae and invertebrates, while radiocesium accumulated in higher trophic level birds and fish. Thus, unlike biomonitoring schemes developed for heavy metals or other contaminants, top-level predators are not sufficient to evaluate potential radionuclide exposure at Amchitka. The process described in this paper resulted in the selection of Fucus, Alaria fistulosa, blue mussel (Mytilus trossulus), dolly varden (Salvelinus malma), black rockfish (Sebastes melanops), Pacific cod (Gadus macrocephalus), Pacific halibut (Hippoglossus stenolepis), and glaucous-winged gull (Larus glaucescens) as bioindicators. This combination of species included mainly subsistence foods, commercial fish, and nodes on different food chains.

  16. The "Good, Bad and Ugly" pin site grading system: A reliable and memorable method for documenting and monitoring ring fixator pin sites.

    PubMed

    Clint, S A; Eastwood, D M; Chasseaud, M; Calder, P R; Marsh, D R

    2010-02-01

    Although there is much in the literature regarding pin site infections, there is no accepted, validated method for documenting their state. We present a system for reliably labelling pin sites on any ring fixator construct and an easy-to-remember grading system to document the state of each pin site. Each site is graded in terms of erythema, pain and discharge to give a 3-point scale, named "Good", "Bad" and "Ugly" for ease of recall. This system was tested for intra- and inter-observer reproducibility. 15 patients undergoing elective limb reconstruction were recruited. A total of 218 pin sites were independently scored by 2 examiners. 82 were then re-examined later by the same examiners. 514 pin sites were felt to be "Good", 80 "Bad" and 6 "Ugly". The reproducibility of the system was found to be excellent. We feel our system gives a quick, reliable and reproducible method to monitor individual pin sites and their response to treatment. Crown Copyright 2009. Published by Elsevier Ltd. All rights reserved.

  17. [Analysis of tools, methods and results of toxicological screening for detection of drug abuse in Italian professional drivers].

    PubMed

    Rosso, G L

    2013-01-01

    Three years after a protocol agreement between the State and the Regions came into force in 2008 (drug testing at the workplace Law) a large number of studies have been conducted to analyse and test the efficacy of on-site screening tests for detection of drug consumption (opiates, cocaine, cannabinoids, amphetamine and methamphetamine, MDMA and methadone), which are frequently used by the occupational health physician, and also to present data resulting from workplace drug testing obtained during health surveillance programmes. The aim of the present study was to verify whether the features of sensitivity and specificity of the most common on-site testing ensure correct application of the provisions of current Italian legislation and also to analyse published studies showing the frequency of positive drug testing. A review of Italian and international literature was carried out aimed at identifying studies relating to: (1) performance of on-site screening tests frequently used by the occupational health physician, (2) prevalence of drug use/abuse among Italian public and commercial transport drivers. A comparison between the studies was then carried out. Several rapid on-site screening tests are commercially available (Italian law does not provide standards for the technical specifications of the tests), the sensitivity and specificity of which varies depending on the model and the substance tested. The sensitivity of these tools is poor when used for the detection of low concentrations of drugs and/or their metabolites in urine (close to the cut-off). Studies are lacking that compare on-site tests performed by the occupational health physician and confirmative tests in specialized laboratories (with particular regard to false positives found by the occupational health physician). The major studies in terms of methods and/or size reported a positive rate (confirmed at the first level) between 1.6% and 1.9%. The drugs most frequently used/abused were cannabis and cocaine. The performance of on-site screening tests (to detect psychotropic substances on urine matrix) and the methodology required by Italian law show that the aims of Italian workplace drug testing legislation have not been achieved The low positive rate observed in Italian studies could be due to an error in the first phase of screening performed by the occupational health physician.

  18. Comparison of Piezosurgery and Conventional Rotary Instruments for Removal of Impacted Mandibular Third Molars: A Randomized Controlled Clinical and Radiographic Trial

    PubMed Central

    Shokry, Mohamed; Aboelsaad, Nayer

    2016-01-01

    The purpose of this study was to test the effect of the surgical removal of impacted mandibular third molars using piezosurgery versus the conventional surgical technique on postoperative sequelae and bone healing. Material and Methods. This study was carried out as a randomized controlled clinical trial: split mouth design. Twenty patients with bilateral mandibular third molar mesioangular impaction class II position B indicated for surgical extraction were treated randomly using either the piezosurgery or the conventional bur technique on each site. Duration of the procedure, postoperative edema, trismus, pain, healing, and bone density and quantity were evaluated up to 6 months postoperatively. Results. Test and control sites were compared using paired t-test. There was statistical significance in reduction of pain and swelling in test sites, where the time of the procedure was statistically increased in test site. For bone quantity and quality, statistical difference was found where test site showed better results. Conclusion. Piezosurgery technique improves quality of patient's life in form of decrease of postoperative pain, trismus, and swelling. Furthermore, it enhances bone quality within the extraction socket and bone quantity along the distal aspect of the mandibular second molar. PMID:27597866

  19. Effects of site preparation on timber and non-timber values of loblolly pine plantations

    Treesearch

    Jianbang Gan; Stephen H. Kolison; James H. Miller; Tasha M. Hargrove

    1998-01-01

    This study evaluated the timber and non-timber values of the forest stands generated by four site preparation methods tested in the Tuskegee National Forest 15 yr earlier. The timber values of the forest stands were assessed with the timber yields predicted by the SE TWIGS model. Non-timber benefits were evaluated through the Contingent Valuation Method. Two hundred...

  20. Effects of opening size and site preparation method on vegetation development after implementing group selection in a pine-hardwood stand

    Treesearch

    M.D. Cain; M.G. Shelton

    2001-01-01

    Three opening sizes (0.25, 0.625, and 1.0 ac) and three site preparation methods (herbicides, mechanical, and an untreated control) were tested in a pine-hardwood stand dominated by loblolly and shortleaf pines (Pinus taeda L. and P. echinata Mill.) and mixed oaks (Quercus spp.) that was being converted to uneven...

  1. Molecular detection of airborne Coccidioides in Tucson, Arizona

    USGS Publications Warehouse

    Chow, Nancy A.; Griffin, Dale W.; Barker, Bridget M.; Loparev, Vladimir N.; Litvintseva, Anastasia P.

    2016-01-01

    Environmental surveillance of the soil-dwelling fungus Coccidioides is essential for the prevention of Valley fever, a disease primarily caused by inhalation of the arthroconidia. Methods for collecting and detectingCoccidioides in soil samples are currently in use by several laboratories; however, a method utilizing current air sampling technologies has not been formally demonstrated for the capture of airborne arthroconidia. In this study, we collected air/dust samples at two sites (Site A and Site B) in the endemic region of Tucson, Arizona, and tested a variety of air samplers and membrane matrices. We then employed a single-tube nested qPCR assay for molecular detection. At both sites, numerous soil samples (n = 10 at Site A and n = 24 at Site B) were collected and Coccidioides was detected in two samples (20%) at Site A and in eight samples (33%) at Site B. Of the 25 air/dust samples collected at both sites using five different air sampling methods, we detected Coccidioides in three samples from site B. All three samples were collected using a high-volume sampler with glass-fiber filters. In this report, we describe these methods and propose the use of these air sampling and molecular detection strategies for environmental surveillance of Coccidioides.

  2. Boise Hydrogeophysical Research Site: Control Volume/Test Cell and Community Research Asset

    NASA Astrophysics Data System (ADS)

    Barrash, W.; Bradford, J.; Malama, B.

    2008-12-01

    The Boise Hydrogeophysical Research Site (BHRS) is a research wellfield or field-scale test facility developed in a shallow, coarse, fluvial aquifer with the objectives of supporting: (a) development of cost- effective, non- or minimally-invasive quantitative characterization and imaging methods in heterogeneous aquifers using hydrologic and geophysical techniques; (b) examination of fundamental relationships and processes at multiple scales; (c) testing theories and models for groundwater flow and solute transport; and (d) educating and training of students in multidisciplinary subsurface science and engineering. The design of the wells and the wellfield support modular use and reoccupation of wells for a wide range of single-well, cross-hole, multiwell and multilevel hydrologic, geophysical, and combined hydrologic-geophysical experiments. Efforts to date by Boise State researchers and collaborators have been largely focused on: (a) establishing the 3D distributions of geologic, hydrologic, and geophysical parameters which can then be used as the basis for jointly inverting hard and soft data to return the 3D K distribution and (b) developing subsurface measurement and imaging methods including tomographic characterization and imaging methods. At this point the hydrostratigraphic framework of the BHRS is known to be a hierarchical multi-scale system which includes layers and lenses that are recognized with geologic, hydrologic, radar, seismic, and EM methods; details are now emerging which may allow 3D deterministic characterization of zones and/or material variations at the meter scale in the central wellfield. Also the site design and subsurface framework have supported a variety of testing configurations for joint hydrologic and geophysical experiments. Going forward we recognize the opportunity to increase the R&D returns from use of the BHRS with additional infrastructure (especially for monitoring the vadose zone and surface water-groundwater interactions), more collaborative activity, and greater access to site data. Our broader goal of becoming more available as a research asset for the scientific community also supports the long-term business plan of increasing funding opportunities to maintain and operate the site.

  3. The "3 in 1" Study: Pooling Self-Taken Pharyngeal, Urethral, and Rectal Samples into a Single Sample for Analysis for Detection of Neisseria gonorrhoeae and Chlamydia trachomatis in Men Who Have Sex with Men.

    PubMed

    Sultan, B; White, J A; Fish, R; Carrick, G; Brima, N; Copas, A; Robinson, A; Gilson, R; Mercey, D; Benn, P

    2016-03-01

    Triple-site testing (using pharyngeal, rectal, and urethral/first-void urine samples) for Neisseria gonorrhoeae and Chlamydia trachomatis using nucleic acid amplification tests detects greater numbers of infections among men who have sex with men (MSM). However, triple-site testing represents a cost pressure for services. MSM over 18 years of age were eligible if they requested testing for sexually transmitted infections (STIs), reported recent sexual contact with either C. trachomatis or N. gonorrhoeae, or had symptoms of an STI. Each patient underwent standard-of-care (SOC) triple-site testing, and swabs were taken to form a pooled sample (PS) (pharyngeal, rectal, and urine specimens). The PS was created using two methods during different periods at one clinic, but we analyzed the data in combination because the sensitivity of the two methods did not differ significantly for C. trachomatis (P = 0.774) or N. gonorrhoeae (P = 0.163). The sensitivity of PS testing (92%) was slightly lower than that of SOC testing (96%) for detecting C. trachomatis (P = 0.167). For N. gonorrhoeae, the sensitivity of PS testing (90%) was significantly lower than that of SOC testing (99%) (P < 0.001). When pharynx-only infections were excluded, the sensitivity of PS testing to detect N. gonorrhoeae infections increased to 94%. Our findings show that pooling of self-taken samples could be an effective and cost-saving method, with high negative predictive values. (Interim results of this study were presented at the BASHH 2013 summer meeting.). Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  4. Final summary of the laboratory culture and toxicity testing of juvenile western pearlshell (Margaritifera falcata) native to the western United States: Expansion of freshwater mussel water and sediment toxicity testing methods

    EPA Science Inventory

    A Regional Applied Research Effort project with EPA Region 10, ORD and USGS was initiated as a result of a baseline ecological risk assessment (BERA) problem formulation for the Upper Columbia River (UCR) site in northwest Washington. The UCR site is a 165-mile stretch of the Col...

  5. Test of tree core sampling for screening of toxic elements in soils from a Norwegian site.

    PubMed

    Algreen, Mette; Rein, Arno; Legind, Charlotte N; Amundsen, Carl Einar; Karlson, Ulrich Gosewinkel; Trapp, Stefan

    2012-04-01

    Tree core samples have been used to delineate organic subsurface plumes. In 2009 and 2010, samples were taken at trees growing on a former dump site in Norway and analyzed for arsenic (As), cadmium (Cd), chromium (Cr), copper (Cu), nickel (Ni), and zinc (Zn). Concentrations in wood were in averages (dw) 30 mg/kg for Zn, 2 mg/kg for Cu, and < 1 mg/kg for Cd, Cr, As and Ni. The concentrations in wood samples from the polluted test site were compared to those derived from a reference site. For all except one case, mean concentrations from the test site were higher than those from the reference site, but the difference was small and not always significant. Differences between tree species were usually higher than differences between reference and test site. Furthermore, all these elements occur naturally, and Cu, Ni, and Zn are essential minerals. Thus, all trees will have a natural background of these elements, and the occurrence alone does not indicate soil pollution. For the interpretation of the results, a comparison to wood samples from an unpolluted reference site with same species and similar soil conditions is required. This makes the tree core screening method less reliable for heavy metals than, e.g., for chlorinated solvents.

  6. Innovative Field Methods for Characterizing the Hydraulic Properties of a Complex Fractured Rock Aquifer (Ploemeur, Brittany)

    NASA Astrophysics Data System (ADS)

    Bour, O.; Le Borgne, T.; Longuevergne, L.; Lavenant, N.; Jimenez-Martinez, J.; De Dreuzy, J. R.; Schuite, J.; Boudin, F.; Labasque, T.; Aquilina, L.

    2014-12-01

    Characterizing the hydraulic properties of heterogeneous and complex aquifers often requires field scale investigations at multiple space and time scales to better constrain hydraulic property estimates. Here, we present and discuss results from the site of Ploemeur (Brittany, France) where complementary hydrological and geophysical approaches have been combined to characterize the hydrogeological functioning of this highly fractured crystalline rock aquifer. In particular, we show how cross-borehole flowmeter tests, pumping tests and frequency domain analysis of groundwater levels allow quantifying the hydraulic properties of the aquifer at different scales. In complement, we used groundwater temperature as an excellent tracer for characterizing groundwater flow. At the site scale, measurements of ground surface deformation through long-base tiltmeters provide robust estimates of aquifer storage and allow identifying the active structures where groundwater pressure changes occur, including those acting during recharge process. Finally, a numerical model of the site that combines hydraulic data and groundwater ages confirms the geometry of this complex aquifer and the consistency of the different datasets. The Ploemeur site, which has been used for water supply at a rate of about 106 m3 per year since 1991, belongs to the French network of hydrogeological sites H+ and is currently used for monitoring groundwater changes and testing innovative field methods.

  7. DuPont Qualicon BAX System polymerase chain reaction assay. Performance Tested Method 100201.

    PubMed

    Tice, George; Andaloro, Bridget; Fallon, Dawn; Wallace, F Morgan

    2009-01-01

    A recent outbreak of Salmonella in peanut butter has highlighted the need for validation of rapid detection methods. A multilaboratory study for detecting Salmonella in peanut butter was conducted as part of the AOAC Research Institute Emergency Response Validation program for methods that detect outbreak threats to food safety. Three sites tested spiked samples from the same master mix according to the U.S. Food and Drug Administration's Bacteriological Analytical Manual (FDA-BAM) method and the BAX System method. Salmonella Typhimurium (ATCC 14028) was grown in brain heart infusion for 24 h at 37 degrees C, then diluted to appropriate levels for sample inoculation. Master samples of peanut butter were spiked at high and low target levels, mixed, and allowed to equilibrate at room temperature for 2 weeks. Spike levels were low [1.08 most probable number (MPN)/25 g]; high (11.5 MPN/25 g) and unspiked to serve as negative controls. Each master sample was divided into 25 g portions and coded to blind the samples. Twenty portions of each spiked master sample and five portions of the unspiked sample were tested at each site. At each testing site, samples were blended in 25 g portions with 225 mL prewarmed lactose broth until thoroughly homogenized, then allowed to remain at room temperature for 55-65 min. Samples were adjusted to a pH of 6.8 +/- 0.2, if necessary, and incubated for 22-26 h at 35 degrees C. Across the three reporting laboratories, the BAX System detected Salmonella in 10/60 low-spike samples and 58/60 high-spike samples. The reference FDA-BAM method yielded positive results for 11/60 low-spike and 58/60 high-spike samples. Neither method demonstrated positive results for any of the 15 unspiked samples.

  8. Guidelines for model calibration and application to flow simulation in the Death Valley regional groundwater system

    USGS Publications Warehouse

    Hill, M.C.; D'Agnese, F. A.; Faunt, C.C.

    2000-01-01

    Fourteen guidelines are described which are intended to produce calibrated groundwater models likely to represent the associated real systems more accurately than typically used methods. The 14 guidelines are discussed in the context of the calibration of a regional groundwater flow model of the Death Valley region in the southwestern United States. This groundwater flow system contains two sites of national significance from which the subsurface transport of contaminants could be or is of concern: Yucca Mountain, which is the potential site of the United States high-level nuclear-waste disposal; and the Nevada Test Site, which contains a number of underground nuclear-testing locations. This application of the guidelines demonstrates how they may be used for model calibration and evaluation, and also to direct further model development and data collection.Fourteen guidelines are described which are intended to produce calibrated groundwater models likely to represent the associated real systems more accurately than typically used methods. The 14 guidelines are discussed in the context of the calibration of a regional groundwater flow model of the Death Valley region in the southwestern United States. This groundwater flow system contains two sites of national significance from which the subsurface transport of contaminants could be or is of concern: Yucca Mountain, which is the potential site of the United States high-level nuclear-waste disposal; and the Nevada Test Site, which contains a number of underground nuclear-testing locations. This application of the guidelines demonstrates how they may be used for model calibration and evaluation, and also to direct further model development and data collection.

  9. Assessing soil quality and potential productivity - a basic approach to define and assess the marginality of land

    NASA Astrophysics Data System (ADS)

    Repmann, Frank; Gerwin, Werner; Freese, Dirk

    2017-04-01

    An ever growing demand for energy and the widely proposed switch from fossil fuels to more sustainable energy sources puts the cultivation and use of bioenergy plants into focus. However, bioenergy production on regular and fertile agricultural soils might conflict with the worldwide growing demand for food. To mitigate or omit this potential conflict, the use of low quality or marginal land for cultivation of bioenergy plants becomes favorable. Against this background the definition and assessment of land marginality and, respectively, the evaluation whether and to which extent specific areas are marginal and thus convenient for sustainable bioenergy production, becomes highly relevant. Within the framework of the EU funded Horizon 2020 project SEEMLA, we attempted to asses land marginality of designated test sites in the Ukraine, Greece and Germany by direct field survey. For that purpose, soil and site properties were investigated and evaluated by applying the Muencheberg Soil Quality Rating (SQR) method, developed at the Leibniz Centre for Agricultural Landscape Research (ZALF). The method deploys a comprehensive set of biogeophysical and chemical indicators to describe and finally evaluate the quality of the soil and site by a score ranging from 1 to 100 points. Field survey data were supported by additional laboratory tests on a representative set of soil samples. Practical field work and analysis of field and lab data from the investigated sites proved the applicability of the SQR method within the SEEMLA context. The SQR indices calculated from the field and lab data ranged from 2 to < 40 and clearly demonstrated the marginality of the investigated sites in the Ukraine, Greece and Germany, which differed considerably in respect to their characteristics. Correlating the site quality index to yield data reflecting yield estimations for common bioenergy plants such as willow (Salix sp.), black locust (Robinia pseudoacacia) and poplar (Populus sp.) cultivated at the respective test sites, revealed that SQR might additionally reflect the potential yield of the investigated sites.

  10. A STANDARDIZED ASSESSMENT METHOD (SAM) FOR RIVERINE MACROINVERTEBRATES

    EPA Science Inventory

    A macroinvertebrate sampling method for large rivers based on desirable characteristics of existing nonwadeable methods was developed and tested. Six sites each were sampled on the Great Miami and Kentucky Rivers, reflecting a human disturbance gradient. Samples were collected ...

  11. Evaluation of multiple tracer methods to estimate low groundwater flow velocities.

    PubMed

    Reimus, Paul W; Arnold, Bill W

    2017-04-01

    Four different tracer methods were used to estimate groundwater flow velocity at a multiple-well site in the saturated alluvium south of Yucca Mountain, Nevada: (1) two single-well tracer tests with different rest or "shut-in" periods, (2) a cross-hole tracer test with an extended flow interruption, (3) a comparison of two tracer decay curves in an injection borehole with and without pumping of a downgradient well, and (4) a natural-gradient tracer test. Such tracer methods are potentially very useful for estimating groundwater velocities when hydraulic gradients are flat (and hence uncertain) and also when water level and hydraulic conductivity data are sparse, both of which were the case at this test location. The purpose of the study was to evaluate the first three methods for their ability to provide reasonable estimates of relatively low groundwater flow velocities in such low-hydraulic-gradient environments. The natural-gradient method is generally considered to be the most robust and direct method, so it was used to provide a "ground truth" velocity estimate. However, this method usually requires several wells, so it is often not practical in systems with large depths to groundwater and correspondingly high well installation costs. The fact that a successful natural gradient test was conducted at the test location offered a unique opportunity to compare the flow velocity estimates obtained by the more easily deployed and lower risk methods with the ground-truth natural-gradient method. The groundwater flow velocity estimates from the four methods agreed very well with each other, suggesting that the first three methods all provided reasonably good estimates of groundwater flow velocity at the site. The advantages and disadvantages of the different methods, as well as some of the uncertainties associated with them are discussed. Published by Elsevier B.V.

  12. The integrated contaminant elution and tracer test toolkit, ICET3, for improved characterization of mass transfer, attenuation, and mass removal

    NASA Astrophysics Data System (ADS)

    Brusseau, Mark L.; Guo, Zhilin

    2018-01-01

    It is evident based on historical data that groundwater contaminant plumes persist at many sites, requiring costly long-term management. High-resolution site-characterization methods are needed to support accurate risk assessments and to select, design, and operate effective remediation operations. Most subsurface characterization methods are generally limited in their ability to provide unambiguous, real-time delineation of specific processes affecting mass-transfer, transformation, and mass removal, and accurate estimation of associated rates. An integrated contaminant elution and tracer test toolkit, comprising a set of local-scale groundwater extraction-and injection tests, was developed to ameliorate the primary limitations associated with standard characterization methods. The test employs extended groundwater extraction to stress the system and induce hydraulic and concentration gradients. Clean water can be injected, which removes the resident aqueous contaminant mass present in the higher-permeability zones and isolates the test zone from the surrounding plume. This ensures that the concentrations and fluxes measured within the isolated area are directly and predominantly influenced by the local mass-transfer and transformation processes controlling mass removal. A suite of standard and novel tracers can be used to delineate specific mass-transfer and attenuation processes that are active at a given site, and to quantify the associated mass-transfer and transformation rates. The conceptual basis for the test is first presented, followed by an illustrative application based on simulations produced with a 3-D mathematical model and a brief case study application.

  13. Effects of Barometric Fluctuations on Well Water-Level Measurements and Aquifer Test Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spane, Frank A.

    1999-12-16

    This report examines the effects of barometric fluctuations on well water-level measurements and evaluates adjustment and removal methods for determining areal aquifer head conditions and aquifer test analysis. Two examples of Hanford Site unconfined aquifer tests are examined that demonstrate baro-metric response analysis and illustrate the predictive/removal capabilities of various methods for well water-level and aquifer total head values. Good predictive/removal characteristics were demonstrated with best corrective results provided by multiple-regression deconvolution methods.

  14. 78 FR 77646 - Proposed Information Collection; Comment Request; 2014 Census Site Test

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-24

    ... pre-notification containing instructions about how to respond to the test online. Some households will... Adaptive Design Strategies portion will test a method of managing data collection by dynamically adapting... methodology. The objectives of this component of the test are to: Design and develop software solutions...

  15. A power analysis for multivariate tests of temporal trend in species composition.

    PubMed

    Irvine, Kathryn M; Dinger, Eric C; Sarr, Daniel

    2011-10-01

    Long-term monitoring programs emphasize power analysis as a tool to determine the sampling effort necessary to effectively document ecologically significant changes in ecosystems. Programs that monitor entire multispecies assemblages require a method for determining the power of multivariate statistical models to detect trend. We provide a method to simulate presence-absence species assemblage data that are consistent with increasing or decreasing directional change in species composition within multiple sites. This step is the foundation for using Monte Carlo methods to approximate the power of any multivariate method for detecting temporal trends. We focus on comparing the power of the Mantel test, permutational multivariate analysis of variance, and constrained analysis of principal coordinates. We find that the power of the various methods we investigate is sensitive to the number of species in the community, univariate species patterns, and the number of sites sampled over time. For increasing directional change scenarios, constrained analysis of principal coordinates was as or more powerful than permutational multivariate analysis of variance, the Mantel test was the least powerful. However, in our investigation of decreasing directional change, the Mantel test was typically as or more powerful than the other models.

  16. Estimating hydraulic properties using a moving-model approach and multiple aquifer tests

    USGS Publications Warehouse

    Halford, K.J.; Yobbi, D.

    2006-01-01

    A new method was developed for characterizing geohydrologic columns that extended >600 m deep at sites with as many as six discrete aquifers. This method was applied at 12 sites within the Southwest Florida Water Management District. Sites typically were equipped with multiple production wells, one for each aquifer and one or more observation wells per aquifer. The average hydraulic properties of the aquifers and confining units within radii of 30 to >300 m were characterized at each site. Aquifers were pumped individually and water levels were monitored in stressed and adjacent aquifers during each pumping event. Drawdowns at a site were interpreted using a radial numerical model that extended from land surface to the base of the geohydrologic column and simulated all pumping events. Conceptually, the radial model moves between stress periods and recenters on the production well during each test. Hydraulic conductivity was assumed homogeneous and isotropic within each aquifer and confining unit. Hydraulic property estimates for all of the aquifers and confining units were consistent and reasonable because results from multiple aquifers and pumping events were analyzed simultaneously. Copyright ?? 2005 National Ground Water Association.

  17. Estimating hydraulic properties using a moving-model approach and multiple aquifer tests.

    PubMed

    Halford, Keith J; Yobbi, Dann

    2006-01-01

    A new method was developed for characterizing geohydrologic columns that extended >600 m deep at sites with as many as six discrete aquifers. This method was applied at 12 sites within the Southwest Florida Water Management District. Sites typically were equipped with multiple production wells, one for each aquifer and one or more observation wells per aquifer. The average hydraulic properties of the aquifers and confining units within radii of 30 to >300 m were characterized at each site. Aquifers were pumped individually and water levels were monitored in stressed and adjacent aquifers during each pumping event. Drawdowns at a site were interpreted using a radial numerical model that extended from land surface to the base of the geohydrologic column and simulated all pumping events. Conceptually, the radial model moves between stress periods and recenters on the production well during each test. Hydraulic conductivity was assumed homogeneous and isotropic within each aquifer and confining unit. Hydraulic property estimates for all of the aquifers and confining units were consistent and reasonable because results from multiple aquifers and pumping events were analyzed simultaneously.

  18. Assessing and Testing Hydrokinetic Turbine Performance and Effects on Open Channel Hydrodynamics: An Irrigation Canal Case Study.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunawan, Budi; Neary, Vincent Sinclair; Mortensen, Josh

    Hydrokinetic energy from flowing water in open channels has the potential to support local electricity needs with lower regulatory or capital investment than impounding water with more conventional means. MOU agencies involved in federal hydropower development have identified the need to better understand the opportunities for hydrokinetic (HK) energy development within existing canal systems that may already have integrated hydropower plants. This document provides an overview of the main considerations, tools, and assessment methods, for implementing field tests in an open-channel water system to characterize current energy converter (CEC) device performance and hydrodynamic effects. It describes open channel processes relevantmore » to their HK site and perform pertinent analyses to guide siting and CEC layout design, with the goal of streamlining the evaluation process and reducing the risk of interfering with existing uses of the site. This document outlines key site parameters of interest and effective tools and methods for measurement and analysis with examples drawn from the Roza Main Canal, in Yakima, WA to illustrate a site application.« less

  19. Using site-selection model to identify suitable sites for seagrass transplantation in the west coast of South Sulawesi

    NASA Astrophysics Data System (ADS)

    Lanuru, Mahatma; Mashoreng, S.; Amri, K.

    2018-03-01

    The success of seagrass transplantation is very much depending on the site selection and suitable transplantation methods. The main objective of this study is to develop and use a site-selection model to identify the suitability of sites for seagrass (Enhalus acoroides) transplantation. Model development was based on the physical and biological characteristics of the transplantation site. The site-selection process is divided into 3 phases: Phase I identifies potential seagrass habitat using available knowledge, removes unnecessary sites before the transplantation test is performed. Phase II involves field assessment and transplantation test of the best scoring areas identified in Phase I. Phase III is the final calculation of the TSI (Transplant Suitability Index), based on results from Phases I and II. The model was used to identify the suitability of sites for seagrass transplantation in the West coast of South Sulawesi (3 sites at Labakkang Coast, 3 sites at Awerange Bay, and 3 sites at Lale-Lae Island). Of the 9 sites, two sites were predicted by the site-selection model to be the most suitable sites for seagrass transplantation: Site II at Labakkang Coast and Site III at Lale-Lae Island.

  20. Prediction of active sites of enzymes by maximum relevance minimum redundancy (mRMR) feature selection.

    PubMed

    Gao, Yu-Fei; Li, Bi-Qing; Cai, Yu-Dong; Feng, Kai-Yan; Li, Zhan-Dong; Jiang, Yang

    2013-01-27

    Identification of catalytic residues plays a key role in understanding how enzymes work. Although numerous computational methods have been developed to predict catalytic residues and active sites, the prediction accuracy remains relatively low with high false positives. In this work, we developed a novel predictor based on the Random Forest algorithm (RF) aided by the maximum relevance minimum redundancy (mRMR) method and incremental feature selection (IFS). We incorporated features of physicochemical/biochemical properties, sequence conservation, residual disorder, secondary structure and solvent accessibility to predict active sites of enzymes and achieved an overall accuracy of 0.885687 and MCC of 0.689226 on an independent test dataset. Feature analysis showed that every category of the features except disorder contributed to the identification of active sites. It was also shown via the site-specific feature analysis that the features derived from the active site itself contributed most to the active site determination. Our prediction method may become a useful tool for identifying the active sites and the key features identified by the paper may provide valuable insights into the mechanism of catalysis.

  1. Public preferences for nontimber benefits of loblolly pine (Pinus taeda) stands regenerated by different site preparation methods

    Treesearch

    Jianbang Gan; Stephen H. Kolison; James Miller

    2000-01-01

    This study assesses public preferences for nontimber benefits of loblolly pine (Pinus taeda L.)stands regenerated 1.5 yr earlier using different site preparation treatments at national forest and industrial forestry sites. Treatments tested on the Tuskegee National Forest were none, chainsaw felling, tree injection, and soil-active herbicide. At the...

  2. Comparison of CyTOF assays across sites: Results of a six-center pilot study.

    PubMed

    Leipold, Michael D; Obermoser, Gerlinde; Fenwick, Craig; Kleinstuber, Katja; Rashidi, Narges; McNevin, John P; Nau, Allison N; Wagar, Lisa E; Rozot, Virginie; Davis, Mark M; DeRosa, Stephen; Pantaleo, Giuseppe; Scriba, Thomas J; Walker, Bruce D; Olsen, Lars R; Maecker, Holden T

    2018-02-01

    For more than five years, high-dimensional mass cytometry has been employed to study immunology. However, these studies have typically been performed in one laboratory on one or few instruments. We present the results of a six-center study using healthy control human peripheral blood mononuclear cells (PBMCs) and commercially available reagents to test the intra-site and inter-site variation of mass cytometers and operators. We used prestained controls generated by the primary center as a reference to compare against samples stained at each individual center. Data were analyzed at the primary center, including investigating the effects of two normalization methods. All six sites performed similarly, with CVs for both Frequency of Parent and median signal intensity (MSI) values<30%. Increased background was seen when using the premixed antibody cocktail aliquots at each site, suggesting that cocktails are best made fresh. Both normalization methods tested performed adequately for normalizing MSI values between centers. Clustering algorithms revealed slight differences between the prestained and the sites-stained samples, due mostly to the increased background of a few antibodies. Therefore, we believe that multicenter mass cytometry assays are feasible. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  3. Shear Wave Velocity and Site Amplification Factors for 25 Strong-Motion Instrument Stations Affected by the M5.8 Mineral, Virginia, Earthquake of August 23, 2011

    USGS Publications Warehouse

    Kayen, Robert E.; Carkin, Brad A.; Corbett, Skye C.; Zangwill, Aliza; Estevez, Ivan; Lai, Lena

    2015-01-01

    Vertical one-dimensional shear wave velocity (Vs) profiles are presented for 25 strong-motion instrument sites along the Mid-Atlantic eastern seaboard, Piedmont region, and Appalachian region, which surround the epicenter of the M5.8 Mineral, Virginia, Earthquake of August 23, 2011. Testing was performed at sites in Pennsylvania, Maryland, West Virginia, Virginia, the District of Columbia, North Carolina, and Tennessee. The purpose of the study is to determine the detailed site velocity profile, the average velocity in the upper 30 meters of the profile (VS,30), the average velocity for the entire profile (VS,Z), and the National Earthquake Hazards Reduction Program (NEHRP) site classification. The Vs profiles are estimated using a non-invasive continuous-sine-wave method for gathering the dispersion characteristics of surface waves. A large trailer-mounted active source was used to shake the ground during the testing and produce the surface waves. Shear wave velocity profiles were inverted from the averaged dispersion curves using three independent methods for comparison, and the root-mean square combined coefficient of variation (COV) of the dispersion and inversion calculations are estimated for each site.

  4. Statistical Models for Incorporating Data from Routine HIV Testing of Pregnant Women at Antenatal Clinics into HIV/AIDS Epidemic Estimates

    PubMed Central

    Sheng, Ben; Marsh, Kimberly; Slavkovic, Aleksandra B.; Gregson, Simon; Eaton, Jeffrey W.; Bao, Le

    2017-01-01

    Objective HIV prevalence data collected from routine HIV testing of pregnant women at antenatal clinics (ANC-RT) are potentially available from all facilities that offer testing services to pregnant women, and can be used to improve estimates of national and sub-national HIV prevalence trends. We develop methods to incorporate this new data source into the UNAIDS Estimation and Projection Package (EPP) in Spectrum 2017. Methods We develop a new statistical model for incorporating ANC-RT HIV prevalence data, aggregated either to the health facility level (‘site-level’) or regionally (‘census-level’), to estimate HIV prevalence alongside existing sources of HIV prevalence data from ANC unlinked anonymous testing (ANC-UAT) and household-based national population surveys. Synthetic data are generated to understand how the availability of ANC-RT data affects the accuracy of various parameter estimates. Results We estimate HIV prevalence and additional parameters using both ANC-RT and other existing data. Fitting HIV prevalence using synthetic data generally gives precise estimates of the underlying trend and other parameters. More years of ANC-RT data should improve prevalence estimates. More ANC-RT sites and continuation with existing ANC-UAT sites may improve the estimate of calibration between ANC-UAT and ANC-RT sites. Conclusion We have proposed methods to incorporate ANC-RT data into Spectrum to obtain more precise estimates of prevalence and other measures of the epidemic. Many assumptions about the accuracy, consistency, and representativeness of ANC-RT prevalence underlie the use of these data for monitoring HIV epidemic trends, and should be tested as more data become available from national ANC-RT programs. PMID:28296804

  5. Proof test methodology for composites

    NASA Technical Reports Server (NTRS)

    Wu, Edward M.; Bell, David K.

    1992-01-01

    The special requirements for proof test of composites are identified based on the underlying failure process of composites. Two proof test methods are developed to eliminate the inevitable weak fiber sites without also causing flaw clustering which weakens the post-proof-test composite. Significant reliability enhancement by these proof test methods has been experimentally demonstrated for composite strength and composite life in tension. This basic proof test methodology is relevant to the certification and acceptance of critical composite structures. It can also be applied to the manufacturing process development to achieve zero-reject for very large composite structures.

  6. Reactive transport studies at the Raymond Field Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freifeld, B.; Karasaki, K.; Solbau, R.

    1995-12-01

    To ensure the safety of a nuclear waste repository, an understanding of the transport of radionuclides from the repository nearfield to the biosphere is necessary. At the Raymond Field Site, in Raymond, California, tracer tests are being conducted to test characterization methods for fractured media and to evaluate the equipment and tracers that will be used for Yucca Mountain`s fracture characterization. Recent tracer tests at Raymond have used reactive cations to demonstrate transport with sorption. A convective-dispersive model was used to simulate a two-well recirculating test with reasonable results. However, when the same model was used to simulate a radiallymore » convergent tracer test, the model poorly predicted the actual test data.« less

  7. SUPERFUND TREATABILITY CLEARINGHOUSE: BDAT FOR SOLIDIFICATION/STABILIZATION TECHNOLOGY FOR SUPERFUND SOILS (DRAFT FINAL REPORT)

    EPA Science Inventory

    This report evaluates the performance of solidification as a method for treating solids from Superfund sites. Tests were conducted on four different artificially contaminated soils which are representative of soils found at the sites. Contaminated soils were solidified us...

  8. Functional integrity of the interrenal tissue of yellow perch from contaminated sites tested in vivo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Girard, C.; Brodeur, J.C.; Hontela, A.

    1995-12-31

    The normal activation of the hypothalamo-pituitary-interrenal axis (HPI axis) in response to capture is disrupted in fish subjected to life-long exposure to heavy metals, PCBs and PAHs. The ability to increase plasma cortisol in yellow perch (Perca flavescens) from sites contaminated by heavy metals and organic compounds, and from a reference site was assessed by the Capture stress test and by the ACTH Challenge test, a new standardized in vivo method designed for field studies. The effects of seasonal factors, such as temperature and gonadal maturity on these tests were investigated. Measures of liver and muscle glycogen and histopathology weremore » made to further characterize the biochemical and structural changes that may occur along with hormonal changes. The Capture stress test showed that an acute source of stress induced a lower cortisol response in fish from the highly contaminated site compared to the reference site, revealing a functional impairment of the HPI axis. The ACTH Challenge test showed that the hormonal responsiveness of the cortisol-secreting interrenal tissue, stimulated by a standard dose of ACTH injected i.p., was lower in fish from the highly contaminated site than the reference site. Spring is the season during which the impairment was the most evident. The possibility of using the reduced capacity of feral fish to respond to a standardized ACTH Challenge as an early bioindicator of toxic stress is discussed.« less

  9. Development and test fuel cell powered on-site integrated total energy systems. Phase 3: Full-scale power plant development

    NASA Technical Reports Server (NTRS)

    Kaufman, A.

    1982-01-01

    The on-site system application analysis is summarized. Preparations were completed for the first test of a full-sized single cell. Emphasis of the methanol fuel processor development program shifted toward the use of commercial shell-and-tube heat exchangers. An improved method for predicting the carbon-monoxide tolerance of anode catalysts is described. Other stack support areas reported include improved ABA bipolar plate bonding technology, improved electrical measurement techniques for specification-testing of stack components, and anodic corrosion behavior of carbon materials.

  10. Classical and Bayesian Seismic Yield Estimation: The 1998 Indian and Pakistani Tests

    NASA Astrophysics Data System (ADS)

    Shumway, R. H.

    2001-10-01

    - The nuclear tests in May, 1998, in India and Pakistan have stimulated a renewed interest in yield estimation, based on limited data from uncalibrated test sites. We study here the problem of estimating yields using classical and Bayesian methods developed by Shumway (1992), utilizing calibration data from the Semipalatinsk test site and measured magnitudes for the 1998 Indian and Pakistani tests given by Murphy (1998). Calibration is done using multivariate classical or Bayesian linear regression, depending on the availability of measured magnitude-yield data and prior information. Confidence intervals for the classical approach are derived applying an extension of Fieller's method suggested by Brown (1982). In the case where prior information is available, the posterior predictive magnitude densities are inverted to give posterior intervals for yield. Intervals obtained using the joint distribution of magnitudes are comparable to the single-magnitude estimates produced by Murphy (1998) and reinforce the conclusion that the announced yields of the Indian and Pakistani tests were too high.

  11. Classical and Bayesian Seismic Yield Estimation: The 1998 Indian and Pakistani Tests

    NASA Astrophysics Data System (ADS)

    Shumway, R. H.

    The nuclear tests in May, 1998, in India and Pakistan have stimulated a renewed interest in yield estimation, based on limited data from uncalibrated test sites. We study here the problem of estimating yields using classical and Bayesian methods developed by Shumway (1992), utilizing calibration data from the Semipalatinsk test site and measured magnitudes for the 1998 Indian and Pakistani tests given by Murphy (1998). Calibration is done using multivariate classical or Bayesian linear regression, depending on the availability of measured magnitude-yield data and prior information. Confidence intervals for the classical approach are derived applying an extension of Fieller's method suggested by Brown (1982). In the case where prior information is available, the posterior predictive magnitude densities are inverted to give posterior intervals for yield. Intervals obtained using the joint distribution of magnitudes are comparable to the single-magnitude estimates produced by Murphy (1998) and reinforce the conclusion that the announced yields of the Indian and Pakistani tests were too high.

  12. Predicting Displaceable Water Sites Using Mixed-Solvent Molecular Dynamics.

    PubMed

    Graham, Sarah E; Smith, Richard D; Carlson, Heather A

    2018-02-26

    Water molecules are an important factor in protein-ligand binding. Upon binding of a ligand with a protein's surface, waters can either be displaced by the ligand or may be conserved and possibly bridge interactions between the protein and ligand. Depending on the specific interactions made by the ligand, displacing waters can yield a gain in binding affinity. The extent to which binding affinity may increase is difficult to predict, as the favorable displacement of a water molecule is dependent on the site-specific interactions made by the water and the potential ligand. Several methods have been developed to predict the location of water sites on a protein's surface, but the majority of methods are not able to take into account both protein dynamics and the interactions made by specific functional groups. Mixed-solvent molecular dynamics (MixMD) is a cosolvent simulation technique that explicitly accounts for the interaction of both water and small molecule probes with a protein's surface, allowing for their direct competition. This method has previously been shown to identify both active and allosteric sites on a protein's surface. Using a test set of eight systems, we have developed a method using MixMD to identify conserved and displaceable water sites. Conserved sites can be determined by an occupancy-based metric to identify sites which are consistently occupied by water even in the presence of probe molecules. Conversely, displaceable water sites can be found by considering the sites which preferentially bind probe molecules. Furthermore, the inclusion of six probe types allows the MixMD method to predict which functional groups are capable of displacing which water sites. The MixMD method consistently identifies sites which are likely to be nondisplaceable and predicts the favorable displacement of water sites that are known to be displaced upon ligand binding.

  13. Design of an ultra-portable field transfer radiometer supporting automated vicarious calibration

    NASA Astrophysics Data System (ADS)

    Anderson, Nikolaus; Thome, Kurtis; Czapla-Myers, Jeffrey; Biggar, Stuart

    2015-09-01

    The University of Arizona Remote Sensing Group (RSG) began outfitting the radiometric calibration test site (RadCaTS) at Railroad Valley Nevada in 2004 for automated vicarious calibration of Earth-observing sensors. RadCaTS was upgraded to use RSG custom 8-band ground viewing radiometers (GVRs) beginning in 2011 and currently four GVRs are deployed providing an average reflectance for the test site. This measurement of ground reflectance is the most critical component of vicarious calibration using the reflectance-based method. In order to ensure the quality of these measurements, RSG has been exploring more efficient and accurate methods of on-site calibration evaluation. This work describes the design of, and initial results from, a small portable transfer radiometer for the purpose of GVR calibration validation on site. Prior to deployment, RSG uses high accuracy laboratory calibration methods in order to provide radiance calibrations with low uncertainties for each GVR. After deployment, a solar radiation based calibration has typically been used. The method is highly dependent on a clear, stable atmosphere, requires at least two people to perform, is time consuming in post processing, and is dependent on several large pieces of equipment. In order to provide more regular and more accurate calibration monitoring, the small portable transfer radiometer is designed for quick, one-person operation and on-site field calibration comparison results. The radiometer is also suited for laboratory calibration use and thus could be used as a transfer radiometer calibration standard for ground viewing radiometers of a RadCalNet site.

  14. Service life of fence posts treated by double-diffusion methods

    Treesearch

    Donald C. Markstrom; Lee R. Gjovik

    1999-01-01

    Service-life tests indicate that Engelmann spruce, lodgepole pine, and Rocky Mountain Douglas-fir fence posts treated by double-diffusion methods performed excellently after field exposure of 30 years with no failures. The test site was located in the semiarid Central Plains near Nunn, Colorado. Although Engelmann spruce posts generally defy treatment by other treating...

  15. A Comparison of Methods to Test for Mediation in Multisite Experiments

    ERIC Educational Resources Information Center

    Pituch, Keenan A.; Whittaker, Tiffany A.; Stapleton, Laura M.

    2005-01-01

    A Monte Carlo study extended the research of MacKinnon, Lockwood, Hoffman, West, and Sheets (2002) for single-level designs by examining the statistical performance of four methods to test for mediation in a multilevel experimental design. The design studied was a two-group experiment that was replicated across several sites, included a single…

  16. A multi‐centre evaluation of nine rapid, point‐of‐care syphilis tests using archived sera

    PubMed Central

    Herring, A J; Ballard, R C; Pope, V; Adegbola, R A; Changalucha, J; Fitzgerald, D W; Hook, E W; Kubanova, A; Mananwatte, S; Pape, J W; Sturm, A W; West, B; Yin, Y P; Peeling, R W

    2006-01-01

    Objectives To evaluate nine rapid syphilis tests at eight geographically diverse laboratory sites for their performance and operational characteristics. Methods Tests were compared “head to head” using locally assembled panels of 100 archived (50 positive and 50 negative) sera at each site using as reference standards the Treponema pallidum haemagglutination or the T pallidum particle agglutination test. In addition inter‐site variation, result stability, test reproducibility and test operational characteristics were assessed. Results All nine tests gave good performance relative to the reference standard with sensitivities ranging from 84.5–97.7% and specificities from 84.5–98%. Result stability was variable if result reading was delayed past the recommended period. All the tests were found to be easy to use, especially the lateral flow tests. Conclusions All the tests evaluated have acceptable performance characteristics and could make an impact on the control of syphilis. Tests that can use whole blood and do not require refrigeration were selected for further evaluation in field settings. PMID:17118953

  17. Hybrid Residual Flexibility/Mass-Additive Method for Structural Dynamic Testing

    NASA Technical Reports Server (NTRS)

    Tinker, M. L.

    2003-01-01

    A large fixture was designed and constructed for modal vibration testing of International Space Station elements. This fixed-base test fixture, which weighs thousands of pounds and is anchored to a massive concrete floor, initially utilized spherical bearings and pendulum mechanisms to simulate Shuttle orbiter boundary constraints for launch of the hardware. Many difficulties were encountered during a checkout test of the common module prototype structure, mainly due to undesirable friction and excessive clearances in the test-article-to-fixture interface bearings. Measured mode shapes and frequencies were not representative of orbiter-constrained modes due to the friction and clearance effects in the bearings. As a result, a major redesign effort for the interface mechanisms was undertaken. The total cost of the fixture design, construction and checkout, and redesign was over $2 million. Because of the problems experienced with fixed-base testing, alternative free-suspension methods were studied, including the residual flexibility and mass-additive approaches. Free-suspension structural dynamics test methods utilize soft elastic bungee cords and overhead frame suspension systems that are less complex and much less expensive than fixed-base systems. The cost of free-suspension fixturing is on the order of tens of thousands of dollars as opposed to millions, for large fixed-base fixturing. In addition, free-suspension test configurations are portable, allowing modal tests to be done at sites without modal test facilities. For example, a mass-additive modal test of the ASTRO-1 Shuttle payload was done at the Kennedy Space Center launch site. In this Technical Memorandum, the mass-additive and residual flexibility test methods are described in detail. A discussion of a hybrid approach that combines the best characteristics of each method follows and is the focus of the study.

  18. Summary and evaluation of hydraulic property data available for the Hanford Site upper basalt confined aquifer system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spane, F.A. Jr.; Vermeul, V.R.

    Pacific Northwest Laboratory, as part of the Hanford Site Ground-Water Surveillance Project, examines the potential for offsite migration of contamination within the upper basalt confined aquifer system. For the past 40 years, hydrologic testing of the upper basalt confined aquifer has been conducted by a number of Hanford Site programs. Hydraulic property estimates are important for evaluating aquifer flow characteristics (i.e., ground-water flow patterns, flow velocity, transport travel time). Presented are the first comprehensive Hanford Site-wide summary of hydraulic properties for the upper basalt confined aquifer system (i.e., the upper Saddle Mountains Basalt). Available hydrologic test data were reevaluated usingmore » recently developed diagnostic test analysis methods. A comparison of calculated transmissivity estimates indicates that, for most test results, a general correspondence within a factor of two between reanalysis and previously reported test values was obtained. For a majority of the tests, previously reported values are greater than reanalysis estimates. This overestimation is attributed to a number of factors, including, in many cases, a misapplication of nonleaky confined aquifer analysis methods in previous analysis reports to tests that exhibit leaky confined aquifer response behavior. Results of the test analyses indicate a similar range for transmissivity values for the various hydro-geologic units making up the upper basalt confined aquifer. Approximately 90% of the calculated transmissivity values for upper basalt confined aquifer hydrogeologic units occur within the range of 10{sup 0} to 10{sup 2} m{sup 2}/d, with 65% of the calculated estimate values occurring between 10{sup 1} to 10{sup 2} m{sup 2}d. These summary findings are consistent with the general range of values previously reported for basalt interflow contact zones and sedimentary interbeds within the Saddle Mountains Basalt.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Findlay, Rick

    The Gnome-Coach, New Mexico, Site was the location of a 3-kiloton-yield underground nuclear test in 1961 and a groundwater tracer test in 1963. The U.S. Geological Survey conducted the groundwater tracer test using four dissolved radionuclides--tritium, iodine-131, strontium-90, and cesium-137--as tracers. Site reclamation and remediation began after the underground testing, and was conducted in several phases at the site. The New Mexico Environment Department (NMED) issued a Conditional Certificate of Completion in September 2014, which documents that surface remediation activities have been successfully completed in accordance with the Voluntary Remediation Program. Subsurface activities have included annual sampling and monitoring ofmore » wells at and near the site since 1972. These annual monitoring activities were enhanced in 2008 to include monitoring hydraulic head and collecting samples from the onsite wells USGS-4, USGS-8, and LRL-7 using the low-flow sampling method. In 2010, the annual monitoring was focused to the monitoring wells within the site boundary. A site inspection and annual sampling were conducted on January 27-28, 2015. A second site visit was conducted on April 21, 2015, to install warning/notification signs to fulfill a requirement of the Conditional Certificate of Completion that was issued by the NMED for the surface.« less

  20. Measurement environments and testing

    NASA Astrophysics Data System (ADS)

    Marvin, A. C.

    1991-06-01

    The various methods used to assess both the emission (interference generation) performance of electronic equipment and the immunity of electronic equipment to external electromagnetic interference are described. The measurement methods attempt to simulate realistic operating conditions for the equipment being tested, yet at the same time they must be repeatable and practical to operate. This has led to the development of a variety of test methods, each of which has its limitations. Concentration is on the most common measurement methods such as open-field test sites, screened enclosures and transverse electromagnetic (TEM) cells. The physical justification for the methods, their limitations, and measurement precision are described. Ways of relating similar measurements made by different methods are discussed, and some thoughts on future measurement improvements are presented.

  1. Characterization and forensic analysis of soil samples using laser-induced breakdown spectroscopy (LIBS).

    PubMed

    Jantzi, Sarah C; Almirall, José R

    2011-07-01

    A method for the quantitative elemental analysis of surface soil samples using laser-induced breakdown spectroscopy (LIBS) was developed and applied to the analysis of bulk soil samples for discrimination between specimens. The use of a 266 nm laser for LIBS analysis is reported for the first time in forensic soil analysis. Optimization of the LIBS method is discussed, and the results compared favorably to a laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) method previously developed. Precision for both methods was <10% for most elements. LIBS limits of detection were <33 ppm and bias <40% for most elements. In a proof of principle study, the LIBS method successfully discriminated samples from two different sites in Dade County, FL. Analysis of variance, Tukey's post hoc test and Student's t test resulted in 100% discrimination with no type I or type II errors. Principal components analysis (PCA) resulted in clear groupings of the two sites. A correct classification rate of 99.4% was obtained with linear discriminant analysis using leave-one-out validation. Similar results were obtained when the same samples were analyzed by LA-ICP-MS, showing that LIBS can provide similar information to LA-ICP-MS. In a forensic sampling/spatial heterogeneity study, the variation between sites, between sub-plots, between samples and within samples was examined on three similar Dade sites. The closer the sampling locations, the closer the grouping on a PCA plot and the higher the misclassification rate. These results underscore the importance of careful sampling for geographic site characterization.

  2. TargetM6A: Identifying N6-Methyladenosine Sites From RNA Sequences via Position-Specific Nucleotide Propensities and a Support Vector Machine.

    PubMed

    Li, Guang-Qing; Liu, Zi; Shen, Hong-Bin; Yu, Dong-Jun

    2016-10-01

    As one of the most ubiquitous post-transcriptional modifications of RNA, N 6 -methyladenosine ( [Formula: see text]) plays an essential role in many vital biological processes. The identification of [Formula: see text] sites in RNAs is significantly important for both basic biomedical research and practical drug development. In this study, we designed a computational-based method, called TargetM6A, to rapidly and accurately target [Formula: see text] sites solely from the primary RNA sequences. Two new features, i.e., position-specific nucleotide/dinucleotide propensities (PSNP/PSDP), are introduced and combined with the traditional nucleotide composition (NC) feature to formulate RNA sequences. The extracted features are further optimized to obtain a much more compact and discriminative feature subset by applying an incremental feature selection (IFS) procedure. Based on the optimized feature subset, we trained TargetM6A on the training dataset with a support vector machine (SVM) as the prediction engine. We compared the proposed TargetM6A method with existing methods for predicting [Formula: see text] sites by performing stringent jackknife tests and independent validation tests on benchmark datasets. The experimental results show that the proposed TargetM6A method outperformed the existing methods for predicting [Formula: see text] sites and remarkably improved the prediction performances, with MCC = 0.526 and AUC = 0.818. We also provided a user-friendly web server for TargetM6A, which is publicly accessible for academic use at http://csbio.njust.edu.cn/bioinf/TargetM6A.

  3. The Method of Manufacturing Nonmetallic Test-Blocks on Different Sensitivity Classes

    NASA Astrophysics Data System (ADS)

    Kalinichenko, N. P.; Kalinichenko, A. N.; Lobanova, I. S.; Zaitseva, A. A.; Loboda, E. L.

    2016-01-01

    Nowadays in our modern world there is a vital question of quality control of details made from nonmetallic materials due to their wide spreading. Nondestructive penetrant testing is effective, and in some cases it is the only possible method of accidents prevention at high- risk sites. A brief review of check sample necessary for quality evaluation of penetrant materials is considered. There was offered a way of making agents for quality of penetrant materials testing according to different liquid penetrant testing sensibility classes.

  4. TECHNOLOGY EVALUATION REPORT: CHEMFIX TECHNOLOGIES, INC. - SOLIDIFICATION/STABILIZATION PROCESS - CLACKAMAS, OREGON - VOLUME I

    EPA Science Inventory

    The CHEMFIX solidification/stabilization process was evaluated in the U.S. Environment Protection Agency's SITE program. Waste from an uncontrolled hazardous waste site was treated by the CHEMFIX process and subjected to a variety of physical and chemical test methods. Physical t...

  5. TECHNOLOGY EVALUATION REPORT: CHEMFIX TECHNOLOGIES, INC. - SOLIDIFICATION/STABILIZATION PROCESS - CLACKAMAS, OREGON - VOLUME II

    EPA Science Inventory

    The CHEMFIX solidification/stabilization process was evaluated in the U.S. Environmental Protection Agency's SITE program. Waste from an uncontrolled hazardous waste site was treated by the CHEMFIX process and subjected to a variety of physical and chemical test methods. Physical...

  6. Plant-based FRET biosensor discriminates enviornmental zinc levels

    USDA-ARS?s Scientific Manuscript database

    Heavy metal accumulation in the environment poses great risks to flora and fauna. However, monitoring sites prone to accumulation poses scale and economic challenges. In this study, we present and test a method for monitoring these sites using fluorescent resonance energy transfer (FRET) change in r...

  7. Evaluation of Surface Infiltration Testing Procedures in Permeable Pavement Systems

    EPA Science Inventory

    The ASTM method (ASTM C1701) for measuring infiltration rate of in-place pervious concrete provides limited guidance on how to select testing locations, so research is needed to evaluate how testing sites should be selected and how results should be interpreted to assess surface ...

  8. What food and feeding rates are optimum for the Chironomus dilutus sediment toxicity test method?

    EPA Science Inventory

    Laboratory tests with benthic macroinvertebrates conducted using standard toxicity test procedures are used to assess the potential toxicity of contaminated sediments. Results are compared across sites or for batches of samples, and the performance of organisms in control treatme...

  9. Direct-to-consumer sales of genetic services on the Internet.

    PubMed

    Gollust, Sarah E; Wilfond, Benjamin S; Hull, Sara Chandros

    2003-01-01

    PURPOSE The increasing use of the Internet to obtain genetics information and to order medical services without a prescription, combined with a rise in direct-to-consumer marketing for genetic testing, suggests the potential for the Internet to be used to sell genetic services. METHODS A systematic World Wide Web search was conducted in May 2002 to assess the availability of genetic services sold directly to consumers on the Internet. RESULTS Out of 105 sites that offered genetic services directly, most offered non-health-related services, including parentage confirmation testing (83%), identity testing (56%), and DNA banking (24%); however, health-related genetic tests were offered through 14 sites (13%). The health-related genetic tests available ranged from standard tests, such as hemochromatosis and cystic fibrosis, to more unconventional tests related to nutrition, behavior, and aging. Of these 14 sites, 5 described risks associated with the genetic services and 6 described the availability of counseling. CONCLUSIONS The availability of direct sales of health-related genetic tests creates the potential for inadequate pretest decision making, misunderstanding test results, and access to tests of questionable clinical value.

  10. Direct-to-consumer sales of genetic services on the Internet

    PubMed Central

    Gollust, Sarah E.; Wilfond, Benjamin S.; Hull, Sara Chandros

    2016-01-01

    Purpose The increasing use of the Internet to obtain genetics information and to order medical services without a prescription, combined with a rise in direct-to-consumer marketing for genetic testing, suggests the potential for the Internet to be used to sell genetic services. Methods A systematic World Wide Web search was conducted in May 2002 to assess the availability of genetic services sold directly to consumers on the Internet. Results Out of 105 sites that offered genetic services directly, most offered non–health-related services, including parentage confirmation testing (83%), identity testing (56%), and DNA banking (24%); however, health-related genetic tests were offered through 14 sites (13%). The health-related genetic tests available ranged from standard tests, such as hemochromatosis and cystic fibrosis, to more unconventional tests related to nutrition, behavior, and aging. Of these 14 sites, 5 described risks associated with the genetic services and 6 described the availability of counseling. Conclusions The availability of direct sales of health-related genetic tests creates the potential for inadequate pretest decision making, misunderstanding test results, and access to tests of questionable clinical value. PMID:12865763

  11. An Unconditional Test for Change Point Detection in Binary Sequences with Applications to Clinical Registries.

    PubMed

    Ellenberger, David; Friede, Tim

    2016-08-05

    Methods for change point (also sometimes referred to as threshold or breakpoint) detection in binary sequences are not new and were introduced as early as 1955. Much of the research in this area has focussed on asymptotic and exact conditional methods. Here we develop an exact unconditional test. An unconditional exact test is developed which assumes the total number of events as random instead of conditioning on the number of observed events. The new test is shown to be uniformly more powerful than Worsley's exact conditional test and means for its efficient numerical calculations are given. Adaptions of methods by Berger and Boos are made to deal with the issue that the unknown event probability imposes a nuisance parameter. The methods are compared in a Monte Carlo simulation study and applied to a cohort of patients undergoing traumatic orthopaedic surgery involving external fixators where a change in pin site infections is investigated. The unconditional test controls the type I error rate at the nominal level and is uniformly more powerful than (or to be more precise uniformly at least as powerful as) Worsley's exact conditional test which is very conservative for small sample sizes. In the application a beneficial effect associated with the introduction of a new treatment procedure for pin site care could be revealed. We consider the new test an effective and easy to use exact test which is recommended in small sample size change point problems in binary sequences.

  12. Genotype-environment interaction and stability in ten-year height growth of Norway spruce Clones (Picea abies Karst.).

    Treesearch

    J.B. St. Clair; J. Kleinschmit

    1986-01-01

    Norway spruce cuttings of 40 clones were tested on seven contrasting sites in northern Germany. Analysis of variance for ten-year height growth indicate a highly significant clone x site interaction. This interaction may be reduced by selection of stable clones. Several measures of stability were calculated and discussed. Characterization of sites by the method of...

  13. Investigations in site response from ground motion observations in vertical arrays

    NASA Astrophysics Data System (ADS)

    Baise, Laurie Gaskins

    The aim of the research is to improve the understanding of earthquake site response and to improve the techniques available to investigate issues in this field. Vertical array ground motion data paired with the empirical transfer function (ETF) methodology is shown to accurately characterize site response. This manuscript draws on methods developed in the field of signal processing and statistical time series analysis to parameterize the ETF as an autoregressive moving-average (ARMA) system which is justified theoretically, historically, and by example. Site response is evaluated at six sites in California, Japan, and Taiwan using ETF estimates, correlation analysis, and full waveform modeling. Correlation analysis is proposed as a required data quality evaluation imperative to any subsequent site response analysis. ETF estimates and waveform modeling are used to decipher the site response at sites with simple and complex geologic structure, which provide simple time-invariant and time-variant methods for evaluating both linear site transfer functions and nonlinear site response for sites experiencing liquefaction of the soils. The Treasure and Yerba Buena Island sites, however, require 2-D waveform modeling to accurately evaluate the effects of the shallow sedimentary basin. ETFs are used to characterize the Port Island site and corresponding shake table tests before, during, and after liquefaction. ETFs derived from the shake table tests were demonstrated to consistently predict the linear field ground response below 16 m depth and the liquefied behavior above 15 m depth. The liquefied interval response was demonstrated to gradually return to pre-liquefied conditions within several weeks of the 1995 Hyogo-ken Nanbu earthquake. Both the site's and the shake table test's response were shown to be effectively linear up to 0.5 g in the native materials below 16 m depth. The effective linearity of the site response at GVDA, Chiba, and Lotting up to 0.1 g, 0.33 g, and 0.49 g, respectively, further confirms that site response in the field may be more linear than expected from laboratory tests. Strong motions were predicted at these sites with normalized mean square error less than 0.10 using ETFs generated from weak motions. The Treasure Island site response was shown to be dominated by surface waves propagating in the shallow sediments of the San Francisco Bay. Low correlation of the ground motions recorded on rock at Yerba Buena Island and in rock beneath the Treasure Island site intimates that the Yerba Buena site is an inappropriate reference site for Treasure Island site response studies. Accurate simulation of the Treasure Island site response was achieved using a 2-D velocity structure comprised of a 100 m uniform soil basin (Vs = 400 m/s) over a weathered rock veneer (Vs = 1.5 km/s) to 200 m depth.

  14. Improving the quality of parameter estimates obtained from slug tests

    USGS Publications Warehouse

    Butler, J.J.; McElwee, C.D.; Liu, W.

    1996-01-01

    The slug test is one of the most commonly used field methods for obtaining in situ estimates of hydraulic conductivity. Despite its prevalence, this method has received criticism from many quarters in the ground-water community. This criticism emphasizes the poor quality of the estimated parameters, a condition that is primarily a product of the somewhat casual approach that is often employed in slug tests. Recently, the Kansas Geological Survey (KGS) has pursued research directed it improving methods for the performance and analysis of slug tests. Based on extensive theoretical and field research, a series of guidelines have been proposed that should enable the quality of parameter estimates to be improved. The most significant of these guidelines are: (1) three or more slug tests should be performed at each well during a given test period; (2) two or more different initial displacements (Ho) should be used at each well during a test period; (3) the method used to initiate a test should enable the slug to be introduced in a near-instantaneous manner and should allow a good estimate of Ho to be obtained; (4) data-acquisition equipment that enables a large quantity of high quality data to be collected should be employed; (5) if an estimate of the storage parameter is needed, an observation well other than the test well should be employed; (6) the method chosen for analysis of the slug-test data should be appropriate for site conditions; (7) use of pre- and post-analysis plots should be an integral component of the analysis procedure, and (8) appropriate well construction parameters should be employed. Data from slug tests performed at a number of KGS field sites demonstrate the importance of these guidelines.

  15. Detection and Identification of Small Seismic Events Following the 3 September 2017 UNT Around North Korean Nuclear Test Site

    NASA Astrophysics Data System (ADS)

    Kim, W. Y.; Richards, P. G.

    2017-12-01

    At least four small seismic events were detected around the North Korean nuclear test site following the 3 September 2017 underground nuclear test. The magnitude of these shocks range from 2.6 to 3.5. Based on their proximity to the September 3 UNT, these shocks may be considered as aftershocks of the UNT. We assess the best method to classify these small events based on spectral amplitude ratios of regional P and S wave from the shocks. None of these shocks are classified as explosion-like based on P/S spectral amplitude ratios. We examine additional possible small seismic events around the North Korean test site by using seismic data from stations in southern Korea and northeastern China including IMS seismic arrays, GSN stations, and regional network stations in the region.

  16. 40 CFR Table C-5 to Subpart C of... - Summary of Comparability Field Testing Campaign Site and Seasonal Requirements for Class II and...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Campaign Site and Seasonal Requirements for Class II and III FEMs for PM 10-2,5 and PM 2.5 C Table C-5 to Subpart C of Part 53 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... Between Candidate Methods and Reference Methods Pt. 53, Subpt. C, Table C-5 Table C-5 to Subpart C of Part...

  17. 40 CFR Table C-5 to Subpart C of... - Summary of Comparability Field Testing Campaign Site and Seasonal Requirements for Class II and...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Campaign Site and Seasonal Requirements for Class II and III FEMs for PM 10-2.5 and PM 2.5 C Table C-5 to Subpart C of Part 53 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... Between Candidate Methods and Reference Methods Pt. 53, Subpt. C, Table C-5 Table C-5 to Subpart C of Part...

  18. Radiometric characterization of hyperspectral imagers using multispectral sensors

    NASA Astrophysics Data System (ADS)

    McCorkel, Joel; Thome, Kurt; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-08-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these tests sites are not always successful due to weather and funding availability. Therefore, RSG has also employed automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor. This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (MODIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of MODIS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most bands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  19. Radiometric Characterization of Hyperspectral Imagers using Multispectral Sensors

    NASA Technical Reports Server (NTRS)

    McCorkel, Joel; Kurt, Thome; Leisso, Nathan; Anderson, Nikolaus; Czapla-Myers, Jeff

    2009-01-01

    The Remote Sensing Group (RSG) at the University of Arizona has a long history of using ground-based test sites for the calibration of airborne and satellite based sensors. Often, ground-truth measurements at these test sites are not always successful due to weather and funding availability. Therefore, RSG has also automated ground instrument approaches and cross-calibration methods to verify the radiometric calibration of a sensor. The goal in the cross-calibration method is to transfer the calibration of a well-known sensor to that of a different sensor, This work studies the feasibility of determining the radiometric calibration of a hyperspectral imager using multispectral a imagery. The work relies on the Moderate Resolution Imaging Spectroradiometer (M0DIS) as a reference for the hyperspectral sensor Hyperion. Test sites used for comparisons are Railroad Valley in Nevada and a portion of the Libyan Desert in North Africa. Hyperion bands are compared to MODIS by band averaging Hyperion's high spectral resolution data with the relative spectral response of M0DlS. The results compare cross-calibration scenarios that differ in image acquisition coincidence, test site used for the calibration, and reference sensor. Cross-calibration results are presented that show agreement between the use of coincident and non-coincident image pairs within 2% in most brands as well as similar agreement between results that employ the different MODIS sensors as a reference.

  20. 40 CFR 63.694 - Testing methods and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... material stream shall be collected from the container, pipeline, or other device used to deliver the off... off-site material streams at the point-of-delivery for compliance with standards specified § 63.683 of... determine the average VOHAP concentration for treated off-site material streams at the point-of-treatment...

  1. 40 CFR 63.5719 - How do I conduct a performance test?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... sampling sites. (2) Use Method 2, 2A, 2C, 2D, 2F or 2G of appendix A to 40 CFR part 60, as appropriate, to... organic HAP emissions. (4) You may use American Society for Testing and Materials (ASTM) D6420-99... respect to the types of parts being made and material application methods. The production conditions...

  2. 40 CFR 63.5719 - How do I conduct a performance test?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... sampling sites. (2) Use Method 2, 2A, 2C, 2D, 2F or 2G of appendix A to 40 CFR part 60, as appropriate, to... organic HAP emissions. (4) You may use American Society for Testing and Materials (ASTM) D6420-99... respect to the types of parts being made and material application methods. The production conditions...

  3. 40 CFR 63.5719 - How do I conduct a performance test?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... sampling sites. (2) Use Method 2, 2A, 2C, 2D, 2F or 2G of appendix A to 40 CFR part 60, as appropriate, to... organic HAP emissions. (4) You may use American Society for Testing and Materials (ASTM) D6420-99... respect to the types of parts being made and material application methods. The production conditions...

  4. 40 CFR 63.7824 - What test methods and other procedures must I use to establish and demonstrate initial compliance...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... establish site-specific operating limits according to the procedures in paragraphs (b)(1) and (2) of this.... (3) Establish revised operating limits according to the applicable procedures in paragraphs (a) and... 40 Protection of Environment 14 2014-07-01 2014-07-01 false What test methods and other procedures...

  5. 40 CFR 63.7824 - What test methods and other procedures must I use to establish and demonstrate initial compliance...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... establish site-specific operating limits according to the procedures in paragraphs (b)(1) and (2) of this.... (3) Establish revised operating limits according to the applicable procedures in paragraphs (a) and... 40 Protection of Environment 14 2012-07-01 2011-07-01 true What test methods and other procedures...

  6. Security concept in 'MyAngelWeb' a website for the individual patient at risk of emergency.

    PubMed

    Pinciroli, F; Nahaissi, D; Boschini, M; Ferrari, R; Meloni, G; Camnasio, M; Spaggiari, P; Carnerone, G

    2000-11-01

    We describe the Security Plan for the 'MyAngelWeb' service. The different actors involved in the service are subject to different security procedures. The core of the security system is implemented at the host site by means of a DBMS and standard Information Technology tools. Hardware requirements for sustainable security are needed at the web-site construction sites. They are not needed at the emergency physician's site. At the emergency physician's site, a two-way authentication system (password and test phrase method) is implemented.

  7. Security concept in 'MyAngelWeb((R))' a website for the individual patient at risk of emergency.

    PubMed

    Pinciroli; Nahaissi; Boschini; Ferrari; Meloni; Camnasio; Spaggiari; Carnerone

    2000-11-01

    We describe the Security Plan for the 'MyAngelWeb' service. The different actors involved in the service are subject to different security procedures. The core of the security system is implemented at the host site by means of a DBMS and standard Information Technology tools. Hardware requirements for sustainable security are needed at the web-site construction sites. They are not needed at the emergency physician's site. At the emergency physician's site, a two-way authentication system (password and test phrase method) is implemented.

  8. Limitations of on-site dairy farm regulatory debits as milk quality predictors.

    PubMed

    Borneman, Darand L; Stiegert, Kyle; Ingham, Steve

    2015-03-01

    In the United States, compliance with grade A raw fluid milk regulatory standards is assessed via laboratory milk quality testing and by on-site inspection of producers (farms). This study evaluated the correlation between on-site survey debits being marked and somatic cell count (SCC) or standard plate count (SPC) laboratory results for 1,301 Wisconsin grade A dairy farms in 2012. Debits recorded on the survey form were tested as predictors of laboratory results utilizing ordinary least squares regression to determine if results of the current method for on-site evaluation of grade A dairy farms accurately predict SCC and SPC test results. Such a correlation may indicate that current methods of on-site inspection serve the primary intended purpose of assuring availability of high-quality milk. A model for predicting SCC was estimated using ordinary least squares regression methods. Step-wise selected regressors of grouped debit items were able to predict SCC levels with some degree of accuracy (adjusted R2=0.1432). Specific debit items, seasonality, and farm size were the best predictors of SCC levels. The SPC data presented an analytical challenge because over 75% of the SPC observations were at or below a 25,000 cfu/mL threshold but were recorded by testing laboratories as at the threshold value. This classic censoring problem necessitated the use of a Tobit regression approach. Even with this approach, prediction of SPC values based on on-site survey criteria was much less successful (adjusted R2=0.034) and provided little support for the on-site survey system as a way to inform farmers about making improvements that would improve SPC. The lower level of correlation with SPC may indicate that factors affecting SPC are more varied and differ from those affecting SCC. Further, unobserved deficiencies in postmilking handling and storage sanitation could enhance bacterial growth and increase SPC, whereas postmilking sanitation will have no effect on SCC because somatic cells do not reproduce in stored milk. Results suggest that close examination, and perhaps redefinition, of survey debits, along with making the survey coincident with SCC and SPC sampling, could make the on-site survey a better tool for ensuring availability of high-quality milk. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  9. The Relationship between SW-846, PBMS, and Innovative Analytical Technologies

    EPA Pesticide Factsheets

    This paper explains EPA's position regarding testing methods used within waste programs, documentation of EPA's position, the reasoning behind EPA's position, and the relationship between analytical method regulatory flexibility and the use of on-site...

  10. True versus Apparent Malaria Infection Prevalence: The Contribution of a Bayesian Approach

    PubMed Central

    Claes, Filip; Van Hong, Nguyen; Torres, Kathy; Mao, Sokny; Van den Eede, Peter; Thi Thinh, Ta; Gamboa, Dioni; Sochantha, Tho; Thang, Ngo Duc; Coosemans, Marc; Büscher, Philippe; D'Alessandro, Umberto; Berkvens, Dirk; Erhart, Annette

    2011-01-01

    Aims To present a new approach for estimating the “true prevalence” of malaria and apply it to datasets from Peru, Vietnam, and Cambodia. Methods Bayesian models were developed for estimating both the malaria prevalence using different diagnostic tests (microscopy, PCR & ELISA), without the need of a gold standard, and the tests' characteristics. Several sources of information, i.e. data, expert opinions and other sources of knowledge can be integrated into the model. This approach resulting in an optimal and harmonized estimate of malaria infection prevalence, with no conflict between the different sources of information, was tested on data from Peru, Vietnam and Cambodia. Results Malaria sero-prevalence was relatively low in all sites, with ELISA showing the highest estimates. The sensitivity of microscopy and ELISA were statistically lower in Vietnam than in the other sites. Similarly, the specificities of microscopy, ELISA and PCR were significantly lower in Vietnam than in the other sites. In Vietnam and Peru, microscopy was closer to the “true” estimate than the other 2 tests while as expected ELISA, with its lower specificity, usually overestimated the prevalence. Conclusions Bayesian methods are useful for analyzing prevalence results when no gold standard diagnostic test is available. Though some results are expected, e.g. PCR more sensitive than microscopy, a standardized and context-independent quantification of the diagnostic tests' characteristics (sensitivity and specificity) and the underlying malaria prevalence may be useful for comparing different sites. Indeed, the use of a single diagnostic technique could strongly bias the prevalence estimation. This limitation can be circumvented by using a Bayesian framework taking into account the imperfect characteristics of the currently available diagnostic tests. As discussed in the paper, this approach may further support global malaria burden estimation initiatives. PMID:21364745

  11. Evaluation and Uncertainty of a New Method to Detect Suspected Nuclear and WMD Activity: Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurzeja, R.; Werth, D.; Buckley, R.

    The Atmospheric Technology Group at SRNL developed a new method to detect signals from Weapons of Mass Destruction (WMD) activities in a time series of chemical measurements at a downwind location. This method was tested with radioxenon measured in Russia and Japan after the 2013 underground test in North Korea. This LDRD calculated the uncertainty in the method with the measured data and also for a case with the signal reduced to 1/10 its measured value. The research showed that the uncertainty in the calculated probability of origin from the NK test site was small enough to confirm the test.more » The method was also wellbehaved for small signal strengths.« less

  12. Testing contamination risk assessment methods for toxic elements from mine waste sites

    NASA Astrophysics Data System (ADS)

    Abdaal, A.; Jordan, G.; Szilassi, P.; Kiss, J.; Detzky, G.

    2012-04-01

    Major incidents involving mine waste facilities and poor environmental management practices have left a legacy of thousands of contaminated sites like in the historic mining areas in the Carpathian Basin. Associated environmental risks have triggered the development of new EU environmental legislation to prevent and minimize the effects of such incidents. The Mine Waste Directive requires the risk-based inventory of all mine waste sites in Europe by May 2012. In order to address the mining problems a standard risk-based Pre-selection protocol has been developed by the EU Commission. This paper discusses the heavy metal contamination in acid mine drainage (AMD) for risk assessment (RA) along the Source-Pathway-Receptor chain using decision support methods which are intended to aid national and regional organizations in the inventory and assessment of potentially contaminated mine waste sites. Several recognized methods such as the European Environmental Agency (EEA) standard PRAMS model for soil contamination, US EPA-based AIMSS and Irish HMS-IRC models for RA of abandoned sites are reviewed, compared and tested for the mining waste environment. In total 145 ore mine waste sites have been selected for scientific testing using the EU Pre-selection protocol as a case study from Hungary. The proportion of uncertain to certain responses for a site and for the total number of sites may give an insight of specific and overall uncertainty in the data we use. The Pre-selection questions are efficiently linked to a GIS system as database inquiries using digital spatial data to directly generate answers. Key parameters such as distance to the nearest surface and ground water bodies, to settlements and protected areas are calculated and statistically evaluated using STATGRAPHICS® in order to calibrate the RA models. According to our scientific research results, of the 145 sites 11 sites are the most risky having foundation slope >20o, 57 sites are within distance <500m to the nearest surface water bodies, and 33 sites are within distance <680m to the nearest settlements. Moreover 25 sites lie directly above the 'poor status' ground water bodies and 91 sites are located in the protected Natura2000 sites (distance =0). Analysis of the total score of all sites was performed, resulting in six risk classes, as follows: <21 (class I, 4 sites), 21-31 (class II, 16 sites), 31-42 (class III, 27 sites), 42-54 (class II, 38 sites), 54-66 (class V, 40 sites) and >66 (class VI, 20 sites). The total risk scores and key parameters are provided in separate tables and GIS maps, in order to facilitate interpretation and comparison. Results of the Pre-selection protocol are consistent with those of the screening PRAMS model. KEY WORDS contamination risk assessment, Mine Waste Directive, Pre-selection Protocol, PRA.MS, AIMSS, abandoned mine sites, GIS

  13. Alternative Chemical Cleaning Methods for High Level Waste Tanks: Simulant Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudisill, T.; King, W.; Hay, M.

    Solubility testing with simulated High Level Waste tank heel solids has been conducted in order to evaluate two alternative chemical cleaning technologies for the dissolution of sludge residuals remaining in the tanks after the exhaustion of mechanical cleaning and sludge washing efforts. Tests were conducted with non-radioactive pure phase metal reagents, binary mixtures of reagents, and a Savannah River Site PUREX heel simulant to determine the effectiveness of an optimized, dilute oxalic/nitric acid cleaning reagent and pure, dilute nitric acid toward dissolving the bulk non-radioactive waste components. A focus of this testing was on minimization of oxalic acid additions duringmore » tank cleaning. For comparison purposes, separate samples were also contacted with pure, concentrated oxalic acid which is the current baseline chemical cleaning reagent. In a separate study, solubility tests were conducted with radioactive tank heel simulants using acidic and caustic permanganate-based methods focused on the “targeted” dissolution of actinide species known to be drivers for Savannah River Site tank closure Performance Assessments. Permanganate-based cleaning methods were evaluated prior to and after oxalic acid contact.« less

  14. Which came first, people or pollution? Assessing the disparate siting and post-siting demographic change hypotheses of environmental injustice

    NASA Astrophysics Data System (ADS)

    Mohai, Paul; Saha, Robin

    2015-11-01

    Although a large body of quantitative environmental justice research exists, only a handful of studies have examined the processes by which racial and socioeconomic disparities in the location of polluting industrial facilities can occur. These studies have had mixed results, we contend, principally because of methodological differences, that is, the use of the unit-hazard coincidence method as compared to distance-based methods. This study is the first national-level environmental justice study to conduct longitudinal analyses using distance-based methods. Our purposes are to: (1) determine whether disparate siting, post-siting demographic change, or a combination of the two created present-day disparities; (2) test related explanations; and (3) determine whether the application of distance-based methods helps resolve the inconsistent findings of previous research. We used a national database of commercial hazardous waste facilities sited from 1966 to 1995 and examined the demographic composition of host neighborhoods around the time of siting and demographic changes that occurred after siting. We found strong evidence of disparate siting for facilities sited in all time periods. Although we found some evidence of post-siting demographic changes, they were mostly a continuation of changes that occurred in the decade or two prior to siting, suggesting that neighborhood transition serves to attract noxious facilities rather than the facilities themselves attracting people of color and low income populations. Our findings help resolve inconsistencies among the longitudinal studies and builds on the evidence from other subnational studies that used distance-based methods. We conclude that racial discrimination and sociopolitical explanations (i.e., the proposition that siting decisions follow the ‘path of least resistance’) best explain present-day inequities.

  15. Knowledge-transfer learning for prediction of matrix metalloprotease substrate-cleavage sites.

    PubMed

    Wang, Yanan; Song, Jiangning; Marquez-Lago, Tatiana T; Leier, André; Li, Chen; Lithgow, Trevor; Webb, Geoffrey I; Shen, Hong-Bin

    2017-07-18

    Matrix Metalloproteases (MMPs) are an important family of proteases that play crucial roles in key cellular and disease processes. Therefore, MMPs constitute important targets for drug design, development and delivery. Advanced proteomic technologies have identified type-specific target substrates; however, the complete repertoire of MMP substrates remains uncharacterized. Indeed, computational prediction of substrate-cleavage sites associated with MMPs is a challenging problem. This holds especially true when considering MMPs with few experimentally verified cleavage sites, such as for MMP-2, -3, -7, and -8. To fill this gap, we propose a new knowledge-transfer computational framework which effectively utilizes the hidden shared knowledge from some MMP types to enhance predictions of other, distinct target substrate-cleavage sites. Our computational framework uses support vector machines combined with transfer machine learning and feature selection. To demonstrate the value of the model, we extracted a variety of substrate sequence-derived features and compared the performance of our method using both 5-fold cross-validation and independent tests. The results show that our transfer-learning-based method provides a robust performance, which is at least comparable to traditional feature-selection methods for prediction of MMP-2, -3, -7, -8, -9 and -12 substrate-cleavage sites on independent tests. The results also demonstrate that our proposed computational framework provides a useful alternative for the characterization of sequence-level determinants of MMP-substrate specificity.

  16. Selection of Atmospheric Environmental Monitoring Sites based on Geographic Parameters Extraction of GIS and Fuzzy Matter-Element Analysis.

    PubMed

    Wu, Jianfa; Peng, Dahao; Ma, Jianhao; Zhao, Li; Sun, Ce; Ling, Huanzhang

    2015-01-01

    To effectively monitor the atmospheric quality of small-scale areas, it is necessary to optimize the locations of the monitoring sites. This study combined geographic parameters extraction by GIS with fuzzy matter-element analysis. Geographic coordinates were extracted by GIS and transformed into rectangular coordinates. These coordinates were input into the Gaussian plume model to calculate the pollutant concentration at each site. Fuzzy matter-element analysis, which is used to solve incompatible problems, was used to select the locations of sites. The matter element matrices were established according to the concentration parameters. The comprehensive correlation functions KA (xj) and KB (xj), which reflect the degree of correlation among monitoring indices, were solved for each site, and a scatter diagram of the sites was drawn to determine the final positions of the sites based on the functions. The sites could be classified and ultimately selected by the scatter diagram. An actual case was tested, and the results showed that 5 positions can be used for monitoring, and the locations conformed to the technical standard. In the results of this paper, the hierarchical clustering method was used to improve the methods. The sites were classified into 5 types, and 7 locations were selected. Five of the 7 locations were completely identical to the sites determined by fuzzy matter-element analysis. The selections according to these two methods are similar, and these methods can be used in combination. In contrast to traditional methods, this study monitors the isolated point pollutant source within a small range, which can reduce the cost of monitoring.

  17. Eliminating Plasmodium falciparum in Hainan, China: a study on the use of behavioural change communication intervention to promote malaria prevention in mountain worker populations.

    PubMed

    He, Chang-hua; Hu, Xi-min; Wang, Guang-ze; Zhao, Wei; Sun, Ding-wei; Li, Yu-chun; Chen, Chun-xiang; Du, Jian-wei; Wang, Shan-qing

    2014-07-13

    In the island of Hainan, the great majority of malaria cases occur in mountain worker populations. Using the behavioral change communication (BCC) strategy, an interventional study was conducted to promote mountain worker malaria prevention at a test site. This study found the methods and measures that are suitable for malaria prevention among mountain worker populations. During the Plasmodium falciparum elimination stage in Hainan, a representative sampling method was used to establish testing and control sites in areas of Hainan that were both affected by malaria and had a relatively high density of mountain workers. Two different methods were used: a BCC strategy and a conventional strategy as a control. Before and after the intervention, house visits, core group discussions, and structural surveys were utilized to collect qualitative and quantitative data regarding mountain worker populations (including knowledge, attitudes, and practices [KAPs]; infection status; and serological data), and these data from the testing and control areas were compared to evaluate the effectiveness of BCC strategies in the prevention of malaria. In the BCC malaria prevention strategy testing areas, the accuracy rates of malaria-related KAP were significantly improved among mountain worker populations. The accuracy rates in the 3 aspects of malaria-related KAP increased from 37.73%, 37.00%, and 43.04% to 89.01%, 91.53%, and 92.25%, respectively. The changes in all 3 aspects of KAP were statistically significant (p < 0.01). In the control sites, the changes in the indices were not as marked as in the testing areas, and the change was not statistically significant (p > 0.05). Furthermore, in the testing areas, both the percentage testing positive in the serum malaria indirect fluorescent antibody test (IFAT) and the number of people inflicted decreased more significantly than in the control sites (p < 0.01). The use of the BCC strategy significantly improved the ability of mountain workers in Hainan to avoid malarial infection. Educational and promotional materials and measures were developed and selected in the process, and hands-on experience was gained that will help achieve the goal of total malaria elimination in Hainan.

  18. Comparing the ISO-recommended and the cumulative data-reduction algorithms in S-on-1 laser damage test by a reverse approach method

    NASA Astrophysics Data System (ADS)

    Zorila, Alexandru; Stratan, Aurel; Nemes, George

    2018-01-01

    We compare the ISO-recommended (the standard) data-reduction algorithm used to determine the surface laser-induced damage threshold of optical materials by the S-on-1 test with two newly suggested algorithms, both named "cumulative" algorithms/methods, a regular one and a limit-case one, intended to perform in some respects better than the standard one. To avoid additional errors due to real experiments, a simulated test is performed, named the reverse approach. This approach simulates the real damage experiments, by generating artificial test-data of damaged and non-damaged sites, based on an assumed, known damage threshold fluence of the target and on a given probability distribution function to induce the damage. In this work, a database of 12 sets of test-data containing both damaged and non-damaged sites was generated by using four different reverse techniques and by assuming three specific damage probability distribution functions. The same value for the threshold fluence was assumed, and a Gaussian fluence distribution on each irradiated site was considered, as usual for the S-on-1 test. Each of the test-data was independently processed by the standard and by the two cumulative data-reduction algorithms, the resulting fitted probability distributions were compared with the initially assumed probability distribution functions, and the quantities used to compare these algorithms were determined. These quantities characterize the accuracy and the precision in determining the damage threshold and the goodness of fit of the damage probability curves. The results indicate that the accuracy in determining the absolute damage threshold is best for the ISO-recommended method, the precision is best for the limit-case of the cumulative method, and the goodness of fit estimator (adjusted R-squared) is almost the same for all three algorithms.

  19. Using Downhole Probes to Locate and Characterize Buried Transuranic and Mixed Low Level Waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steinman, Donald K; Bramblett, Richard L; Hertzog, Russel C

    2012-06-25

    Borehole logging probes were developed and tested to locate and quantify transuranic elements in subsurface disposal areas and in contaminated sites at USDOE Weapons Complex sites. A new method of measuring very high levels of chlroine in the subsurface was developed using pulsed neutron technology from oilfield applications. The probes were demonstrated at the Hanford site in wells containing plutonium and other contaminants.

  20. Predicting protein amidation sites by orchestrating amino acid sequence features

    NASA Astrophysics Data System (ADS)

    Zhao, Shuqiu; Yu, Hua; Gong, Xiujun

    2017-08-01

    Amidation is the fourth major category of post-translational modifications, which plays an important role in physiological and pathological processes. Identifying amidation sites can help us understanding the amidation and recognizing the original reason of many kinds of diseases. But the traditional experimental methods for predicting amidation sites are often time-consuming and expensive. In this study, we propose a computational method for predicting amidation sites by orchestrating amino acid sequence features. Three kinds of feature extraction methods are used to build a feature vector enabling to capture not only the physicochemical properties but also position related information of the amino acids. An extremely randomized trees algorithm is applied to choose the optimal features to remove redundancy and dependence among components of the feature vector by a supervised fashion. Finally the support vector machine classifier is used to label the amidation sites. When tested on an independent data set, it shows that the proposed method performs better than all the previous ones with the prediction accuracy of 0.962 at the Matthew's correlation coefficient of 0.89 and area under curve of 0.964.

  1. 40 CFR Table C-5 to Subpart C of... - Summary of Comparability Field Testing Campaign Site and Seasonal Requirements for Class II and...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Campaign Site and Seasonal Requirements for Class II and III FEMs for PM10â2.5 and PM2.5 C Table C-5 to Subpart C of Part 53 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... Between Candidate Methods and Reference Methods Pt. 53, Subpt. C, Table C-5 Table C-5 to Subpart C of Part...

  2. 40 CFR Table C-5 to Subpart C of... - Summary of Comparability Field Testing Campaign Site and Seasonal Requirements for Class II and...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Campaign Site and Seasonal Requirements for Class II and III FEMs for PM10â2.5 and PM2.5 C Table C-5 to Subpart C of Part 53 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... Between Candidate Methods and Reference Methods Pt. 53, Subpt. C, Table C-5 Table C-5 to Subpart C of Part...

  3. Identifying 5-methylcytosine sites in RNA sequence using composite encoding feature into Chou's PseKNC.

    PubMed

    Sabooh, M Fazli; Iqbal, Nadeem; Khan, Mukhtaj; Khan, Muslim; Maqbool, H F

    2018-05-01

    This study examines accurate and efficient computational method for identification of 5-methylcytosine sites in RNA modification. The occurrence of 5-methylcytosine (m 5 C) plays a vital role in a number of biological processes. For better comprehension of the biological functions and mechanism it is necessary to recognize m 5 C sites in RNA precisely. The laboratory techniques and procedures are available to identify m 5 C sites in RNA, but these procedures require a lot of time and resources. This study develops a new computational method for extracting the features of RNA sequence. In this method, first the RNA sequence is encoded via composite feature vector, then, for the selection of discriminate features, the minimum-redundancy-maximum-relevance algorithm was used. Secondly, the classification method used has been based on a support vector machine by using jackknife cross validation test. The suggested method efficiently identifies m 5 C sites from non- m 5 C sites and the outcome of the suggested algorithm is 93.33% with sensitivity of 90.0 and specificity of 96.66 on bench mark datasets. The result exhibits that proposed algorithm shown significant identification performance compared to the existing computational techniques. This study extends the knowledge about the occurrence sites of RNA modification which paves the way for better comprehension of the biological uses and mechanism. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Predicting protein-binding regions in RNA using nucleotide profiles and compositions.

    PubMed

    Choi, Daesik; Park, Byungkyu; Chae, Hanju; Lee, Wook; Han, Kyungsook

    2017-03-14

    Motivated by the increased amount of data on protein-RNA interactions and the availability of complete genome sequences of several organisms, many computational methods have been proposed to predict binding sites in protein-RNA interactions. However, most computational methods are limited to finding RNA-binding sites in proteins instead of protein-binding sites in RNAs. Predicting protein-binding sites in RNA is more challenging than predicting RNA-binding sites in proteins. Recent computational methods for finding protein-binding sites in RNAs have several drawbacks for practical use. We developed a new support vector machine (SVM) model for predicting protein-binding regions in mRNA sequences. The model uses sequence profiles constructed from log-odds scores of mono- and di-nucleotides and nucleotide compositions. The model was evaluated by standard 10-fold cross validation, leave-one-protein-out (LOPO) cross validation and independent testing. Since actual mRNA sequences have more non-binding regions than protein-binding regions, we tested the model on several datasets with different ratios of protein-binding regions to non-binding regions. The best performance of the model was obtained in a balanced dataset of positive and negative instances. 10-fold cross validation with a balanced dataset achieved a sensitivity of 91.6%, a specificity of 92.4%, an accuracy of 92.0%, a positive predictive value (PPV) of 91.7%, a negative predictive value (NPV) of 92.3% and a Matthews correlation coefficient (MCC) of 0.840. LOPO cross validation showed a lower performance than the 10-fold cross validation, but the performance remains high (87.6% accuracy and 0.752 MCC). In testing the model on independent datasets, it achieved an accuracy of 82.2% and an MCC of 0.656. Testing of our model and other state-of-the-art methods on a same dataset showed that our model is better than the others. Sequence profiles of log-odds scores of mono- and di-nucleotides were much more powerful features than nucleotide compositions in finding protein-binding regions in RNA sequences. But, a slight performance gain was obtained when using the sequence profiles along with nucleotide compositions. These are preliminary results of ongoing research, but demonstrate the potential of our approach as a powerful predictor of protein-binding regions in RNA. The program and supporting data are available at http://bclab.inha.ac.kr/RBPbinding .

  5. Human Splice-Site Prediction with Deep Neural Networks.

    PubMed

    Naito, Tatsuhiko

    2018-04-18

    Accurate splice-site prediction is essential to delineate gene structures from sequence data. Several computational techniques have been applied to create a system to predict canonical splice sites. For classification tasks, deep neural networks (DNNs) have achieved record-breaking results and often outperformed other supervised learning techniques. In this study, a new method of splice-site prediction using DNNs was proposed. The proposed system receives an input sequence data and returns an answer as to whether it is splice site. The length of input is 140 nucleotides, with the consensus sequence (i.e., "GT" and "AG" for the donor and acceptor sites, respectively) in the middle. Each input sequence model is applied to the pretrained DNN model that determines the probability that an input is a splice site. The model consists of convolutional layers and bidirectional long short-term memory network layers. The pretraining and validation were conducted using the data set tested in previously reported methods. The performance evaluation results showed that the proposed method can outperform the previous methods. In addition, the pattern learned by the DNNs was visualized as position frequency matrices (PFMs). Some of PFMs were very similar to the consensus sequence. The trained DNN model and the brief source code for the prediction system are uploaded. Further improvement will be achieved following the further development of DNNs.

  6. Locating bomb factories by detecting hydrogen peroxide.

    PubMed

    Romolo, Francesco Saverio; Connell, Samantha; Ferrari, Carlotta; Suarez, Guillaume; Sauvain, Jean-Jacques; Hopf, Nancy B

    2016-11-01

    The analytical capability to detect hydrogen peroxide vapour can play a key role in localizing a site where a H2O2 based Improvised Explosive (IE) is manufactured. In security activities it is very important to obtain information in a short time. For this reason, an analytical method to be used in security activity needs portable devices. The authors have developed the first analytical method based on a portable luminometer, specifically designed and validated to locate IE manufacturing sites using quantitative on-site vapour analysis for H2O2. The method was tested both indoor and outdoor. The results demonstrate that the detection of H2O2 vapours could allow police forces to locate the site, while terrorists are preparing an attack. The collected data are also very important in developing new sensors, able to give an early alarm if located at a proper distance from a site where an H2O2 based IE is prepared. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Kinase Identification with Supervised Laplacian Regularized Least Squares

    PubMed Central

    Zhang, He; Wang, Minghui

    2015-01-01

    Phosphorylation is catalyzed by protein kinases and is irreplaceable in regulating biological processes. Identification of phosphorylation sites with their corresponding kinases contributes to the understanding of molecular mechanisms. Mass spectrometry analysis of phosphor-proteomes generates a large number of phosphorylated sites. However, experimental methods are costly and time-consuming, and most phosphorylation sites determined by experimental methods lack kinase information. Therefore, computational methods are urgently needed to address the kinase identification problem. To this end, we propose a new kernel-based machine learning method called Supervised Laplacian Regularized Least Squares (SLapRLS), which adopts a new method to construct kernels based on the similarity matrix and minimizes both structure risk and overall inconsistency between labels and similarities. The results predicted using both Phospho.ELM and an additional independent test dataset indicate that SLapRLS can more effectively identify kinases compared to other existing algorithms. PMID:26448296

  8. Kinase Identification with Supervised Laplacian Regularized Least Squares.

    PubMed

    Li, Ao; Xu, Xiaoyi; Zhang, He; Wang, Minghui

    2015-01-01

    Phosphorylation is catalyzed by protein kinases and is irreplaceable in regulating biological processes. Identification of phosphorylation sites with their corresponding kinases contributes to the understanding of molecular mechanisms. Mass spectrometry analysis of phosphor-proteomes generates a large number of phosphorylated sites. However, experimental methods are costly and time-consuming, and most phosphorylation sites determined by experimental methods lack kinase information. Therefore, computational methods are urgently needed to address the kinase identification problem. To this end, we propose a new kernel-based machine learning method called Supervised Laplacian Regularized Least Squares (SLapRLS), which adopts a new method to construct kernels based on the similarity matrix and minimizes both structure risk and overall inconsistency between labels and similarities. The results predicted using both Phospho.ELM and an additional independent test dataset indicate that SLapRLS can more effectively identify kinases compared to other existing algorithms.

  9. Evaluation of multiple tracer methods to estimate low groundwater flow velocities

    DOE PAGES

    Reimus, Paul W.; Arnold, Bill W.

    2017-02-20

    Here, four different tracer methods were used to estimate groundwater flow velocity at a multiple-well site in the saturated alluvium south of Yucca Mountain, Nevada: (1) two single-well tracer tests with different rest or “shut-in” periods, (2) a cross-hole tracer test with an extended flow interruption, (3) a comparison of two tracer decay curves in an injection borehole with and without pumping of a downgradient well, and (4) a natural-gradient tracer test. Such tracer methods are potentially very useful for estimating groundwater velocities when hydraulic gradients are flat (and hence uncertain) and also when water level and hydraulic conductivity datamore » are sparse, both of which were the case at this test location. The purpose of the study was to evaluate the first three methods for their ability to provide reasonable estimates of relatively low groundwater flow velocities in such low-hydraulic-gradient environments. The natural-gradient method is generally considered to be the most robust and direct method, so it was used to provide a “ground truth” velocity estimate. However, this method usually requires several wells, so it is often not practical in systems with large depths to groundwater and correspondingly high well installation costs. The fact that a successful natural gradient test was conducted at the test location offered a unique opportunity to compare the flow velocity estimates obtained by the more easily deployed and lower risk methods with the ground-truth natural-gradient method. The groundwater flow velocity estimates from the four methods agreed very well with each other, suggesting that the first three methods all provided reasonably good estimates of groundwater flow velocity at the site. We discuss the advantages and disadvantages of the different methods, as well as some of the uncertainties associated with them.« less

  10. Evaluation of multiple tracer methods to estimate low groundwater flow velocities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reimus, Paul W.; Arnold, Bill W.

    Here, four different tracer methods were used to estimate groundwater flow velocity at a multiple-well site in the saturated alluvium south of Yucca Mountain, Nevada: (1) two single-well tracer tests with different rest or “shut-in” periods, (2) a cross-hole tracer test with an extended flow interruption, (3) a comparison of two tracer decay curves in an injection borehole with and without pumping of a downgradient well, and (4) a natural-gradient tracer test. Such tracer methods are potentially very useful for estimating groundwater velocities when hydraulic gradients are flat (and hence uncertain) and also when water level and hydraulic conductivity datamore » are sparse, both of which were the case at this test location. The purpose of the study was to evaluate the first three methods for their ability to provide reasonable estimates of relatively low groundwater flow velocities in such low-hydraulic-gradient environments. The natural-gradient method is generally considered to be the most robust and direct method, so it was used to provide a “ground truth” velocity estimate. However, this method usually requires several wells, so it is often not practical in systems with large depths to groundwater and correspondingly high well installation costs. The fact that a successful natural gradient test was conducted at the test location offered a unique opportunity to compare the flow velocity estimates obtained by the more easily deployed and lower risk methods with the ground-truth natural-gradient method. The groundwater flow velocity estimates from the four methods agreed very well with each other, suggesting that the first three methods all provided reasonably good estimates of groundwater flow velocity at the site. We discuss the advantages and disadvantages of the different methods, as well as some of the uncertainties associated with them.« less

  11. Evaluation of surface renewal and flux-variance methods above agricultural and forest surfaces

    NASA Astrophysics Data System (ADS)

    Fischer, M.; Katul, G. G.; Noormets, A.; Poznikova, G.; Domec, J. C.; Trnka, M.; King, J. S.

    2016-12-01

    Measurements of turbulent surface energy fluxes are of high interest in agriculture and forest research. During last decades, eddy covariance (EC), has been adopted as the most commonly used micrometeorological method for measuring fluxes of greenhouse gases, energy and other scalars at the surface-atmosphere interface. Despite its robustness and accuracy, the costs of EC hinder its deployment at some research experiments and in practice like e.g. for irrigation scheduling. Therefore, testing and development of other cost-effective methods is of high interest. In our study, we tested performance of surface renewal (SR) and flux variance method (FV) for estimates of sensible heat flux density. Surface renewal method is based on the concept of non-random transport of scalars via so-called coherent structures which if accurately identified can be used for the computing of associated flux. Flux variance method predicts the flux from the scalar variance following the surface-layer similarity theory. We tested SR and FV against EC in three types of ecosystem with very distinct aerodynamic properties. First site was represented by agricultural wheat field in the Czech Republic. The second site was a 20-m tall mixed deciduous wetland forest on the coast of North Carolina, USA. The third site was represented by pine-switchgrass intercropping agro-forestry system located in coastal plain of North Carolina, USA. Apart from solving the coherent structures in a SR framework from the structure functions (representing the most common approach), we applied ramp wavelet detection scheme to test the hypothesis that the duration and amplitudes of the coherent structures are normally distributed within the particular 30-minutes time intervals and so just the estimates of their averages is sufficient for the accurate flux determination. Further, we tested whether the orthonormal wavelet thresholding can be used for isolating of the coherent structure scales which are associated with flux transport. Finally, we tested whether low-pass filtering in the Fourier domain based on integral length scale can improve estimates of both SR and FV as it supposedly removes the low frequency portion of the signal not related with the investigated fluxes.

  12. 40 CFR 65.158 - Performance test procedures for control devices.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... regulated material or as TOC (minus methane and ethane) according to the procedures specified. (1) Method 1... material or TOC, sampling sites shall be located at the inlet of the control device as specified in the... sampling sites shall ensure the measurement of total regulated material or TOC (minus methane and ethane...

  13. A general way for quantitative magnetic measurement by transmitted electrons

    NASA Astrophysics Data System (ADS)

    Song, Dongsheng; Li, Gen; Cai, Jianwang; Zhu, Jing

    2016-01-01

    EMCD (electron magnetic circular dichroism) technique opens a new door to explore magnetic properties by transmitted electrons. The recently developed site-specific EMCD technique makes it possible to obtain rich magnetic information from the Fe atoms sited at nonequivalent crystallographic planes in NiFe2O4, however it is based on a critical demand for the crystallographic structure of the testing sample. Here, we have further improved and tested the method for quantitative site-specific magnetic measurement applicable for more complex crystallographic structure by using the effective dynamical diffraction effects (general routine for selecting proper diffraction conditions, making use of the asymmetry of dynamical diffraction for design of experimental geometry and quantitative measurement, etc), and taken yttrium iron garnet (Y3Fe5O12, YIG) with more complex crystallographic structure as an example to demonstrate its applicability. As a result, the intrinsic magnetic circular dichroism signals, spin and orbital magnetic moment of iron with site-specific are quantitatively determined. The method will further promote the development of quantitative magnetic measurement with high spatial resolution by transmitted electrons.

  14. Laser assisted periodontal treatment: from bactericidal effect to local modification of the host response

    NASA Astrophysics Data System (ADS)

    Ciurescu, Codruţa.; Teslaru, Silvia; Zetu, Liviu; Ciurescu, Daniel

    2016-03-01

    The aim of the present short-term study is to investigate efficiency of laser therapy as adjunct to conventional periodontal therapy in patients with chronic periodontitis. Methods. The study protocol included 44 patients (20 males, 24 females; age 45-60) with moderate and advanced chronic periodontitis, recruited in Private Clinic Krondent (Brasov, Romania). The patients were randomly assigned in two groups, one group (test-sites group, 22 patients) treated by ultrasonic scaling and root planning followed by laser therapy (940 nm diode laser and 2780 nm Er:Cr:YAG laser) and second group (control-sites group, 22 patients) treated only by ultrasonic scaling and root planning. All patients were submitted to initial evaluation, recording of bleeding on probing (BOP) and probing of pockets depth (PPD), oral hygiene instruction and motivation. Indices BOP and PPD for the assessed periodontal sites were also recorded at 8 weeks, 16 weeks and 24 weeks after treatment. Results. Periodontal inflammatory parameters PPD (PPD>=4mm) were significantly lower in test-sites group as compared with control-sites group at 2 months (82% vs. 90%), 4 months (42% vs. 62%), and 6 months (11% vs. 30%).Periodontal parameters BOP were lower among patients in control-sites group and test-sites group at 2 months (38% vs. 32%), and significantly lower in test-sites group at 4 months (42% vs.26%), and 6 months (44% vs. 24%). Conclusions. The additional use of laser therapy increases significantly the efficiency of periodontal treatment comparing with conventional periodontal therapy.

  15. Upscaling the pollutant emission from mixed recycled aggregates under compaction for civil applications.

    PubMed

    Galvín, Adela P; Ayuso, Jesús; Barbudo, Auxi; Cabrera, Manuel; López-Uceda, Antonio; Rosales, Julia

    2017-12-27

    In general terms, plant managers of sites producing construction wastes assess materials according to concise, legally recommended leaching tests that do not consider the compaction stage of the materials when they are applied on-site. Thus, the tests do not account for the real on-site physical conditions of the recycled aggregates used in civil works (e.g., roads or embankments). This leads to errors in estimating the pollutant potential of these materials. For that reason, in the present research, an experimental procedure is designed as a leaching test for construction materials under compaction. The aim of this laboratory test (designed specifically for the granular materials used in civil engineering infrastructures) is to evaluate the release of pollutant elements when the recycled aggregate is tested at its commercial grain-size distribution and when the material is compacted under on-site conditions. Two recycled aggregates with different gypsum contents (0.95 and 2.57%) were used in this study. In addition to the designed leaching laboratory test, the conventional compliance leaching test and the Dutch percolation test were performed. The results of the new leaching method were compared with the conventional leaching test results. After analysis, the chromium and sulphate levels obtained from the newly designed test were lower than those obtained from the conventional leaching test, and these were considered more seriously pollutant elements. This result confirms that when the leaching behaviour is evaluated for construction aggregates without density alteration, crushing the aggregate and using only the finest fraction, as is done in the conventional test (which is an unrealistic situation for aggregates that are applied under on-site conditions), the leaching behaviour is not accurately assessed.

  16. Protocol for a mixed-methods study on leader-based interventions in construction contractors' safety commitments.

    PubMed

    Pedersen, Betina Holbaek; Dyreborg, Johnny; Kines, Pete; Mikkelsen, Kim Lyngby; Hannerz, Harald; Andersen, Dorte Raaby; Spangenberg, Søren

    2010-06-01

    Owing to high injury rates, safety interventions are needed in the construction industry. Evidence-based interventions tailored to this industry are, however, scarce. Leader-based safety interventions have proven more effective than worker-based interventions in other industries. To test a leader-based safety intervention for construction sites. The intervention consists of encouraging safety coordinators to provide feedback on work safety to the client and line management. The intention is to increase communication and interactions regarding safety within the line management and between the client and the senior management. It is hypothesised that this, in turn, will lead to increased communication and interaction about safety between management and coworkers as well as an increased on-site safety level. A group-randomised double-blinded case study of six Danish construction sites (three intervention sites and three control sites). The recruitment of the construction sites is performed continuously from January 2010 to June 2010. The investigation of each site lasts 20 continuous weeks. Confirmatory statistical analysis is used to test if the safety level increased, and if the probability of safety communications between management and coworkers increases as a consequence of the intervention. The data collection will be blinded. Qualitative methods are used to evaluate if communication and interactions about safety at all managerial levels, including the client, increase. (1) The proportion of safety-related communications out of all studied communications between management and coworkers. (2) The safety level index of the construction sites.

  17. Estimates of monthly streamflow characteristics at selected sites in the upper Missouri River basin, Montana, base period water years 1937-86

    USGS Publications Warehouse

    Parrett, Charles; Johnson, D.R.; Hull, J.A.

    1989-01-01

    Estimates of streamflow characteristics (monthly mean flow that is exceeded 90, 80, 50, and 20 percent of the time for all years of record and mean monthly flow) were made and are presented in tabular form for 312 sites in the Missouri River basin in Montana. Short-term gaged records were extended to the base period of water years 1937-86, and were used to estimate monthly streamflow characteristics at 100 sites. Data from 47 gaged sites were used in regression analysis relating the streamflow characteristics to basin characteristics and to active-channel width. The basin-characteristics equations, with standard errors of 35% to 97%, were used to estimate streamflow characteristics at 179 ungaged sites. The channel-width equations, with standard errors of 36% to 103%, were used to estimate characteristics at 138 ungaged sites. Streamflow measurements were correlated with concurrent streamflows at nearby gaged sites to estimate streamflow characteristics at 139 ungaged sites. In a test using 20 pairs of gages, the standard errors ranged from 31% to 111%. At 139 ungaged sites, the estimates from two or more of the methods were weighted and combined in accordance with the variance of individual methods. When estimates from three methods were combined the standard errors ranged from 24% to 63 %. A drainage-area-ratio adjustment method was used to estimate monthly streamflow characteristics at seven ungaged sites. The reliability of the drainage-area-ratio adjustment method was estimated to be about equal to that of the basin-characteristics method. The estimate were checked for reliability. Estimates of monthly streamflow characteristics from gaged records were considered to be most reliable, and estimates at sites with actual flow record from 1937-86 were considered to be completely reliable (zero error). Weighted-average estimates were considered to be the most reliable estimates made at ungaged sites. (USGS)

  18. Prospects for Problems Associated with Integrative and Inter-comparative Analysis of Eddy Flux Data Sets

    NASA Astrophysics Data System (ADS)

    Kobayashi, Y.; Miyata, A.; Nagai, H.; Mano, M.; Yamamoto, S.

    2005-12-01

    In last decade, numerous long-term eddy flux measurements have been conducted worldwide to assess annual/seasonal energy, water and carbon exchanges between terrestrial ecosystem and the atmosphere. And FLUXNET communities now seem to come into a next phase with the objectives: integration of flux data observed at various ecosystems and/or inter-sites comparative studies. For example, a big research project "S-1" is ongoing in Japan and other eastern Asian region to set up terrestrial carbon management of Asia in the 21st century. One of the highlights of S-1 project is to provide a carbon budget map of all over Asia based on integrated and inter-compared eddy flux data collected at 15 sites of S-1 membership. FLUXNET communities including S-1 project have recognized that integration and inter-comparison of eddy flux data are the key issues to understand aspects of energy, water and carbon budgets at regional scale. However, the issues have difficulties to be settled because each flux site applies own data processing methods and gap-filling methods with site-specified classification and threshold values. In order to conduct appropriate integrative and inter-comparative analysis for eddy flux data effectively, we made it clear that how the differences in the data processing method affect the obtained flux values and searched for suitable and common gap-filling methodology. The differences in the data processing methods affect the obtained flux data in the present study was discussed based on a comparative experiment in S-1 project. We prepared one-month common test data sets, which consisted of 10 Hz eddy covariance raw data and related half-hourly meteorological data obtained at a larch forest site and a paddy site, in the comparative experiment. The 15 sites of S-1 memberships processed the test data by using their own processing methods. The results indicated that combined influences of coordinate rotation, detrending and frequency response correction brought about up to 10% of flux discrepancy, and that the forest sites were more sensitive to differences in the data processing methods than the non-forest sites. Multiple imputation method (MI), one of the statistical operations for analyzing incomplete multivariate data set, is likely to be an easy-to-use and objective gap-filling method to account for missing eddy flux data. We also discussed validity of application of MI to fill missing flux data by comparing a gap-filled complete eddy flux data set obtained by MI with that by nonlinear regression method and look-up table method. It was revealed that, with suitable separation of the periods to be filled and proper selection of reference variables, MI has potential to be applied commonly to gap-filling missing flux data, and that MI can be a useful tool for FLUXNET communities to make inter-site comparison of long-term flux data.

  19. Tritium as an indicator of venues for nuclear tests.

    PubMed

    Lyakhova, O N; Lukashenko, S N; Mulgin, S I; Zhdanov, S V

    2013-10-01

    Currently, due to the Treaty on the Non-proliferation of Nuclear Weapons there is a highly topical issue of an accurate verification of nuclear explosion venues. This paper proposes to consider new method for verification by using tritium as an indicator. Detailed studies of the tritium content in the air were carried in the locations of underground nuclear tests - "Balapan" and "Degelen" testing sites located in Semipalatinsk Test Site. The paper presents data on the levels and distribution of tritium in the air where tunnels and boreholes are located - explosion epicentres, wellheads and tunnel portals, as well as in estuarine areas of the venues for the underground nuclear explosions (UNE). Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Test-retest and between-site reliability in a multicenter fMRI study.

    PubMed

    Friedman, Lee; Stern, Hal; Brown, Gregory G; Mathalon, Daniel H; Turner, Jessica; Glover, Gary H; Gollub, Randy L; Lauriello, John; Lim, Kelvin O; Cannon, Tyrone; Greve, Douglas N; Bockholt, Henry Jeremy; Belger, Aysenil; Mueller, Bryon; Doty, Michael J; He, Jianchun; Wells, William; Smyth, Padhraic; Pieper, Steve; Kim, Seyoung; Kubicki, Marek; Vangel, Mark; Potkin, Steven G

    2008-08-01

    In the present report, estimates of test-retest and between-site reliability of fMRI assessments were produced in the context of a multicenter fMRI reliability study (FBIRN Phase 1, www.nbirn.net). Five subjects were scanned on 10 MRI scanners on two occasions. The fMRI task was a simple block design sensorimotor task. The impulse response functions to the stimulation block were derived using an FIR-deconvolution analysis with FMRISTAT. Six functionally-derived ROIs covering the visual, auditory and motor cortices, created from a prior analysis, were used. Two dependent variables were compared: percent signal change and contrast-to-noise-ratio. Reliability was assessed with intraclass correlation coefficients derived from a variance components analysis. Test-retest reliability was high, but initially, between-site reliability was low, indicating a strong contribution from site and site-by-subject variance. However, a number of factors that can markedly improve between-site reliability were uncovered, including increasing the size of the ROIs, adjusting for smoothness differences, and inclusion of additional runs. By employing multiple steps, between-site reliability for 3T scanners was increased by 123%. Dropping one site at a time and assessing reliability can be a useful method of assessing the sensitivity of the results to particular sites. These findings should provide guidance toothers on the best practices for future multicenter studies.

  1. MDD-carb: a combinatorial model for the identification of protein carbonylation sites with substrate motifs.

    PubMed

    Kao, Hui-Ju; Weng, Shun-Long; Huang, Kai-Yao; Kaunang, Fergie Joanda; Hsu, Justin Bo-Kai; Huang, Chien-Hsun; Lee, Tzong-Yi

    2017-12-21

    Carbonylation, which takes place through oxidation of reactive oxygen species (ROS) on specific residues, is an irreversibly oxidative modification of proteins. It has been reported that the carbonylation is related to a number of metabolic or aging diseases including diabetes, chronic lung disease, Parkinson's disease, and Alzheimer's disease. Due to the lack of computational methods dedicated to exploring motif signatures of protein carbonylation sites, we were motivated to exploit an iterative statistical method to characterize and identify carbonylated sites with motif signatures. By manually curating experimental data from research articles, we obtained 332, 144, 135, and 140 verified substrate sites for K (lysine), R (arginine), T (threonine), and P (proline) residues, respectively, from 241 carbonylated proteins. In order to examine the informative attributes for classifying between carbonylated and non-carbonylated sites, multifarious features including composition of twenty amino acids (AAC), composition of amino acid pairs (AAPC), position-specific scoring matrix (PSSM), and positional weighted matrix (PWM) were investigated in this study. Additionally, in an attempt to explore the motif signatures of carbonylation sites, an iterative statistical method was adopted to detect statistically significant dependencies of amino acid compositions between specific positions around substrate sites. Profile hidden Markov model (HMM) was then utilized to train a predictive model from each motif signature. Moreover, based on the method of support vector machine (SVM), we adopted it to construct an integrative model by combining the values of bit scores obtained from profile HMMs. The combinatorial model could provide an enhanced performance with evenly predictive sensitivity and specificity in the evaluation of cross-validation and independent testing. This study provides a new scheme for exploring potential motif signatures at substrate sites of protein carbonylation. The usefulness of the revealed motifs in the identification of carbonylated sites is demonstrated by their effective performance in cross-validation and independent testing. Finally, these substrate motifs were adopted to build an available online resource (MDD-Carb, http://csb.cse.yzu.edu.tw/MDDCarb/ ) and are also anticipated to facilitate the study of large-scale carbonylated proteomes.

  2. Evaluation of field methods for vertical high resolution aquifer characterization

    NASA Astrophysics Data System (ADS)

    Vienken, T.; Tinter, M.; Rogiers, B.; Leven, C.; Dietrich, P.

    2012-12-01

    The delineation and characterization of subsurface (hydro)-stratigraphic structures is one of the challenging tasks of hydrogeological site investigations. The knowledge about the spatial distribution of soil specific properties and hydraulic conductivity (K) is the prerequisite for understanding flow and fluid transport processes. This is especially true for heterogeneous unconsolidated sedimentary deposits with a complex sedimentary architecture. One commonly used approach to investigate and characterize sediment heterogeneity is soil sampling and lab analyses, e.g. grain size distribution. Tests conducted on 108 samples show that calculation of K based on grain size distribution is not suitable for high resolution aquifer characterization of highly heterogeneous sediments due to sampling effects and large differences of calculated K values between applied formulas (Vienken & Dietrich 2011). Therefore, extensive tests were conducted at two test sites under different geological conditions to evaluate the performance of innovative Direct Push (DP) based approaches for the vertical high resolution determination of K. Different DP based sensor probes for the in-situ subsurface characterization based on electrical, hydraulic, and textural soil properties were used to obtain high resolution vertical profiles. The applied DP based tools proved to be a suitable and efficient alternative to traditional approaches. Despite resolution differences, all of the applied methods captured the main aquifer structure. Correlation of the DP based K estimates and proxies with DP based slug tests show that it is possible to describe the aquifer hydraulic structure on less than a meter scale by combining DP slug test data and continuous DP measurements. Even though correlations are site specific and appropriate DP tools must be chosen, DP is reliable and efficient alternative for characterizing even strongly heterogeneous sites with complex structured sedimentary aquifers (Vienken et al. 2012). References: Vienken, T., Leven, C., and Dietrich, P. 2012. Use of CPT and other direct push methods for (hydro-) stratigraphic aquifer characterization — a field study. Canadian Geotechnical Journal, 49(2): 197-206. Vienken, T., and Dietrich, P. 2011. Field evaluation of methods for determining hydraulic conductivity from grain size data. Journal of Hydrology, 400(1-2): 58-71.

  3. Multimodality language mapping in patients with left-hemispheric language dominance on Wada test.

    PubMed

    Kojima, Katsuaki; Brown, Erik C; Rothermel, Robert; Carlson, Alanna; Matsuzaki, Naoyuki; Shah, Aashit; Atkinson, Marie; Mittal, Sandeep; Fuerst, Darren; Sood, Sandeep; Asano, Eishi

    2012-10-01

    We determined the utility of electrocorticography (ECoG) and stimulation for detecting language-related sites in patients with left-hemispheric language-dominance on Wada test. We studied 13 epileptic patients who underwent language mapping using event-related gamma-oscillations on ECoG and stimulation via subdural electrodes. Sites showing significant gamma-augmentation during an auditory-naming task were defined as language-related ECoG sites. Sites at which stimulation resulted in auditory perceptual changes, failure to verbalize a correct answer, or sensorimotor symptoms involving the mouth were defined as language-related stimulation sites. We determined how frequently these methods revealed language-related sites in the superior-temporal, inferior-frontal, dorsolateral-premotor, and inferior-Rolandic regions. Language-related sites in the superior-temporal and inferior-frontal gyri were detected by ECoG more frequently than stimulation (p < 0.05), while those in the dorsolateral-premotor and inferior-Rolandic regions were detected by both methods equally. Stimulation of language-related ECoG sites, compared to the others, more frequently elicited language symptoms (p < 0.00001). One patient developed dysphasia requiring in-patient speech therapy following resection of the dorsolateral-premotor and inferior-Rolandic regions containing language-related ECoG sites not otherwise detected by stimulation. Language-related gamma-oscillations may serve as an alternative biomarker of underlying language function in patients with left-hemispheric language-dominance. Measurement of language-related gamma-oscillations is warranted in presurgical evaluation of epileptic patients. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  4. Management of Chronic Periodontitis Using Subgingival Irrigation of Ozonized Water: A Clinical and Microbiological Study

    PubMed Central

    Mathew, Jayan Jacob; Ambooken, Majo; Kachappilly, Arun Jose; PK, Ajithkumar; Johny, Thomas; VK, Linith; Samuel, Anju

    2015-01-01

    Introduction Adjunctive use of professional subgingival irrigation with scaling and root planing (SRP) has been found to be beneficial in eradicating the residual microorganisms in the pocket. Objective To evaluate the effect of ozonized water subgingival irrigation on microbiologic parameters and clinical parameters namely Gingival index, probing pocket depth, and clinical attachment level. Materials and Methods Thirty chronic periodontitis patients with probing pocket depth ≥6mm on at least one tooth on contra lateral sides of opposite arches were included in the study. The test sites were subjected to ozonized water subgingival irrigation with subgingival irrigation device fitted with a modified subgingival tip. Control sites were subjected to scaling and root planing only. The following clinical parameters were recorded initially and after 4 weeks at the test sites and control sites. Plaque Index, Gingival Index, probing pocket depth, clinical attachment level. Microbiologic sampling was done for the test at the baseline, after scaling, immediately after ozonized water subgingival irrigation and after 4 weeks. In control sites microbiologic sampling was done at the baseline, after scaling and after 4 weeks. The following observations were made after 4 weeks. The results were statistically analysed using independent t-test and paired t-test. Result Test sites showed a greater reduction in pocket depth and gain in clinical attachment compared to control sites. The total anaerobic counts were significantly reduced by ozonized water subgingival irrigation along with SRP compared to SRP alone. Conclusion Ozonized water subgingival irrigation can improve the clinical and microbiological parameters in patients with chronic periodontitis when used as an adjunct to scaling and root planing. PMID:26436042

  5. Unravelling the quality of HIV counselling and testing services in the private and public sectors in Zambia

    PubMed Central

    Ron Levey, Ilana; Wang, Wenjuan

    2014-01-01

    Background Despite the substantial investment for providing HIV counselling and testing (VCT) services in Zambia, there has been little effort to systematically evaluate the quality of VCT services provided by various types of health providers. This study, conducted in 2009, examines VCT in the public and private sectors including private for-profit and NGO/faith-based sectors in Copperbelt and Luapula. Methods The study used five primary data collection methods to gauge quality of VCT services: closed-ended client interviews with clients exiting VCT sites; open-ended client interviews; interviews with facility managers; review of service statistics; and an observation of the physical environment for VCT by site. Over 400 clients and 87 facility managers were interviewed from almost 90 facilities. Sites were randomly selected and results are generalizable at the provincial level. Results The study shows concerning levels of underperformance in VCT services across the sectors. It reveals serious underperformance in counselling about key risk-reduction methods. Less than one-third of clients received counselling on reducing number of sexual partners and only approximately 5% of clients received counselling about disclosing test results to partners. In terms of client profiles, the NGO sector attracts the most educated clients and less educated Zambians seek VCT services at very low rates (7%). The private for-profit performs equally or sometimes better than other sectors even though this sector is not adequately integrated into the Zambian national response to HIV. Conclusion The private for-profit sector provides VCT services on par in quality with the other sectors. Most clients did not receive counselling on partner reduction or disclosure of HIV test results to partners. In a generalized HIV epidemic where multiple concurrent sexual partners are a significant problem for transmitting the disease, risk-reduction methods and discussion should be a main focus of pre-test and post-test counselling. PMID:25012796

  6. Development of a design basis tornado and structural design criteria for the Nevada Test Site, Nevada. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, J.R.; Minor, J.E.; Mehta, K.C.

    1975-06-01

    In order to evaluate the ability of critical facilities at the Nevada Test Site to withstand the possible damaging effects of extreme winds and tornadoes, parameters for the effects of tornadoes and extreme winds and structural design criteria for the design and evaluation of structures were developed. The meteorological investigations conducted are summarized, and techniques used for developing the combined tornado and extreme wind risk model are discussed. The guidelines for structural design include methods for calculating pressure distributions on walls and roofs of structures and methods for accommodating impact loads from wind-driven missiles. Calculations for determining the design loadsmore » for an example structure are included. (LCL)« less

  7. Apparatus for detecting leaks

    DOEpatents

    Booth, Eugene T.

    1976-02-24

    A method and apparatus for determining the position of and estimating the size of leaks in an evacuating apparatus comprising the use of a testing gas such as helium or hydrogen flowing around said apparatus whereby the testing gas will be drawn in at the site of any leaks.

  8. 77 FR 41188 - Clinical Laboratory Improvement Advisory Committee (CLIAC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-12

    ... technological advances, such as new test methods and the electronic transmission of laboratory information... potential need for educational materials and resources for sites that test under a Provider-performed Microscopy Certificate; and the increased use of culture-independent microbiology diagnostics and the impact...

  9. Commercial Grade Item (CGI) Dedication for Leak Detection Relays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KOCH, M.R.; JOHNS, B.R.

    1999-12-21

    This Test Plan provides a test method to dedicate the leak detection relays used on the new Pumping and Instrumentation Control (PIC) skids. The new skids are fabricated on-site. The leak detection system is a safety class system per the Authorization Basis.

  10. Commercial Grade Item (CGI) Dedication for Leak Detection Relays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KOCH, M.R.

    2000-02-28

    This Test Plan provides a test method to dedicate the leak detection relays used on the new Pumping Instrumentation and Control (PIC) skids. The new skids are fabricated on-site. The leak detection system is a safety class system per the Authorization Basis.

  11. Commercial Grade Item (CGI) Dedication for Leak Detection Relays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KOCH, M.R.

    1999-08-11

    This Test Plan provides a test method to dedicate the leak detection relays used on the new Pumping and Instrumentation Control (PIC) skids. The new skids are fabricated on-site. The leak detection system is a safety class system per the Authorization Basis.

  12. Commercial Grade Item (CGI) Dedication for Leak Detection Relays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JOHNS, B.R.

    1999-05-05

    This Test Plan provides a test method to dedicate the leak detection relays used on the new Pumping and Instrumentation Control (PIC) skids. The new skids are fabricated on-site. The leak detection system is a safety class system per the Authorization Basis.

  13. Commercial Grade Item (CGI) Dedication for Leak Detection Relays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KOCH, M.R.

    1999-10-26

    This Test Plan provides a test method to dedicate the leak detection relays used on the new Pumping and Instrumentation Control (PIC) skids. The new skids are fabricated on-site. The leak detection system is a safety class system per the Authorization Basis.

  14. Commercial Grade Item (CGI) Dedication for Leak Detection Relays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JOHNS, B.R.; KOCH, M.R.

    2000-01-28

    This Test Plan provides a test method to dedicate the leak detection relays used on the new Pumping Instrumentation and Control (PIC) skids. The new skids are fabricated on-site. The leak detection system is a safety class system per the Authorization Basis.

  15. Application of Poisson random effect models for highway network screening.

    PubMed

    Jiang, Ximiao; Abdel-Aty, Mohamed; Alamili, Samer

    2014-02-01

    In recent years, Bayesian random effect models that account for the temporal and spatial correlations of crash data became popular in traffic safety research. This study employs random effect Poisson Log-Normal models for crash risk hotspot identification. Both the temporal and spatial correlations of crash data were considered. Potential for Safety Improvement (PSI) were adopted as a measure of the crash risk. Using the fatal and injury crashes that occurred on urban 4-lane divided arterials from 2006 to 2009 in the Central Florida area, the random effect approaches were compared to the traditional Empirical Bayesian (EB) method and the conventional Bayesian Poisson Log-Normal model. A series of method examination tests were conducted to evaluate the performance of different approaches. These tests include the previously developed site consistence test, method consistence test, total rank difference test, and the modified total score test, as well as the newly proposed total safety performance measure difference test. Results show that the Bayesian Poisson model accounting for both temporal and spatial random effects (PTSRE) outperforms the model that with only temporal random effect, and both are superior to the conventional Poisson Log-Normal model (PLN) and the EB model in the fitting of crash data. Additionally, the method evaluation tests indicate that the PTSRE model is significantly superior to the PLN model and the EB model in consistently identifying hotspots during successive time periods. The results suggest that the PTSRE model is a superior alternative for road site crash risk hotspot identification. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Application of the modified chi-square ratio statistic in a stepwise procedure for cascade impactor equivalence testing.

    PubMed

    Weber, Benjamin; Lee, Sau L; Delvadia, Renishkumar; Lionberger, Robert; Li, Bing V; Tsong, Yi; Hochhaus, Guenther

    2015-03-01

    Equivalence testing of aerodynamic particle size distribution (APSD) through multi-stage cascade impactors (CIs) is important for establishing bioequivalence of orally inhaled drug products. Recent work demonstrated that the median of the modified chi-square ratio statistic (MmCSRS) is a promising metric for APSD equivalence testing of test (T) and reference (R) products as it can be applied to a reduced number of CI sites that are more relevant for lung deposition. This metric is also less sensitive to the increased variability often observed for low-deposition sites. A method to establish critical values for the MmCSRS is described here. This method considers the variability of the R product by employing a reference variance scaling approach that allows definition of critical values as a function of the observed variability of the R product. A stepwise CI equivalence test is proposed that integrates the MmCSRS as a method for comparing the relative shapes of CI profiles and incorporates statistical tests for assessing equivalence of single actuation content and impactor sized mass. This stepwise CI equivalence test was applied to 55 published CI profile scenarios, which were classified as equivalent or inequivalent by members of the Product Quality Research Institute working group (PQRI WG). The results of the stepwise CI equivalence test using a 25% difference in MmCSRS as an acceptance criterion provided the best matching with those of the PQRI WG as decisions of both methods agreed in 75% of the 55 CI profile scenarios.

  17. 78 FR 98 - Notice of Availability: Test Tools and Test Procedures Approved by the National Coordinator for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-02

    ...This notice announces the availability of test tools and test procedures approved by the National Coordinator for Health Information Technology (the National Coordinator) for the testing of EHR technology to the 2014 Edition EHR certification criteria under the ONC HIT Certification Program. The approved test tools and test procedures are identified on the ONC Web site at: http://www.healthit.gov/policy- researchers-implementers/2014-edition-final-test-method.

  18. Results from laboratory and field testing of nitrate measuring spectrophotometers

    USGS Publications Warehouse

    Snazelle, Teri T.

    2015-01-01

    In Phase II, the analyzers were deployed in field conditions at three diferent USGS sites. The measured nitrate concentrations were compared to discrete (reference) samples analyzed by the Direct UV method on a Shimadzu UV1800 bench top spectrophotometer, and by the National Environmental Methods Index (NEMI) method I-2548-11 at the USGS National Water Quality Laboratory. The first deployment at USGS site 0249620 on the East Pearl River in Hancock County, Mississippi, tested the ability of the TriOs ProPs (10-mm path length), Hach NITRATAX (5 mm), Satlantic SUNA (10 mm), and the S::CAN Spectro::lyser (5 mm) to accurately measure low-level (less than 2 mg-N/L) nitrate concentrations while observing the effect turbidity and colored dissolved organic matter (CDOM) would have on the analyzers' measurements. The second deployment at USGS site 01389005 Passaic River below Pompton River at Two Bridges, New Jersey, tested the analyzer's accuracy in mid-level (2-8 mg-N/L) nitrate concentrations. This site provided the means to test the analyzers' performance in two distinct matrices—the Passaic and the Pompton Rivers. In this deployment, three instruments tested in Phase I (TriOS, Hach, and SUNA) were deployed with the S::CAN Spectro::lyser (35 mm) already placed by the New Jersey Water Science Center (WSC). The third deployment at USGS site 05579610 Kickapoo Creek at 2100E Road near Bloomington, Illinois, tested the ability of the analyzers to measure high nitrate concentrations (greater than 8 mg-N/L) in turbid waters. For Kickapoo Creek, the HIF provided the TriOS (10 mm) and S::CAN (5 mm) from Phase I, and a SUNA V2 (5 mm) to be deployed adjacent to the Illinois WSC-owned Hach (2 mm). A total of 40 discrete samples were collected from the three deployment sites and analyzed. The nitrate concentration of the samples ranged from 0.3–22.2 mg-N/L. The average absolute difference between the TriOS measurements and discrete samples was 0.46 mg-N/L. For the combined data from the Hach 5-mm and 2-mm analyzers, the average absolute difference between the Hach samples and the discrete samples was 0.13 mg-N/L. For the SUNA and SUNA V2 combined data, the average absolute difference between the SUNA samples and the discrete samples was 0.66 mg-N/L. The average absolute difference between the S::CAN samples and the discrete samples was 0.63 mg-N/L.

  19. Geoscientific Characterization of the Bruce Site, Tiverton, Ontario

    NASA Astrophysics Data System (ADS)

    Raven, K.; Jackson, R.; Avis, J.; Clark, I.; Jensen, M.

    2009-05-01

    Ontario Power Generation is proposing a Deep Geologic Repository (DGR) for the long-term management of its Low and Intermediate Level Radioactive Waste (L&ILW) within a Paleozoic-age sedimentary sequence beneath the Bruce site near Tiverton, Ontario, Canada. The concept envisions that the DGR would be excavated at a depth of approximately 680 m within the Ordovician Cobourg Formation, a massive, dense, low- permeability, argillaceous limestone. Characterization of the Bruce site for waste disposal is being conducted in accordance with a four year multi-phase Geoscientific Site Characterization Plan (GSCP). The GSCP, initially developed in 2006 and later revised in 2008 to account for acquired site knowledge based on successful completion of Phase I investigations, describes the tools and methods selected for geological, hydrogeological and geomechanical site characterization. The GSCP was developed, in part, on an assessment of geoscience data needs and collection methods, review of the results of detailed geoscientific studies completed in the same bedrock formations found off the Bruce site, and recent international experience in geoscientific characterization of similar sedimentary rocks for long-term radioactive waste management purposes. Field and laboratory work related to Phase 1 and Phase 2A are nearing completion and have focused on the drilling, testing and monitoring of four continuously cored vertical boreholes through Devonian, Silurian, Ordovician and Cambrian bedrock to depths of about 860 mBGS. Work in 2009 will focus on drilling and testing of inclined boreholes to assess presence of vertical structure. The available geological, hydrogeological and hydrogeochemical data indicate the presence of remarkably uniform and predictable geology, physical hydrogeologic and geochemical properties over well separation distances exceeding 1 km. The current data set including 2-D seismic reflection surveys, field and lab hydraulic testing, lab petrophysical and diffusion testing, lab porewater and field groundwater characterization, and field head monitoring confirm the anticipated favourable characteristics of the Bruce site for long-term waste management. These favourable characteristics include a tight geomechanically stable host formation that is overlain and underlain by thick, massive, very low permeability shale and argillaceous limestone formations where radionuclide transport appears to be very limited and dominated by diffusion.

  20. A Comparative Analysis of Three Monocular Passive Ranging Methods on Real Infrared Sequences

    NASA Astrophysics Data System (ADS)

    Bondžulić, Boban P.; Mitrović, Srđan T.; Barbarić, Žarko P.; Andrić, Milenko S.

    2013-09-01

    Three monocular passive ranging methods are analyzed and tested on the real infrared sequences. The first method exploits scale changes of an object in successive frames, while other two use Beer-Lambert's Law. Ranging methods are evaluated by comparing with simultaneously obtained reference data at the test site. Research is addressed on scenarios where multiple sensor views or active measurements are not possible. The results show that these methods for range estimation can provide the fidelity required for object tracking. Maximum values of relative distance estimation errors in near-ideal conditions are less than 8%.

  1. Multi-level assessment protocol (MAP) for adoption in multi-site clinical trials

    PubMed Central

    Guydish, J.; Manser, S.T.; Jessup, M.; Tajima, B.; Sears, C.; Montini, T.

    2010-01-01

    The National Institute on Drug Abuse (NIDA) Clinical Trials Network (CTN) is intended to test promising drug abuse treatment models in multi-site clinical trials, and to support adoption of new interventions into clinical practice. Using qualitative research methods we asked: How might the technology of multi-site clinical trials be modified to better support adoption of tested interventions? A total of 42 participants, representing 8 organizational levels ranging from clinic staff to clinical trial leaders, were interviewed about their role in the clinical trial, its interactions with clinics, and intervention adoption. Among eight clinics participating in the clinical trial, we found adoption of the tested intervention in one clinic only. In analysis of interview data we identified four conceptual themes which are likely to affect adoption and may be informative in future multi-site clinical trials. We offer the conclusion that planning for adoption in the early stages of protocol development will better serve the aim of integrating new interventions into practice. PMID:20890376

  2. The Las Vegas Valley Seismic Response Project: Ground Motions in Las Vegas Valley from Nuclear Explosions at the Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodgers, A; Tkalcic, H; McCallen, D

    2005-03-18

    Between 2001-2004 the Las Vegas Seismic Response Project has sought to understand the response of Las Vegas Valley (LVV) to seismic excitation. In this study, the author report the findings of this project with an emphasis on ground motions in LVV from nuclear explosions at the Nevada Test Site (NTS). These ground motions are used to understand building structural response and damage as well as human perception. Historical nuclear explosion observations are augmented with earthquake recordings from a temporary deployment of seismometers to improve spatial coverage of LVV. The nuclear explosions were conducted between 1968 and 1989 and were recordedmore » at various sites within Las Vegas. The data from past nuclear tests were used to constrain ground motions in LVV and to gain a predictive capability of ground motions for possible future nuclear tests at NTS. Analysis of ground motion data includes peak ground motions (accelerations and velocities) and amplification of basin sites relative to hard rock sites (site response). Site response was measured with the Standard Spectral Ratios (SSR) technique relative to hard rock reference sites on the periphery of LVV. The site response curves indicate a strong basin amplification of up to a factor of ten at frequencies between 0.5-2 Hz. Amplifications are strongest in the central and northern portions of LVV, where the basin is deeper than 1 km based on the reported basin depths of Langenheim et al (2001a). They found a strong correlation between amplification and basin depth and shallow shear wave velocities. Amplification below 1 Hz is strongly controlled by slowness-averaged shear velocities to depths of 30 and 100 meters. Depth averaged shear velocities to 10 meters has modest control of amplifications between 1-3 Hz. Modeling reveals that low velocity material in the shallow layers (< 200 m) effectively controls amplification. They developed a method to scale nuclear explosion ground motion time series to sites around LVV that have no historical record of explosions. The method is also used to scale nuclear explosion ground motions to different yields. They also present a range of studies to understand basin structure and response performed on data from the temporary deployment.« less

  3. Automated prediction of protein function and detection of functional sites from structure.

    PubMed

    Pazos, Florencio; Sternberg, Michael J E

    2004-10-12

    Current structural genomics projects are yielding structures for proteins whose functions are unknown. Accordingly, there is a pressing requirement for computational methods for function prediction. Here we present PHUNCTIONER, an automatic method for structure-based function prediction using automatically extracted functional sites (residues associated to functions). The method relates proteins with the same function through structural alignments and extracts 3D profiles of conserved residues. Functional features to train the method are extracted from the Gene Ontology (GO) database. The method extracts these features from the entire GO hierarchy and hence is applicable across the whole range of function specificity. 3D profiles associated with 121 GO annotations were extracted. We tested the power of the method both for the prediction of function and for the extraction of functional sites. The success of function prediction by our method was compared with the standard homology-based method. In the zone of low sequence similarity (approximately 15%), our method assigns the correct GO annotation in 90% of the protein structures considered, approximately 20% higher than inheritance of function from the closest homologue.

  4. Residential Two-Stage Gas Furnaces - Do They Save Energy?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lekov, Alex; Franco, Victor; Lutz, James

    2006-05-12

    Residential two-stage gas furnaces account for almost a quarter of the total number of models listed in the March 2005 GAMA directory of equipment certified for sale in the United States. Two-stage furnaces are expanding their presence in the market mostly because they meet consumer expectations for improved comfort. Currently, the U.S. Department of Energy (DOE) test procedure serves as the method for reporting furnace total fuel and electricity consumption under laboratory conditions. In 2006, American Society of Heating Refrigeration and Air-conditioning Engineers (ASHRAE) proposed an update to its test procedure which corrects some of the discrepancies found in themore » DOE test procedure and provides an improved methodology for calculating the energy consumption of two-stage furnaces. The objectives of this paper are to explore the differences in the methods for calculating two-stage residential gas furnace energy consumption in the DOE test procedure and in the 2006 ASHRAE test procedure and to compare test results to research results from field tests. Overall, the DOE test procedure shows a reduction in the total site energy consumption of about 3 percent for two-stage compared to single-stage furnaces at the same efficiency level. In contrast, the 2006 ASHRAE test procedure shows almost no difference in the total site energy consumption. The 2006 ASHRAE test procedure appears to provide a better methodology for calculating the energy consumption of two-stage furnaces. The results indicate that, although two-stage technology by itself does not save site energy, the combination of two-stage furnaces with BPM motors provides electricity savings, which are confirmed by field studies.« less

  5. RNA interference as a method for target-site screening in the Western Corn Rootworm, Diabrotica virgifera virgifera

    USDA-ARS?s Scientific Manuscript database

    RNA interference (RNAi) is one of the most powerful and extraordinarily-specific means by which to silence genes. The ability of RNAi to silence genes makes it possible to ascertain function from genomic data, thereby making it an excellent choice for target-site screening. To test the efficacy of...

  6. Three-Dimensional Bayesian Geostatistical Aquifer Characterization at the Hanford 300 Area using Tracer Test Data

    NASA Astrophysics Data System (ADS)

    Chen, X.; Murakami, H.; Hahn, M. S.; Hammond, G. E.; Rockhold, M. L.; Rubin, Y.

    2010-12-01

    Tracer testing under natural or forced gradient flow provides useful information for characterizing subsurface properties, by monitoring and modeling the tracer plume migration in a heterogeneous aquifer. At the Hanford 300 Area, non-reactive tracer experiments, in addition to constant-rate injection tests and electromagnetic borehole flowmeter (EBF) profiling, were conducted to characterize the heterogeneous hydraulic conductivity field. A Bayesian data assimilation technique, method of anchored distributions (MAD), is applied to assimilate the experimental tracer test data and to infer the three-dimensional heterogeneous structure of the hydraulic conductivity in the saturated zone of the Hanford formation. In this study, the prior information of the underlying random hydraulic conductivity field was obtained from previous field characterization efforts using the constant-rate injection tests and the EBF data. The posterior distribution of the random field is obtained by further conditioning the field on the temporal moments of tracer breakthrough curves at various observation wells. The parallel three-dimensional flow and transport code PFLOTRAN is implemented to cope with the highly transient flow boundary conditions at the site and to meet the computational demand of the proposed method. The validation results show that the field conditioned on the tracer test data better reproduces the tracer transport behavior compared to the field characterized previously without the tracer test data. A synthetic study proves that the proposed method can effectively assimilate tracer test data to capture the essential spatial heterogeneity of the three-dimensional hydraulic conductivity field. These characterization results will improve conceptual models developed for the site, including reactive transport models. The study successfully demonstrates the capability of MAD to assimilate multi-scale multi-type field data within a consistent Bayesian framework. The MAD framework can potentially be applied to combine geophysical data with other types of data in site characterization.

  7. Rapid-estimation method for assessing scour at highway bridges

    USGS Publications Warehouse

    Holnbeck, Stephen R.

    1998-01-01

    A method was developed by the U.S. Geological Survey for rapid estimation of scour at highway bridges using limited site data and analytical procedures to estimate pier, abutment, and contraction scour depths. The basis for the method was a procedure recommended by the Federal Highway Administration for conducting detailed scour investigations, commonly referred to as the Level 2 method. Using pier, abutment, and contraction scour results obtained from Level 2 investigations at 122 sites in 10 States, envelope curves and graphical relations were developed that enable determination of scour-depth estimates at most bridge sites in a matter of a few hours. Rather than using complex hydraulic variables, surrogate variables more easily obtained in the field were related to calculated scour-depth data from Level 2 studies. The method was tested by having several experienced individuals apply the method in the field, and results were compared among the individuals and with previous detailed analyses performed for the sites. Results indicated that the variability in predicted scour depth among individuals applying the method generally was within an acceptable range, and that conservatively greater scour depths generally were obtained by the rapid-estimation method compared to the Level 2 method. The rapid-estimation method is considered most applicable for conducting limited-detail scour assessments and as a screening tool to determine those bridge sites that may require more detailed analysis. The method is designed to be applied only by a qualified professional possessing knowledge and experience in the fields of bridge scour, hydraulics, and flood hydrology, and having specific expertise with the Level 2 method.

  8. Assessing technical performance at diverse ambulatory care sites.

    PubMed

    Osterweis, M; Bryant, E

    1978-01-01

    The purpose of the large study reported here was to develop and test methods for assessing the quality of health care that would be broadly applicable to diverse ambulatory care organizations for periodic comparative review. Methodological features included the use of an age-sex stratified random sampling scheme, dependence on medical records as the source of data, a fixed study period year, use of Kessner's tracer methodology (including not only acute and chronic diseases but also screening and immunization rates as indicators), and a fixed tracer matrix at all test sites. This combination of methods proved more efficacious in estimating certain parameters for the total patient populations at each site (including utilization patterns, screening, and immunization rates) and the process of care for acute conditions than it did in examining the process of care for the selected chronic condition. It was found that the actual process of care at all three sites for the three acute conditions (streptococcal pharyngitis, urinary tract infection, and iron deficiency anemia) often differed from the expected process in terms of both diagnostic procedures and treatment. For hypertension, the chronic disease tracer, medical records were frequently a deficient data source from which to draw conclusions about the adequacy of treatment. Several aspects of the study methodology were found to be detrimental to between-site comparisons of the process of care for chronic disease management. The use of an age-sex stratified random sampling scheme resulted in the identification of too few cases of hypertension at some sites for analytic purposes, thereby necessitating supplementary sampling by diagnosis. The use of a fixed study period year resulted in an arbitrary starting point in the course of the disease. Furthermore, in light of the diverse sociodemographic characteristics of the patient populations, the use of a fixed matrix of tracer conditions for all test sites is questionable. The discussion centers on these and other problems encountered in attempting to compare technical performance within diverse ambulatory care organizations and provides some guidelines as to the utility of alternative methods for assessing the quality of health care.

  9. Research on the calibration methods of the luminance parameter of radiation luminance meters

    NASA Astrophysics Data System (ADS)

    Cheng, Weihai; Huang, Biyong; Lin, Fangsheng; Li, Tiecheng; Yin, Dejin; Lai, Lei

    2017-10-01

    This paper introduces standard diffusion reflection white plate method and integrating sphere standard luminance source method to calibrate the luminance parameter. The paper compares the effects of calibration results by using these two methods through principle analysis and experimental verification. After using two methods to calibrate the same radiation luminance meter, the data obtained verifies the testing results of the two methods are both reliable. The results show that the display value using standard white plate method has fewer errors and better reproducibility. However, standard luminance source method is more convenient and suitable for on-site calibration. Moreover, standard luminance source method has wider range and can test the linear performance of the instruments.

  10. Detection of human and animal sources of pollution by microbial and chemical methods

    USDA-ARS?s Scientific Manuscript database

    A multi-indicator approach comprising Enterococcus, bacterial source tracking (BST), and sterol analysis was tested for pollution source identification. Fecal contamination was detected in 100% of surface water sites tested. Enterococcus faecium was the dominant species in aged litter samples from p...

  11. 76 FR 27261 - Propiconazole; Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-11

    ... identified and discussed later in this document. Interregional Research Project 4 (IR-4) requested these... e-CFR site at http://www.gpoaccess.gov/ecfr . To access the harmonized test guidelines referenced in this document electronically, please go to http://www.epa.gov/ocspp and select ``Test Methods and...

  12. Performance and Specificity of the Covalently Linked Immunomagnetic Separation-ATP Method for Rapid Detection and Enumeration of Enterococci in Coastal Environments

    PubMed Central

    Zimmer-Faust, Amity G.; Thulsiraj, Vanessa; Ferguson, Donna

    2014-01-01

    The performance and specificity of the covalently linked immunomagnetic separation-ATP (Cov-IMS/ATP) method for the detection and enumeration of enterococci was evaluated in recreational waters. Cov-IMS/ATP performance was compared with standard methods: defined substrate technology (Enterolert; IDEXX Laboratories), membrane filtration (EPA Method 1600), and an Enterococcus-specific quantitative PCR (qPCR) assay (EPA Method A). We extend previous studies by (i) analyzing the stability of the relationship between the Cov-IMS/ATP method and culture-based methods at different field sites, (ii) evaluating specificity of the assay for seven ATCC Enterococcus species, (iii) identifying cross-reacting organisms binding the antibody-bead complexes with 16S rRNA gene sequencing and evaluating specificity of the assay to five nonenterococcus species, and (iv) conducting preliminary tests of preabsorption as a means of improving the assay. Cov-IMS/ATP was found to perform consistently and with strong agreement rates (based on exceedance/compliance with regulatory limits) of between 83% and 100% compared to the culture-based Enterolert method at a variety of sites with complex inputs. The Cov-IMS/ATP method is specific to five of seven different Enterococcus spp. tested. However, there is potential for nontarget bacteria to bind the antibody, which may be reduced by purification of the IgG serum with preabsorption at problematic sites. The findings of this study help to validate the Cov-IMS/ATP method, suggesting a predictable relationship between the Cov-IMS/ATP method and traditional culture-based methods, which will allow for more widespread application of this rapid and field-portable method for coastal water quality assessment. PMID:24561583

  13. Automatic sub-pixel coastline extraction based on spectral mixture analysis using EO-1 Hyperion data

    NASA Astrophysics Data System (ADS)

    Hong, Zhonghua; Li, Xuesu; Han, Yanling; Zhang, Yun; Wang, Jing; Zhou, Ruyan; Hu, Kening

    2018-06-01

    Many megacities (such as Shanghai) are located in coastal areas, therefore, coastline monitoring is critical for urban security and urban development sustainability. A shoreline is defined as the intersection between coastal land and a water surface and features seawater edge movements as tides rise and fall. Remote sensing techniques have increasingly been used for coastline extraction; however, traditional hard classification methods are performed only at the pixel-level and extracting subpixel accuracy using soft classification methods is both challenging and time consuming due to the complex features in coastal regions. This paper presents an automatic sub-pixel coastline extraction method (ASPCE) from high-spectral satellite imaging that performs coastline extraction based on spectral mixture analysis and, thus, achieves higher accuracy. The ASPCE method consists of three main components: 1) A Water- Vegetation-Impervious-Soil (W-V-I-S) model is first presented to detect mixed W-V-I-S pixels and determine the endmember spectra in coastal regions; 2) The linear spectral mixture unmixing technique based on Fully Constrained Least Squares (FCLS) is applied to the mixed W-V-I-S pixels to estimate seawater abundance; and 3) The spatial attraction model is used to extract the coastline. We tested this new method using EO-1 images from three coastal regions in China: the South China Sea, the East China Sea, and the Bohai Sea. The results showed that the method is accurate and robust. Root mean square error (RMSE) was utilized to evaluate the accuracy by calculating the distance differences between the extracted coastline and the digitized coastline. The classifier's performance was compared with that of the Multiple Endmember Spectral Mixture Analysis (MESMA), Mixture Tuned Matched Filtering (MTMF), Sequential Maximum Angle Convex Cone (SMACC), Constrained Energy Minimization (CEM), and one classical Normalized Difference Water Index (NDWI). The results from the three test sites indicated that the proposed ASPCE method extracted coastlines more efficiently than did the compared methods, and its coastline extraction accuracy corresponded closely to the digitized coastline, with 0.39 pixels, 0.40 pixels, and 0.35 pixels in the three test regions, showing that the ASPCE method achieves an accuracy below 12.0 m (0.40 pixels). Moreover, in the quantitative accuracy assessment for the three test sites, the ASPCE method shows the best performance in coastline extraction, achieving a 0.35 pixel-level at the Bohai Sea, China test site. Therefore, the proposed ASPCE method can extract coastline more accurately than can the hard classification methods or other spectral unmixing methods.

  14. BetaScint{trademark} fiber-optic sensor for detecting strontium-90 and uranium-238 in soil. Innovative technology summary report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-12-01

    Accurate measurements of radioactivity in soils contaminated with Strontium-90 (Sr-90) or Uranium-238 (U-238) are essential for many DOE site remediation programs. These crucial measurements determine if excavation and soil removal is necessary, where remediation efforts should be focused, and/or if a site has reached closure. Measuring soil contamination by standard EPA laboratory methods typically takes a week (accelerated analytical test turnaround) or a month (standard analytical test turnaround). The time delay extends to operations involving heavy excavation equipment and associated personnel which are the main costs of remediation. This report describes an application of the BetaScint{trademark} fiber-optic sensor that measuresmore » Sr-90 or U-238 contamination in soil samples on site in about 20 minutes, at a much lower cost than time-consuming laboratory methods, to greatly facilitate remediation. This report describes the technology, its performance, its uses, cost, regulatory and policy issues, and lessons learned.« less

  15. Evaluation of permeable fractures in rock aquifers

    NASA Astrophysics Data System (ADS)

    Bok Lee, Hang

    2015-04-01

    In this study, the practical usefulness and fundamental applicability of a self-potential (SP) method for identifying the permeable fractures were evaluated by a comparison of SP methods with other geophysical logging methods and hydraulic tests. At a 10 m-shallow borehole in the study site, the candidates of permeable fractures crossing the borehole were first determined by conventional geophysical methods such as an acoustic borehole televiwer, temperature, electrical conductivity and gamma-gamma loggings, which was compared to the analysis by the SP method. Constant pressure injection and recovery tests were conducted for verification of the hydraulic properties of the fractures identified by various logging methods. The acoustic borehole televiwer and gamma-gamma loggings detected the open space or weathering zone within the borehole, but they cannot prove the possibility of a groundwater flow through the detected fractures. The temperature and electrical conductivity loggings had limitations to detect the fractured zones where groundwater in the borehole flows out to the surrounding rock aquifers. Comparison of results from different methods showed that there is a best correlation between the distribution of hydraulic conductivity and the variation of the SP signals, and the SP logging can estimate accurately the hydraulic activity as well as the location of permeable fractures. Based on the results, the SP method is recommended for determining the hydraulically-active fractures rather than other conventional geophysical loggings. This self-potential method can be effectively applied in the initial stage of a site investigation which selects the optimal location and evaluates the hydrogeological property of fractures in target sites for the underground structure including the geothermal reservoir and radioactive waste disposal.

  16. The New WindForS Wind Energy Test Site in Southern Germany

    NASA Astrophysics Data System (ADS)

    Clifton, A. J.

    2017-12-01

    Wind turbines are increasingly being installed in complex terrain where patchy landcover, forestry, steep slopes, and complex regional and local atmospheric conditions lead to major challenges for traditional numerical weather prediction methods. In this presentation, the new WindForS complex terrain test site will be introduced. WindForS is a southern Germany-based research consortium of more than 20 groups at higher education and research institutes, with strong links to regional government and industry. The new test site will be located in the hilly, forested terrain of the Swabian Alps between Stuttgart and Germany, and will consist of two wind turbines with four meteorological towers. The test site will be used for accompanying ecological research and will also have mobile eddy covariance measurement stations as well as bird and bat monitoring systems. Seismic and noise monitoring systems are also planned. The large number of auxiliary measurements at this facility are intended to allow the complete atmosphere-wind turbine-environment-people system to be characterized. This presentation will show some of the numerical weather prediction work and measurements done at the site so far, and inform the audience about WindForS' plans for the future. A major focus of the presentation will be on opportunities for collaboration through field campaigns or model validation.

  17. Hydrogeologic and hydraulic characterization of aquifer and nonaquifer layers in a lateritic terrain (West Bengal, India)

    NASA Astrophysics Data System (ADS)

    Biswal, Sabinaya; Jha, Madan K.; Sharma, Shashi P.

    2018-02-01

    The hydrogeologic and hydraulic characteristics of a lateritic terrain in West Bengal, India, were investigated. Test drilling was conducted at ten sites and grain-size distribution curves (GSDCs) were prepared for 275 geologic samples. Performance evaluation of eight grain-size-analysis (GSA) methods was carried out to estimate the hydraulic conductivity (K) of subsurface formations. Finally, the GSA results were validated against pumping-test data. The GSDCs indicated that shallow aquifer layers are coarser than the deeper aquifer layers (uniformity coefficient 0.19-11.4). Stratigraphy analysis revealed that both shallow and deep aquifers of varying thickness exist at depths 9-40 and 40-79 m, respectively. The mean K estimates by the GSA methods are 3.62-292.86 m/day for shallow aquifer layers and 0.97-209.93 m/day for the deeper aquifer layers, suggesting significant aquifer heterogeneity. Pumping-test data indicated that the deeper aquifers are leaky confined with transmissivity 122.69-693.79 m2/day, storage coefficient 1.01 × 10-7-2.13 × 10-4 and leakance 2.01 × 10-7-34.56 × 10-2 day-1. Although the K values yielded by the GSA methods are generally larger than those obtained from the pumping tests, the Slichter, Harleman and US Bureau Reclamation (USBR) GSA methods yielded reasonable values at most of the sites (1-3 times higher than K estimates by the pumping-test method). In conclusion, more reliable aquifers exist at deeper depths that can be tapped for dependable water supply. GSA methods such as Slichter, Harleman and USBR can be used for the preliminary assessment of K in lateritic terrains in the absence of reliable field methods.

  18. Refinement of a Method for Identifying Probable Archaeological Sites from Remotely Sensed Data

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Comer, Douglas C.; Priebe, Carey E.; Sussman, Daniel; Chen, Li

    2012-01-01

    To facilitate locating archaeological sites before they are compromised or destroyed, we are developing approaches for generating maps of probable archaeological sites, through detecting subtle anomalies in vegetative cover, soil chemistry, and soil moisture by analyzing remotely sensed data from multiple sources. We previously reported some success in this effort with a statistical analysis of slope, radar, and Ikonos data (including tasseled cap and NDVI transforms) with Student's t-test. We report here on new developments in our work, performing an analysis of 8-band multispectral Worldview-2 data. The Worldview-2 analysis begins by computing medians and median absolute deviations for the pixels in various annuli around each site of interest on the 28 band difference ratios. We then use principle components analysis followed by linear discriminant analysis to train a classifier which assigns a posterior probability that a location is an archaeological site. We tested the procedure using leave-one-out cross validation with a second leave-one-out step to choose parameters on a 9,859x23,000 subset of the WorldView-2 data over the western portion of Ft. Irwin, CA, USA. We used 100 known non-sites and trained one classifier for lithic sites (n=33) and one classifier for habitation sites (n=16). We then analyzed convex combinations of scores from the Archaeological Predictive Model (APM) and our scores. We found that that the combined scores had a higher area under the ROC curve than either individual method, indicating that including WorldView-2 data in analysis improved the predictive power of the provided APM.

  19. Estimated flow-duration curves for selected ungaged sites in Kansas

    USGS Publications Warehouse

    Studley, S.E.

    2001-01-01

    Flow-duration curves for 1968-98 were estimated for 32 ungaged sites in the Missouri, Smoky Hill-Saline, Solomon, Marais des Cygnes, Walnut, Verdigris, and Neosho River Basins in Kansas. Also included from a previous report are estimated flow-duration curves for 16 ungaged sites in the Cimarron and lower Arkansas River Basins in Kansas. The method of estimation used six unique factors of flow duration: (1) mean streamflow and percentage duration of mean streamflow, (2) ratio of 1-percent-duration streamflow to mean streamflow, (3) ratio of 0.1-percent-duration streamflow to 1-percent-duration streamflow, (4) ratio of 50-percent-duration streamflow to mean streamflow, (5) percentage duration of appreciable streamflow (0.10 cubic foot per second), and (6) average slope of the flow-duration curve. These factors were previously developed from a regionalized study of flow-duration curves using streamflow data for 1921-76 from streamflow-gaging stations with drainage areas of 100 to 3,000 square miles. The method was tested on a currently (2001) measured, continuous-record streamflow-gaging station on Salt Creek near Lyndon, Kansas, with a drainage area of 111 square miles and was found to adequately estimate the computed flow-duration curve for the station. The method also was tested on a currently (2001) measured, continuous-record, streamflow-gaging station on Soldier Creek near Circleville, Kansas, with a drainage area of 49.3 square miles. The results of the test on Soldier Creek near Circleville indicated that the method could adequately estimate flow-duration curves for sites with drainage areas of less than 100 square miles. The low-flow parts of the estimated flow-duration curves were verified or revised using 137 base-flow discharge measurements made during 1999-2000 at the 32 ungaged sites that were correlated with base-flow measurements and flow-duration analyses performed at nearby, long-term, continuous-record, streamflow-gaging stations (index stations). The method did not adequately estimate the flow-duration curves for two sites in the western one-third of the State because of substantial changes in farming practices (terracing and intensive ground-water withdrawal) that were not accounted for in the two previous studies (Furness, 1959; Jordan, 1983). For these two sites, there was enough historic, continuous-streamflow record available to perform record-extension techniques correlated to their respective index stations for the development of the estimated flow-duration curves. The estimated flow-duration curves at the ungaged sites can be used for projecting future flow frequencies for assessment of total maximum daily loads (TMDLs) or other water-quality constituents, water-availability studies, and for basin-characteristic studies.

  20. A Simplified 4-Site Economical Intradermal Post-Exposure Rabies Vaccine Regimen: A Randomised Controlled Comparison with Standard Methods

    PubMed Central

    Warrell, Mary J.; Riddell, Anna; Yu, Ly-Mee; Phipps, Judith; Diggle, Linda; Bourhy, Hervé; Deeks, Jonathan J.; Fooks, Anthony R.; Audry, Laurent; Brookes, Sharon M.; Meslin, François-Xavier; Moxon, Richard; Pollard, Andrew J.; Warrell, David A.

    2008-01-01

    Background The need for economical rabies post-exposure prophylaxis (PEP) is increasing in developing countries. Implementation of the two currently approved economical intradermal (ID) vaccine regimens is restricted due to confusion over different vaccines, regimens and dosages, lack of confidence in intradermal technique, and pharmaceutical regulations. We therefore compared a simplified 4-site economical PEP regimen with standard methods. Methods Two hundred and fifty-four volunteers were randomly allocated to a single blind controlled trial. Each received purified vero cell rabies vaccine by one of four PEP regimens: the currently accepted 2-site ID; the 8-site regimen using 0.05 ml per ID site; a new 4-site ID regimen (on day 0, approximately 0.1 ml at 4 ID sites, using the whole 0.5 ml ampoule of vaccine; on day 7, 0.1 ml ID at 2 sites and at one site on days 28 and 90); or the standard 5-dose intramuscular regimen. All ID regimens required the same total amount of vaccine, 60% less than the intramuscular method. Neutralising antibody responses were measured five times over a year in 229 people, for whom complete data were available. Findings All ID regimens showed similar immunogenicity. The intramuscular regimen gave the lowest geometric mean antibody titres. Using the rapid fluorescent focus inhibition test, some sera had unexpectedly high antibody levels that were not attributable to previous vaccination. The results were confirmed using the fluorescent antibody virus neutralisation method. Conclusions This 4-site PEP regimen proved as immunogenic as current regimens, and has the advantages of requiring fewer clinic visits, being more practicable, and having a wider margin of safety, especially in inexperienced hands, than the 2-site regimen. It is more convenient than the 8-site method, and can be used economically with vaccines formulated in 1.0 or 0.5 ml ampoules. The 4-site regimen now meets all requirements of immunogenicity for PEP and can be introduced without further studies. Trial Registration Controlled-Trials.com ISRCTN 30087513 PMID:18431444

  1. Timescale Correlation between Marine Atmospheric Exposure and Accelerated Corrosion Testing - Part 2

    NASA Technical Reports Server (NTRS)

    Montgomery, Eliza L.; Calle, Luz Marina; Curran, Jerome C.; Kolody, Mark R.

    2012-01-01

    Evaluation of metals to predict service life of metal-based structures in corrosive environments has long relied on atmospheric exposure test sites. Traditional accelerated corrosion testing relies on mimicking the exposure conditions, often incorporating salt spray and ultraviolet (UV) radiation, and exposing the metal to continuous or cyclic conditions similar to those of the corrosive environment. Their reliability to correlate to atmospheric exposure test results is often a concern when determining the timescale to which the accelerated tests can be related. Accelerated corrosion testing has yet to be universally accepted as a useful tool in predicting the long-term service life of a metal, despite its ability to rapidly induce corrosion. Although visual and mass loss methods of evaluating corrosion are the standard, and their use is crucial, a method that correlates timescales from accelerated testing to atmospheric exposure would be very valuable. This paper presents work that began with the characterization of the atmospheric environment at the Kennedy Space Center (KSC) Beachside Corrosion Test Site. The chemical changes that occur on low carbon steel, during atmospheric and accelerated corrosion conditions, were investigated using surface chemistry analytical methods. The corrosion rates and behaviors of panels subjected to long-term and accelerated corrosion conditions, involving neutral salt fog and alternating seawater spray, were compared to identify possible timescale correlations between accelerated and long-term corrosion performance. The results, as well as preliminary findings on the correlation investigation, are presented.

  2. Statistical models for incorporating data from routine HIV testing of pregnant women at antenatal clinics into HIV/AIDS epidemic estimates.

    PubMed

    Sheng, Ben; Marsh, Kimberly; Slavkovic, Aleksandra B; Gregson, Simon; Eaton, Jeffrey W; Bao, Le

    2017-04-01

    HIV prevalence data collected from routine HIV testing of pregnant women at antenatal clinics (ANC-RT) are potentially available from all facilities that offer testing services to pregnant women and can be used to improve estimates of national and subnational HIV prevalence trends. We develop methods to incorporate these new data source into the Joint United Nations Programme on AIDS Estimation and Projection Package in Spectrum 2017. We develop a new statistical model for incorporating ANC-RT HIV prevalence data, aggregated either to the health facility level (site-level) or regionally (census-level), to estimate HIV prevalence alongside existing sources of HIV prevalence data from ANC unlinked anonymous testing (ANC-UAT) and household-based national population surveys. Synthetic data are generated to understand how the availability of ANC-RT data affects the accuracy of various parameter estimates. We estimate HIV prevalence and additional parameters using both ANC-RT and other existing data. Fitting HIV prevalence using synthetic data generally gives precise estimates of the underlying trend and other parameters. More years of ANC-RT data should improve prevalence estimates. More ANC-RT sites and continuation with existing ANC-UAT sites may improve the estimate of calibration between ANC-UAT and ANC-RT sites. We have proposed methods to incorporate ANC-RT data into Spectrum to obtain more precise estimates of prevalence and other measures of the epidemic. Many assumptions about the accuracy, consistency, and representativeness of ANC-RT prevalence underlie the use of these data for monitoring HIV epidemic trends and should be tested as more data become available from national ANC-RT programs.

  3. [Implementation of transgingival antibacterial photodynamic therapy (PDT) supplementary to scaling and root planing. A controlled clinical proof-of-principle study].

    PubMed

    Mettraux, Gérald; Hüsler, Jürg

    2011-01-01

    The antibacterial photodynamic therapy (PDT) has been effective in the periodontal therapy. The laser light application reported in the literature so far is the subgingival placement of a light fibre. To study the effect of PDT with a transgingival laser application. In 19 patients with untreated periodontitis 1 test and 1 control site were selected. Both pockets were treated by scaling, root planing. the Test site received additional PDT (LASOTRONIC MED 701 by ORCOS MEDICAL, Switzerland) at baseline, after 2 and 6 months. The control sites were rinsed with ringer solution. Clinical parameters (ST, BOP, CAL) and bacterial monitoring (PADO, IAI, Switzerland) at baseline, 2 and 6 months were recorded. Mean pocket reduction was after 6 months 2.1 mm (+/-1.4) in the test group, 1.5 mm (+/-1.6) in the control group significantly different. The 95% confidence interval for the difference of the mean reductions of the test and control group after 6 months is (1.5, 3). Mean CAL gain after 6 months was 1.5 mm (+/-1.3) in the test, 0.9 mm (+/-1.7) in the control group. T. denticola showed lower number after 2 and 6 months in the test versus the control. The total bacterial load (TBL) showed significantly better reduction in the test group at 6 months. The transgingival application of PDT with the MED 701 showed clinical and bacteriological effects which are comparable to those reported in the literature with the subgingival method. The transgingival method is convenient, harmless and easy to perform.

  4. Problem-Solving Test: Restriction Endonuclease Mapping

    ERIC Educational Resources Information Center

    Szeberenyi, Jozsef

    2011-01-01

    The term "restriction endonuclease mapping" covers a number of related techniques used to identify specific restriction enzyme recognition sites on small DNA molecules. A method for restriction endonuclease mapping of a 1,000-basepair (bp)-long DNA molecule is described in the fictitious experiment of this test. The most important fact needed to…

  5. 75 FR 81878 - Imazosulfuron; Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-29

    ... e-CFR site at http://www.gpoaccess.gov/ecfr . To access the harmonized test guidelines referenced in this document electronically, please go http://www.epa.gov/ocspp and select ``Test Methods and... reliability as well as the relationship of the results of the studies to human risk. EPA has also considered...

  6. 75 FR 69353 - Isoxaben; Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-12

    ... e-CFR site at http://www.gpoaccess.gov/ecfr . To access the harmonized test guidelines referenced in this document electronically, please go to http://www.epa.gov/ocspp and select ``Test Methods and... reliability as well as the relationship of the results of the studies to human risk. EPA has also considered...

  7. Testing Mars-inspired operational strategies for semi-autonomous rovers on the Moon: The GeoHeuristic Operational Strategies Test in New Mexico

    PubMed Central

    Yingst, R. Aileen; Cohen, B. A.; Crumpler, L.; Schmidt, M. E.; Schrader, C. M.

    2017-01-01

    Background We tested the science operational strategy used for the Mars Exploration Rover (MER) mission on Mars to determine its suitability for conducting remote geology on the Moon by conducting a field test at Cerro de Santa Clara, New Mexico. This region contains volcanic and sedimentary products from a variety of provenances, mimicking the variety that might be found at a lunar site such as South Pole-Aitken Basin. Method At each site a Science Team broke down observational “days” into a sequence of observations of features and targets of interest. The number, timing, and sequence of observations was chosen to mimic those used by the MERs when traversing. Images simulating high-resolution stereo and hand lens-scale images were taken using a professional SLR digital camera; multispectral and XRD data were acquired from samples to mimic the availability of geochemical data. A separate Tiger Team followed the Science Team and examined each site using traditional terrestrial field methods, facilitating comparison between what was revealed by human versus rover-inspired methods. Lessons Learned We conclude from this field test that MER-inspired methodology is not conducive to utilizing all acquired data in a timely manner for the case of any lunar architecture that involves the acquisition of rover data in near real-time. We additionally conclude that a methodology similar to that used for MER can be adapted for use on the Moon if mission goals are focused on reconnaissance. If the goal is to locate and identify a specific feature or material, such as water ice, a different methodology will likely be needed. PMID:29309066

  8. Gas Transport and Detection Following Underground Nuclear Explosions

    NASA Astrophysics Data System (ADS)

    Carrigan, C. R.; Sun, Y.; Wagoner, J. L.; Zucca, J. J.

    2011-12-01

    Some extremely rare radioactive noble gases are by-products of underground nuclear explosions, and the detection of significant levels of these gases (e.g., Xe-133 and Ar-37) at the surface is a very strong indicator of the occurrence of an underground nuclear event. Because of their uniqueness, such noble gas signatures can be confirmatory of the nuclear nature of an event while signatures from other important detection methods, such as anomalous seismicity, are generally not. As a result, noble gas detection at a suspected underground nuclear test site is considered to be the most important technique available to inspectors operating under the On-Site-Inspection protocol of the Comprehensive Nuclear Test Ban Treaty. A one-kiloton chemical underground explosion, the Non-Proliferation Experiment (NPE), was carried out at the Nevada Test Site in 1993 and represented the first On-Site-Inspection oriented test of subsurface gas transport with subsequent detection at the surface using soil gas sampling methods. A major conclusion of the experiment was that noble gases from underground nuclear tests have a good possibility of being detected even if the test is well contained. From this experiment and from computer simulations, we have also learned significant lessons about the modes of gas transport to the surface and the importance of careful subsurface sampling to optimize the detected noble gas signature. Understanding transport and sampling processes for a very wide range of geologic and testing scenarios presents significant challenges that we are currently addressing using sensitivity studies, which we attempt to verify using experiments such as the NPE and a new subsurface gas migration experiment that is now being undertaken at the National Center for Nuclear Security. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  9. HIV misdiagnosis in sub-Saharan Africa: performance of diagnostic algorithms at six testing sites

    PubMed Central

    Kosack, Cara S.; Shanks, Leslie; Beelaert, Greet; Benson, Tumwesigye; Savane, Aboubacar; Ng’ang’a, Anne; Andre, Bita; Zahinda, Jean-Paul BN; Fransen, Katrien; Page, Anne-Laure

    2017-01-01

    Abstract Introduction: We evaluated the diagnostic accuracy of HIV testing algorithms at six programmes in five sub-Saharan African countries. Methods: In this prospective multisite diagnostic evaluation study (Conakry, Guinea; Kitgum, Uganda; Arua, Uganda; Homa Bay, Kenya; Doula, Cameroun and Baraka, Democratic Republic of Congo), samples from clients (greater than equal to five years of age) testing for HIV were collected and compared to a state-of-the-art algorithm from the AIDS reference laboratory at the Institute of Tropical Medicine, Belgium. The reference algorithm consisted of an enzyme-linked immuno-sorbent assay, a line-immunoassay, a single antigen-enzyme immunoassay and a DNA polymerase chain reaction test. Results: Between August 2011 and January 2015, over 14,000 clients were tested for HIV at 6 HIV counselling and testing sites. Of those, 2786 (median age: 30; 38.1% males) were included in the study. Sensitivity of the testing algorithms ranged from 89.5% in Arua to 100% in Douala and Conakry, while specificity ranged from 98.3% in Doula to 100% in Conakry. Overall, 24 (0.9%) clients, and as many as 8 per site (1.7%), were misdiagnosed, with 16 false-positive and 8 false-negative results. Six false-negative specimens were retested with the on-site algorithm on the same sample and were found to be positive. Conversely, 13 false-positive specimens were retested: 8 remained false-positive with the on-site algorithm. Conclusions: The performance of algorithms at several sites failed to meet expectations and thresholds set by the World Health Organization, with unacceptably high rates of false results. Alongside the careful selection of rapid diagnostic tests and the validation of algorithms, strictly observing correct procedures can reduce the risk of false results. In the meantime, to identify false-positive diagnoses at initial testing, patients should be retested upon initiating antiretroviral therapy. PMID:28691437

  10. Specific Yields Estimated from Gravity Change during Pumping Test

    NASA Astrophysics Data System (ADS)

    Chen, K. H.; Hwang, C.; Chang, L. C.

    2017-12-01

    Specific yield (Sy) is the most important parameter to describe available groundwater capacity in an unconfined aquifer. When estimating Sy by a field pumping test, aquifer heterogeneity and well performers will cause a large uncertainty. In this study, we use a gravity-based method to estimate Sy. At the time of pumping test, amounts of mass (groundwater) are forced to be taken out. If drawdown corn is big and close enough to high precision gravimeter, the gravity change can be detected. The gravity-based method use gravity observations that are independent from traditional flow computation. Only the drawdown corn should be modeled with observed head and hydrogeology data. The gravity method can be used in most groundwater field tests, such as locally pumping/injection tests initiated by active man-made or annual variations due to natural sources. We apply our gravity method at few sites in Taiwan situated over different unconfined aquifer. Here pumping tests for Sy determinations were also carried out. We will discuss why the gravity method produces different results from traditional pumping test, field designs and limitations of the gravity method.

  11. Comparison of rangeland vegetation sampling techniques in the Central Grasslands

    USGS Publications Warehouse

    Stohlgren, T.J.; Bull, K.A.; Otsuki, Yuka

    1998-01-01

    Maintaining native plant diversity, detecting exotic species, and monitoring rare species are becoming important objectives in rangeland conservation. Four rangeland vegetation sampling techniques were compared to see how well they captured local pant diversity. The methods tested included the commonly used Parker transects, Daubenmire transects as modified by the USDA Forest Service, a new transect and 'large quadrat' design proposed by the USDA Agricultural Research Service, and the Modified-Whittaker multi-scale vegetation plot. The 4 methods were superimposed in shortgrass steppe, mixed grass prairie, northern mixed prairie, and tallgrass prairie in the Central Grasslands of the United States with 4 replicates in each prairie type. Analysis of variance tests showed significant method effects and prairie type effects, but no significant method X type interactions for total species richness, the number of native species, the number of species with less than 1 % cover, and the time required for sampling. The methods behaved similarly in each prairie type under a wide variety of grazing regimens. The Parker, large quadrat, and Daubenmire transects significantly underestimated the total species richness and the number of native species in each prairie type, and the number of species with less than 1 % cover in all but the tallgrass prairie type. The transect techniques also consistently missed half the exotic species, including noxious weeds, in each prairie type. The Modified-Whittaker method, which included an exhaustive search for plant species in a 20 x 50 m plot, served as the baseline for species richeness comparisons. For all prairie types, the Modified-Whittaker plot captured an average of 42. (?? 2.4; 1 S.E.) plant species per site compared to 15.9 (?? 1.3), 18.9 (?? 1.2), and 22.8 (?? 1.6) plant species per site using the Parker, large quadrat, and Daubenmire transect methods, respectively. The 4 methods captured most of the dominant species at each site and thus produced similar results for total foliar cover and soil cover. The detection and measurement of exotic plant species were greatly enhanced by using ten 1 m2 subplots in a multi-scale sampling design and searching a larger area (1,000 m2) at each site. Even with 4 replicate sites, the transect methods usually captured, and thus would monitor, 36 to 66 % of the plant species at each site. To evaluate the status and trends of common, rare, and exotic plant species at local, regional, and national scales, innovative, multi-scale methods must replace the commonly used transect methods to the past.

  12. The prediction of palmitoylation site locations using a multiple feature extraction method.

    PubMed

    Shi, Shao-Ping; Sun, Xing-Yu; Qiu, Jian-Ding; Suo, Sheng-Bao; Chen, Xiang; Huang, Shu-Yun; Liang, Ru-Ping

    2013-03-01

    As an extremely important and ubiquitous post-translational lipid modification, palmitoylation plays a significant role in a variety of biological and physiological processes. Unlike other lipid modifications, protein palmitoylation and depalmitoylation are highly dynamic and can regulate both protein function and localization. The dynamic nature of palmitoylation is poorly understood because of the limitations in current assay methods. The in vivo or in vitro experimental identification of palmitoylation sites is both time consuming and expensive. Due to the large volume of protein sequences generated in the post-genomic era, it is extraordinarily important in both basic research and drug discovery to rapidly identify the attributes of a new protein's palmitoylation sites. In this work, a new computational method, WAP-Palm, combining multiple feature extraction, has been developed to predict the palmitoylation sites of proteins. The performance of the WAP-Palm model is measured herein and was found to have a sensitivity of 81.53%, a specificity of 90.45%, an accuracy of 85.99% and a Matthews correlation coefficient of 72.26% in 10-fold cross-validation test. The results obtained from both the cross-validation and independent tests suggest that the WAP-Palm model might facilitate the identification and annotation of protein palmitoylation locations. The online service is available at http://bioinfo.ncu.edu.cn/WAP-Palm.aspx. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. Geophysical studies of the Syncline Ridge area, Nevada Test Site, Nye County, Nevada

    USGS Publications Warehouse

    Hoover, D.B.; Hanna, W.F.; Anderson, L.A.; Flanigan, V.J.; Pankratz, L.W.

    1982-01-01

    A wide variety of geophysical methods were employed to study a proposed nuclear waste site at Syncline Ridge on the Nevada Test Site, Nev. The proposed site was believed to be a relatively undisturbed synclinal structure containing a thick argillite unit of Misslsslppian age, the Eleana Formation unit J, which would be the emplacement medium. Data acquisition for the geophysical studies was constrained because of rugged topography in a block of Tipplpah Limestone overlying the central part of the proposed site. This study employed gravity, magnetic, seismic refraction and reflection, and four distinct electrical methods to try and define the structural integrity and shape of the proposed repository medium. Detailed and regional gravity work revealed complex structure at the site. Magnetics helped only in identifying small areas of Tertiary volcanic rocks because of low magnetization of the rocks. Seismic refraction assisted in identifying near surface faulting and bedrock structure. Difficulty was experienced in obtaining good quality reflection data. This implied significant structural complexity but also revealed the principal features that were supported by other data. Electrical methods were used for fault identification and for mapping of a thick argillaceous unit of the Eleana Formation in which nuclear waste was to be emplaced. The geophysical studies indicate that major faults along the axis of Syncline Ridge and on both margins have large vertical offsets displacing units so as not only to make mining difficult, but also providing potential paths for waste migration to underlying carbonate aquifers. The Eleana Formation appeared heterogeneous, which was inferred to be due to structural complexity. Only a small region in the northwest part of the study area was found to contain a thick and relatively undisturbed volume of host rock.

  14. 40 CFR 63.645 - Test methods and procedures for miscellaneous process vents.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TOC emission rate, as specified under paragraph (f) of this section, the sampling site shall be after... process vent TOC mass flow rate is less than 33 kilograms per day for an existing source or less than 6.8... shall determine the TOC mass flow rate by the following procedures: (1) The sampling site shall be...

  15. 40 CFR 63.9622 - What test methods and other procedures must I use to establish and demonstrate initial compliance...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...-specific operating limits according to the procedures in paragraphs (a)(1) through (3) of this section. (1... establish site-specific operating limits according to the procedures in paragraphs (b)(1) and (2) of this... site-specific operating limit according to the procedures in paragraphs (c)(1) or (2) of this section...

  16. 40 CFR 63.9622 - What test methods and other procedures must I use to establish and demonstrate initial compliance...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...-specific operating limits according to the procedures in paragraphs (a)(1) through (3) of this section. (1... establish site-specific operating limits according to the procedures in paragraphs (b)(1) and (2) of this... site-specific operating limit according to the procedures in paragraphs (c)(1) or (2) of this section...

  17. 40 CFR 63.9622 - What test methods and other procedures must I use to establish and demonstrate initial compliance...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...-specific operating limits according to the procedures in paragraphs (a)(1) through (3) of this section. (1... establish site-specific operating limits according to the procedures in paragraphs (b)(1) and (2) of this... site-specific operating limit according to the procedures in paragraphs (c)(1) or (2) of this section...

  18. 40 CFR 63.9622 - What test methods and other procedures must I use to establish and demonstrate initial compliance...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...-specific operating limits according to the procedures in paragraphs (a)(1) through (3) of this section. (1... establish site-specific operating limits according to the procedures in paragraphs (b)(1) and (2) of this... site-specific operating limit according to the procedures in paragraphs (c)(1) or (2) of this section...

  19. 40 CFR 63.9622 - What test methods and other procedures must I use to establish and demonstrate initial compliance...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-specific operating limits according to the procedures in paragraphs (a)(1) through (3) of this section. (1... establish site-specific operating limits according to the procedures in paragraphs (b)(1) and (2) of this... site-specific operating limit according to the procedures in paragraphs (c)(1) or (2) of this section...

  20. Linguistic Politeness and Interpersonal Ties among Bengalis on the Social Network Site Orkut[R]: The Bulge Theory Revisited

    ERIC Educational Resources Information Center

    Das, Anupam

    2010-01-01

    This study examined linguistic politeness behaviors and their relationship to social distance among members of a diasporic Bengali community on the social network site "Orkut"[R]. Using data from computer-mediated communication (CMC), specifically text messages posted on "Orkut"[R] "scrapbooks," it developed a method to test the claims of the…

  1. Assessing the value of different data sets and modeling schemes for flow and transport simulations

    NASA Astrophysics Data System (ADS)

    Hyndman, D. W.; Dogan, M.; Van Dam, R. L.; Meerschaert, M. M.; Butler, J. J., Jr.; Benson, D. A.

    2014-12-01

    Accurate modeling of contaminant transport has been hampered by an inability to characterize subsurface flow and transport properties at a sufficiently high resolution. However mathematical extrapolation combined with different measurement methods can provide realistic three-dimensional fields of highly heterogeneous hydraulic conductivity (K). This study demonstrates an approach to evaluate the time, cost, and efficiency of subsurface K characterization. We quantify the value of different data sets at the highly heterogeneous Macro Dispersion Experiment (MADE) Site in Mississippi, which is a flagship test site that has been used for several macro- and small-scale tracer tests that revealed non-Gaussian tracer behavior. Tracer data collected at the site are compared to models that are based on different types and resolution of geophysical and hydrologic data. We present a cost-benefit analysis of several techniques including: 1) flowmeter K data, 2) direct-push K data, 3) ground penetrating radar, and 4) two stochastic methods to generate K fields. This research provides an initial assessment of the level of data necessary to accurately simulate solute transport with the traditional advection dispersion equation; it also provides a basis to design lower cost and more efficient remediation schemes at highly heterogeneous sites.

  2. Archaeological investigations at Sample Unit U19aq, Nevada Test Site, Nye County, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, R.C.; DuBarton, A.; Holz, B.A.

    1992-12-31

    This report documents the methods and results of archaeological investigations at sample unit U19aq on Pahute Mesa. Seven sites were studied: two lithic artifact scatters (26NY4577 and 26NY4584), two temporary camps (26NY4585 and 26NY4588), two rock rings (26NY4592 and 26NY4593), and two flakes (26NY7855). Surface artifacts were collected from all seven sites. Excavations were confined to one test pit at 26NY4584 and two test pits at 26NY4585. The data retrieved from these investigations include over eight thousand artifacts, such as projectile points, bifaces, debitage, groundstone, pottery and beads. The temporally diagnostic materials indicate periodic use of sample unit U19aq frommore » 3250 B.P. to historic times. Most of the cultural remains reflect the specialized activities of hunters and gatherers occupying temporary camps.« less

  3. Archaeological investigations at Sample Unit U19aq, Nevada Test Site, Nye County, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, R.C.; DuBarton, A.; Holz, B.A.

    1992-01-01

    This report documents the methods and results of archaeological investigations at sample unit U19aq on Pahute Mesa. Seven sites were studied: two lithic artifact scatters (26NY4577 and 26NY4584), two temporary camps (26NY4585 and 26NY4588), two rock rings (26NY4592 and 26NY4593), and two flakes (26NY7855). Surface artifacts were collected from all seven sites. Excavations were confined to one test pit at 26NY4584 and two test pits at 26NY4585. The data retrieved from these investigations include over eight thousand artifacts, such as projectile points, bifaces, debitage, groundstone, pottery and beads. The temporally diagnostic materials indicate periodic use of sample unit U19aq frommore » 3250 B.P. to historic times. Most of the cultural remains reflect the specialized activities of hunters and gatherers occupying temporary camps.« less

  4. Signal peptide discrimination and cleavage site identification using SVM and NN.

    PubMed

    Kazemian, H B; Yusuf, S A; White, K

    2014-02-01

    About 15% of all proteins in a genome contain a signal peptide (SP) sequence, at the N-terminus, that targets the protein to intracellular secretory pathways. Once the protein is targeted correctly in the cell, the SP is cleaved, releasing the mature protein. Accurate prediction of the presence of these short amino-acid SP chains is crucial for modelling the topology of membrane proteins, since SP sequences can be confused with transmembrane domains due to similar composition of hydrophobic amino acids. This paper presents a cascaded Support Vector Machine (SVM)-Neural Network (NN) classification methodology for SP discrimination and cleavage site identification. The proposed method utilises a dual phase classification approach using SVM as a primary classifier to discriminate SP sequences from Non-SP. The methodology further employs NNs to predict the most suitable cleavage site candidates. In phase one, a SVM classification utilises hydrophobic propensities as a primary feature vector extraction using symmetric sliding window amino-acid sequence analysis for discrimination of SP and Non-SP. In phase two, a NN classification uses asymmetric sliding window sequence analysis for prediction of cleavage site identification. The proposed SVM-NN method was tested using Uni-Prot non-redundant datasets of eukaryotic and prokaryotic proteins with SP and Non-SP N-termini. Computer simulation results demonstrate an overall accuracy of 0.90 for SP and Non-SP discrimination based on Matthews Correlation Coefficient (MCC) tests using SVM. For SP cleavage site prediction, the overall accuracy is 91.5% based on cross-validation tests using the novel SVM-NN model. © 2013 Published by Elsevier Ltd.

  5. GPR study of a prehistoric archaeological site near Point Barrow, Alaska

    NASA Astrophysics Data System (ADS)

    Herman, R. B.; Jensen, A. M.

    2012-12-01

    A ground penetrating radar (GPR) study was performed on the prehistoric Thule cemetery site near Point Barrow, Alaska. The goals of this study were (a) to test this technology in this type of polar environment, and (b) to search for burials and other archaeological features in a location in imminent danger from ocean erosion. The Nuvuk site is currently eroding at an average rate measured at over 6 m/year. Prior archaeological work at the site had recovered over 80 burials with nearly 100 individuals represented, all of which were less than 1 m below surface, and detectable with small test pits. In addition, the first coastal Ipiutak occupation known north of Point Hope had been recently discovered, at a depth of nearly 2m below surface, in the erosion face. The occupation appeared to have been terminated by a large storm which overwashed the site, leaving a strandline immediately superimposed on the living surface. After that, approximately 1.5 m of sterile gravels had been deposited before the surface on which the Thule people were living formed. Both occupations are of considerable scientific interest. The matrix at the site consists of unconsolidated beach gravels, which necessitates opening large surface areas or use of shoring to test even small units to the depths of the Ipiutak deposit (approximately 8m x 8m at the surface to test 1m x 1m at 2m depth). Such excavations promote erosion, and are very costly in terms of time and labor, so a means to detect features buried at depths greater than those exposed by shovel test pits was desirable. GPR seemed a likely candidate, but it had not been used in such conditions before, and thus it was necessary to test it thoroughly prior to relying on GPR to eliminate areas from physical testing. The GPR imaged the subsurface to a depth of 3 meters at a frequency of 500MHz. Meter-deep test pits were placed at 2-meter intervals in the survey area in a grid pattern since the efficacy of the technology had yet to be shown. The results of the test pits and the GPR were in agreement. It was anticipated that there might be few or no remaining burials in this location since the number of burials had been declining with distance from the center of the larger site. Thus it was surprising when the GPR detected an anomaly that turned out to be the deepest burial in the whole site. In fact, it was so deeply buried that the standard shovel test pitting method might not have detected it. It proved to be a very well-preserved individual, with fairly intact garments. In addition to the burial site, the GPR was used to image a number of "strandlines" as well as other deep (>1m) features in this area. These correspond in depth and orientation to two partial Ipiutak features which have been exposed and recorded in the erosion face in two separate field seasons. It was not possible to test to that depth, but subsequent coastal erosion has exposed additional strandline debris at the depth and location predicted by the GPR data. Two- and three-dimensional images of these features will be presented, along with a detailed technical description of the GPR methods used in this environment.

  6. Bone modelling at fresh extraction sockets: immediate implant placement versus spontaneous healing: an experimental study in the beagle dog.

    PubMed

    Vignoletti, Fabio; Discepoli, Nicola; Müller, Anna; de Sanctis, Massimo; Muñoz, Fernando; Sanz, Mariano

    2012-01-01

    The purpose of this investigation is to describe histologically the undisturbed healing of fresh extraction sockets when compared to immediate implant placement. In eight beagle dogs, after extraction of the 3P3 and 4P4, implants were inserted into the distal sockets of the premolars, while the mesial sockets were left to heal spontaneously. Each animal provided four socket sites (control) and four implant sites (test). After 6 weeks, animals were sacrificed and tissue blocks were dissected, prepared for ground sectioning. The relative vertical buccal bone resorption in relation to the lingual bone was similar in both test and control groups. At immediate implant sites, however, the absolute buccal bone loss observed was 2.32 (SD 0.36) mm, what may indicate that while an apical shift of both the buccal and lingual bone crest occurred at the implant sites, this may not happen in naturally healing sockets. The results from this investigation showed that after tooth extraction the buccal socket wall underwent bone resorption at both test and control sites. This resorption appeared to be more pronounced at the implant sites, although the limitations of the histological evaluation method utilized preclude a definite conclusion. © 2011 John Wiley & Sons A/S.

  7. Criteria for site selection and frequency allocation (keynote paper), part 5

    NASA Technical Reports Server (NTRS)

    Rottger, J.

    1985-01-01

    Technical aspects of mesosphere-stratosphere-troposphere (MST) Radar on site and frequency selection were discussed. Recommendations on site selections are presented. Tests of interference will be conducted before selecting a site. A small directional antenna may be suitable to simulate sidelobe sensitivity of radars however, sophisticated data-processing methods make system sensitivity extremely good. The use of the complete data system to look for interference is recommended. There is the difficulty of allocation of frequencies -- almost continuous use by these radars will be made when the band 40 to 60 MHz is allocated to other services.

  8. Hydrogeology and water quality in the Graces Quarters area of Aberdeen Proving Ground, Maryland

    USGS Publications Warehouse

    Tenbus, Frederick J.; Blomquist, Joel D.

    1995-01-01

    Graces Quarters was used for open-air testing of chemical-warfare agents from the late 1940's until 1971. Testing and disposal activities have resulted in the contamination of ground water and surface water. The hydrogeology and water quality were examined at three test areas, four disposal sites, a bunker, and a service area on Graces Quarters. Methods of investigation included surface and borehole geophysics, water-quality sampling, water- level measurement, and hydrologic testing. The hydrogeologic framework is complex and consists of a discontinuous surficial aquifer, one or more upper confining units, and a confined aquifer system. Directions of ground-water flow vary spatially and temporally, and results of site investigations show that ground-water flow is controlled by the geology of the area. The ground water and surface water at Graces Quarters generally are unmineralized; the ground water is mildly acidic (median pH is 5.38) and poorly buffered. Inorganic constituents in excess of certain Federal drinking-water regulations and ambient water-quality criteria were detected at some sites, but they probably were present naturally. Volatile and semivolatile organic com- pounds were detected in the ground water and surface water at seven of the nine sites that were investi- gated. Concentrations of organic compounds at two of the nine sites exceeded Federal drinking-water regulations. Volatile compounds in concentrations as high as 6,000 m/L (micrograms per liter) were detected in the ground water at the site known as the primary test area. Concentrations of volatile compounds detected in the other areas ranged from 0.57 to 17 m/L.

  9. 40 CFR 146.90 - Testing and monitoring requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... information about the geologic sequestration project, including injection rate and volume, geology, the... site-specific geology, that such methods are not appropriate; (h) The Director may require surface air...

  10. 40 CFR 146.90 - Testing and monitoring requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... information about the geologic sequestration project, including injection rate and volume, geology, the... site-specific geology, that such methods are not appropriate; (h) The Director may require surface air...

  11. 40 CFR 146.90 - Testing and monitoring requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... information about the geologic sequestration project, including injection rate and volume, geology, the... site-specific geology, that such methods are not appropriate; (h) The Director may require surface air...

  12. 40 CFR 146.90 - Testing and monitoring requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... information about the geologic sequestration project, including injection rate and volume, geology, the... site-specific geology, that such methods are not appropriate; (h) The Director may require surface air...

  13. Understanding the Geometry of Connected Fracture Flow with Multiperiod Oscillatory Hydraulic Tests.

    PubMed

    Sayler, Claire; Cardiff, Michael; Fort, Michael D

    2018-03-01

    An understanding of the spatial and hydraulic properties of fast preferential flow pathways in the subsurface is necessary in applications ranging from contaminant fate and transport modeling to design of energy extraction systems. One method for the characterization of fracture properties over interwellbore scales is Multiperiod Oscillatory Hydraulic (MOH) testing, in which the aquifer response to oscillatory pressure stimulations is observed. MOH tests were conducted on isolated intervals of wells in siliciclastic and carbonate aquifers in southern Wisconsin. The goal was to characterize the spatial properties of discrete fractures over interwellbore scales. MOH tests were conducted on two discrete fractured intervals intersecting two boreholes at one field site, and a nest of three piezometers at another field site. Fracture diffusivity estimates were obtained using analytical solutions that relate diffusivity to observed phase lag and amplitude decay. In addition, MOH tests were used to investigate the spatial extent of flow using different conceptual models of fracture geometry. Results indicated that fracture geometry at both field sites can be approximated by permeable two-dimensional fracture planes, oriented near-horizontally at one site, and near-vertically at the other. The technique used on MOH field data to characterize fracture geometry shows promise in revealing fracture network characteristics important to groundwater flow and transport. © 2017, National Ground Water Association.

  14. Prospective Evaluation of Multimodal Optical Imaging with Automated Image Analysis to Detect Oral Neoplasia In Vivo.

    PubMed

    Quang, Timothy; Tran, Emily Q; Schwarz, Richard A; Williams, Michelle D; Vigneswaran, Nadarajah; Gillenwater, Ann M; Richards-Kortum, Rebecca

    2017-10-01

    The 5-year survival rate for patients with oral cancer remains low, in part because diagnosis often occurs at a late stage. Early and accurate identification of oral high-grade dysplasia and cancer can help improve patient outcomes. Multimodal optical imaging is an adjunctive diagnostic technique in which autofluorescence imaging is used to identify high-risk regions within the oral cavity, followed by high-resolution microendoscopy to confirm or rule out the presence of neoplasia. Multimodal optical images were obtained from 206 sites in 100 patients. Histologic diagnosis, either from a punch biopsy or an excised surgical specimen, was used as the gold standard for all sites. Histopathologic diagnoses of moderate dysplasia or worse were considered neoplastic. Images from 92 sites in the first 30 patients were used as a training set to develop automated image analysis methods for identification of neoplasia. Diagnostic performance was evaluated prospectively using images from 114 sites in the remaining 70 patients as a test set. In the training set, multimodal optical imaging with automated image analysis correctly classified 95% of nonneoplastic sites and 94% of neoplastic sites. Among the 56 sites in the test set that were biopsied, multimodal optical imaging correctly classified 100% of nonneoplastic sites and 85% of neoplastic sites. Among the 58 sites in the test set that corresponded to a surgical specimen, multimodal imaging correctly classified 100% of nonneoplastic sites and 61% of neoplastic sites. These findings support the potential of multimodal optical imaging to aid in the early detection of oral cancer. Cancer Prev Res; 10(10); 563-70. ©2017 AACR . ©2017 American Association for Cancer Research.

  15. Detection of possible restriction sites for type II restriction enzymes in DNA sequences.

    PubMed

    Gagniuc, P; Cimponeriu, D; Ionescu-Tîrgovişte, C; Mihai, Andrada; Stavarachi, Monica; Mihai, T; Gavrilă, L

    2011-01-01

    In order to make a step forward in the knowledge of the mechanism operating in complex polygenic disorders such as diabetes and obesity, this paper proposes a new algorithm (PRSD -possible restriction site detection) and its implementation in Applied Genetics software. This software can be used for in silico detection of potential (hidden) recognition sites for endonucleases and for nucleotide repeats identification. The recognition sites for endonucleases may result from hidden sequences through deletion or insertion of a specific number of nucleotides. Tests were conducted on DNA sequences downloaded from NCBI servers using specific recognition sites for common type II restriction enzymes introduced in the software database (n = 126). Each possible recognition site indicated by the PRSD algorithm implemented in Applied Genetics was checked and confirmed by NEBcutter V2.0 and Webcutter 2.0 software. In the sequence NG_008724.1 (which includes 63632 nucleotides) we found a high number of potential restriction sites for ECO R1 that may be produced by deletion (n = 43 sites) or insertion (n = 591 sites) of one nucleotide. The second module of Applied Genetics has been designed to find simple repeats sizes with a real future in understanding the role of SNPs (Single Nucleotide Polymorphisms) in the pathogenesis of the complex metabolic disorders. We have tested the presence of simple repetitive sequences in five DNA sequence. The software indicated exact position of each repeats detected in the tested sequences. Future development of Applied Genetics can provide an alternative for powerful tools used to search for restriction sites or repetitive sequences or to improve genotyping methods.

  16. Vadose zone transport field study: Detailed test plan for simulated leak tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    AL Ward; GW Gee

    2000-06-23

    The US Department of Energy (DOE) Groundwater/Vadose Zone Integration Project Science and Technology initiative was created in FY 1999 to reduce the uncertainty associated with vadose zone transport processes beneath waste sites at DOE's Hanford Site near Richland, Washington. This information is needed not only to evaluate the risks from transport, but also to support the adoption of measures for minimizing impacts to the groundwater and surrounding environment. The principal uncertainties in vadose zone transport are the current distribution of source contaminants and the natural heterogeneity of the soil in which the contaminants reside. Oversimplified conceptual models resulting from thesemore » uncertainties and limited use of hydrologic characterization and monitoring technologies have hampered the understanding contaminant migration through Hanford's vadose zone. Essential prerequisites for reducing vadose transport uncertainly include the development of accurate conceptual models and the development or adoption of monitoring techniques capable of delineating the current distributions of source contaminants and characterizing natural site heterogeneity. The Vadose Zone Transport Field Study (VZTFS) was conceived as part of the initiative to address the major uncertainties confronting vadose zone fate and transport predictions at the Hanford Site and to overcome the limitations of previous characterization attempts. Pacific Northwest National Laboratory (PNNL) is managing the VZTFS for DOE. The VZTFS will conduct field investigations that will improve the understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. Ideally, these methods will capture the extent of contaminant plumes using existing infrastructure (i.e., more than 1,300 steel-cased boreholes). The objectives of the VZTFS are to conduct controlled transport experiments at well-instrumented field sites at Hanford to: identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford's waste disposal sites; reduce uncertainty in conceptual models; develop a detailed and accurate database of hydraulic and transport parameters for validation of three-dimensional numerical models; identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. This plan provides details for conducting field tests during FY 2000 to accomplish these objectives. Details of additional testing during FY 2001 and FY 2002 will be developed as part of the work planning process implemented by the Integration Project.« less

  17. Full-vector geomagnetic field records from the East Eifel, Germany

    NASA Astrophysics Data System (ADS)

    Monster, Marilyn W. L.; Langemeijer, Jaap; Wiarda, Laura R.; Dekkers, Mark J.; Biggin, Andy J.; Hurst, Elliot A.; Groot, Lennart V. de

    2018-01-01

    To create meaningful models of the geomagnetic field, high-quality directional and intensity input data are needed. However, while it is fairly straightforward to obtain directional data, intensity data are much scarcer, especially for periods before the Holocene. Here, we present data from twelve flows (age range ∼ 200 to ∼ 470 ka) in the East Eifel volcanic field (Germany). These sites had been previously studied and are resampled to further test the recently proposed multi-method palaeointensity approach. Samples are first subjected to classic palaeomagnetic and rock magnetic analyses to optimise the subsequent palaeointensity experiments. Four different palaeointensity methods - IZZI-Thellier, the multispecimen method, calibrated pseudo-Thellier, and microwave-Thellier - are being used in the present study. The latter should be considered as supportive because only one or two specimens per site could be processed. Palaeointensities obtained for ten sites pass our selection criteria: two sites are successful with a single approach, four sites with two approaches, three more sites work with three approaches, and one site with all four approaches. Site-averaged intensity values typically range between 30 and 35 μT. No typically low palaeointensity values are found, in line with paleodirectional results which are compatible with regular palaeosecular variation of the Earth's magnetic field. Results from different methods are remarkably consistent and generally agree well with the values previously reported. They appear to be below the average for the Brunhes chron; there are no indications for relatively higher palaeointensities for units younger than 300 ka. However, our young sites could be close in age, and therefore may not represent the average intensity of the paleofield. Three of our sites are even considered coeval; encouragingly, these do yield the same palaeointensity within uncertainty bounds.

  18. LBSizeCleav: improved support vector machine (SVM)-based prediction of Dicer cleavage sites using loop/bulge length.

    PubMed

    Bao, Yu; Hayashida, Morihiro; Akutsu, Tatsuya

    2016-11-25

    Dicer is necessary for the process of mature microRNA (miRNA) formation because the Dicer enzyme cleaves pre-miRNA correctly to generate miRNA with correct seed regions. Nonetheless, the mechanism underlying the selection of a Dicer cleavage site is still not fully understood. To date, several studies have been conducted to solve this problem, for example, a recent discovery indicates that the loop/bulge structure plays a central role in the selection of Dicer cleavage sites. In accordance with this breakthrough, a support vector machine (SVM)-based method called PHDCleav was developed to predict Dicer cleavage sites which outperforms other methods based on random forest and naive Bayes. PHDCleav, however, tests only whether a position in the shift window belongs to a loop/bulge structure. In this paper, we used the length of loop/bulge structures (in addition to their presence or absence) to develop an improved method, LBSizeCleav, for predicting Dicer cleavage sites. To evaluate our method, we used 810 empirically validated sequences of human pre-miRNAs and performed fivefold cross-validation. In both 5p and 3p arms of pre-miRNAs, LBSizeCleav showed greater prediction accuracy than PHDCleav did. This result suggests that the length of loop/bulge structures is useful for prediction of Dicer cleavage sites. We developed a novel algorithm for feature space mapping based on the length of a loop/bulge for predicting Dicer cleavage sites. The better performance of our method indicates the usefulness of the length of loop/bulge structures for such predictions.

  19. The Joint Experiment for Crop Assessment and Monitoring (JECAM) Initiative: Developing methods and best practices for global agricultural monitoring

    NASA Astrophysics Data System (ADS)

    Champagne, C.; Jarvis, I.; Defourny, P.; Davidson, A.

    2014-12-01

    Agricultural systems differ significantly throughout the world, making a 'one size fits all' approach to remote sensing and monitoring of agricultural landscapes problematic. The Joint Experiment for Crop Assessment and Monitoring (JECAM) was established in 2009 to bring together the global scientific community to work towards a set of best practices and recommendations for using earth observation data to map, monitor and report on agricultural productivity globally across an array of diverse agricultural systems. These methods form the research and development component of the Group on Earth Observation Global Agricultural Monitoring (GEOGLAM) initiative to harmonize global monitoring efforts and increase market transparency. The JECAM initiative brings together researchers from a large number of globally distributed, well monitored agricultural test sites that cover a range of crop types, cropping systems and climate regimes. Each test site works independently as well as together across multiple sites to test methods, sensors and field data collection techniques to derive key agricultural parameters, including crop type, crop condition, crop yield and soil moisture. The outcome of this project will be a set of best practices that cover the range of remote sensing monitoring and reporting needs, including satellite data acquisition, pre-processing techniques, information retrieval and ground data validation. These outcomes provide the research and development foundation for GEOGLAM and will help to inform the development of the GEOGLAM "system of systems" for global agricultural monitoring. The outcomes of the 2014 JECAM science meeting will be discussed as well as examples of methods being developed by JECAM scientists.

  20. A non-intrusive screening methodology for environmental hazard assessment at waste disposal sites for water resources protection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simons, B.A.; Woldt, W.E.; Jones, D.D.

    The environmental and health risks posed by unregulated waste disposal sites are potential concerns of Pacific Rim regions and island ares because of the need to protect aquifers and other valuable water resources. A non-intrusive screening methodology to determine site characteristics including possible soil and/or groundwater contamination, areal extent of waste, etc. is being developed and tested at waste disposal sites in Nebraska. This type of methodology would be beneficial to Pacific Rim regions in investigating and/or locating unknown or poorly documented contamination areas for hazard assessment and groundwater protection. Traditional assessment methods are generally expensive, time consuming, and potentiallymore » exacerbate the problem. Ideally, a quick and inexpensive assessment method to reliably characterize these sites is desired. Electromagnetic (EM) conductivity surveying and soil-vapor sampling techniques, combined with innovative three-dimensional geostatistical methods are used to map the data to develop a site characterization of the subsurface and to aid in tracking any contaminant plumes. The EM data is analyzed to determine/estimate the extent and volume of waste and/or leachate. Soil-vapor data are analyzed to estimate a site`s volatile organic compound (VOC) emission rate to the atmosphere. The combined information could then be incorporated as one part of an overall hazard assessment system.« less

  1. Analysis and Prediction of Myristoylation Sites Using the mRMR Method, the IFS Method and an Extreme Learning Machine Algorithm.

    PubMed

    Wang, ShaoPeng; Zhang, Yu-Hang; Huang, GuoHua; Chen, Lei; Cai, Yu-Dong

    2017-01-01

    Myristoylation is an important hydrophobic post-translational modification that is covalently bound to the amino group of Gly residues on the N-terminus of proteins. The many diverse functions of myristoylation on proteins, such as membrane targeting, signal pathway regulation and apoptosis, are largely due to the lipid modification, whereas abnormal or irregular myristoylation on proteins can lead to several pathological changes in the cell. To better understand the function of myristoylated sites and to correctly identify them in protein sequences, this study conducted a novel computational investigation on identifying myristoylation sites in protein sequences. A training dataset with 196 positive and 84 negative peptide segments were obtained. Four types of features derived from the peptide segments following the myristoylation sites were used to specify myristoylatedand non-myristoylated sites. Then, feature selection methods including maximum relevance and minimum redundancy (mRMR), incremental feature selection (IFS), and a machine learning algorithm (extreme learning machine method) were adopted to extract optimal features for the algorithm to identify myristoylation sites in protein sequences, thereby building an optimal prediction model. As a result, 41 key features were extracted and used to build an optimal prediction model. The effectiveness of the optimal prediction model was further validated by its performance on a test dataset. Furthermore, detailed analyses were also performed on the extracted 41 features to gain insight into the mechanism of myristoylation modification. This study provided a new computational method for identifying myristoylation sites in protein sequences. We believe that it can be a useful tool to predict myristoylation sites from protein sequences. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Comparison of U.S. Geological Survey and Ohio Environmental Protection Agency fish-collection methods using the index of biotic integrity and modified index of well-being, 1996-97

    USGS Publications Warehouse

    Covert, S. Alex

    2001-01-01

    The U.S. Geological Survey (USGS) and Ohio Environmental Protection Agency (OEPA) collected data on fish from 10 stream sites in 1996 and 3 stream sites in 1997 as part of a comparative study of fish community assessment methods. The sites sampled represent a wide range of basin sizes (ranging from 132?6,330 square kilometers) and surrounding land-use types (urban, agricultural, and mixed). Each agency used its own fish-sampling protocol. Using the Index of Biotic Integrity and Modified Index of Well-Being, differences between data sets were tested for significance by means of the Wilcoxon signed-ranks test (a = 0.05). Results showed that the median of Index of Biotic Integrity differences between data sets was not significantly different from zero (p = 0.2521); however, the same statistical test showed the median differences in the Modified Index of Well-Being scores to be significantly different from zero (p = 0.0158). The differences observed in the Index of Biotic Integrity scores are likely due to natural variability, increased variability at sites with degraded water quality, differences in sampling methods, and low-end adjustments in the Index of Biotic Integrity calculation when fewer than 50 fish were collected. The Modified Index of Well-Being scores calculated by OEPA were significantly higher than those calculated by the USGS. This finding was attributed to the comparatively large numbers and biomass of fish collected by the OEPA. By combining the two indices and viewing them in terms of the percentage attainment of Ohio Warmwater Habitat criteria, the two agencies? data seemed comparable, although the Index of Biotic Integrity scores were more similar than the Modified Index of Well-Being scores.

  3. Rapid Estimation of TPH Reduction in Oil-Contaminated Soils Using the MED Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edenborn, H.M.; Zenone, V.A.

    2007-09-01

    Oil-contaminated soil and sludge generated during federal well plugging activities in northwestern Pennsylvania are currently remediated on small landfarm sites in lieu of more expensive landfill disposal. Bioremediation success at these sites in the past has been gauged by the decrease in total petroleum hydrocarbon (TPH) concentrations to less than 10,000 mg/kg measured using EPA Method 418.1. We tested the “molarity of ethanol droplet” (MED) water repellency test as a rapid indicator of TPH concentration in soil at one landfarm near Bradford, PA. MED was estimated by determining the minimum ethanol concentration (0 – 6 M) required to penetrate air-driedmore » and sieved soil samples within 10 sec. TPH in soil was analyzed by rapid fluorometric analysis of methanol soil extracts, which correlated well with EPA Method 1664. Uncontaminated landfarm site soil amended with increasing concentrations of waste oil sludge showed a high correlation between MED and TPH. MED values exceeded the upper limit of 6 M as TPH estimates exceed ca. 25,000 mg/kg. MED and TPH at the land farm were sampled monthly during summer months over two years in a grid pattern that allowed spatial comparisons of site remediation effectiveness. MED and TPH decreased at a constant rate over time and remained highly correlated. Inexpensive alternatives to reagent-grade ethanol gave comparable results. The simple MED approach served as an inexpensive alternative to the routine laboratory analysis of TPH during the monitoring of oily waste bioremediation at this landfarm site.« less

  4. The impact evaluation of soil liquefaction on low-rise building in the Meinong earthquake

    NASA Astrophysics Data System (ADS)

    Lu, Chih-Chieh; Hwang, Jin-Hung; Hsu, Shang-Yi

    2017-08-01

    This paper presents major preliminary observations on the liquefaction-induced damages in the Meinong earthquake ( M L = 6.4). The severe damages to buildings centered on Huian and Sanmin Streets in Tainan City where the places were reclaimed fish or farm ponds with poor construction quality from many decades ago. To better understand the effect due to the soil liquefaction at these sites, the information provided by the in situ 13 Standard Penetration Test boreholes and 5 Cone Penetration Test soundings accompanying with the PGAs derived from the near seismographs was used to conduct the soil liquefaction evaluation by the Seed method (Seed et al. in J Geotech Eng ASCE 111(12):1425-1445, 1985) when subject to the Meinong earthquake. The liquefaction potential index (LPI) was then evaluated accordingly. From the results, it was found that the estimated damage severity was not consistent to the field conditions if the local site effect was not taken into account. To better reflect the site response in such sites, the sites' PGAs in the PGA contour map were multiplied by 1.5 times to quantify the amplification effects due to the soft geological condition. In addition, the PGAs based on other simple approaches were evaluated as well for comparison. Besides, the effects of fines content and magnitude scaling factor were also discussed in this paper. After that, several common simplified methods were also used to calculate the LPI when subject to the Meinong earthquake in order to evaluate the applicability of these simplified methods.

  5. Assessment of the announced North Korean nuclear test using long-range atmospheric transport and dispersion modelling.

    PubMed

    De Meutter, Pieter; Camps, Johan; Delcloo, Andy; Termonia, Piet

    2017-08-18

    On 6 January 2016, the Democratic People's Republic of Korea announced to have conducted its fourth nuclear test. Analysis of the corresponding seismic waves from the Punggye-ri nuclear test site showed indeed that an underground man-made explosion took place, although the nuclear origin of the explosion needs confirmation. Seven weeks after the announced nuclear test, radioactive xenon was observed in Japan by a noble gas measurement station of the International Monitoring System. In this paper, atmospheric transport modelling is used to show that the measured radioactive xenon is compatible with a delayed release from the Punggye-ri nuclear test site. An uncertainty quantification on the modelling results is given by using the ensemble method. The latter is important for policy makers and helps advance data fusion, where different nuclear Test-Ban-Treaty monitoring techniques are combined.

  6. Remote geologic structural analysis of Yucca Flat

    NASA Astrophysics Data System (ADS)

    Foley, M. G.; Heasler, P. G.; Hoover, K. A.; Rynes, N. J.; Thiessen, R. L.; Alfaro, J. L.

    1991-12-01

    The Remote Geologic Analysis (RGA) system was developed by Pacific Northwest Laboratory (PNL) to identify crustal structures that may affect seismic wave propagation from nuclear tests. Using automated methods, the RGA system identifies all valleys in a digital elevation model (DEM), fits three-dimensional vectors to valley bottoms, and catalogs all potential fracture or fault planes defined by coplanar pairs of valley vectors. The system generates a cluster hierarchy of planar features having greater-than-random density that may represent areas of anomalous topography manifesting structural control of erosional drainage development. Because RGA uses computer methods to identify zones of hypothesized control of topography, ground truth using a well-characterized test site was critical in our evaluation of RGA's characterization of inaccessible test sites for seismic verification studies. Therefore, we applied RGA to a study area centered on Yucca Flat at the Nevada Test Site (NTS) and compared our results with both mapped geology and geologic structures and with seismic yield-magnitude models. This is the final report of PNL's RGA development project for peer review within the U.S. Department of Energy Office of Arms Control (OAC) seismic-verification community. In this report, we discuss the Yucca Flat study area, the analytical basis of the RGA system and its application to Yucca Flat, the results of the analysis, and the relation of the analytical results to known topography, geology, and geologic structures.

  7. Incidence of childhood leukaemia and non-Hodgkin's lymphoma in the vicinity of nuclear sites in Scotland, 1968-93.

    PubMed Central

    Sharp, L; Black, R J; Harkness, E F; McKinney, P A

    1996-01-01

    OBJECTIVES: The primary aims were to investigate the incidence of leukaemia and non-Hodgkin's lymphoma in children resident near seven nuclear sites in Scotland and to determine whether there was any evidence of a gradient in risk with distance of residence from a nuclear site. A secondary aim was to assess the power of statistical tests for increased risk of disease near a point source when applied in the context of census data for Scotland. METHODS: The study data set comprised 1287 cases of leukaemia and non-Hodgkin's lymphoma diagnosed in children aged under 15 years in the period 1968-93, validated for accuracy and completeness. A study zone around each nuclear site was constructed from enumeration districts within 25 km. Expected numbers were calculated, adjusting for sex, age, and indices of deprivation and urban-rural residence. Six statistical tests were evaluated. Stone's maximum likelihood ratio (unconditional application) was applied as the main test for general increased incidence across a study zone. The linear risk score based on enumeration districts (conditional application) was used as a secondary test for declining risk with distance from each site. RESULTS: More cases were observed (O) than expected (E) in the study zones around Rosyth naval base (O/E 1.02), Chapelcross electricity generating station (O/E 1.08), and Dounreay reprocessing plant (O/E 1.99). The maximum likelihood ratio test reached significance only for Dounreay (P = 0.030). The linear risk score test did not indicate a trend in risk with distance from any of the seven sites, including Dounreay. CONCLUSIONS: There was no evidence of a generally increased risk of childhood leukaemia and non-Hodgkin's lymphoma around nuclear sites in Scotland, nor any evidence of a trend of decreasing risk with distance from any of the sites. There was a significant excess risk in the zone around Dounreay, which was only partially accounted for by the sociodemographic characteristics of the area. The statistical power of tests for localised increased risk of disease around a point source should be assessed in each new setting in which they are applied. PMID:8994402

  8. Virtual Seismic Observation (VSO) with Sparsity-Promotion Inversion

    NASA Astrophysics Data System (ADS)

    Tiezhao, B.; Ning, J.; Jianwei, M.

    2017-12-01

    Large station interval leads to low resolution images, sometimes prevents people from obtaining images in concerned regions. Sparsity-promotion inversion, a useful method to recover missing data in industrial field acquisition, can be lent to interpolate seismic data on none-sampled sites, forming Virtual Seismic Observation (VSO). Traditional sparsity-promotion inversion suffers when coming up with large time difference in adjacent sites, which we concern most and use shift method to improve it. The procedure of the interpolation is that we first employ low-pass filter to get long wavelength waveform data and shift the waveforms of the same wave in different seismograms to nearly same arrival time. Then we use wavelet-transform-based sparsity-promotion inversion to interpolate waveform data on none-sampled sites and filling a phase in each missing trace. Finally, we shift back the waveforms to their original arrival times. We call our method FSIS (Filtering, Shift, Interpolation, Shift) interpolation. By this way, we can insert different virtually observed seismic phases into none-sampled sites and get dense seismic observation data. For testing our method, we randomly hide the real data in a site and use the rest to interpolate the observation on that site, using direct interpolation or FSIS method. Compared with directly interpolated data, interpolated data with FSIS can keep amplitude better. Results also show that the arrival times and waveforms of those VSOs well express the real data, which convince us that our method to form VSOs are applicable. In this way, we can provide needed data for some advanced seismic technique like RTM to illuminate shallow structures.

  9. Marking multi-channel silicon-substrate electrode recording sites using radiofrequency lesions.

    PubMed

    Brozoski, Thomas J; Caspary, Donald M; Bauer, Carol A

    2006-01-30

    Silicon-substrate multi-channel electrodes (multiprobes) have proven useful in a variety of electrophysiological tasks. When using multiprobes it is often useful to identify the site of each channel, e.g., when recording single-unit activity from a heterogeneous structure. Lesion marking of electrode sites has been used for many years. Electrolytic, or direct current (DC) lesions, have been used successfully to mark multiprobe sites in rat hippocampus [Townsend G, Peloquin P, Kloosterman F, Hetke JF, Leung LS. Recording and marking with silicon multichannel electrodes. Brain Res Brain Res Protoc 2002;9:122-9]. The present method used radio-frequency (rf) lesions to distinctly mark each of the 16 recording sites of 16-channel linear array multiprobes, in chinchilla inferior colliculus. A commercial radio-frequency lesioner was used as the current source, in conjunction with custom connectors adapted to the multiprobe configuration. In vitro bench testing was used to establish current-voltage-time parameters, as well as to check multiprobe integrity and radio-frequency performance. In in vivo application, visualization of individual-channel multiprobe recording sites was clear in 21 out of 33 sets of collicular serial-sections (i.e., probe tracks) obtained from acute experimental subjects, i.e., maximum post-lesion survival time of 2h. Advantages of the rf method include well-documented methods of in vitro calibration as well as low impact on probe integrity. The rf method of marking individual-channel sites should be useful in a variety of applications.

  10. New methods for tightly regulated gene expression and highly efficient chromosomal integration of cloned genes for Methanosarcina species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guss, Adam M.; Rother, Michael; Zhang, Jun Kai

    A highly efficient method for chromosomal integration of cloned DNA into Methanosarcina spp. was developed utilizing the site-specific recombination system from the Streptomyces phage φC31. Host strains expressing the φC31 integrase gene and carrying an appropriate recombination site can be transformed with non-replicating plasmids carrying the complementary recombination site at efficiencies similar to those obtained with self-replicating vectors. We have also constructed a series of hybrid promoters that combine the highly expressed M. barkeri P mcrB promoter with binding sites for the tetracycline-responsive, bacterial TetR protein. These promoters are tightly regulated by the presence or absence of tetracycline in strainsmore » that express the tetR gene. The hybrid promoters can be used in genetic experiments to test gene essentiality by placing a gene of interest under their control. Thus, growth of strains with tetR -regulated essential genes becomes tetracycline-dependent. A series of plasmid vectors that utilize the site-specific recombination system for construction of reporter gene fusions and for tetracycline regulated expression of cloned genes are reported. These vectors were used to test the efficiency of translation at a variety of start codons. Fusions using an ATG start site were the most active, whereas those using GTG and TTG were approximately one half or one fourth as active, respectively. The CTG fusion was 95% less active than the ATG fusion.« less

  11. New methods for tightly regulated gene expression and highly efficient chromosomal integration of cloned genes for Methanosarcina species

    DOE PAGES

    Guss, Adam M.; Rother, Michael; Zhang, Jun Kai; ...

    2008-01-01

    A highly efficient method for chromosomal integration of cloned DNA into Methanosarcina spp. was developed utilizing the site-specific recombination system from the Streptomyces phage φC31. Host strains expressing the φC31 integrase gene and carrying an appropriate recombination site can be transformed with non-replicating plasmids carrying the complementary recombination site at efficiencies similar to those obtained with self-replicating vectors. We have also constructed a series of hybrid promoters that combine the highly expressed M. barkeri P mcrB promoter with binding sites for the tetracycline-responsive, bacterial TetR protein. These promoters are tightly regulated by the presence or absence of tetracycline in strainsmore » that express the tetR gene. The hybrid promoters can be used in genetic experiments to test gene essentiality by placing a gene of interest under their control. Thus, growth of strains with tetR -regulated essential genes becomes tetracycline-dependent. A series of plasmid vectors that utilize the site-specific recombination system for construction of reporter gene fusions and for tetracycline regulated expression of cloned genes are reported. These vectors were used to test the efficiency of translation at a variety of start codons. Fusions using an ATG start site were the most active, whereas those using GTG and TTG were approximately one half or one fourth as active, respectively. The CTG fusion was 95% less active than the ATG fusion.« less

  12. A composite docking approach for the identification and characterization of ectosteric inhibitors of cathepsin K.

    PubMed

    Law, Simon; Panwar, Preety; Li, Jody; Aguda, Adeleke H; Jamroz, Andrew; Guido, Rafael V C; Brömme, Dieter

    2017-01-01

    Cathepsin K (CatK) is a cysteine protease that plays an important role in mammalian intra- and extracellular protein turnover and is known for its unique and potent collagenase activity. Through studies on the mechanism of its collagenase activity, selective ectosteric sites were identified that are remote from the active site. Inhibitors targeting these ectosteric sites are collagenase selective and do not interfere with other proteolytic activities of the enzyme. Potential ectosteric inhibitors were identified using a computational approach to screen the druggable subset of and the entire 281,987 compounds comprising Chemical Repository library of the National Cancer Institute-Developmental Therapeutics Program (NCI-DTP). Compounds were scored based on their affinity for the ectosteric site. Here we compared the scores of three individual molecular docking methods with that of a composite score of all three methods together. The composite docking method was up to five-fold more effective at identifying potent collagenase inhibitors (IC50 < 20 μM) than the individual methods. Of 160 top compounds tested in enzymatic assays, 28 compounds revealed blocking of the collagenase activity of CatK at 100 μM. Two compounds exhibited IC50 values below 5 μM corresponding to a molar protease:inhibitor concentration of <1:12. Both compounds were subsequently tested in osteoclast bone resorption assays where the most potent inhibitor, 10-[2-[bis(2-hydroxyethyl)amino]ethyl]-7,8-diethylbenzo[g]pteridine-2,4-dione, (NSC-374902), displayed an inhibition of bone resorption with an IC50-value of approximately 300 nM and no cell toxicity effects.

  13. A composite docking approach for the identification and characterization of ectosteric inhibitors of cathepsin K

    PubMed Central

    Law, Simon; Panwar, Preety; Li, Jody; Aguda, Adeleke H.; Jamroz, Andrew; Guido, Rafael V. C.

    2017-01-01

    Cathepsin K (CatK) is a cysteine protease that plays an important role in mammalian intra- and extracellular protein turnover and is known for its unique and potent collagenase activity. Through studies on the mechanism of its collagenase activity, selective ectosteric sites were identified that are remote from the active site. Inhibitors targeting these ectosteric sites are collagenase selective and do not interfere with other proteolytic activities of the enzyme. Potential ectosteric inhibitors were identified using a computational approach to screen the druggable subset of and the entire 281,987 compounds comprising Chemical Repository library of the National Cancer Institute-Developmental Therapeutics Program (NCI-DTP). Compounds were scored based on their affinity for the ectosteric site. Here we compared the scores of three individual molecular docking methods with that of a composite score of all three methods together. The composite docking method was up to five-fold more effective at identifying potent collagenase inhibitors (IC50 < 20 μM) than the individual methods. Of 160 top compounds tested in enzymatic assays, 28 compounds revealed blocking of the collagenase activity of CatK at 100 μM. Two compounds exhibited IC50 values below 5 μM corresponding to a molar protease:inhibitor concentration of <1:12. Both compounds were subsequently tested in osteoclast bone resorption assays where the most potent inhibitor, 10-[2-[bis(2-hydroxyethyl)amino]ethyl]-7,8-diethylbenzo[g]pteridine-2,4-dione, (NSC-374902), displayed an inhibition of bone resorption with an IC50-value of approximately 300 nM and no cell toxicity effects. PMID:29088253

  14. Problems in specimen collection for sexually transmitted diseases.

    PubMed

    Larsen, B

    1985-03-01

    Laboratory methods for the diagnosis of sexually transmitted diseases (STDs) are continuously undergoing improvement. It remains the responsibility of the clinician to become familiar with the tests available for the diagnosis of STDs. Those tests depend on obtaining clinical specimens from the proper site and on transporting them to the laboratory under satisfactory conditions.

  15. 78 FR 68076 - Request for Information on Alternative Skin Sensitization Test Methods and Testing Strategies and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... INFORMATION: Background: Allergic contact dermatitis (ACD), a skin reaction characterized by localized redness, swelling, blistering, or itching after direct contact with a skin allergen, is an important public health.... Web site: http://ntp.niehs.nih.gov/go/niceatm . FOR FURTHER INFORMATION CONTACT: Dr. Warren S. Casey...

  16. Regional Longleaf Pine (Pinus palustris) Natural Regeneration

    Treesearch

    William D. Boyer

    1998-01-01

    Duration: 1968-present Objective: Test the shelterwood system of longleaf pine natural regeneration. Methods: Longleaf pine natural regeneration tests were established from 1966 through 1970 at ten locations in seven states from North Carolina to Louisiana. One of these was established on a 50-acre flatwoods site on Eglin AFB in 1968. Regeneration was initially...

  17. ANALYSIS OF A GAS-PHASE PARTITIONING TRACER TEST CONDUCTED IN AN UNSATURATED FRACTURED-CLAY FORMATION

    EPA Science Inventory

    The gas-phase partitioning tracer method was used to estimate non-aqueous phase liquid (NAPL), water, and air saturations in the vadose zone at a chlorinated-solvent contaminated field site in Tucson, AZ. The tracer test was conducted in a fractured-clay system that is the confin...

  18. A Laboratory Exercise for Compatibility Testing of Hazardous Wastes in an Environmental Analysis Course.

    ERIC Educational Resources Information Center

    Chang, J. C.; And Others

    1986-01-01

    Discusses a new program at the University of Michigan in hazardous waste management. Describes a laboratory demonstration that deals with the reactivity and potential violence of several reactions that may be encountered on a hazardous waste site. Provides criteria for selecting particular compatibility testing methods. (TW)

  19. Reliability and Validity Testing of the Physical Resilience Measure

    ERIC Educational Resources Information Center

    Resnick, Barbara; Galik, Elizabeth; Dorsey, Susan; Scheve, Ann; Gutkin, Susan

    2011-01-01

    Objective: The purpose of this study was to test reliability and validity of the Physical Resilience Scale. Methods: A single-group repeated measure design was used and 130 older adults from three different housing sites participated. Participants completed the Physical Resilience Scale, Hardy-Gill Resilience Scale, 14-item Resilience Scale,…

  20. Vacuum decay container/closure integrity testing technology. Part 2. Comparison to dye ingress tests.

    PubMed

    Wolf, Heinz; Stauffer, Tony; Chen, Shu-Chen Y; Lee, Yoojin; Forster, Ronald; Ludzinski, Miron; Kamat, Madhav; Mulhall, Brian; Guazzo, Dana Morton

    2009-01-01

    Part 1 of this series demonstrated that a container closure integrity test performed according to ASTM F2338-09 Standard Test Method for Nondestructive Detection of Leaks in Packages by Vacuum Decay Method using a VeriPac 325/LV vacuum decay leak tester by Packaging Technologies & Inspection, LLC (PTI) is capable of detecting leaks > or = 5.0 microm (nominal diameter) in rigid, nonporous package systems, such as prefilled glass syringes. The current study compared USP, Ph.Eur. and ISO dye ingress integrity test methods to PTI's vacuum decay technology for the detection of these same 5-, 10-, and 15-microm laser-drilled hole defects in 1-mL glass prefilled syringes. The study was performed at three test sites using several inspectors and a variety of inspection conditions. No standard dye ingress method was found to reliably identify all holed syringes. Modifications to these standard dye tests' challenge conditions increased the potential for dye ingress, and adjustments to the visual inspection environment improved dye ingress detection. However, the risk of false positive test results with dye ingress tests remained. In contrast, the nondestructive vacuum decay leak test method reliably identified syringes with holes > or = 5.0 microm.

  1. CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach

    EPA Pesticide Factsheets

    An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.

  2. 76 FR 49737 - Takes of Marine Mammals Incidental to Specified Activities; Marine Geophysical Survey in the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-11

    ... stock(s) for subsistence uses (where relevant). The authorization must set forth the permissible methods... shooting a test pattern over an ocean bottom instrument in shallow water. This method is neither practical nor valid in water depths as great as 3,000 m (9,842.5 ft). The alternative method of conducting site...

  3. Development of the SAFE Checklist Tool for Assessing Site-Level Threats to Child Protection: Use of Delphi Methods and Application to Two Sites in India

    PubMed Central

    Betancourt, Theresa S.; Zuilkowski, Stephanie S.; Ravichandran, Arathi; Einhorn, Honora; Arora, Nikita; Bhattacharya Chakravarty, Aruna; Brennan, Robert T.

    2015-01-01

    Background The child protection community is increasingly focused on developing tools to assess threats to child protection and the basic security needs and rights of children and families living in adverse circumstances. Although tremendous advances have been made to improve measurement of individual child health status or household functioning for use in low-resource settings, little attention has been paid to a more diverse array of settings in which many children in adversity spend time and how context contributes to threats to child protection. The SAFE model posits that insecurity in any of the following fundamental domains threatens security in the others: Safety/freedom from harm; Access to basic physiological needs and healthcare; Family and connection to others; Education and economic security. Site-level tools are needed in order to monitor the conditions that can dramatically undermine or support healthy child growth, development and emotional and behavioral health. From refugee camps and orphanages to schools and housing complexes, site-level threats exist that are not well captured by commonly used measures of child health and well-being or assessments of single households (e.g., SDQ, HOME). Methods The present study presents a methodology and the development of a scale for assessing site-level child protection threats in various settings of adversity. A modified Delphi panel process was enhanced with two stages of expert review in core content areas as well as review by experts in instrument development, and field pilot testing. Results Field testing in two diverse sites in India—a construction site and a railway station—revealed that the resulting SAFE instrument was sensitive to the differences between the sites from the standpoint of core child protection issues. PMID:26540159

  4. Benchmarking passive seismic methods of estimating the depth of velocity interfaces down to ~300 m

    NASA Astrophysics Data System (ADS)

    Czarnota, Karol; Gorbatov, Alexei

    2016-04-01

    In shallow passive seismology it is generally accepted that the spatial autocorrelation (SPAC) method is more robust than the horizontal-over-vertical spectral ratio (HVSR) method at resolving the depth to surface-wave velocity (Vs) interfaces. Here we present results of a field test of these two methods over ten drill sites in western Victoria, Australia. The target interface is the base of Cenozoic unconsolidated to semi-consolidated clastic and/or carbonate sediments of the Murray Basin, which overlie Paleozoic crystalline rocks. Depths of this interface intersected in drill holes are between ~27 m and ~300 m. Seismometers were deployed in a three-arm spiral array, with a radius of 250 m, consisting of 13 Trillium Compact 120 s broadband instruments. Data were acquired at each site for 7-21 hours. The Vs architecture beneath each site was determined through nonlinear inversion of HVSR and SPAC data using the neighbourhood algorithm, implemented in the geopsy modelling package (Wathelet, 2005, GRL v35). The HVSR technique yielded depth estimates of the target interface (Vs > 1000 m/s) generally within ±20% error. Successful estimates were even obtained at a site with an inverted velocity profile, where Quaternary basalts overlie Neogene sediments which in turn overlie the target basement. Half of the SPAC estimates showed significantly higher errors than were obtained using HVSR. Joint inversion provided the most reliable estimates but was unstable at three sites. We attribute the surprising success of HVSR over SPAC to a low content of transient signals within the seismic record caused by low levels of anthropogenic noise at the benchmark sites. At a few sites SPAC waveform curves showed clear overtones suggesting that more reliable SPAC estimates may be obtained utilizing a multi-modal inversion. Nevertheless, our study indicates that reliable basin thickness estimates in the Australian conditions tested can be obtained utilizing HVSR data from a single seismometer, without a priori knowledge of the surface-wave velocity of the basin material, thereby negating the need to deploy cumbersome arrays.

  5. Influence of Stepped Osteotomy on Primary Stability of Implants Inserted in Low-Density Bone Sites: An In Vitro Study.

    PubMed

    Degidi, Marco; Daprile, Giuseppe; Piattelli, Adriano

    The aims of this study were to evaluate the ability of a stepped osteotomy to improve dental implant primary stability in low-density bone sites and to investigate possible correlations between primary stability parameters. The study was performed on fresh humid bovine bone classified as type III. The test group consisted of 30 Astra Tech EV implants inserted following the protocol provided by the manufacturer. The first control group consisted of 30 Astra Tech EV implants inserted in sites without the underpreparation of the apical portion. The second control group consisted of 30 Astra Tech TX implants inserted following the protocol provided by the manufacturer. Implant insertion was performed at the predetermined 30 rpm. The insertion torque data were recorded and exported as a curve; using a trapezoidal integration technique, the area underlying the curve was calculated: this area represents the variable torque work (VTW). Peak insertion torque (pIT) and resonance frequency analysis (RFA) were also recorded. A Mann-Whitney test showed that the mean VTW was significantly higher in the test group compared with the first control and second control groups; furthermore, statistical analysis showed that pIT also was significantly higher in the test group compared with the first and second control groups. Analyzing RFA values, only the difference between the test group and second control group showed statistical significance. Pearson correlation analysis showed a very strong positive correlation between pIT and VTW values in all groups; furthermore, it showed a positive correlation between pIT and RFA values and between VTW and RFA values only in the test group. Within the limitations of an in vitro study, the results show that stepped osteotomy can be a viable method to improve implant primary stability in low-density bone sites, and that, when a traditional osteotomy method is performed, RFA presents no correlation with pIT and VTW.

  6. Development of radon-222 as a natural tracer for monitoring the remediation of NAPL contamination in the subsurface. 1998 annual progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Semprini, L.; Istok, J.

    'The objective of this research is to develop a unique method of using naturally occurring radon-222 as a tracer for locating and quantitatively describing the presence of subsurface NAPL contamination. The research will evaluate using radon as an inexpensive, yet highly accurate, means of detecting NAPL contamination and assessing the effectiveness of NAPL remediation. Laboratory, field, and modeling studies are being performed to evaluate this technique, and to develop methods for its successful implementation in practice. This report summarizes work that has been accomplished after 1-year of a 3-year project. The research to date has included radon tracer tests inmore » physical aquifer models (PAMs) and field studies at Site 300 of the Lawrence Livermore National Laboratory, CA, and Site 100D at Hanford DOE Facility, WA. The PAM tests have evaluated the ability of radon as a tracer to monitor the remediation of TCE NAPL contamination using surfactant treatment, and oxidation with permanganate. The surfactant tests were performed in collaboration with Dr. Jack Istok and Dr. Jennifer Field and their EMSP project ``In-situ, Field-Scale Evaluation of Surfactant Enhanced DNAPL Recovery Using a Single-Well-Push-Pull Test.'''' This collaboration enabled the EMSP radon project to make rapid progress. The PAM surfactant tests were performed in a radial flow geometry to simulate the push-pull-method that is being developed for surfactant field tests. The radon tests were easily incorporated into these experiments, since they simply rely on measuring the natural radon present in the subsurface fluids. Two types of radon tests were performed: (1) static tests where radon was permitted to build-up to steady-state concentrations in the pore fluids and the groundwater concentrations were monitored, and (2) dynamic tests were the radon response during push-pull surfactant tests was measured. Both methods were found to be useful in determining how NAPL remediation was progressing.'« less

  7. Terrestrial Eco-Toxicological Tests as Screening Tool to Assess Soil Contamination in Krompachy Area

    NASA Astrophysics Data System (ADS)

    Ol'ga, Šestinová; Findoráková, Lenka; Hančuľák, Jozef; Fedorová, Erika; Tomislav, Špaldon

    2016-10-01

    In this study, we present screening tool of heavy metal inputs to agricultural and permanent grass vegetation of the soils in Krompachy. This study is devoted to Ecotoxicity tests, Terrestrial Plant Test (modification of OECD 208, Phytotoxkit microbiotest on Sinapis Alba) and chronic tests of Earthworm (Dendrobaena veneta, modification of OECD Guidelines for the testing of chemicals 317, Bioaccumulation in Terrestrial Oligochaetes) as practical and sensitive screening method for assessing the effects of heavy metals in Krompachy soils. The total Cu, Zn, As, Pb and Hg concentrations and eco-toxicological tests of soils from the Krompachy area were determined of 4 sampling sites in 2015. An influence of the sampling sites distance from the copper smeltery on the absolutely concentrations of metals were recorded for copper, lead, zinc, arsenic and mercury. The highest concentrations of these metals were detected on the sampling sites up to 3 km from the copper smeltery. The samples of soil were used to assess of phytotoxic effect. Total mortality was established at earthworms using chronic toxicity test after 7 exposure days. The results of our study confirmed that no mortality was observed in any of the study soils. Based on the phytotoxicity testing, phytotoxic effects of the metals contaminated soils from the samples 3KR (7-9) S.alba seeds was observed.

  8. A method for assessing the intrinsic value and management potentials of geomorphosites

    NASA Astrophysics Data System (ADS)

    Reynard, Emmanuel; Amandine, Perret; Marco, Buchmann; Jonathan, Bussard; Lucien, Grangier; Simon, Martin

    2014-05-01

    In 2007, we have proposed a method for assessing the scientific and additional values of geomorphosites (Reynard et al., 2007). The evaluation methodology was divided in two steps: the evaluation of the scientific value of pre-selected sites, based on several criteria (rareness, integrity, representativeness, interest for reconstructing the regional morphogenesis), and the assessment of a set of so-called additional values (aesthetic, economic, ecological, and cultural). The method has proved to be quite robust and easy to use. The tests carried out in several geomorphological contexts allowed us to improve the implementation process of the method, by precising the criteria used to assess the various values of selected sites. Nevertheless, two main problems remained unsolved: (1) the selection of sites was not clear and not really systematic; (2) some additional values - in particular the economic value - were difficult to assess, and others, not considered in the method, could be evaluated (e.g. the educational value of sites). These were the factors for launching a series of modifications of the method that are presented in this poster. First of all, the assessment procedure was divided in two main steps: (1) the evaluation of the intrinsic value, in two parts (the scientific and additional values, limited to three kinds of values - cultural, ecological, aesthetic); (2) the documentation of the present use and management of the site, also divided in two parts: the sensitivity of the site (allowing us to assess the need for protection), and a series of factors influencing the (tourist) use of the site (visit conditions, educational interest, economic value). Secondly, a procedure was developed to select the potential geomorphosites - that is the sites worth to be assessed using the evaluation method. The method was then tested in four regions in the Swiss and French Alps: the Chablais area (Switzerland, France), the Hérens valley (Switzerland), the Moesano valley (Switzerland), where a project of national park is in preparation, and the Gruyère - Pays-d'Enhaut Regional Nature Park (Switzerland). The main conclusion of the research is that even if a full objectivity in the evaluation process is difficult to reach, transparency is essential almost at 3 stages: (1) the selection of potential geomorphosites: it is important to develop criteria and a method for establishing a list of potential geomorphosites; in this study, we propose to carry out the selection by crossing two dimensions: a spatial one (the selection should reflect the regional geo(morpho)diversity) and a temporal one (the selection should allow reconstructing the regional geomorphological history); (2) the assessment of the intrinsic value of the selected geomorphosites, by the establishment of clear criteria for carrying out the evaluation; (3) the development of a clear management strategy oriented to the protection and tourist promotion of the sites and based on the precise documentation of management potentials and needs, according to the assessment objectives. Reference Reynard E., Fontana G., Kozlik L., Scapozza C. (2007). A method for assessing the scientific and additional values of geomorphosites, Geogr. Helv. 62(3), 148-158.

  9. A simplified 4-site economical intradermal post-exposure rabies vaccine regimen: a randomised controlled comparison with standard methods.

    PubMed

    Warrell, Mary J; Riddell, Anna; Yu, Ly-Mee; Phipps, Judith; Diggle, Linda; Bourhy, Hervé; Deeks, Jonathan J; Fooks, Anthony R; Audry, Laurent; Brookes, Sharon M; Meslin, François-Xavier; Moxon, Richard; Pollard, Andrew J; Warrell, David A

    2008-04-23

    The need for economical rabies post-exposure prophylaxis (PEP) is increasing in developing countries. Implementation of the two currently approved economical intradermal (ID) vaccine regimens is restricted due to confusion over different vaccines, regimens and dosages, lack of confidence in intradermal technique, and pharmaceutical regulations. We therefore compared a simplified 4-site economical PEP regimen with standard methods. Two hundred and fifty-four volunteers were randomly allocated to a single blind controlled trial. Each received purified vero cell rabies vaccine by one of four PEP regimens: the currently accepted 2-site ID; the 8-site regimen using 0.05 ml per ID site; a new 4-site ID regimen (on day 0, approximately 0.1 ml at 4 ID sites, using the whole 0.5 ml ampoule of vaccine; on day 7, 0.1 ml ID at 2 sites and at one site on days 28 and 90); or the standard 5-dose intramuscular regimen. All ID regimens required the same total amount of vaccine, 60% less than the intramuscular method. Neutralising antibody responses were measured five times over a year in 229 people, for whom complete data were available. All ID regimens showed similar immunogenicity. The intramuscular regimen gave the lowest geometric mean antibody titres. Using the rapid fluorescent focus inhibition test, some sera had unexpectedly high antibody levels that were not attributable to previous vaccination. The results were confirmed using the fluorescent antibody virus neutralisation method. This 4-site PEP regimen proved as immunogenic as current regimens, and has the advantages of requiring fewer clinic visits, being more practicable, and having a wider margin of safety, especially in inexperienced hands, than the 2-site regimen. It is more convenient than the 8-site method, and can be used economically with vaccines formulated in 1.0 or 0.5 ml ampoules. The 4-site regimen now meets all requirements of immunogenicity for PEP and can be introduced without further studies. Controlled-Trials.com ISRCTN 30087513.

  10. Comparison of water absorption methods: testing the water absorption of recently quarried and weathered porous limestone on site and under laboratory conditions

    NASA Astrophysics Data System (ADS)

    Rozgonyi-Boissinot, Nikoletta; Agárdi, Tamás; Karolina Cebula, Ágnes; Török, Ákos

    2017-04-01

    The water absorption of weathering sensitive stones is a critical parameter that influences durability. The current paper compares different methods of water absorption tests by using on site and laboratory tests. The aims of the tests were to assess the water absorption of un-weathered quarry stones and various weathering forms occurring on porous limestone monuments. For the tests a Miocene porous limestone was used that occurs in Central and Western Hungary and especially near and in Budapest. Besides the Hungarian occurrences the same or very similar porous limestones are found in Austria, Slovakia and in the Czech Republic. Several quarries were operating in these countries. Due to the high workability the stone have been intensively used as construction material from the Roman period onward. The most prominent monuments made of this stone were built in Vienna and in Budapest during the 18th -19th century and in the early 20th century. The high porosity and the micro-fabric of the stone make it prone to frost- and salt weathering. Three different limestone types were tested representing coarse-, medium- and fine grained lithologies. The test methods included Rilem tube (Karsten tube) tests and capillary water absorption tests. The latter methodology has been described in detail in EN 1925:2000. The test results of on-site tests of weathered porous limestone clearly show that the water absorption of dissolved limestone surfaces and crumbling or micro-cracked limestone is similar. The water absorption curves have similar inclinations marking high amount of absorbed water. To the contrary, the white weathering crusts covered stone blocks and black crusts have significantly lower water absorptions and many of these crusts are considered as very tight almost impermeable surfaces. Capillary water absorption tests in the laboratory allowed the determination of maximum water absorption of quarried porous limestone. Specimens were placed in 3 mm of water column and the absorbed amount of water was detected. The obtained 29-30m% water absorption values compared to the 30-35m% of the total porosity of the stone, clearly suggest that the pores can be saturated with water under standard barometric pressure and therefore the tested porous Miocene limestones are very prone to salt attack.

  11. How Big Was It? Getting at Yield

    NASA Astrophysics Data System (ADS)

    Pasyanos, M.; Walter, W. R.; Ford, S. R.

    2013-12-01

    One of the most coveted pieces of information in the wake of a nuclear test is the explosive yield. Determining the yield from remote observations, however, is not necessarily a trivial thing. For instance, recorded observations of seismic amplitudes, used to estimate the yield, are significantly modified by the intervening media, which varies widely, and needs to be properly accounted for. Even after correcting for propagation effects such as geometrical spreading, attenuation, and station site terms, getting from the resulting source term to a yield depends on the specifics of the explosion source model, including material properties, and depth. Some formulas are based on assumptions of the explosion having a standard depth-of-burial and observed amplitudes can vary if the actual test is either significantly overburied or underburied. We will consider the complications and challenges of making these determinations using a number of standard, more traditional methods and a more recent method that we have developed using regional waveform envelopes. We will do this comparison for recent declared nuclear tests from the DPRK. We will also compare the methods using older explosions at the Nevada Test Site with announced yields, material and depths, so that actual performance can be measured. In all cases, we also strive to quantify realistic uncertainties on the yield estimation.

  12. Nuclear gauge application in road industry

    NASA Astrophysics Data System (ADS)

    Azmi Ismail, Mohd

    2017-11-01

    Soil compaction is essential in road construction. The evaluation of the degree of compaction relies on the knowledge of density and moisture of the compacted layers is very important to the performance of the pavement structure. Among the various tests used for making these determinations, the sand replacement density test and the moisture content determination by oven drying are perhaps the most widely used. However, these methods are not only time consuming and need wearisome procedures to obtain the results but also destructive and the number of measurements that can be taken at any time is limited. The test can on be fed back to the construction site the next day. To solve these problems, a nuclear technique has been introduced as a quicker and easier way of measuring the density and moisture of construction materials. Nuclear moisture density gauges have been used for many years in pavement construction as a method of non-destructive density testing The technique which can determine both wet density and moisture content offers an in situ method for construction control at the work site. The simplicity, the speed, and non-destructive nature offer a great advantage for quality control. This paper provides an overview of nuclear gauge application in road construction and presents a case study of monitoring compaction status of in Sedenak - Skudai, Johor rehabilitation projects.

  13. Development of a standardized differential-reflective bioassay for microbial pathogens

    NASA Astrophysics Data System (ADS)

    Wilhelm, Jay; Auld, J. R. X.; Smith, James E.

    2008-04-01

    This research examines standardizing a method for the rapid/semi-automated identification of microbial contaminates. It introduces a method suited to test for food/water contamination, serology, urinalysis and saliva testing for any >1 micron sized molecule that can be effectively bound to an identifying marker with exclusivity. This optical biosensor method seeks to integrate the semi-manual distribution of a collected sample onto a "transparent" substrate array of binding sites that will then be applied to a standard optical data disk and run for analysis. The detection of most microbe species is possible in this platform because the relative scale is greater than the resolution of the standard-scale digital information on a standard CD or DVD. This paper explains the critical first stage in the advance of this detection concept. This work has concentrated on developing the necessary software component needed to perform highly sensitive small-scale recognition using the standard optical disk as a detection platform. Physical testing has made significant progress in demonstrating the ability to utilize a standard optical drive for the purposes of micro-scale detection through the exploitation of CIRC error correction. Testing has also shown a definable trend in the optimum scale and geometry of micro-arrayed attachment sites for the technology's concept to reach achievement.

  14. Audit of Helicobacter pylori Testing in Microbiology Laboratories in England: To Inform Compliance with NICE Guidance and the Feasibility of Routine Antimicrobial Resistance Surveillance

    PubMed Central

    Allison, Rosalie; Lecky, Donna M.; Bull, Megan; Turner, Kim; Godbole, Gauri

    2016-01-01

    Introduction. The National Institute for Health and Clinical Excellence (NICE) guidance recommends that dyspeptic patients are tested for Helicobacter pylori using a urea breath test, stool antigen test, or serology. Antibiotic resistance in H. pylori is globally increasing, but treatment in England is rarely guided by susceptibility testing or surveillance. Aims. To determine compliance of microbiology laboratories in England with NICE guidance and whether laboratories perform culture and antibiotic susceptibility testing (AST). Methods. In 2015, 170 accredited English microbiology laboratories were surveyed, by email. Results. 121/170 (71%) laboratories responded; 96% provided H. pylori testing (78% on site). 94% provided H. pylori diagnosis using stool antigen; only four provided serology as their noninvasive test; 3/4 of these encouraged urea breath tests in their acute trusts. Only 22/94 (23%) of the laboratories performed H. pylori cultures from gastric biopsies on site; 9/22 performed AST, but the vast majority processed less than one specimen/week. Conclusions. Only five laboratories in England do not comply with NICE guidance; these will need the guidance reinforced. National surveillance needs to be implemented; culture-based AST would need to be centralised. Moving forward, detection of resistance in H. pylori from stool specimens using molecular methods (PCR) needs to be explored. PMID:27829836

  15. Design and Testing of a Tool for Evaluating the Quality of Diabetes Consumer-Information Web Sites

    PubMed Central

    Steinwachs, Donald; Rubin, Haya R

    2003-01-01

    Background Most existing tools for measuring the quality of Internet health information focus almost exclusively on structural criteria or other proxies for quality information rather than evaluating actual accuracy and comprehensiveness. Objective This research sought to develop a new performance-measurement tool for evaluating the quality of Internet health information, test the validity and reliability of the tool, and assess the variability in diabetes Web site quality. Methods An objective, systematic tool was developed to evaluate Internet diabetes information based on a quality-of-care measurement framework. The principal investigator developed an abstraction tool and trained an external reviewer on its use. The tool included 7 structural measures and 34 performance measures created by using evidence-based practice guidelines and experts' judgments of accuracy and comprehensiveness. Results Substantial variation existed in all categories, with overall scores following a normal distribution and ranging from 15% to 95% (mean was 50% and median was 51%). Lin's concordance correlation coefficient to assess agreement between raters produced a rho of 0.761 (Pearson's r of 0.769), suggesting moderate to high agreement. The average agreement between raters for the performance measures was 0.80. Conclusions Diabetes Web site quality varies widely. Alpha testing of this new tool suggests that it could become a reliable and valid method for evaluating the quality of Internet health sites. Such an instrument could help lay people distinguish between beneficial and misleading information. PMID:14713658

  16. Vacuum decay container/closure integrity testing technology. Part 1. ASTM F2338-09 precision and bias studies.

    PubMed

    Wolf, Heinz; Stauffer, Tony; Chen, Shu-Chen Y; Lee, Yoojin; Forster, Ronald; Ludzinski, Miron; Kamat, Madhav; Godorov, Phillip; Guazzo, Dana Morton

    2009-01-01

    ASTM F2338-09 Standard Test Method for Nondestructive Detection of Leaks in Packages by Vacuum Decay Method is applicable for leak-testing rigid and semi-rigid non-lidded trays; trays or cups sealed with porous barrier lidding materials; rigid, nonporous packages; and flexible, nonporous packages. Part 1 of this series describes the precision and bias studies performed in 2008 to expand this method's scope to include rigid, nonporous packages completely or partially filled with liquid. Round robin tests using three VeriPac 325/LV vacuum decay leak testers (Packaging Technologies & Inspection, LLC, Tuckahoe, NY) were performed at three test sites. Test packages were 1-mL glass syringes. Positive controls had laser-drilled holes in the barrel ranging from about 5 to 15 microm in nominal diameter. Two different leak tests methods were performed at each site: a "gas leak test" performed at 250 mbar (absolute) and a "liquid leak test" performed at about 1 mbar (absolute). The gas leak test was used to test empty, air-filled syringes. All defects with holes > or = 5.0 microm and all no-defect controls were correctly identified. The only false negative result was attributed to a single syringe with a < 5.0-microm hole. Tests performed using a calibrated air leak supported a 0.10-cm3 x min(-1) (ccm) sensitivity limit (99/99 lower tolerance limit). The liquid leak test was used to test both empty, air-filled syringes and water-filled syringes. Test results were 100% accurate for all empty and water-filled syringes, both without holes and with holes (5, 10, and 15 microm). Tests performed using calibrated air flow leaks of 0, 0.05, and 0.10 ccm were also 100% accurate; data supported a 0.10-ccm sensitivity limit (99/99 lower tolerance limit). Quantitative differential pressure results strongly correlated to hole size using either liquid or gas vacuum decay leak tests. The higher vacuum liquid leak test gave noticeably higher pressure readings when water was present in the defect. Both the ASTM F2338-09 test method and the precision and bias study report are available by contacting ASTM International in West Conshohocken, PA, USA (www.astm.org).

  17. 3-D Characterization of Seismic Properties at the Smart Weapons Test Range, YPG

    NASA Astrophysics Data System (ADS)

    Miller, Richard D.; Anderson, Thomas S.; Davis, John C.; Steeples, Don W.; Moran, Mark L.

    2001-10-01

    The Smart Weapons Test Range (SWTR) lies within the Yuma Proving Ground (YPG), Arizona. SWTR is a new facility constructed specifically for the development and testing of futuristic intelligent battlefield sensor networks. In this paper, results are presented for an extensive high-resolution geophysical characterization study at the SWTR site along with validation using 3-D modeling. In this study, several shallow seismic methods and novel processing techniques were used to generate a 3-D grid of earth seismic properties, including compressional (P) and shear (S) body-wave speeds (Vp and Vs), and their associated body-wave attenuation parameters (Qp, and Qs). These experiments covered a volume of earth measuring 1500 m by 300 m by 25 m deep (11 million cubic meters), centered on the vehicle test track at the SWTR site. The study has resulted in detailed characterizations of key geophysical properties. To our knowledge, results of this kind have not been previously achieved, nor have the innovative methods developed for this effort been reported elsewhere. In addition to supporting materiel developers with important geophysical information at this test range, the data from this study will be used to validate sophisticated 3-D seismic signature models for moving vehicles.

  18. Geotechnical risk analysis by flat dilatometer (DMT)

    NASA Astrophysics Data System (ADS)

    Amoroso, Sara; Monaco, Paola

    2015-04-01

    In the last decades we have assisted at a massive migration from laboratory testing to in situ testing, to the point that, today, in situ testing is often the major part of a geotechnical investigation. The State of the Art indicates that direct-push in situ tests, such as the Cone Penetration Test (CPT) and the Flat Dilatometer Test (DMT), are fast and convenient in situ tests for routine site investigation. In most cases the DMT estimated parameters, in particular the undrained shear strength su and the constrained modulus M, are used with the common design methods of Geotechnical Engineering for evaluating bearing capacity, settlements etc. The paper focuses on the prediction of settlements of shallow foundations, that is probably the No. 1 application of the DMT, especially in sands, where undisturbed samples cannot be retrieved, and on the risk associated with their design. A compilation of documented case histories that compare DMT-predicted vs observed settlements, was collected by Monaco et al. (2006), indicating that, in general, the constrained modulus M can be considered a reasonable "operative modulus" (relevant to foundations in "working conditions") for settlement predictions based on the traditional linear elastic approach. Indeed, the use of a site investigation method, such as DMT, that improve the accuracy of design parameters, reduces risk, and the design can then center on the site's true soil variability without parasitic test variability. In this respect, Failmezger et al. (1999, 2015) suggested to introduce Beta probability distribution, that provides a realistic and useful description of variability for geotechnical design problems. The paper estimates Beta probability distribution in research sites where DMT tests and observed settlements are available. References Failmezger, R.A., Rom, D., Ziegler, S.R. (1999). "SPT? A better approach of characterizing residual soils using other in-situ tests", Behavioral Characterics of Residual Soils, B. Edelen, Ed., ASCE, Reston, VA, pp. 158-175. Failmezger, R.A., Till, P., Frizzell, J., Kight, S. (2015). "Redesign of shallow foundations using dilatometer tests-more case studies after DMT'06 conference", Proc. 2nd International Conference on the Flat Dilatometer, June 14-16 (paper accepted). Monaco, P., Totani, G., Calabrese, M. (2006). "DMT-predicted vs observed settlements: a review of the available experience". In "Flat Dilatometer Testing", Proc. 2nd International Conference on the Flat Dilatometer, Washington, D.C., USA, April 2-5, 244-252. R.A. Failmezger and J.B. Anderson (eds).

  19. Unravelling the quality of HIV counselling and testing services in the private and public sectors in Zambia.

    PubMed

    Ron Levey, Ilana; Wang, Wenjuan

    2014-07-01

    Despite the substantial investment for providing HIV counselling and testing (VCT) services in Zambia, there has been little effort to systematically evaluate the quality of VCT services provided by various types of health providers. This study, conducted in 2009, examines VCT in the public and private sectors including private for-profit and NGO/faith-based sectors in Copperbelt and Luapula. The study used five primary data collection methods to gauge quality of VCT services: closed-ended client interviews with clients exiting VCT sites; open-ended client interviews; interviews with facility managers; review of service statistics; and an observation of the physical environment for VCT by site. Over 400 clients and 87 facility managers were interviewed from almost 90 facilities. Sites were randomly selected and results are generalizable at the provincial level. The study shows concerning levels of underperformance in VCT services across the sectors. It reveals serious underperformance in counselling about key risk-reduction methods. Less than one-third of clients received counselling on reducing number of sexual partners and only approximately 5% of clients received counselling about disclosing test results to partners. In terms of client profiles, the NGO sector attracts the most educated clients and less educated Zambians seek VCT services at very low rates (7%). The private for-profit performs equally or sometimes better than other sectors even though this sector is not adequately integrated into the Zambian national response to HIV. The private for-profit sector provides VCT services on par in quality with the other sectors. Most clients did not receive counselling on partner reduction or disclosure of HIV test results to partners. In a generalized HIV epidemic where multiple concurrent sexual partners are a significant problem for transmitting the disease, risk-reduction methods and discussion should be a main focus of pre-test and post-test counselling. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.

  20. PrAS: Prediction of amidation sites using multiple feature extraction.

    PubMed

    Wang, Tong; Zheng, Wei; Wuyun, Qiqige; Wu, Zhenfeng; Ruan, Jishou; Hu, Gang; Gao, Jianzhao

    2017-02-01

    Amidation plays an important role in a variety of pathological processes and serious diseases like neural dysfunction and hypertension. However, identification of protein amidation sites through traditional experimental methods is time consuming and expensive. In this paper, we proposed a novel predictor for Prediction of Amidation Sites (PrAS), which is the first software package for academic users. The method incorporated four representative feature types, which are position-based features, physicochemical and biochemical properties features, predicted structure-based features and evolutionary information features. A novel feature selection method, positive contribution feature selection was proposed to optimize features. PrAS achieved AUC of 0.96, accuracy of 92.1%, sensitivity of 81.2%, specificity of 94.9% and MCC of 0.76 on the independent test set. PrAS is freely available at https://sourceforge.net/p/praspkg. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Risk based requirements for long term stewardship: A proof-of-principle analysis of an analytic method tested on selected Hanford locations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarvis, T.T.; Andrews, W.B.; Buck, J.W.

    1998-03-01

    Since 1989, the Department of Energy`s (DOE) Environmental Management (EM) Program has managed the environmental legacy of US nuclear weapons production, research and testing at 137 facilities in 31 states and one US territory. The EM program has conducted several studies on the public risks posed by contaminated sites at these facilities. In Risks and the Risk Debate [DOE, 1995a], the Department analyzed the risks at sites before, during, and after remediation work by the EM program. The results indicated that aside from a few urgent risks, most hazards present little inherent risk because physical and active site management controlsmore » limit both the releases of site contaminants, and public access to these hazards. Without these controls, these sites would pose greater risks to the public. Past risk reports, however, provided little information about post-cleanup risk, primarily because of uncertainty about future site uses and site characteristics at the end of planned cleanup activities. This is of concern because in many cases current cleanup technologies, and remedies, will last a shorter period of time than the waste itself and the resulting contamination will remain hazardous.« less

  2. Using ensemble of classifiers for predicting HIV protease cleavage sites in proteins.

    PubMed

    Nanni, Loris; Lumini, Alessandra

    2009-03-01

    The focus of this work is the use of ensembles of classifiers for predicting HIV protease cleavage sites in proteins. Due to the complex relationships in the biological data, several recent works show that often ensembles of learning algorithms outperform stand-alone methods. We show that the fusion of approaches based on different encoding models can be useful for improving the performance of this classification problem. In particular, in this work four different feature encodings for peptides are described and tested. An extensive evaluation on a large dataset according to a blind testing protocol is reported which demonstrates how different feature extraction methods and classifiers can be combined for obtaining a robust and reliable system. The comparison with other stand-alone approaches allows quantifying the performance improvement obtained by the ensembles proposed in this work.

  3. The potential of seismic methods for detecting cavities and buried objects: experimentation at a test site

    NASA Astrophysics Data System (ADS)

    Grandjean, Gilles; Leparoux, Donatienne

    2004-06-01

    One of the recurring problems in civil engineering and landscape management is the detection of natural and man-made cavities in order to mitigate the problems of collapse and subsurface subsidence. In general, the position of the cavities is not known, either because they are not recorded in a database or because location maps are not available. In such cases, geophysical methods can provide an effective alternative for cavity detection, particularly ground-penetrating radar (GPR) and seismic methods, for which pertinent results have been recently obtained. Many studies carried out under real conditions have revealed that the signatures derived from interaction between seismic signals and voids are affected by complex geology, thus making them difficult to interpret. We decided to analyze this interaction under physical conditions as simple as possible, i.e., at a test site built specifically for that purpose. The test site was constructed of a homogeneous material and a void-equivalent body so that the ratio between wavelength and heterogeneity size was compatible with that encountered in reality. Numerical modeling was initially used to understand wave interaction with the body, prior to the design of various data-processing protocols. P-wave imagery and surface-wave sections were then acquired and processed. The work involved in this experiment and the associated results are presented, followed by a discussion concerning the reliability of such a study, and its consequences for future seismic projects.

  4. Incorporating availability/bioavailability in risk assessment and decision making of polluted sites, using Germany as an example.

    PubMed

    Kördel, Werner; Bernhardt, Cornelia; Derz, Kerstin; Hund-Rinke, Kerstin; Harmsen, Joop; Peijnenburg, Willie; Comans, Rob; Terytze, Konstantin

    2013-10-15

    Nearly all publications dealing with availability or bioavailability of soil pollutants start with the following statement: the determination of total pollutant content will lead to an over-estimation of risk. However, an assessment of contaminated sites should be based on the determination of mobile fractions of pollutants, and the fractions with potential for mobilisation that threaten groundwater and surface water, and the actual and potential fractions available for uptake by plants, soil microflora and soil organisms. After reviewing the literature for method proposals concerning the determination of available/bioavailable fractions of contaminants with respect to leaching, plants, microorganisms (biodegradation) and soil organisms, we propose a testing and assessment scheme for contaminated sites. The proposal includes (i) already accepted and used methods, (ii) methods which are under standardisation, and (iii) methods for which development has just started in order to promote urgently needed research. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Are rapid population estimates accurate? A field trial of two different assessment methods.

    PubMed

    Grais, Rebecca F; Coulombier, Denis; Ampuero, Julia; Lucas, Marcelino E S; Barretto, Avertino T; Jacquier, Guy; Diaz, Francisco; Balandine, Serge; Mahoudeau, Claude; Brown, Vincent

    2006-09-01

    Emergencies resulting in large-scale displacement often lead to populations resettling in areas where basic health services and sanitation are unavailable. To plan relief-related activities quickly, rapid population size estimates are needed. The currently recommended Quadrat method estimates total population by extrapolating the average population size living in square blocks of known area to the total site surface. An alternative approach, the T-Square, provides a population estimate based on analysis of the spatial distribution of housing units taken throughout a site. We field tested both methods and validated the results against a census in Esturro Bairro, Beira, Mozambique. Compared to the census (population: 9,479), the T-Square yielded a better population estimate (9,523) than the Quadrat method (7,681; 95% confidence interval: 6,160-9,201), but was more difficult for field survey teams to implement. Although applicable only to similar sites, several general conclusions can be drawn for emergency planning.

  6. Testing high SPF sunscreens: a demonstration of the accuracy and reproducibility of the results of testing high SPF formulations by two methods and at different testing sites.

    PubMed

    Agin, Patricia Poh; Edmonds, Susan H

    2002-08-01

    The goals of this study were (i) to demonstrate that existing and widely used sun protection factor (SPF) test methodologies can produce accurate and reproducible results for high SPF formulations and (ii) to provide data on the number of test-subjects needed, the variability of the data, and the appropriate exposure increments needed for testing high SPF formulations. Three high SPF formulations were tested, according to the Food and Drug Administration's (FDA) 1993 tentative final monograph (TFM) 'very water resistant' test method and/or the 1978 proposed monograph 'waterproof' test method, within one laboratory. A fourth high SPF formulation was tested at four independent SPF testing laboratories, using the 1978 waterproof SPF test method. All laboratories utilized xenon arc solar simulators. The data illustrate that the testing conducted within one laboratory, following either the 1978 proposed or the 1993 TFM SPF test method, was able to reproducibly determine the SPFs of the formulations tested, using either the statistical analysis method in the proposed monograph or the statistical method described in the TFM. When one formulation was tested at four different laboratories, the anticipated variation in the data owing to the equipment and other operational differences was minimized through the use of the statistical method described in the 1993 monograph. The data illustrate that either the 1978 proposed monograph SPF test method or the 1993 TFM SPF test method can provide accurate and reproducible results for high SPF formulations. Further, these results can be achieved with panels of 20-25 subjects with an acceptable level of variability. Utilization of the statistical controls from the 1993 sunscreen monograph can help to minimize lab-to-lab variability for well-formulated products.

  7. Hydrologic testing of tight zones in southeastern New Mexico.

    USGS Publications Warehouse

    Dennehy, K.F.; Davis, P.A.

    1981-01-01

    Increased attention is being directed toward the investigation of tight zones in relation to the storage and disposal of hazardous wastes. Shut-in tests, slug tests, and pressure-slug tests are being used at the proposed Waste Isolation Pilot Plant site, New Mexico, to evaluate the fluid-transmitting properties of several zones above the proposed repository zone. All three testing methods were used in various combinations to obtain values for the hydraulic properties of the test zones. Multiple testing on the same zone produced similar results. -from Authors

  8. 21 CFR 118.8 - Testing methodology for Salmonella Enteritidis (SE).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Salmonella Web site is located at http://www.fda.gov/Food/ScienceResearch/LaboratoryMethods/ucm114716.htm... you may examine a copy at the Center for Food Safety and Applied Nutrition's Library, 5100 Paint... Edition, is located at http://www.fda.gov/Food/ScienceResearch/LaboratoryMethods/BacteriologicalAnalytical...

  9. A street intercept survey to assess HIV-testing attitudes and behaviors.

    PubMed

    Rotheram-Borus, M J; Mann, T; Newman, P A; Grusky, O; Frerichs, R R; Wight, R G; Kuklinski, M

    2001-06-01

    Nationally, it has been estimated that 44% of adults in the United States have been tested for HIV, with substantial individual and community-level variations in HIV-testing attitudes and behaviors. HIV-testing behaviors and intentions and attitudes toward HIV testing, particularly toward home tests, were assessed among 385 adults recruited in a street intercept survey from a gay-identified agency, a substance-abuse treatment program, and inner-city community venues (a shopping mall and community center). Across these Los Angeles sites, the proportion of persons reported being tested for HIV in their lifetime (77%) was higher than the national estimate. Gay-identified agency (88%) and substance-abuse treatment program participants (99%) were more likely to have been tested than were the community participants (67%). Participants from a gay-identified agency were more likely to have had an anonymous test (51%) than were those from a substance-abuse treatment program (25%) or community sites (24%). Attitudes toward HIV testing, including mail-in home-test kits and instant home tests, were very positive. Most participants were willing to pay about $20 for a home-test kit. Participants from the community sites (82%) and the substance-abuse treatment program participants (87%) endorsed notification of HIV status to health departments and sexual partners more than did participants from the gay identified agency (48%). The street intercept survey appears to be a quick and feasible method to assess HIV testing in urban areas.

  10. Use of Passive Diffusion Samplers for Monitoring Volatile Organic Compounds in Ground Water

    USGS Publications Warehouse

    Harte, Philip T.; Brayton, Michael J.; Ives, Wayne

    2000-01-01

    Passive diffusion samplers have been tested at a number of sites where volatile organic compounds (VOC's) are the principal contaminants in ground water. Test results generally show good agreement between concentrations of VOC's in samples collected with diffusion samplers and concentrations in samples collected by purging the water from a well. Diffusion samplers offer several advantages over conventional and low-flow ground-water sampling procedures: * Elimination of the need to purge a well before collecting a sample and to dispose of contaminated water. * Elimination of cross-contamination of samples associated with sampling with non-dedicated pumps or sample delivery tubes. * Reduction in sampling time by as much as 80 percent of that required for 'purge type' sampling methods. * An increase in the frequency and spatial coverage of monitoring at a site because of the associated savings in time and money. The successful use of diffusion samplers depends on the following three primary factors: (1) understanding site conditions and contaminants of interest (defining sample objectives), (2) validating of results of diffusion samplers against more widely acknowledged sampling methods, and (3) applying diffusion samplers in the field.

  11. USGS GeoData Digital Raster Graphics

    USGS Publications Warehouse

    ,

    2001-01-01

    Passive diffusion samplers have been tested at a number of sites where volatile organic compounds (VOC?s) are the principal contaminants in ground water. Test results generally show good agreement between concentrations of VOC?s in samples collected with diffusion samplers and concentrations in samples collected by purging the water from a well. Diffusion samplers offer several advantages over conventional and low-flow ground-water sampling procedures: ? Elimination of the need to purge a well before collecting a sample and to dispose of contaminated water. ? Elimination of cross-contamination of samples associated with sampling with non-dedicated pumps or sample delivery tubes. ? Reduction in sampling time by as much as 80 percent of that required for ?purge type? sampling methods. ? An increase in the frequency and spatial coverage of monitoring at a site because of the associated savings in time and money. The successful use of diffusion samplers depends on the following three primary factors: (1) understanding site conditions and contaminants of interest (defining sample objectives), (2) validating of results of diffusion samplers against more widely acknowledged sampling methods, and (3) applying diffusion samplers in the field.

  12. Corrective Action Decision Document/Corrective Action Plan for Corrective Action Unit 104: Area 7 Yucca Flat Atmospheric Test Sites Nevada National Security Site, Nevada, Revision 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patrick Matthews

    2012-10-01

    CAU 104 comprises the following corrective action sites (CASs): • 07-23-03, Atmospheric Test Site T-7C • 07-23-04, Atmospheric Test Site T7-1 • 07-23-05, Atmospheric Test Site • 07-23-06, Atmospheric Test Site T7-5a • 07-23-07, Atmospheric Test Site - Dog (T-S) • 07-23-08, Atmospheric Test Site - Baker (T-S) • 07-23-09, Atmospheric Test Site - Charlie (T-S) • 07-23-10, Atmospheric Test Site - Dixie • 07-23-11, Atmospheric Test Site - Dixie • 07-23-12, Atmospheric Test Site - Charlie (Bus) • 07-23-13, Atmospheric Test Site - Baker (Buster) • 07-23-14, Atmospheric Test Site - Ruth • 07-23-15, Atmospheric Test Site T7-4 •more » 07-23-16, Atmospheric Test Site B7-b • 07-23-17, Atmospheric Test Site - Climax These 15 CASs include releases from 30 atmospheric tests conducted in the approximately 1 square mile of CAU 104. Because releases associated with the CASs included in this CAU overlap and are not separate and distinguishable, these CASs are addressed jointly at the CAU level. The purpose of this CADD/CAP is to evaluate potential corrective action alternatives (CAAs), provide the rationale for the selection of recommended CAAs, and provide the plan for implementation of the recommended CAA for CAU 104. Corrective action investigation (CAI) activities were performed from October 4, 2011, through May 3, 2012, as set forth in the CAU 104 Corrective Action Investigation Plan.« less

  13. Azimuth selection for sea level measurements using geodetic GPS receivers

    NASA Astrophysics Data System (ADS)

    Wang, Xiaolei; Zhang, Qin; Zhang, Shuangcheng

    2018-03-01

    Based on analysis of Global Positioning System (GPS) multipath signals recorded by a geodetic GPS receiver, GPS Reflectometry (GPS-R) has demonstrated unique advantages in relation to sea level monitoring. Founded on multipath reflectometry theory, sea level changes can be measured by GPS-R through spectral analysis of recorded signal-to-noise ratio data. However, prior to estimating multipath parameters, it is necessary to define azimuth and elevation angle mask to ensure the reflecting zones are on water. Here, a method is presented to address azimuth selection, a topic currently under active development in the field of GPS-R. Data from three test sites: the Kachemak Bay GPS site PBAY in Alaska (USA), Friday Harbor GPS site SC02 in the San Juan Islands (USA), and Brest Harbor GPS site BRST in Brest (France) are analyzed. These sites are located in different multipath environments, from a rural coastal area to a busy harbor, and they experience different tidal ranges. Estimates by the GPS tide gauges at azimuths selected by the presented method are compared with measurements from physical tide gauges and acceptable correspondence found for all three sites.

  14. Attenuation-difference radar tomography: results of a multiple-plane experiment at the U.S. Geological Survey Fractured-Rock Research Site, Mirror Lake, New Hampshire

    USGS Publications Warehouse

    Lane, J.W.; Day-Lewis, F. D.; Harris, J.M.; Haeni, F.P.; Gorelick, S.M.

    2000-01-01

    Attenuation-difference, borehole-radar tomography was used to monitor a series of sodium chloride tracer injection tests conducted within the FSE, wellfield at the U.S. Geological Survey Fractured-Rock Hydrology Research Site in Grafton County, New Hampshire, USA. Borehole-radar tomography surveys were conducted using the sequential-scanning and injection method in three boreholes that form a triangular prism of adjoining tomographic image planes. Results indicate that time-lapse tomography methods provide high-resolution images of tracer distribution in permeable zones.

  15. Alternative Methods for Assessing Contaminant Transport from the Vadose Zone to Indoor Air

    NASA Astrophysics Data System (ADS)

    Baylor, K. J.; Lee, A.; Reddy, P.; Plate, M.

    2010-12-01

    Vapor intrusion, which is the transport of contaminant vapors from groundwater and the vadose zone to indoor air, has emerged as a significant human health risk near hazardous waste sites. Volatile organic compounds (VOCs) such as trichloroethylene (TCE) and tetrachloroethylene (PCE) can volatilize from groundwater and from residual sources in the vadose zone and enter homes and commercial buildings through cracks in the slab, plumbing conduits, or other preferential pathways. Assessment of the vapor intrusion pathway typically requires collection of groundwater, soil gas, and indoor air samples, a process which can be expensive and time-consuming. We evaluated three alternative vapor intrusion assessment methods, including 1) use of radon as a surrogate for vapor intrusion, 2) use of pressure differential measurements between indoor/outdoor and indoor/subslab to assess the potential for vapor intrusion, and 3) use of passive, longer-duration sorbent methods to measure indoor air VOC concentrations. The primary test site, located approximately 30 miles south of San Francisco, was selected due to the presence of TCE (10 - 300 ug/L) in shallow groundwater (5 to 10 feet bgs). At this test site, we found that radon was not a suitable surrogate to asses vapor intrusion and that pressure differential measurements are challenging to implement and equipment-intensive. More significantly, we found that the passive, longer-duration sorbent methods are easy to deploy and compared well quantitatively with standard indoor air sampling methods. The sorbent technique is less than half the cost of typical indoor air methods, and also provides a longer duration sample, typically 3 to 14 days rather than 8 to 24 hours for standard methods. The passive sorbent methods can be a reliable, cost-effective, and easy way to sample for TCE, PCE and other VOCs as part of a vapor intrusion investigation.

  16. Analysis and elimination method of the effects of cables on LVRT testing for offshore wind turbines

    NASA Astrophysics Data System (ADS)

    Jiang, Zimin; Liu, Xiaohao; Li, Changgang; Liu, Yutian

    2018-02-01

    The current state, characteristics and necessity of the low voltage ride through (LVRT) on-site testing for grid-connected offshore wind turbines are introduced firstly. Then the effects of submarine cables on the LVRT testing are analysed based on the equivalent circuit of the testing system. A scheme for eliminating the effects of cables on the proposed LVRT testing method is presented. The specified voltage dips are guaranteed to be in compliance with the testing standards by adjusting the ratio between the current limiting impedance and short circuit impedance according to the steady voltage relationship derived from the equivalent circuit. Finally, simulation results demonstrate that the voltage dips at the high voltage side of wind turbine transformer satisfy the requirements of testing standards.

  17. Seismic data acquisition at the FACT site for the CASPAR project.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Kyle R.; Chael, Eric Paul; Hart, Darren M.

    Since May 2010, we have been recording continuous seismic data at Sandia's FACT site. The collected signals provide us with a realistic archive for testing algorithms under development for local monitoring of explosive testing. Numerous small explosive tests are routinely conducted around Kirtland AFB by different organizations. Our goal is to identify effective methods for distinguishing these events from normal daily activity on and near the base, such as vehicles, aircraft, and storms. In this report, we describe the recording system, and present some observations of the varying ambient noise conditions at FACT. We present examples of various common, non-explosive,more » sources. Next we show signals from several small explosions, and discuss their characteristic features.« less

  18. Effect of water quality on residential water heater life-cycle efficiency. Annual report, September 1983-August 1984

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stickford, G.H.; Talbert, S.G.; Newman, D.C.

    A 3-year field test program is under way for the Gas Research Institute to quantify the effect of scale buildup on the performance of residential water heaters, and to determine the benefits and limitations of common water-treatment methods. In this program, the performance of gas and electric heaters is being monitored in test laboratories set up in selected U.S. cities. The efficiency of heaters operating on hard water is measured and compared with the performance of heaters operating on treated water. Corrosion tests are also being conducted on each type of water tested to determine the effect of water treatmentmore » on the corrosion of the water heating system. During this reporting period Battelle has established operating hard water test facilities at four test sites: (1) Battelle, (2) the Roswell Test Facility in Roswell, New Mexico, (3) the Water Quality Association in Lisle, Illinois, and (4) the Marshall Municipal Utilities in Marshall, Minnesota. At each of these sites 12 water heaters have been installed and are operating on accelerated draw cycles. The recovery efficiency of each heater has been carefully measured, and the heaters have been operating from 4 months at one site to 7 months at another. At two of the test sites, the recovery efficiency of each heater has been remeasured after 6 months of operation. No significant degradation in heater performance due to scale buildup was observed in these heaters for the equivalent of 2 to 3 years of typical residential use.« less

  19. Identification and characterization of conservative organic tracers for use as hydrologic tracers for the Yucca Mountain site characterization study. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stetzenbach, K.; Farnham, I.

    1996-06-01

    Extensive tracer testing is expected to take place at the C-well complex in the Nevada Test Site as part of the Yucca Mountain Site Characterization Project. The C-well complex consists of one pumping well, C3, and two injection wells, C1 and C2 into which tracer will be introduced. The goal of this research was to provide USGS with numerous tracers to completed these tests. Several classes of fluorinated organic acids have been evaluated. These include numerous isomers of fluorinated benzoic acids, cinnamic acids, and salicylic acids. Also several derivatives of 2-hydroxy nicotinic acid (pyridone) have been tested. The stability ofmore » these compounds was determined using batch and column tests. Ames testing (mutagenicity/carcinogenicity) was conducted on the fluorinated benzoic acids and a literature review of toxicity of the fluorobenzoates and three perfluoro aliphatic acids was prepared. Solubilities were measured and method development work was performed to optimize the detection of these compounds. A Quality Assurance (QA) Program was developed under existing DOE and USGS guidelines. The program includes QA procedures and technical standard operating procedures. A tracer test, using sodium iodide, was performed at the C-well complex. HRC chemists performed analyses on site, to provide real time data for the USGS hydrologists and in the laboratories at UNLV. Over 2,500 analyses were performed. This report provides the results of the laboratory experiments and literature reviews used to evaluate the potential tracers and reports on the results of the iodide C-well tracer test.« less

  20. A comparison of two visual inspection methods for cervical cancer screening among HIV-infected women in Kenya

    PubMed Central

    Sneden, Jennifer; Leslie, Hannah H; Abdulrahim, Naila; Maloba, May; Bukusi, Elizabeth; Cohen, Craig R

    2014-01-01

    Abstract Objective To determine the optimal strategy for cervical cancer screening in women with human immunodeficiency virus (HIV) infection by comparing two strategies: visual inspection of the cervix with acetic acid (VIA) and VIA followed immediately by visual inspection with Lugol’s iodine (VIA/VILI) in women with a positive VIA result. Methods Data from a cervical cancer screening programme embedded in two HIV clinic sites in western Kenya were evaluated. Women at a central site underwent VIA, while women at a peripheral site underwent VIA/VILI. All women positive for cervical intraepithelial neoplasia grade 2 or worse (CIN 2+) on VIA and/or VILI had a confirmatory colposcopy, with a biopsy if necessary. Overall test positivity, positive predictive value (PPV) and the CIN 2+ detection rate were calculated for the two screening methods, with biopsy being the gold standard. Findings Between October 2007 and October 2010, 2338 women were screened with VIA and 1124 with VIA/VILI. In the VIA group, 26.4% of the women tested positive for CIN 2+; in the VIA/VILI group, 21.7% tested positive (P < 0.01). Histologically confirmed CIN 2+ was detected in 8.9% and 7.8% (P = 0.27) of women in the VIA and VIA/VILI groups, respectively. The PPV of VIA for biopsy-confirmed CIN 2+ in a single round of screening was 35.2%, compared with 38.2% for VIA/VILI (P = 0.41). Conclusion The absence of any differences between VIA and VIA/VILI in detection rates or PPV for CIN 2+ suggests that VIA, an easy testing procedure, can be used alone as a cervical cancer screening strategy in low-income settings. PMID:24700979

  1. Toward the establishment of standardized in vitro tests for lipid-based formulations, part 1: method parameterization and comparison of in vitro digestion profiles across a range of representative formulations.

    PubMed

    Williams, Hywel D; Sassene, Philip; Kleberg, Karen; Bakala-N'Goma, Jean-Claude; Calderone, Marilyn; Jannin, Vincent; Igonin, Annabel; Partheil, Anette; Marchaud, Delphine; Jule, Eduardo; Vertommen, Jan; Maio, Mario; Blundell, Ross; Benameur, Hassan; Carrière, Frédéric; Müllertz, Anette; Porter, Christopher J H; Pouton, Colin W

    2012-09-01

    The Lipid Formulation Classification System Consortium is an industry-academia collaboration, established to develop standardized in vitro methods for the assessment of lipid-based formulations (LBFs). In this first publication, baseline conditions for the conduct of digestion tests are suggested and a series of eight model LBFs are described to probe test performance across different formulation types. Digestion experiments were performed in vitro using a pH-stat apparatus and danazol employed as a model poorly water-soluble drug. LBF digestion (rate and extent) and drug solubilization patterns on digestion were examined. To evaluate cross-site reproducibility, experiments were conducted at two sites and highly consistent results were obtained. In a further refinement, bench-top centrifugation was explored as a higher throughput approach to separation of the products of digestion (and compared with ultracentrifugation), and conditions under which this method was acceptable were defined. Drug solubilization was highly dependent on LBF composition, but poorly correlated with simple performance indicators such as dispersion efficiency, confirming the utility of the digestion model as a means of formulation differentiation. Copyright © 2012 Wiley Periodicals, Inc.

  2. Establishing a range-wide provenance test in valley oak (Quercus lobata Née) at two California sites

    Treesearch

    Annette Delfino-Mix; Jessica W. Wright; Paul F. Gugger; Christina Liang; Victoria L. Sork

    2015-01-01

    We present the methods used to establish a provenance test in valley oak, Quercus lobata. Nearly 11,000 acorns were planted and 88 percent of those germinated. The resulting seedlings were measured after 1 and 2 years of growth, and were outplanted in the field in the winter of 2014-2015. This test represents a long-term resource for both research...

  3. 242-A Evaporator/plutonium uranium extraction (PUREX) effluent treatment facility (ETF) nonradioactive air emission test report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, J.S., Westinghouse Hanford

    1996-05-10

    This report shows the methods used to test the stack gas outlet concentration and emission rate of Volatile Organic Compounds as Total Non-Methane Hydrocarbons in parts per million by volume,grams per dry standard cubic meter, and grams per minute from the PUREX ETF stream number G6 on the Hanford Site. Test results are shown in Appendix B.1.

  4. Comparison of shear-wave slowness profiles at 10 strong-motion sites from noninvasive SASW measurements and measurements made in boreholes

    USGS Publications Warehouse

    Brown, L.T.; Boore, D.M.; Stokoe, K.H.

    2002-01-01

    The spectral-analysis-of-surface-waves (SASW) method is a relatively new in situ method for determining shear-wave slownesses. All measurements are made on the ground surface, making it much less costly than methods that require boreholes. The SASW method uses a number of active sources (ranging from a commercial Vibroseis truck to a small handheld hammer for the study conducted here) and different receiver spacings to map a curve of apparent phase velocity versus frequency. With the simplifying assumption that the phase velocities correspond to fundamental mode surface waves, forward modeling yields an estimate of the sub-surface shear-wave slownesses. To establish the reliability of this indirect technique, we conducted a blind evaluation of the SASW method. SASW testing was performed at 10 strong-motion stations at which borehole seismic measurements were previously or subsequently made; if previously made, the borehole results were not used for the interpretation of the SASW data, and vice-versa. Comparisons of the shear-wave slownesses from the SASW and borehole measurements are generally very good. The differences in predicted ground-motion amplifications are less than about 15% for most frequencies. In addition, both methods gave the same NEHRP site classification for seven of the sites. For the other three sites the average velocities from the downhole measurements were only 5-13 m/sec larger than the velocity defining the class C/D boundary. This study demonstrates that in many situations the SASW method can provide subsurface information suitable for site response predictions.

  5. Teaching older adults by adapting for aging changes.

    PubMed

    Weinrich, S P; Weinrich, M C; Boyd, M D; Atwood, J; Cervenka, B

    1994-12-01

    Few teaching programs are geared to meet the special learning needs of the elderly. This pilot study used a quasi-experimental pretest-posttest design to measure the effect of the Adaptation for Aging Changes (AAC) Method on fecal occult blood screening (FOBS) at meal sites for the elderly in the South. The AAC Method uses techniques that adjust the presentation to accommodate for normal aging changes and includes a demonstration of the procedure for collection of the stool blood test, memory reminders of the date to return the stool blood test, and written materials adapted to the 5th grade reading level. In addition, actual practice of the FOBS with the use of peanut butter was added to the AAC Method, making it the AAC with Practice Method (AACP) in two sites. The American Cancer Society's colorectal cancer educational slide-tape show served as the basis for all of the methods. Hemoccult II kits were distributed at no cost to the participants. Descriptive statistics, chi 2, and logistic regressions were used to analyze data from 135 Council on Aging meal sites' participants. The average age of the participants was 72 years; the average educational level was 8th grade; over half the sample was African-American; and half of the participants had incomes below the poverty level. Results support a significant increase in participation in FOBS in participants taught by the AACP Method [chi 2 (1, n = 56) = 5.34, p = 0.02; odds ratio = 6.2]. This research provides support for teaching that makes adaptations for aging changes, especially adaptations that include actual practice of the procedure.

  6. Formulation and evaluation of chitosan/polyethylene oxide nanofibers loaded with metronidazole for local infections.

    PubMed

    Zupančič, Špela; Potrč, Tanja; Baumgartner, Saša; Kocbek, Petra; Kristl, Julijana

    2016-12-01

    Nanofibers combined with an antimicrobial represent a powerful strategy for treatment of various infections. Local infections usually have a low fluid volume available for drug release, whereas pharmacopoeian dissolution tests include a much larger receptor volume. Therefore, the development of novel drug-release methods that more closely resemble the in-vivo conditions is necessary. We first developed novel biocompatible and biodegradable chitosan/polyethylene oxide nanofibers using environmentally friendly electrospinning of aqueous polymer solutions, with the inclusion of the antimicrobial metronidazole. Here, the focus is on the characterization of these nanofibers, which have high potential for bioadhesion and retention at the site of application. These can be used where prolonged retention of the delivery system at an infected target site is needed. Drug release was studied using three in-vitro methods: a dissolution apparatus (Apparatus 1 of the European Pharmacopoeia), vials, and a Franz diffusion cell. In contrast to other studies, here the Franz diffusion cell method was modified to introduce a small volume of medium with the nanofibers in the donor compartment, where the nanofibers swelled, eroded, and released the metronidazole, which then diffused into the receptor compartment. This set-up with nanofibers in a limited amount of medium released the drug more slowly compared to the other two in-vitro methods that included larger volumes of medium. These findings show that drug release from nanofibers strongly depends on the release method used. Therefore, in-vitro test methods should closely resemble the in-vivo conditions for more accurate prediction of drug release at a therapeutic site. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Alternatives to animal testing: information resources via the Internet and World Wide Web.

    PubMed

    Hakkinen, P J Bert; Green, Dianne K

    2002-04-25

    Many countries, including the United States, Canada, European Union member states, and others, require that a comprehensive search for possible alternatives be completed before beginning some or all research involving animals. Completing comprehensive alternatives searches and keeping current with information associated with alternatives to animal testing is a challenge that will be made easier as people throughout the world gain access to the Internet and World Wide Web. Numerous Internet and World Wide Web resources are available to provide guidance and other information on in vitro and other alternatives to animal testing. A comprehensive Web site is Alternatives to Animal Testing on the Web (Altweb), which serves as an online clearinghouse for resources, information, and news about alternatives to animal testing. Examples of other important Web sites include the joint one for the (US) Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) and the National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM) and the Norwegian Reference Centre for Laboratory Animal Science and Alternatives (The NORINA database). Internet mailing lists and online access to bulletin boards, discussion areas, newsletters, and journals are other ways to access and share information to stay current with alternatives to animal testing.

  8. A New Test Method of Circuit Breaker Spring Telescopic Characteristics Based Image Processing

    NASA Astrophysics Data System (ADS)

    Huang, Huimin; Wang, Feifeng; Lu, Yufeng; Xia, Xiaofei; Su, Yi

    2018-06-01

    This paper applied computer vision technology to the fatigue condition monitoring of springs, and a new telescopic characteristics test method is proposed for circuit breaker operating mechanism spring based on image processing technology. High-speed camera is utilized to capture spring movement image sequences when high voltage circuit breaker operated. Then the image-matching method is used to obtain the deformation-time curve and speed-time curve, and the spring expansion and deformation parameters are extracted from it, which will lay a foundation for subsequent spring force analysis and matching state evaluation. After performing simulation tests at the experimental site, this image analyzing method could solve the complex problems of traditional mechanical sensor installation and monitoring online, status assessment of the circuit breaker spring.

  9. Algorithms for Ocean Bottom Albedo Determination from In-Water Natural Light Measurements

    NASA Technical Reports Server (NTRS)

    Leathers, Robert A.; McCormick, Normal J.

    1999-01-01

    A method for determining ocean bottom optical albedo, R-sub b, from in-water upward and downward irradiance measurements at a shallow site is presented, tested, and compared with a more familiar approach that requires additional measurements at a nearby deep-water site. Also presented are two new algorithms for the estimation of R-sub b from measurements of the downward irradiance and vertically upward radiance.

  10. Characterization and identification of ubiquitin conjugation sites with E3 ligase recognition specificities.

    PubMed

    Nguyen, Van-Nui; Huang, Kai-Yao; Huang, Chien-Hsun; Chang, Tzu-Hao; Bretaña, Neil; Lai, K; Weng, Julia; Lee, Tzong-Yi

    2015-01-01

    In eukaryotes, ubiquitin-conjugation is an important mechanism underlying proteasome-mediated degradation of proteins, and as such, plays an essential role in the regulation of many cellular processes. In the ubiquitin-proteasome pathway, E3 ligases play important roles by recognizing a specific protein substrate and catalyzing the attachment of ubiquitin to a lysine (K) residue. As more and more experimental data on ubiquitin conjugation sites become available, it becomes possible to develop prediction models that can be scaled to big data. However, no development that focuses on the investigation of ubiquitinated substrate specificities has existed. Herein, we present an approach that exploits an iteratively statistical method to identify ubiquitin conjugation sites with substrate site specificities. In this investigation, totally 6259 experimentally validated ubiquitinated proteins were obtained from dbPTM. After having filtered out homologous fragments with 40% sequence identity, the training data set contained 2658 ubiquitination sites (positive data) and 5532 non-ubiquitinated sites (negative data). Due to the difficulty in characterizing the substrate site specificities of E3 ligases by conventional sequence logo analysis, a recursively statistical method has been applied to obtain significant conserved motifs. The profile hidden Markov model (profile HMM) was adopted to construct the predictive models learned from the identified substrate motifs. A five-fold cross validation was then used to evaluate the predictive model, achieving sensitivity, specificity, and accuracy of 73.07%, 65.46%, and 67.93%, respectively. Additionally, an independent testing set, completely blind to the training data of the predictive model, was used to demonstrate that the proposed method could provide a promising accuracy (76.13%) and outperform other ubiquitination site prediction tool. A case study demonstrated the effectiveness of the characterized substrate motifs for identifying ubiquitination sites. The proposed method presents a practical means of preliminary analysis and greatly diminishes the total number of potential targets required for further experimental confirmation. This method may help unravel their mechanisms and roles in E3 recognition and ubiquitin-mediated protein degradation.

  11. Service life of treated and untreated Black Hills ponderosa pine fenceposts

    Treesearch

    Donald C. Markstrom; Lee R. Gjovik

    1992-01-01

    Service-life tests indicate that ponderosa pine fenceposts treated with preservatives performed well after field exposure of 30 years. Treating plants in the Black Hills area used commercial methods to treat the posts with creosote, pentachlorophenol, and waterborne arsenicals. Test sites were in the northern Great Plains-one in the semiarid western portion near Scenic...

  12. Rapid Surface Detection of CO2 Leaks from Geologic Sequestration Sites

    NASA Astrophysics Data System (ADS)

    Moriarty, D. M.; Krevor, S. C.; Benson, S. M.

    2013-12-01

    Carbon sequestration is becoming a viable option for global CO2 mitigation but effective monitoring methods are needed assure the carbon dioxide stays underground. Above surface monitoring using a mobile gas analyzer is one such method (e.g. Krevor et al., 2010). The Picarro gas analyzer uses wavelength-scanned cavity ring down spectroscopy to accurately identify concentrations of various atmospheric gases including their isotopic composition. These measurements can then be used for anomaly (leak) detection and source attribution. Leaks are detected by anomalous absolute concentration of CO2 and anomalous δ13C values. Source attribution is determined by the isotopic concentrations of the identified leaking gas. To distinguish between noise from ambient signals and leaks, a method based on mixing ratios has been developed. A newly acquired data set presented here has been collected from a 3.7km2 area with naturally occurring CO2 springs near Green River, Utah. All of the areas of known leakage were readily detected using this method along with several other areas that showed significant signs of leakage. In addition, testing on the Stanford campus has shown that this method is sensitive enough to distinguish between open fields and roadways. Another data set is being collected at Montana State University at the ZERT monitoring test site where an artificial leak has been created for the purpose of testing leak detection and quantification methods. Data collected from this site are being used for (1) assessing of detection levels and how they depend on environmental parameters such as wind speed, and acquisition variables such as sample rate and traverse speed, (2) optimizing acquisition parameters to increase detection levels and increase confidence in leak detection, (3) evaluating the potential for quantifying the magnitude of the leak and (4) spatial data analysis to identify the most probable leak locations.

  13. A Comparison of Two Sampling Strategies to Assess Discomycete Diversity in Wet Tropical Forests

    Treesearch

    SHARON A. CANTRELL

    2004-01-01

    Most of the fungal diversity studies that have used a systematic collecting scheme have not included the discomycetes, so optimal sampling methods are not available for this group. In this study, I tested two sampling methods at each sites in the Caribbean National Forest, Puerto Rico and Ebano Verde Reserve, Dominican Republic. For a plot-based sampling method, 10 ×...

  14. Methods for using clinical laboratory test results as baseline confounders in multi-site observational database studies when missing data are expected.

    PubMed

    Raebel, Marsha A; Shetterly, Susan; Lu, Christine Y; Flory, James; Gagne, Joshua J; Harrell, Frank E; Haynes, Kevin; Herrinton, Lisa J; Patorno, Elisabetta; Popovic, Jennifer; Selvan, Mano; Shoaibi, Azadeh; Wang, Xingmei; Roy, Jason

    2016-07-01

    Our purpose was to quantify missing baseline laboratory results, assess predictors of missingness, and examine performance of missing data methods. Using the Mini-Sentinel Distributed Database from three sites, we selected three exposure-outcome scenarios with laboratory results as baseline confounders. We compared hazard ratios (HRs) or risk differences (RDs) and 95% confidence intervals (CIs) from models that omitted laboratory results, included only available results (complete cases), and included results after applying missing data methods (multiple imputation [MI] regression, MI predictive mean matching [PMM] indicator). Scenario 1 considered glucose among second-generation antipsychotic users and diabetes. Across sites, glucose was available for 27.7-58.9%. Results differed between complete case and missing data models (e.g., olanzapine: HR 0.92 [CI 0.73, 1.12] vs 1.02 [0.90, 1.16]). Across-site models employing different MI approaches provided similar HR and CI; site-specific models provided differing estimates. Scenario 2 evaluated creatinine among individuals starting high versus low dose lisinopril and hyperkalemia. Creatinine availability: 44.5-79.0%. Results differed between complete case and missing data models (e.g., HR 0.84 [CI 0.77, 0.92] vs. 0.88 [0.83, 0.94]). HR and CI were identical across MI methods. Scenario 3 examined international normalized ratio (INR) among warfarin users starting interacting versus noninteracting antimicrobials and bleeding. INR availability: 20.0-92.9%. Results differed between ignoring INR versus including INR using missing data methods (e.g., RD 0.05 [CI -0.03, 0.13] vs 0.09 [0.00, 0.18]). Indicator and PMM methods gave similar estimates. Multi-site studies must consider site variability in missing data. Different missing data methods performed similarly. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. State Waste Discharge Permit Application: Electric resistance tomography testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-04-01

    This permit application documentation is for a State Waste Discharge Permit issued in accordance with requirements of Washington Administrative Code 173-216. The activity being permitted is a technology test using electrical resistance tomography. The electrical resistance tomography technology was developed at Lawrence Livermore National Laboratory and has been used at other waste sites to track underground contamination plumes. The electrical resistance tomography technology measures soil electrical resistance between two electrodes. If a fluid contaminated with electrolytes is introduced into the soil, the soil resistance is expected to drop. By using an array of measurement electrodes in several boreholes, the arealmore » extent of contamination can be estimated. At the Hanford Site, the purpose of the testing is to determine if the electrical resistance tomography technology can be used in the vicinity of large underground metal tanks without the metal tank interfering with the test. It is anticipated that the electrical resistance tomography technology will provide a method for accurately detecting leaks from the bottom of underground tanks, such as the Hanford Site single-shell tanks.« less

  16. Detection of nuclear testing from surface concentration measurements: Analysis of radioxenon from the February 2013 underground test in North Korea

    DOE PAGES

    Kurzeja, R. J.; Buckley, R. L.; Werth, D. W.; ...

    2017-12-28

    A method is outlined and tested to detect low level nuclear or chemical sources from time series of concentration measurements. The method uses a mesoscale atmospheric model to simulate the concentration signature from a known or suspected source at a receptor which is then regressed successively against segments of the measurement series to create time series of metrics that measure the goodness of fit between the signatures and the measurement segments. The method was applied to radioxenon data from the Comprehensive Test Ban Treaty (CTBT) collection site in Ussuriysk, Russia (RN58) after the Democratic People's Republic of Korea (North Korea)more » underground nuclear test on February 12, 2013 near Punggye. The metrics were found to be a good screening tool to locate data segments with a strong likelihood of origin from Punggye, especially when multiplied together to a determine the joint probability. Metrics from RN58 were also used to find the probability that activity measured in February and April of 2013 originated from the Feb 12 test. A detailed analysis of an RN58 data segment from April 3/4, 2013 was also carried out for a grid of source locations around Punggye and identified Punggye as the most likely point of origin. Thus, the results support the strong possibility that radioxenon was emitted from the test site at various times in April and was detected intermittently at RN58, depending on the wind direction. The method does not locate unsuspected sources, but instead, evaluates the probability of a source at a specified location. However, it can be extended to include a set of suspected sources. Extension of the method to higher resolution data sets, arbitrary sampling, and time-varying sources is discussed along with a path to evaluate uncertainty in the calculated probabilities.« less

  17. Detection of nuclear testing from surface concentration measurements: Analysis of radioxenon from the February 2013 underground test in North Korea

    NASA Astrophysics Data System (ADS)

    Kurzeja, R. J.; Buckley, R. L.; Werth, D. W.; Chiswell, S. R.

    2018-03-01

    A method is outlined and tested to detect low level nuclear or chemical sources from time series of concentration measurements. The method uses a mesoscale atmospheric model to simulate the concentration signature from a known or suspected source at a receptor which is then regressed successively against segments of the measurement series to create time series of metrics that measure the goodness of fit between the signatures and the measurement segments. The method was applied to radioxenon data from the Comprehensive Test Ban Treaty (CTBT) collection site in Ussuriysk, Russia (RN58) after the Democratic People's Republic of Korea (North Korea) underground nuclear test on February 12, 2013 near Punggye. The metrics were found to be a good screening tool to locate data segments with a strong likelihood of origin from Punggye, especially when multiplied together to a determine the joint probability. Metrics from RN58 were also used to find the probability that activity measured in February and April of 2013 originated from the Feb 12 test. A detailed analysis of an RN58 data segment from April 3/4, 2013 was also carried out for a grid of source locations around Punggye and identified Punggye as the most likely point of origin. Thus, the results support the strong possibility that radioxenon was emitted from the test site at various times in April and was detected intermittently at RN58, depending on the wind direction. The method does not locate unsuspected sources, but instead, evaluates the probability of a source at a specified location. However, it can be extended to include a set of suspected sources. Extension of the method to higher resolution data sets, arbitrary sampling, and time-varying sources is discussed along with a path to evaluate uncertainty in the calculated probabilities.

  18. Detection of nuclear testing from surface concentration measurements: Analysis of radioxenon from the February 2013 underground test in North Korea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurzeja, R. J.; Buckley, R. L.; Werth, D. W.

    A method is outlined and tested to detect low level nuclear or chemical sources from time series of concentration measurements. The method uses a mesoscale atmospheric model to simulate the concentration signature from a known or suspected source at a receptor which is then regressed successively against segments of the measurement series to create time series of metrics that measure the goodness of fit between the signatures and the measurement segments. The method was applied to radioxenon data from the Comprehensive Test Ban Treaty (CTBT) collection site in Ussuriysk, Russia (RN58) after the Democratic People's Republic of Korea (North Korea)more » underground nuclear test on February 12, 2013 near Punggye. The metrics were found to be a good screening tool to locate data segments with a strong likelihood of origin from Punggye, especially when multiplied together to a determine the joint probability. Metrics from RN58 were also used to find the probability that activity measured in February and April of 2013 originated from the Feb 12 test. A detailed analysis of an RN58 data segment from April 3/4, 2013 was also carried out for a grid of source locations around Punggye and identified Punggye as the most likely point of origin. Thus, the results support the strong possibility that radioxenon was emitted from the test site at various times in April and was detected intermittently at RN58, depending on the wind direction. The method does not locate unsuspected sources, but instead, evaluates the probability of a source at a specified location. However, it can be extended to include a set of suspected sources. Extension of the method to higher resolution data sets, arbitrary sampling, and time-varying sources is discussed along with a path to evaluate uncertainty in the calculated probabilities.« less

  19. DOUBLE TRACKS Test Site interim corrective action plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The DOUBLE TRACKS site is located on Range 71 north of the Nellis Air Force Range, northwest of the Nevada Test Site (NTS). DOUBLE TRACKS was the first of four experiments that constituted Operation ROLLER COASTER. On May 15, 1963, weapons-grade plutonium and depleted uranium were dispersed using 54 kilograms of trinitrotoluene (TNT) explosive. The explosion occurred in the open, 0.3 m above the steel plate. No fission yield was detected from the test, and the total amount of plutonium deposited on the ground surface was estimated to be between 980 and 1,600 grams. The test device was composed primarilymore » of uranium-238 and plutonium-239. The mass ratio of uranium to plutonium was 4.35. The objective of the corrective action is to reduce the potential risk to human health and the environment and to demonstrate technically viable and cost-effective excavation, transportation, and disposal. To achieve these objectives, Bechtel Nevada (BN) will remove soil with a total transuranic activity greater then 200 pCI/g, containerize the soil in ``supersacks,`` transport the filled ``supersacks`` to the NTS, and dispose of them in the Area 3 Radioactive Waste Management Site. During this interim corrective action, BN will also conduct a limited demonstration of an alternative method for excavation of radioactive near-surface soil contamination.« less

  20. Methylated site display (MSD)-AFLP, a sensitive and affordable method for analysis of CpG methylation profiles.

    PubMed

    Aiba, Toshiki; Saito, Toshiyuki; Hayashi, Akiko; Sato, Shinji; Yunokawa, Harunobu; Maruyama, Toru; Fujibuchi, Wataru; Kurita, Hisaka; Tohyama, Chiharu; Ohsako, Seiichiroh

    2017-03-09

    It has been pointed out that environmental factors or chemicals can cause diseases that are developmental in origin. To detect abnormal epigenetic alterations in DNA methylation, convenient and cost-effective methods are required for such research, in which multiple samples are processed simultaneously. We here present methylated site display (MSD), a unique technique for the preparation of DNA libraries. By combining it with amplified fragment length polymorphism (AFLP) analysis, we developed a new method, MSD-AFLP. Methylated site display libraries consist of only DNAs derived from DNA fragments that are CpG methylated at the 5' end in the original genomic DNA sample. To test the effectiveness of this method, CpG methylation levels in liver, kidney, and hippocampal tissues of mice were compared to examine if MSD-AFLP can detect subtle differences in the levels of tissue-specific differentially methylated CpGs. As a result, many CpG sites suspected to be tissue-specific differentially methylated were detected. Nucleotide sequences adjacent to these methyl-CpG sites were identified and we determined the methylation level by methylation-sensitive restriction endonuclease (MSRE)-PCR analysis to confirm the accuracy of AFLP analysis. The differences of the methylation level among tissues were almost identical among these methods. By MSD-AFLP analysis, we detected many CpGs showing less than 5% statistically significant tissue-specific difference and less than 10% degree of variability. Additionally, MSD-AFLP analysis could be used to identify CpG methylation sites in other organisms including humans. MSD-AFLP analysis can potentially be used to measure slight changes in CpG methylation level. Regarding the remarkable precision, sensitivity, and throughput of MSD-AFLP analysis studies, this method will be advantageous in a variety of epigenetics-based research.

  1. Rippability Assessment of Weathered Sedimentary Rock Mass using Seismic Refraction Methods

    NASA Astrophysics Data System (ADS)

    Ismail, M. A. M.; Kumar, N. S.; Abidin, M. H. Z.; Madun, A.

    2018-04-01

    Rippability or ease of excavation in sedimentary rocks is a significant aspect of the preliminary work of any civil engineering project. Rippability assessment was performed in this study to select an available ripping machine to rip off earth materials using the seismic velocity chart provided by Caterpillar. The research area is located at the proposed construction site for the development of a water reservoir and related infrastructure in Kampus Pauh Putra, Universiti Malaysia Perlis. The research was aimed at obtaining seismic velocity, P-wave (Vp) using a seismic refraction method to produce a 2D tomography model. A 2D seismic model was used to delineate the layers into the velocity profile. The conventional geotechnical method of using a borehole was integrated with the seismic velocity method to provide appropriate correlation. The correlated data can be used to categorize machineries for excavation activities based on the available systematic analysis procedure to predict rock rippability. The seismic velocity profile obtained was used to interpret rock layers within the ranges labelled as rippable, marginal, and non-rippable. Based on the seismic velocity method the site can be classified into loose sand stone to moderately weathered rock. Laboratory test results shows that the site’s rock material falls between low strength and high strength. Results suggest that Caterpillar’s smallest ripper, namely, D8R, can successfully excavate materials based on the test results integration from seismic velocity method and laboratory test.

  2. Diagnosis of metastatic neoplasms: a clinicopathologic and morphologic approach.

    PubMed

    Marchevsky, Alberto M; Gupta, Ruta; Balzer, Bonnie

    2010-02-01

    The diagnosis of the site of origin of metastatic neoplasms often poses a challenge to practicing pathologists. A variety of immunohistochemical and molecular tests have been proposed for the identification of tumor site of origin, but these methods are no substitute for careful attention to the pathologic features of tumors and their correlation with imaging findings and other clinical data. The current trend in anatomic pathology is to overly rely on immunohistochemical and molecular tests to identify the site of origin of metastatic neoplasms, but this "shotgun approach" is often costly and can result in contradictory and even erroneous conclusions about the site of origin of a metastatic neoplasm. To describe the use of a systematic approach to the evaluation of metastatic neoplasms. Literature review and personal experience. A systematic approach can frequently help to narrow down differential diagnoses for a patient to a few likely tumor sites of origin that can be confirmed or excluded with the use of selected immunohistochemistry and/or molecular tests. This approach involves the qualitative evaluation of the "pretest and posttest probabilities" of various diagnoses before the immunohistochemical and molecular tests are ordered. Pretest probabilities are qualitatively estimated for each individual by taking into consideration the patient's age, sex, clinical history, imaging findings, and location of the metastases. This estimate is further narrowed by qualitatively evaluating, through careful observation of a variety of gross pathology and histopathologic features, the posttest probabilities of the most likely tumor sites of origin. Multiple examples of the use of this systematic approach for the evaluation of metastatic lesions are discussed.

  3. Hazard characterization and identification of a former ammunition site using microarrays, bioassays, and chemical analysis.

    PubMed

    Eisentraeger, Adolf; Reifferscheid, Georg; Dardenne, Freddy; Blust, Ronny; Schofer, Andrea

    2007-04-01

    More than 100,000 tons of 2,4,6-trinitrotoluene were produced at the former ammunition site Werk Tanne in Clausthal-Zellerfeld, Germany. The production of explosives and consequent detonation in approximately 1944 by the Allies caused great pollution in this area. Four soil samples and three water samples were taken from this site and characterized by applying chemical-analytical methods and several bioassays. Ecotoxicological test systems, such as the algal growth inhibition assay with Desmodesmus subspicatus, and genotoxicity tests, such as the umu and NM2009 tests, were performed. Also applied were the Ames test, according to International Organization for Standardization 16240, and an Ames fluctuation test. The toxic mode of action was examined using bacterial gene profiling assays with a battery of Escherichia coli strains and with the human liver cell line hepG2 using the PIQOR Toxicology cDNA microarray. Additionally, the molecular mechanism of 2,4,6-trinitrotoluene in hepG2 cells was analyzed. The present assessment indicates a danger of pollutant leaching for the soil-groundwater path. A possible impact for human health is discussed, because the groundwater in this area serves as drinking water.

  4. A full-potential approach to the relativistic single-site Green's function

    DOE PAGES

    Liu, Xianglin; Wang, Yang; Eisenbach, Markus; ...

    2016-07-07

    One major purpose of studying the single-site scattering problem is to obtain the scattering matrices and differential equation solutions indispensable to multiple scattering theory (MST) calculations. On the other hand, the single-site scattering itself is also appealing because it reveals the physical environment experienced by electrons around the scattering center. In this study, we demonstrate a new formalism to calculate the relativistic full-potential single-site Green's function. We implement this method to calculate the single-site density of states and electron charge densities. Lastly, the code is rigorously tested and with the help of Krein's theorem, the relativistic effects and full potentialmore » effects in group V elements and noble metals are thoroughly investigated.« less

  5. Ground Characterization Studies in Canakkale Pilot Site of LIQUEFACT Project

    NASA Astrophysics Data System (ADS)

    Ozcep, F.; Oztoprak, S.; Aysal, N.; Bozbey, I.; Tezel, O.; Ozer, C.; Sargin, S.; Bekin, E.; Almasraf, M.; Cengiz Cinku, M.; Ozdemir, K.

    2017-12-01

    The our aim is to outline the ground characterisation studies in Canakkale test site. Study is based on the EU H2020 LIQUEFACT project entitled "Liquefact: Assessment and mitigation of liquefaction potential across Europe: a holistic approach to protect structures / infrastructures for improved resilience to earthquake-induced liquefaction disasters". Objectives and extent of ground characterization for Canakkale test site includes pre-existing soil investigation studies and complementary field studies. There were several SPT and geophysical tests carried out in the study area. Within the context of the complementary tests, six (6) study areas in the test site were chosen and complementary tests were carried out in these areas. In these areas, additional boreholes were opened and SPT tests were performed. It was decided that additional CPT (CPTU and SCPT) and Marchetti Dilatometer (DMT) tests should be carried out within the scope of the complementary testing. Seismic refraction, MASW and micro tremor measurements had been carried out in pre-existing studies. Shear wave velocities obtained from MASW measurements were evaluated to the most rigorous level. These tests were downhole seismic, PS-logging, seismic refraction, 2D-ReMi, MASW, micro tremor (H/V Nakamura method), 2D resistivity and resonance acoustic profiling (RAP). RAP is a new technique which will be explained briefly in the relevant section. Dynamic soil properties had not been measured in pre-existing studies, therefore these properties were investigated within the scope of the complementary tests. Selection of specific experimental tests of the complementary campaign was based on cost-benefit considerations Within the context of complementary field studies, dynamic soil properties were measured using resonant column and cyclic direct shear tests. Several sieve analyses and Atterberg Limits tests which were documented in the pre-existing studies were evaluated. In the complementary study carried out, additional sieve analyses and Atterberg Limit tests were carried out. It was aimed to make some correlations between geophysical measurements and other field measurements; such as SPT, blow count values.

  6. Low Dimensional Embedding of Climate Data for Radio Astronomical Site Testing in the Colombian Andes

    NASA Astrophysics Data System (ADS)

    Chaparro Molano, Germán; Ramírez Suárez, Oscar Leonardo; Restrepo Gaitán, Oscar Alberto; Marcial Martínez Mercado, Alexander

    2017-10-01

    We set out to evaluate the potential of the Colombian Andes for millimeter-wave astronomical observations. Previous studies for astronomical site testing in this region have suggested that nighttime humidity and cloud cover conditions make most sites unsuitable for professional visible-light observations. Millimeter observations can be done during the day, but require that the precipitable water vapor column above a site stays below ˜10 mm. Due to a lack of direct radiometric or radiosonde measurements, we present a method for correlating climate data from weather stations to sites with a low precipitable water vapor column. We use unsupervised learning techniques to low dimensionally embed climate data (precipitation, rain days, relative humidity, and sunshine duration) in order to group together stations with similar long-term climate behavior. The data were taken over a period of 30 years by 2046 weather stations across the Colombian territory. We find six regions with unusually dry, clear-sky conditions, ranging in elevations from 2200 to 3800 masl. We evaluate the suitability of each region using a quality index derived from a Bayesian probabilistic analysis of the station type and elevation distributions. Two of these regions show a high probability of having an exceptionally low precipitable water vapor column. We compared our results with global precipitable water vapor maps and find a plausible geographical correlation with regions with low water vapor columns (˜10 mm) at an accuracy of ˜20 km. Our methods can be applied to similar data sets taken in other countries as a first step toward astronomical site evaluation.

  7. The Growth of Multi-Site Fatigue Damage in Fuselage Lap Joints

    NASA Technical Reports Server (NTRS)

    Piascik, Robert S.; Willard, Scott A.

    1999-01-01

    Destructive examinations were performed to document the progression of multi-site damage (MSD) in three lap joint panels that were removed from a full scale fuselage test article that was tested to 60,000 full pressurization cycles. Similar fatigue crack growth characteristics were observed for small cracks (50 microns to 10 mm) emanating from counter bore rivets, straight shank rivets, and 100 deg counter sink rivets. Good correlation of the fatigue crack growth data base obtained in this study and FASTRAN Code predictions show that the growth of MSD in the fuselage lap joint structure can be predicted by fracture mechanics based methods.

  8. "iSS-Hyb-mRMR": Identification of splicing sites using hybrid space of pseudo trinucleotide and pseudo tetranucleotide composition.

    PubMed

    Iqbal, Muhammad; Hayat, Maqsood

    2016-05-01

    Gene splicing is a vital source of protein diversity. Perfectly eradication of introns and joining exons is the prominent task in eukaryotic gene expression, as exons are usually interrupted by introns. Identification of splicing sites through experimental techniques is complicated and time-consuming task. With the avalanche of genome sequences generated in the post genomic age, it remains a complicated and challenging task to develop an automatic, robust and reliable computational method for fast and effective identification of splicing sites. In this study, a hybrid model "iSS-Hyb-mRMR" is proposed for quickly and accurately identification of splicing sites. Two sample representation methods namely; pseudo trinucleotide composition (PseTNC) and pseudo tetranucleotide composition (PseTetraNC) were used to extract numerical descriptors from DNA sequences. Hybrid model was developed by concatenating PseTNC and PseTetraNC. In order to select high discriminative features, minimum redundancy maximum relevance algorithm was applied on the hybrid feature space. The performance of these feature representation methods was tested using various classification algorithms including K-nearest neighbor, probabilistic neural network, general regression neural network, and fitting network. Jackknife test was used for evaluation of its performance on two benchmark datasets S1 and S2, respectively. The predictor, proposed in the current study achieved an accuracy of 93.26%, sensitivity of 88.77%, and specificity of 97.78% for S1, and the accuracy of 94.12%, sensitivity of 87.14%, and specificity of 98.64% for S2, respectively. It is observed, that the performance of proposed model is higher than the existing methods in the literature so for; and will be fruitful in the mechanism of RNA splicing, and other research academia. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. Using pre-screening methods for an effective and reliable site characterization at megasites.

    PubMed

    Algreen, Mette; Kalisz, Mariusz; Stalder, Marcel; Martac, Eugeniu; Krupanek, Janusz; Trapp, Stefan; Bartke, Stephan

    2015-10-01

    This paper illustrates the usefulness of pre-screening methods for an effective characterization of polluted sites. We applied a sequence of site characterization methods to a former Soviet military airbase with likely fuel and benzene, toluene, ethylbenzene, and xylene (BTEX) contamination in shallow groundwater and subsoil. The methods were (i) phytoscreening with tree cores; (ii) soil gas measurements for CH4, O2, and photoionization detector (PID); (iii) direct-push with membrane interface probe (MIP) and laser-induced fluorescence (LIF) sensors; (iv) direct-push sampling; and (v) sampling from soil and from groundwater monitoring wells. Phytoscreening and soil gas measurements are rapid and inexpensive pre-screening methods. Both indicated subsurface pollution and hot spots successfully. The direct-push sensors yielded 3D information about the extension and the volume of the subsurface plume. This study also expanded the applicability of tree coring to BTEX compounds and tested the use of high-resolution direct-push sensors for light hydrocarbons. Comparison of screening results to results from conventional soil and groundwater sampling yielded in most cases high rank correlation and confirmed the findings. The large-scale application of non- or low-invasive pre-screening can be of help in directing and focusing the subsequent, more expensive investigation methods. The rapid pre-screening methods also yielded useful information about potential remediation methods. Overall, we see several benefits of a stepwise screening and site characterization scheme, which we propose in conclusion.

  10. Evaluation of offshore stocking of Lake Trout in Lake Ontario

    USGS Publications Warehouse

    Lantry, B.F.; O'Gorman, R.; Strang, T.G.; Lantry, J.R.; Connerton, M.J.; Schanger, T.

    2011-01-01

    Restoration stocking of hatchery-reared lake trout Salvelinus namaycush has occurred in Lake Ontario since 1973. In U.S. waters, fish stocked through 1990 survived well and built a large adult population. Survival of yearlings stocked from shore declined during 1990–1995, and adult numbers fell during 1998–2005. Offshore stocking of lake trout was initiated in the late 1990s in response to its successful mitigation of predation losses to double-crested cormorants Phalacrocorax auritus and the results of earlier studies that suggested it would enhance survival in some cases. The current study was designed to test the relative effectiveness of three stocking methods at a time when poststocking survival for lake trout was quite low and losses due to fish predators was a suspected factor. The stocking methods tested during 2000–2002 included May offshore, May onshore, and June onshore. Visual observations during nearshore stockings and hydroacoustic observations of offshore stockings indicated that release methods were not a direct cause of fish mortality. Experimental stockings were replicated for 3 years at one site in the southwest and for 2 years at one site in the southeast. Offshore releases used a landing craft to transport hatchery trucks from 3 to 6 km offshore out to 55–60-m-deep water. For the southwest site, offshore stocking significantly enhanced poststocking survival. Among the three methods, survival ratios were 1.74 : 1.00 : 1.02 (May offshore : May onshore : June onshore). Although not statistically significant owing to the small samples, the trends were similar for the southeast site, with survival ratios of 1.67 : 1.00 : 0.72. Consistent trends across years and sites indicated that offshore stocking of yearling lake trout during 2000–2002 provided nearly a twofold enhancement in survival; however, this increase does not appear to be great enough to achieve the 12-fold enhancement necessary to return population abundance to restoration targets.

  11. A Novel Quantum Dots-Based Point of Care Test for Syphilis

    NASA Astrophysics Data System (ADS)

    Yang, Hao; Li, Ding; He, Rong; Guo, Qin; Wang, Kan; Zhang, Xueqing; Huang, Peng; Cui, Daxiang

    2010-05-01

    One-step lateral flow test is recommended as the first line screening of syphilis for primary healthcare settings in developing countries. However, it generally shows low sensitivity. We describe here the development of a novel fluorescent POC (Point Of Care) test method to be used for screening for syphilis. The method was designed to combine the rapidness of lateral flow test and sensitiveness of fluorescent method. 50 syphilis-positive specimens and 50 healthy specimens conformed by Treponema pallidum particle agglutination (TPPA) were tested with Quantum Dot-labeled and colloidal gold-labeled lateral flow test strips, respectively. The results showed that both sensitivity and specificity of the quantum dots-based method reached up to 100% (95% confidence interval [CI], 91-100%), while those of the colloidal gold-based method were 82% (95% CI, 68-91%) and 100% (95% CI, 91-100%), respectively. In addition, the naked-eye detection limit of quantum dot-based method could achieve 2 ng/ml of anti-TP47 polyclonal antibodies purified by affinity chromatography with TP47 antigen, which was tenfold higher than that of colloidal gold-based method. In conclusion, the quantum dots were found to be suitable for labels of lateral flow test strip. Its ease of use, sensitiveness and low cost make it well-suited for population-based on-the-site syphilis screening.

  12. Pore-water and epibenthic exposures in contaminated sediments using embryos of two estuarine fish species

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jelinski, J.A.; Anderson, S.L.

    1995-12-31

    The authors` objectives were to determine the feasibility of using embryos of two fish species, Menidia beryllina and Atherinops affinis, in estuarine sediment toxicity tests at ambient temperatures and salinities, and to compare pore-water and sediment water interface corer (SWIC) exposure techniques using these same species. The ultimate goal is to determine whether these pore-water and SWIC methods can be used in in situ exposure studies. Sediment samples were collected at both a reference and contaminated site at the Mare Island Naval Shipyard in San Francisco Bay. Pore-water testes were conducted using methods developed in the laboratory, and SWIC testsmore » were conducted using a modification of B. Anderson et al. Salinity and temperature tolerance experiments revealed that M. beryllina embryos can tolerate temperatures between 160 C and 240 C and salinities of 10 ppt to 25 ppt, whereas A. affinis has a temperature range between 160 C and 200 C. Comparisons between pore-water and SWIC exposures at a reference site within MINSY showed no significant difference in hatching success. However, hatching success in SWIC exposures was significantly lower than pore-water exposures at a previously characterized contaminated site. In conclusion, both M. beryllina and A. affinis embryos may be useful for sediment and in situ toxicity testing in estuarine environments. Their wide temperature and salinity tolerances allow for minimal test manipulations, and M. beryllina showed excellent hatching success in reference sediments for both types of exposures.« less

  13. Efficient Blockwise Permutation Tests Preserving Exchangeability

    PubMed Central

    Zhou, Chunxiao; Zwilling, Chris E.; Calhoun, Vince D.; Wang, Michelle Y.

    2014-01-01

    In this paper, we present a new blockwise permutation test approach based on the moments of the test statistic. The method is of importance to neuroimaging studies. In order to preserve the exchangeability condition required in permutation tests, we divide the entire set of data into certain exchangeability blocks. In addition, computationally efficient moments-based permutation tests are performed by approximating the permutation distribution of the test statistic with the Pearson distribution series. This involves the calculation of the first four moments of the permutation distribution within each block and then over the entire set of data. The accuracy and efficiency of the proposed method are demonstrated through simulated experiment on the magnetic resonance imaging (MRI) brain data, specifically the multi-site voxel-based morphometry analysis from structural MRI (sMRI). PMID:25289113

  14. Development of a general method for detection and quantification of the P35S promoter based on assessment of existing methods

    PubMed Central

    Wu, Yuhua; Wang, Yulei; Li, Jun; Li, Wei; Zhang, Li; Li, Yunjing; Li, Xiaofei; Li, Jun; Zhu, Li; Wu, Gang

    2014-01-01

    The Cauliflower mosaic virus (CaMV) 35S promoter (P35S) is a commonly used target for detection of genetically modified organisms (GMOs). There are currently 24 reported detection methods, targeting different regions of the P35S promoter. Initial assessment revealed that due to the absence of primer binding sites in the P35S sequence, 19 of the 24 reported methods failed to detect P35S in MON88913 cotton, and the other two methods could only be applied to certain GMOs. The rest three reported methods were not suitable for measurement of P35S in some testing events, because SNPs in binding sites of the primer/probe would result in abnormal amplification plots and poor linear regression parameters. In this study, we discovered a conserved region in the P35S sequence through sequencing of P35S promoters from multiple transgenic events, and developed new qualitative and quantitative detection systems targeting this conserved region. The qualitative PCR could detect the P35S promoter in 23 unique GMO events with high specificity and sensitivity. The quantitative method was suitable for measurement of P35S promoter, exhibiting good agreement between the amount of template and Ct values for each testing event. This study provides a general P35S screening method, with greater coverage than existing methods. PMID:25483893

  15. Minor surgery in microgravity

    NASA Technical Reports Server (NTRS)

    Billica, Roger; Krupa, Debra T.; Stonestreet, Robert; Kizzee, Victor D.

    1991-01-01

    The purpose is to investigate and demonstrate equipment and techniques proposed for minor surgery on Space Station Freedom (SSF). The objectives are: (1) to test and evaluate methods of surgical instrument packaging and deployment; (2) to test and evaluate methods of surgical site preparation and draping; (3) to evaluate techniques of sterile procedure and maintaining sterile field; (4) to evaluate methods of trash management during medical/surgical procedures; and (4) to gain experience in techniques for performing surgery in microgravity. A KC-135 parabolic flight test was performed on March 30, 1990 with the goal of investigating and demonstrating surgical equipment and techniques under consideration for use on SSF. The flight followed the standard 40 parabola profile with 20 to 25 seconds of near-zero gravity in each parabola.

  16. Testing a simple field method for assessing nitrate removal in riparian zones

    Treesearch

    Philippe Vidon; Michael G. Dosskey

    2008-01-01

    Being able to identify riparian sites that function better for nitrate removal from groundwater is critical to using efficiently the riparian zones for water quality management. For this purpose, managers need a method that is quick, inexpensive, and accurate enough to enable effective management decisions. This study assesses the precision and accuracy of a simple...

  17. Application of phytoscreening to three hazardous waste sites in Arizona.

    PubMed

    Duncan, Candice M; Mainhagu, Jon; Virgone, Kayla; Ramírez, Denise Moreno; Brusseau, Mark L

    2017-12-31

    The great majority of prior phytoscreening applications have been conducted in humid and temperate environments wherein groundwater is relatively shallow (~1-6m deep). The objective of this research is to evaluate its use in semi-arid environments for sites with deeper groundwater (>10m). To that end, phytoscreening is applied to three chlorinated-solvent hazardous-waste sites in Arizona. Contaminant concentrations were quantifiable in tree-tissue samples collected from two of the sites (Nogales, Park-Euclid). Contaminant concentrations were detectable, but not quantifiable, for the third site. Tree-tissue concentrations of tetrachloroethene (PCE) ranged from approximately 400-5000ug/kg wet weight for burrobrush, cottonwood, palo verde, and velvet mesquite at the Nogales site. In addition to standard trunk-core samples, leaf samples were collected to test the effectiveness of a less invasive sampling method. Leaf-sample concentrations were quantifiable, but several times lower than the corresponding core-sample concentrations. Comparison of results obtained for the test sites to those reported in the literature suggest that tree species is a major factor mediating observed results. One constraint faced for the Arizona sites was the relative scarcity of mature trees available for sampling, particularly in areas adjacent to industrial zones. The results of this study illustrate that phytoscreening can be used effectively to characterize the presence of groundwater contamination for semi-arid sites with deeper groundwater. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Quantum coupled mutation finder: predicting functionally or structurally important sites in proteins using quantum Jensen-Shannon divergence and CUDA programming.

    PubMed

    Gültas, Mehmet; Düzgün, Güncel; Herzog, Sebastian; Jäger, Sven Joachim; Meckbach, Cornelia; Wingender, Edgar; Waack, Stephan

    2014-04-03

    The identification of functionally or structurally important non-conserved residue sites in protein MSAs is an important challenge for understanding the structural basis and molecular mechanism of protein functions. Despite the rich literature on compensatory mutations as well as sequence conservation analysis for the detection of those important residues, previous methods often rely on classical information-theoretic measures. However, these measures usually do not take into account dis/similarities of amino acids which are likely to be crucial for those residues. In this study, we present a new method, the Quantum Coupled Mutation Finder (QCMF) that incorporates significant dis/similar amino acid pair signals in the prediction of functionally or structurally important sites. The result of this study is twofold. First, using the essential sites of two human proteins, namely epidermal growth factor receptor (EGFR) and glucokinase (GCK), we tested the QCMF-method. The QCMF includes two metrics based on quantum Jensen-Shannon divergence to measure both sequence conservation and compensatory mutations. We found that the QCMF reaches an improved performance in identifying essential sites from MSAs of both proteins with a significantly higher Matthews correlation coefficient (MCC) value in comparison to previous methods. Second, using a data set of 153 proteins, we made a pairwise comparison between QCMF and three conventional methods. This comparison study strongly suggests that QCMF complements the conventional methods for the identification of correlated mutations in MSAs. QCMF utilizes the notion of entanglement, which is a major resource of quantum information, to model significant dissimilar and similar amino acid pair signals in the detection of functionally or structurally important sites. Our results suggest that on the one hand QCMF significantly outperforms the previous method, which mainly focuses on dissimilar amino acid signals, to detect essential sites in proteins. On the other hand, it is complementary to the existing methods for the identification of correlated mutations. The method of QCMF is computationally intensive. To ensure a feasible computation time of the QCMF's algorithm, we leveraged Compute Unified Device Architecture (CUDA).The QCMF server is freely accessible at http://qcmf.informatik.uni-goettingen.de/.

  19. Remarkable experiences of the nuclear tests in residents near the Semipalatinsk nuclear test site: analysis based on the questionnaire surveys.

    PubMed

    Kawano, Noriyuki; Ohtaki, Megu

    2006-02-01

    The main objective of this paper is to identify salient experiences of those who were exposed to radiation by the nuclear tests at the Semipalatinsk Nuclear Tests Site (SNTS). In 2002, our research team of the Research Institute for Radiation Biology and Medicine, Hiroshima University, started to conduct some field research by means of a questionnaire survey. Through this, we expected to examine the health condition of the residents near the SNTS, identify their experiences from the nuclear tests, and understand the exposure path. This attempt at clarifying the reality of radiation exposure at Semipalatinsk through the use of a survey research method is the first of its kind. Among the responses to our survey, the present paper focuses mainly upon responses to the questions concerning the experiences of the nuclear tests. It deals mainly with direct experiences of nuclear tests of the residents characteristic to Semipalatinsk, including some new experiences hitherto unnoticed. The present paper touches upon their concrete direct experiences of flash, bomb blast, heat, rain and dust. We also discuss distinct experiences in Semipalatinsk such as evacuation, through the additional use of their testimonies. The data have been compared with the results obtained in a similar survey made in Hiroshima and Nagasaki. For the data analysis, a statistical method called logistic multiple linear regression analysis has been used.

  20. Ensuring a reliable satellite

    NASA Astrophysics Data System (ADS)

    Johnson, Charles E.; Persinger, Randy R.; Lemon, James J.; Volkert, Keith J.

    Comprehensive testing and monitoring approaches have been formulated and implemented for Intelsat VI, which is the largest commercial satellite in service. An account is given of the ground test program from unit level through launch site activities, giving attention to the test data handling system. Test methods unique to Intelsat VI encompass near-field anechoic chamber antenna measurements, offloading 1-g deployment of solar cell and deflector antennas, and electrostatic discharge measurements. The problems accruing to the sheer size of this spacecraft are stressed.

  1. Combining stakeholder analysis and spatial multicriteria evaluation to select and rank inert landfill sites.

    PubMed

    Geneletti, Davide

    2010-02-01

    This paper presents a method based on the combination of stakeholder analysis and spatial multicriteria evaluation (SMCE) to first design possible sites for an inert landfill, and then rank them according to their suitability. The method was tested for the siting of an inert landfill in the Sarca's Plain, located in south-western Trentino, an alpine region in northern Italy. Firstly, stakeholder analysis was conducted to identify a set of criteria to be satisfied by new inert landfill sites. SMCE techniques were then applied to combine the criteria, and obtain a suitability map of the study region. Subsequently, the most suitable sites were extracted by taking into account also thresholds based on size and shape. These sites were then compared and ranked according to their visibility, accessibility and dust pollution. All these criteria were assessed through GIS modelling. Sensitivity analyses were performed on the results to assess the stability of the ranking with respect to variations in the input (criterion scores and weights). The study concluded that the three top-ranking sites are located close to each other, in the northernmost sector of the study area. A more general finding was that the use of different criteria in the different stages of the analysis allowed to better differentiate the suitability of the potential landfill sites.

  2. Toxicity and bioaccumulation of sediment-associated contaminants using freshwater invertebrates: A review of methods and applications

    USGS Publications Warehouse

    Ingersoll, C.G.; Ankley, G.T.; Benoit, D.A.; Brunson, E.L.; Burton, G.A.; Dwyer, F.J.; Hoke, R.A.; Landrum, P.F.; Norberg-King, T. J.; Winger, P.V.

    1995-01-01

    This paper reviews recent developments in methods for evaluating the toxicity and bioaccumulation of contaminants associated with freshwater sediments and summarizes example case studies demonstrating the application of these methods. Over the past decade, research has emphasized development of more specific testing procedures for conducting 10-d toxicity tests with the amphipod Hyalella azteca and the midge Chironomus tentans. Toxicity endpoints measured in these tests are survival for H. azteca and survival and growth for C. tentans. Guidance has also been developed for conducting 28-d bioaccumulation tests with the oligochaete Lumbriculus variegatus, including determination of bioaccumulation kinetics for different compound classes. These methods have been applied to a variety of sediments to address issues ranging from site assessments to bioavailability of organic and inorganic contaminants using field-collected and laboratory-spiked samples. Survival and growth of controls routinely meet or exceed test acceptability criteria. Results of laboratory bioaccumulation studies with L. variegatus have been confirmed with comparisons to residues (PCBs, PAHs, DDT) present from synoptically collected field populations of oligochaetes. Additional method development is currently underway to develop chronic toxicity tests and to provide additional data-confirming responses observed in laboratory sediment tests with natural benthic populations.

  3. Do regional methods really help reduce uncertainties in flood frequency analyses?

    NASA Astrophysics Data System (ADS)

    Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric

    2013-04-01

    Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged sites or estimated extremes at ungauged sites in the considered region, is an efficient way to reduce uncertainties in flood frequency studies.

  4. Design and methods of the Southeast Stream Quality Assessment (SESQA), 2014

    USGS Publications Warehouse

    Journey, Celeste A.; Van Metre, Peter C.; Bell, Amanda H.; Button, Daniel T.; Garrett, Jessica D.; Nakagaki, Naomi; Qi, Sharon L.; Bradley, Paul M.

    2015-07-15

    This report provides a detailed description of the SESQA study components, including surveys of ecological conditions, routine water sampling, deployment of passive polar organic compound integrative samplers for pesticides and contaminants of emerging concern, and synoptic sediment sampling and toxicity testing at all urban, confined animal feeding operation, and reference sites. Continuous water-quality monitoring and daily pesticide sampling efforts conducted at a subset of urban sites are also described.

  5. Analysis and recognition of 5′ UTR intron splice sites in human pre-mRNA

    PubMed Central

    Eden, E.; Brunak, S.

    2004-01-01

    Prediction of splice sites in non-coding regions of genes is one of the most challenging aspects of gene structure recognition. We perform a rigorous analysis of such splice sites embedded in human 5′ untranslated regions (UTRs), and investigate correlations between this class of splice sites and other features found in the adjacent exons and introns. By restricting the training of neural network algorithms to ‘pure’ UTRs (not extending partially into protein coding regions), we for the first time investigate the predictive power of the splicing signal proper, in contrast to conventional splice site prediction, which typically relies on the change in sequence at the transition from protein coding to non-coding. By doing so, the algorithms were able to pick up subtler splicing signals that were otherwise masked by ‘coding’ noise, thus enhancing significantly the prediction of 5′ UTR splice sites. For example, the non-coding splice site predicting networks pick up compositional and positional bias in the 3′ ends of non-coding exons and 5′ non-coding intron ends, where cytosine and guanine are over-represented. This compositional bias at the true UTR donor sites is also visible in the synaptic weights of the neural networks trained to identify UTR donor sites. Conventional splice site prediction methods perform poorly in UTRs because the reading frame pattern is absent. The NetUTR method presented here performs 2–3-fold better compared with NetGene2 and GenScan in 5′ UTRs. We also tested the 5′ UTR trained method on protein coding regions, and discovered, surprisingly, that it works quite well (although it cannot compete with NetGene2). This indicates that the local splicing pattern in UTRs and coding regions is largely the same. The NetUTR method is made publicly available at www.cbs.dtu.dk/services/NetUTR. PMID:14960723

  6. Archaeological data recovery at drill pad U19au, Nye County, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henton, G.H.; Pippin, L.C.

    1991-01-01

    Construction activities accompanying underground nuclear tests result in the disturbance of the surface terrain at the Nevada Test Site. In compliance with Federal legislation (National Historic Preservation Act of 1966 (PL 89-665) and National Environmental Policy Act of 1969 (PL 91-190)), the US Department of Energy (DOE), Field Office, Nevada, has long required that cultural resources studies must precede all land-disturbing activities on the Nevada Test Site. In accordance with 36 CFR Part 800, these studies consist of archaeological surveys conducted prior to the land-disturbing activities. The intent of these surveys is to identify and evaluate all cultural resources thatmore » might be adversely affected by the proposed construction activity. This report presents the final analysis of the data recovered from archaeological investigations conducted at the U19au drill site and access road. This report includes descriptions of the archaeological sites as recorded during the original survey, the research design used to guide the investigations, the method and techniques used to collect and analyze the data, and the results and interpretations of the analysis. 200 refs., 112 figs., 53 tabs.« less

  7. Detecting Malaria Hotspots: A Comparison of Rapid Diagnostic Test, Microscopy, and Polymerase Chain Reaction

    PubMed Central

    Mogeni, Polycarp; Williams, Thomas N; Omedo, Irene; Kimani, Domtila; Ngoi, Joyce M; Mwacharo, Jedida; Morter, Richard; Nyundo, Christopher; Wambua, Juliana; Nyangweso, George; Kapulu, Melissa; Fegan, Gregory; Bejon, Philip

    2017-01-01

    Abstract Background Malaria control strategies need to respond to geographical hotspots of transmission. Detection of hotspots depends on the sensitivity of the diagnostic tool used. Methods We conducted cross-sectional surveys in 3 sites within Kilifi County, Kenya, that had variable transmission intensities. Rapid diagnostic test (RDT), microscopy, and polymerase chain reaction (PCR) were used to detect asymptomatic parasitemia, and hotspots were detected using the spatial scan statistic. Results Eight thousand five hundred eighty-one study participants were surveyed in 3 sites. There were statistically significant malaria hotspots by RDT, microscopy, and PCR for all sites except by microscopy in 1 low transmission site. Pooled data analysis of hotspots by PCR overlapped with hotspots by microscopy at a moderate setting but not at 2 lower transmission settings. However, variations in degree of overlap were noted when data were analyzed by year. Hotspots by RDT were predictive of PCR/microscopy at the moderate setting, but not at the 2 low transmission settings. We observed long-term stability of hotspots by PCR and microscopy but not RDT. Conclusion Malaria control programs may consider PCR testing to guide asymptomatic malaria hotspot detection once the prevalence of infection falls. PMID:28973672

  8. Field evaluation of a horizontal well recirculation system for groundwater treatment: Field demonstration at X-701B Portsmouth Gaseous Diffusion Plant, Piketon, Ohio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korte, N.; Muck, M.; Kearl, P.

    1998-08-01

    This report describes the field-scale demonstration performed as part of the project, In Situ Treatment of Mixed Contaminants in Groundwater. This project was a 3{1/2} year effort comprised of laboratory work performed at Oak Ridge National Laboratory and fieldwork performed at the US Department of Energy (DOE) Portsmouth Gaseous Diffusion Plant (PORTS). The overall goal of the project was to evaluate in situ treatment of groundwater using horizontal recirculation coupled with treatment modules. Specifically, horizontal recirculation was tested because of its application to thin, interbedded aquifer zones. Mixed contaminants were targeted because of their prominence at DOE sites and becausemore » they cannot be treated with conventional methods. The project involved several research elements, including treatment process evaluation, hydrodynamic flow and transport modeling, pilot testing at an uncontaminated site, and full-scale testing at a contaminated site. This report presents the results of the work at the contaminated site, X-701B at PORTS. Groundwater contamination at X-701B consists of trichloroethene (TCE) (concentrations up to 1800 mg/L) and technetium-998 (Tc{sup 99}) (activities up to 926 pCi/L).« less

  9. Developing and Testing an Online Tool for Teaching GIS Concepts Applied to Spatial Decision-Making

    ERIC Educational Resources Information Center

    Carver, Steve; Evans, Andy; Kingston, Richard

    2004-01-01

    The development and testing of a Web-based GIS e-learning resource is described. This focuses on the application of GIS for siting a nuclear waste disposal facility and the associated principles of spatial decision-making using Boolean and weighted overlay methods. Initial student experiences in using the system are analysed as part of a research…

  10. Do soil tests help forecast nitrogen response in first-year corn following alfalfa on fine-textured soils?

    USDA-ARS?s Scientific Manuscript database

    Improved methods of predicting grain yield response to fertilizer N for first-year corn (Zea mays L.) following alfalfa (Medicago sativa L.) on fine-textured soils are needed. Data from 21 site-years in the North Central Region were used to (i) determine how Illinois soil nitrogen test (ISNT) and pr...

  11. Comparison of American Fisheries Society (AFS) standard fish sampling techniques and environmental DNA for characterizing fish communities in a large reservoir

    USGS Publications Warehouse

    Perez, Christina R.; Bonar, Scott A.; Amberg, Jon J.; Ladell, Bridget; Rees, Christopher B.; Stewart, William T.; Gill, Curtis J.; Cantrell, Chris; Robinson, Anthony

    2017-01-01

    Recently, methods involving examination of environmental DNA (eDNA) have shown promise for characterizing fish species presence and distribution in waterbodies. We evaluated the use of eDNA for standard fish monitoring surveys in a large reservoir. Specifically, we compared the presence, relative abundance, biomass, and relative percent composition of Largemouth Bass Micropterus salmoides and Gizzard Shad Dorosoma cepedianum measured through eDNA methods and established American Fisheries Society standard sampling methods for Theodore Roosevelt Lake, Arizona. Catches at electrofishing and gillnetting sites were compared with eDNA water samples at sites, within spatial strata, and over the entire reservoir. Gizzard Shad were detected at a higher percentage of sites with eDNA methods than with boat electrofishing in both spring and fall. In contrast, spring and fall gillnetting detected Gizzard Shad at more sites than eDNA. Boat electrofishing and gillnetting detected Largemouth Bass at more sites than eDNA; the exception was fall gillnetting, for which the number of sites of Largemouth Bass detection was equal to that for eDNA. We observed no relationship between relative abundance and biomass of Largemouth Bass and Gizzard Shad measured by established methods and eDNA copies at individual sites or lake sections. Reservoirwide catch composition for Largemouth Bass and Gizzard Shad (numbers and total weight [g] of fish) as determined through a combination of gear types (boat electrofishing plus gillnetting) was similar to the proportion of total eDNA copies from each species in spring and fall field sampling. However, no similarity existed between proportions of fish caught via spring and fall boat electrofishing and the proportion of total eDNA copies from each species. Our study suggests that eDNA field sampling protocols, filtration, DNA extraction, primer design, and DNA sequencing methods need further refinement and testing before incorporation into standard fish sampling surveys.

  12. The impact of parental consent on the HIV testing of minors.

    PubMed Central

    Meehan, T M; Hansen, H; Klein, W C

    1997-01-01

    OBJECTIVES: This investigation assessed change in use of human immunodeficiency virus (HIV) testing by minors after removal of the parental consent requirement in Connecticut. METHODS: HIV counseling and testing records for 13- to 17-year-olds who accessed publicly funded testing sites were analyzed. RESULTS: The number of visits increased by 44% from the 12-month period before the statutory change (n = 656) to the 12-month period thereafter (n = 965). The number of HIV tests increased twofold. Visits and tests of high-risk minors tripled. CONCLUSIONS: Minors should have the right to consent to HIV testing. PMID:9279271

  13. Evaluation of Mapping Methodologies at a Legacy Test Site

    NASA Astrophysics Data System (ADS)

    Sussman, A. J.; Schultz-Fellenz, E. S.; Roback, R. C.; Kelley, R. E.; Drellack, S.; Reed, D.; Miller, E.; Cooper, D. I.; Sandoval, M.; Wang, R.

    2013-12-01

    On June 12th, 1985, a nuclear test with an announced yield between 20-150kt was detonated in rhyolitic lava in a vertical emplacement borehole at a depth of 608m below the surface. This test did not collapse to the surface and form a crater, but rather resulted in a subsurface collapse with more subtle surface expressions of deformation, providing an opportunity to evaluate the site using a number of surface mapping methodologies. The site was investigated over a two-year time span by several mapping teams. In order to determine the most time efficient and accurate approach for mapping post-shot surface features at a legacy test site, a number of different techniques were employed. The site was initially divided into four quarters, with teams applying various methodologies, techniques, and instrumentations to each quarter. Early methods included transect lines and site gridding with a Brunton pocket transit, flagging tape, measuring tape, and stakes; surveying using a hand-held personal GPS to locate observed features with an accuracy of × 5-10m; and extensive photo-documentation. More recent methods have incorporated the use of near survey grade GPS devices to allow careful location and mapping of surface features. Initially, gridding was employed along with the high resolution GPS surveys, but this was found to be time consuming and of little observational value. Raw visual observation (VOB) data included GPS coordinates for artifacts or features of interest, field notes, and photographs. A categorization system was used to organize the myriad of items, in order to aid in database searches and for visual presentation of findings. The collected data set was imported into a geographic information system (GIS) as points, lines, or polygons and overlain onto a digital color orthophoto map of the test site. Once these data were mapped, spectral data were collected using a high resolution field spectrometer. In addition to geo-locating the field observations with 10cm resolution GPS, LiDAR and hyperspectral imagery were also acquired. The LiDAR and hyperspectral data are being processed and will be added to the existing geo-referenced database as separate information layers for remote sensing analysis of surface features associated with the legacy test. By consolidating the various components of a VOB data point (coordinates, photo and item description) into a standalone database, searching or querying for other components or collects such as subsurface geophysical and/or airborne imagery is made much easier. Work by Los Alamos National Laboratory was sponsored by the National Nuclear Security Administration Award No. DE-AC52-06NA25946/NST10-NCNS-PD00. Work by National Security Technologies, LLC, was performed under Contract No. DE AC52 06NA25946 with the U.S. Department of Energy.

  14. Corrective Action Plan for Corrective Action Unit 453: Area 9 UXO Landfill, Tonopah Test Range, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bechtel Nevada

    1998-09-30

    This corrective action plan proposes the closure method for the area 9 unexploded Ordnance landfill, corrective action unit 453 located at the Tonopah Test Range. The area 9 UXO landfill consists of corrective action site no. 09-55-001-0952 and is comprised of three individual landfill cells designated as A9-1, A9-2, and A9-3. The three landfill cells received wastes from daily operations at area 9 and from range cleanups which were performed after weapons testing. Cell locations and contents were not well documented due to the unregulated disposal practices commonly associated with early landfill operations. However, site process knowledge indicates that themore » landfill cells were used for solid waste disposal, including disposal of UXO.« less

  15. Radiology compared with xenon—133 scanning and bronchoscopic lobar sampling as methods for assessing regional lung function in patients with emphysema

    PubMed Central

    Barter, C. E.; Hugh-Jones, P.; Laws, J. W.; Crosbie, W. A.

    1973-01-01

    Regional lung function was assessed by radiographic methods, by regional function studies using xenon-133 scans, and by lobar sampling with a mass spectrometer flow-meter at bronchoscopy in 12 patients who subsequently had bullae resected at operation. The information given by these three methods of regional assessment was subsequently compared with the findings at operation. When only one lobe was abnormal on the radiographs, these alone were adequate to locate the major site of the emphysema and the regional tests gave relatively little extra information. The xenon scan was sometimes helpful in assessing the state of the remaining lung, but this information could be deduced from the radiographs and overall lung function tests, especially the carbon monoxide transfer and mechanical measurements. Bronchoscopic sampling was helpful in determining whether the affected lobe was acting as a ventilated dead-space. When more than one lobe was affected the regional function tests supplemented the radiographs in defining the site of bullous change as well as locating dead space. Xenon scans, although widely employed for such preoperative assessments, added little to the topographical information obtained by careful radiology. The combination of radiology, lobar sampling, and overall function tests is recommended for assessing which emphysematous patients are likely to benefit from surgery. Images PMID:4685209

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Xianglin; Wang, Yang; Eisenbach, Markus

    One major purpose of studying the single-site scattering problem is to obtain the scattering matrices and differential equation solutions indispensable to multiple scattering theory (MST) calculations. On the other hand, the single-site scattering itself is also appealing because it reveals the physical environment experienced by electrons around the scattering center. In this study, we demonstrate a new formalism to calculate the relativistic full-potential single-site Green's function. We implement this method to calculate the single-site density of states and electron charge densities. Lastly, the code is rigorously tested and with the help of Krein's theorem, the relativistic effects and full potentialmore » effects in group V elements and noble metals are thoroughly investigated.« less

  17. Profiling a multiplex short tandem repeat loci from human urine with use of low cost on-site technology for verification of sample authenticity.

    PubMed

    Pires, Nuno M M; Tao Dong; Berntzen, Lasse; Lonningdal, Torill

    2017-07-01

    This work focuses on the development of a sophisticated technique via STR typing to unequivocally verify the authenticity of urine samples before sent to laboratories. STR profiling was conducted with the CSF1PO, TPOX, TH01 Multiplex System coupled with a smartphone-based detection method. The promising capability of the method to identify distinct STR profiles from urine of different persons opens the possibility to conduct sample authenticity tests. On-site STR profiling could be realized with a self-contained autonomous device with an integrated PCR microchip shown hereby.

  18. Searching for Life with Rovers: Exploration Methods & Science Results from the 2004 Field Campaign of the "Life in the Atacama" Project and Applications to Future Mars Missions

    NASA Technical Reports Server (NTRS)

    Cabrol, N. A.a; Wettergreen, D. S.; Whittaker, R.; Grin, E. A.; Moersch, J.; Diaz, G. Chong; Cockell, C.; Coppin, P.; Dohm, J. M.; Fisher, G.

    2005-01-01

    The Life In The Atacama (LITA) project develops and field tests a long-range, solarpowered, automated rover platform (Zo ) and a science payload assembled to search for microbial life in the Atacama desert. Life is barely detectable over most of the driest desert on Earth. Its unique geological, climatic, and biological evolution have created a unique training site for designing and testing exploration strategies and life detection methods for the robotic search for life on Mars.

  19. 40 CFR 60.74 - Test methods and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... select the sampling site, and the sampling point shall be the centroid of the stack or duct or at a point... the production rate (P) of 100 percent nitric acid for each run. Material balance over the production...

  20. A method for the automated detection phishing websites through both site characteristics and image analysis

    NASA Astrophysics Data System (ADS)

    White, Joshua S.; Matthews, Jeanna N.; Stacy, John L.

    2012-06-01

    Phishing website analysis is largely still a time-consuming manual process of discovering potential phishing sites, verifying if suspicious sites truly are malicious spoofs and if so, distributing their URLs to the appropriate blacklisting services. Attackers increasingly use sophisticated systems for bringing phishing sites up and down rapidly at new locations, making automated response essential. In this paper, we present a method for rapid, automated detection and analysis of phishing websites. Our method relies on near real-time gathering and analysis of URLs posted on social media sites. We fetch the pages pointed to by each URL and characterize each page with a set of easily computed values such as number of images and links. We also capture a screen-shot of the rendered page image, compute a hash of the image and use the Hamming distance between these image hashes as a form of visual comparison. We provide initial results demonstrate the feasibility of our techniques by comparing legitimate sites to known fraudulent versions from Phishtank.com, by actively introducing a series of minor changes to a phishing toolkit captured in a local honeypot and by performing some initial analysis on a set of over 2.8 million URLs posted to Twitter over a 4 days in August 2011. We discuss the issues encountered during our testing such as resolvability and legitimacy of URL's posted on Twitter, the data sets used, the characteristics of the phishing sites we discovered, and our plans for future work.

  1. Shape-intensity prior level set combining probabilistic atlas and probability map constrains for automatic liver segmentation from abdominal CT images.

    PubMed

    Wang, Jinke; Cheng, Yuanzhi; Guo, Changyong; Wang, Yadong; Tamura, Shinichi

    2016-05-01

    Propose a fully automatic 3D segmentation framework to segment liver on challenging cases that contain the low contrast of adjacent organs and the presence of pathologies from abdominal CT images. First, all of the atlases are weighted in the selected training datasets by calculating the similarities between the atlases and the test image to dynamically generate a subject-specific probabilistic atlas for the test image. The most likely liver region of the test image is further determined based on the generated atlas. A rough segmentation is obtained by a maximum a posteriori classification of probability map, and the final liver segmentation is produced by a shape-intensity prior level set in the most likely liver region. Our method is evaluated and demonstrated on 25 test CT datasets from our partner site, and its results are compared with two state-of-the-art liver segmentation methods. Moreover, our performance results on 10 MICCAI test datasets are submitted to the organizers for comparison with the other automatic algorithms. Using the 25 test CT datasets, average symmetric surface distance is [Formula: see text] mm (range 0.62-2.12 mm), root mean square symmetric surface distance error is [Formula: see text] mm (range 0.97-3.01 mm), and maximum symmetric surface distance error is [Formula: see text] mm (range 12.73-26.67 mm) by our method. Our method on 10 MICCAI test data sets ranks 10th in all the 47 automatic algorithms on the site as of July 2015. Quantitative results, as well as qualitative comparisons of segmentations, indicate that our method is a promising tool to improve the efficiency of both techniques. The applicability of the proposed method to some challenging clinical problems and the segmentation of the liver are demonstrated with good results on both quantitative and qualitative experimentations. This study suggests that the proposed framework can be good enough to replace the time-consuming and tedious slice-by-slice manual segmentation approach.

  2. Descriptive analysis and spatial epidemiology of porcine reproductive and respiratory syndrome (PRRS) for swine sites participating in area regional control and elimination programs from 3 regions of Ontario

    PubMed Central

    Arruda, Andreia G.; Poljak, Zvonimir; Friendship, Robert; Carpenter, Jane; Hand, Karen

    2015-01-01

    The objectives of this study were to describe demographics, basic biosecurity practices, ownership structure, and prevalence of porcine reproductive and respiratory syndrome (PRRS) in swine sites located in 3 regions in Ontario, and investigate the presence of spatial clustering and clusters of PRRS positive sites in the 3 regions. A total of 370 swine sites were enrolled in Area Regional Control and Elimination projects in Niagara, Watford, and Perth from 2010 to 2013. Demographics, biosecurity, and site ownership data were collected using a standardized questionnaire and site locations were obtained from an industry organization. Status was assigned on the basis of available diagnostic tests and/or assessment by site veterinarians. Spatial dependence was investigated using the D-function, the spatial scan statistic test and the spatial relative risk method. Results showed that the use of strict all-in all-out (AIAO) pig flow and shower before entry are uncommon biosecurity practices in swine sites, but a larger proportion of sites reported having a Danish entry. The prevalence of PRRS in the 3 regions ranged from 17% to 48% and localized high and low risk clusters were detected. Sites enrolled in the PRRS control projects were characterized by membership in multiple and overlapping ownership structures and networks, which complicates the way the results of monitoring and disease management measures are communicated to the target population. PMID:26424906

  3. A method for reducing the largest relative errors in Monte Carlo iterated-fission-source calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunter, J. L.; Sutton, T. M.

    2013-07-01

    In Monte Carlo iterated-fission-source calculations relative uncertainties on local tallies tend to be larger in lower-power regions and smaller in higher-power regions. Reducing the largest uncertainties to an acceptable level simply by running a larger number of neutron histories is often prohibitively expensive. The uniform fission site method has been developed to yield a more spatially-uniform distribution of relative uncertainties. This is accomplished by biasing the density of fission neutron source sites while not biasing the solution. The method is integrated into the source iteration process, and does not require any auxiliary forward or adjoint calculations. For a given amountmore » of computational effort, the use of the method results in a reduction of the largest uncertainties relative to the standard algorithm. Two variants of the method have been implemented and tested. Both have been shown to be effective. (authors)« less

  4. SELECTION AND TREATMENT OF STRIPPER GAS WELLS FOR PRODUCTION ENHANCEMENT IN THE MID-CONTINENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott Reeves

    2003-03-01

    Stripper gas wells are an important source of domestic energy supply and under constant threat of permanent loss (shut-in) due to marginal economics. In 1998, 192 thousand stripper gas wells produced over a Tcf of gas, at an average rate of less than 16 Mcfd. This represents about 57% of all producing gas wells in the onshore lower-48 states, yet only 8% of production. Reserves of stripper gas wells are estimated to be only 1.6 Tcf, or slightly over 1% of the onshore lower-48 total (end of year 1996 data). Obviously, stripper gas wells are at the very margin ofmore » economic sustenance. As the demand for natural gas in the U.S. grows to the forecasted estimate of over 30 Tcf annually by the year 2010, supply from current conventional sources is expected to decline. Therefore, an important need exists to fully exploit known domestic resources of natural gas, including those represented by stripper gas wells. The overall objectives of this project are to develop an efficient and low-cost methodology to broadly categorize the well performance characteristics for a stripper gas field, identify the high-potential candidate wells for remediation, and diagnose the specific causes for well underperformance. With this capability, stripper gas well operators can more efficiently and economically produce these resources and maximize these gas reserves. A further objective is to identify/develop, evaluate and test ''new and novel,'' economically viable remediation options. Finally, it is the objective of this project that all the methods and technologies developed in this project, while being tested in the Mid-Continent, be widely applicable to stripper gas wells of all types across the country. The project activities during the reporting period were: (1) Continued to solicit industry research partners to provide test sites. A Cooperative Research Agreement has been signed with Oneok, for a test site in the Mocane-Laverne field in the Anadarko basin (Oklahoma). The site consists of about 150 wells producing primarily from the Morrow, but also the Chester, Hoover, and Tonkawa. The Morrow is the second largest gas play in the Anadarko basin (next to Hugoton), and the Mocane-Laverne field in the largest Morrow field, so any new methods developed at this site will have broad application throughout the Mid-Continent Morrow play (which has tens of thousands of wells). (2) Discussions are also ongoing with EOG Resources and Patina Oil and Gas to obtain test sites. In these cases, however, the sites are in the Rocky Mountains. The difficulty being encountered in securing Mid-Continent test sites has forced us to expand the geographic scope of our search. (3) Data collection for the Mocane-Laverne site has recently begun, and will be completed soon. At that time the various analyses will be performed to identify enhancement potential.« less

  5. Problematic use of social networking sites among urban school going teenagers

    PubMed Central

    Meena, Parth Singh; Mittal, Pankaj Kumar; Solanki, Ram Kumar

    2012-01-01

    Background: Social networking sites like Facebook, Orkut and Twitter are virtual communities where users can create individual public profiles, interact with real-life friends and meet other people based on shared interests. An exponential rise in usage of Social Networking Sites have been seen within the last few years. Their ease of use and immediate gratification effect on users has changed the way people in general and students in particular spend their time. Young adults, particularly teenagers tended to be unaware of just how much time they really spent on social networking sites. Negative correlates of Social Networking Sites usage include the decrease in real life social community participation and academic achievement, as well as relationship problems, each of which may be indicative of potential addiction. Aims: the aim of the study was to find out whether teenagers, specially those living in cities spend too much time on social networking websites. Materials and Methods: 200 subjects, both boys and girls were included in the cross sectional study who were given a 20 item Young's internet addiction test modified for social networking sites. The responses were analyzed using chi square test and Fisher's exact test. Results: 24.74% of the students were having occasional or ‘frequency’ problems while 2.02% of them were experiencing severe problems due to excessive time spent using social networking sites. Conclusion: With the ever increasing popularity of social media, teenagers are devoting significant time to social networking on websites and are prone to get ‘addicted’ to such form of online social interaction. PMID:24250039

  6. Selecting optimal monitoring site locations for peak ambient particulate material concentrations using the MM5-CAMx4 numerical modelling system.

    PubMed

    Sturman, Andrew; Titov, Mikhail; Zawar-Reza, Peyman

    2011-01-15

    Installation of temporary or long term monitoring sites is expensive, so it is important to rationally identify potential locations that will achieve the requirements of regional air quality management strategies. A simple, but effective, numerical approach to selecting ambient particulate matter (PM) monitoring site locations has therefore been developed using the MM5-CAMx4 air pollution dispersion modelling system. A new method, 'site efficiency,' was developed to assess the ability of any monitoring site to provide peak ambient air pollution concentrations that are representative of the urban area. 'Site efficiency' varies from 0 to 100%, with the latter representing the most representative site location for monitoring peak PM concentrations. Four heavy pollution episodes in Christchurch (New Zealand) during winter 2005, representing 4 different aerosol dispersion patterns, were used to develop and test this site assessment technique. Evaluation of the efficiency of monitoring sites was undertaken for night and morning aerosol peaks for 4 different particulate material (PM) spatial patterns. The results demonstrate that the existing long term monitoring site at Coles Place is quite well located, with a site efficiency value of 57.8%. A temporary ambient PM monitoring site (operating during winter 2006) showed a lower ability to capture night and morning peak aerosol concentrations. Evaluation of multiple site locations used during an extensive field campaign in Christchurch (New Zealand) in 2000 indicated that the maximum efficiency achieved by any site in the city would be 60-65%, while the efficiency of a virtual background site is calculated to be about 7%. This method of assessing the appropriateness of any potential monitoring site can be used to optimize monitoring site locations for any air pollution measurement programme. Copyright © 2010 Elsevier B.V. All rights reserved.

  7. Carboxylator: incorporating solvent-accessible surface area for identifying protein carboxylation sites

    NASA Astrophysics Data System (ADS)

    Lu, Cheng-Tsung; Chen, Shu-An; Bretaña, Neil Arvin; Cheng, Tzu-Hsiu; Lee, Tzong-Yi

    2011-10-01

    In proteins, glutamate (Glu) residues are transformed into γ-carboxyglutamate (Gla) residues in a process called carboxylation. The process of protein carboxylation catalyzed by γ-glutamyl carboxylase is deemed to be important due to its involvement in biological processes such as blood clotting cascade and bone growth. There is an increasing interest within the scientific community to identify protein carboxylation sites. However, experimental identification of carboxylation sites via mass spectrometry-based methods is observed to be expensive, time-consuming, and labor-intensive. Thus, we were motivated to design a computational method for identifying protein carboxylation sites. This work aims to investigate the protein carboxylation by considering the composition of amino acids that surround modification sites. With the implication of a modified residue prefers to be accessible on the surface of a protein, the solvent-accessible surface area (ASA) around carboxylation sites is also investigated. Radial basis function network is then employed to build a predictive model using various features for identifying carboxylation sites. Based on a five-fold cross-validation evaluation, a predictive model trained using the combined features of amino acid sequence (AA20D), amino acid composition, and ASA, yields the highest accuracy at 0.874. Furthermore, an independent test done involving data not included in the cross-validation process indicates that in silico identification is a feasible means of preliminary analysis. Additionally, the predictive method presented in this work is implemented as Carboxylator (http://csb.cse.yzu.edu.tw/Carboxylator/), a web-based tool for identifying carboxylated proteins with modification sites in order to help users in investigating γ-glutamyl carboxylation.

  8. Predicting protein-binding RNA nucleotides with consideration of binding partners.

    PubMed

    Tuvshinjargal, Narankhuu; Lee, Wook; Park, Byungkyu; Han, Kyungsook

    2015-06-01

    In recent years several computational methods have been developed to predict RNA-binding sites in protein. Most of these methods do not consider interacting partners of a protein, so they predict the same RNA-binding sites for a given protein sequence even if the protein binds to different RNAs. Unlike the problem of predicting RNA-binding sites in protein, the problem of predicting protein-binding sites in RNA has received little attention mainly because it is much more difficult and shows a lower accuracy on average. In our previous study, we developed a method that predicts protein-binding nucleotides from an RNA sequence. In an effort to improve the prediction accuracy and usefulness of the previous method, we developed a new method that uses both RNA and protein sequence data. In this study, we identified effective features of RNA and protein molecules and developed a new support vector machine (SVM) model to predict protein-binding nucleotides from RNA and protein sequence data. The new model that used both protein and RNA sequence data achieved a sensitivity of 86.5%, a specificity of 86.2%, a positive predictive value (PPV) of 72.6%, a negative predictive value (NPV) of 93.8% and Matthews correlation coefficient (MCC) of 0.69 in a 10-fold cross validation; it achieved a sensitivity of 58.8%, a specificity of 87.4%, a PPV of 65.1%, a NPV of 84.2% and MCC of 0.48 in independent testing. For comparative purpose, we built another prediction model that used RNA sequence data alone and ran it on the same dataset. In a 10 fold-cross validation it achieved a sensitivity of 85.7%, a specificity of 80.5%, a PPV of 67.7%, a NPV of 92.2% and MCC of 0.63; in independent testing it achieved a sensitivity of 67.7%, a specificity of 78.8%, a PPV of 57.6%, a NPV of 85.2% and MCC of 0.45. In both cross-validations and independent testing, the new model that used both RNA and protein sequences showed a better performance than the model that used RNA sequence data alone in most performance measures. To the best of our knowledge, this is the first sequence-based prediction of protein-binding nucleotides in RNA which considers the binding partner of RNA. The new model will provide valuable information for designing biochemical experiments to find putative protein-binding sites in RNA with unknown structure. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. a Hyperspectral Based Method to Detect Cannabis Plantation in Inaccessible Areas

    NASA Astrophysics Data System (ADS)

    Houmi, M.; Mohamadi, B.; Balz, T.

    2018-04-01

    The increase in drug use worldwide has led to sophisticated illegal planting methods. Most countries depend on helicopters, and local knowledge to identify such illegal plantations. However, remote sensing techniques can provide special advantages for monitoring the extent of illegal drug production. This paper sought to assess the ability of the Satellite remote sensing to detect Cannabis plantations. This was achieved in two stages: 1- Preprocessing of Hyperspectral data EO-1, and testing the capability to collect the spectral signature of Cannabis in different sites of the study area (Morocco) from well-known Cannabis plantation fields. 2- Applying the method of Spectral Angle Mapper (SAM) based on a specific angle threshold on Hyperion data EO-1 in well-known Cannabis plantation sites, and other sites with negative Cannabis plantation in another study area (Algeria), to avoid any false Cannabis detection using these spectra. This study emphasizes the benefits of using hyperspectral remote sensing data as an effective detection tool for illegal Cannabis plantation in inaccessible areas based on SAM classification method with a maximum angle (radians) less than 0.03.

  10. Ultimate pier and contraction scour prediction in cohesive soils at selected bridges in Illinois

    USGS Publications Warehouse

    Straub, Timothy D.; Over, Thomas M.; Domanski, Marian M.

    2013-01-01

    The Scour Rate In COhesive Soils-Erosion Function Apparatus (SRICOS-EFA) method includes an ultimate scour prediction that is the equilibrium maximum pier and contraction scour of cohesive soils over time. The purpose of this report is to present the results of testing the ultimate pier and contraction scour methods for cohesive soils on 30 bridge sites in Illinois. Comparison of the ultimate cohesive and noncohesive methods, along with the Illinois Department of Transportation (IDOT) cohesive soil reduction-factor method and measured scour are presented. Also, results of the comparison of historic IDOT laboratory and field values of unconfined compressive strength of soils (Qu) are presented. The unconfined compressive strength is used in both ultimate cohesive and reduction-factor methods, and knowing how the values from field methods compare to the laboratory methods is critical to the informed application of the methods. On average, the non-cohesive method results predict the highest amount of scour, followed by the reduction-factor method results; and the ultimate cohesive method results predict the lowest amount of scour. The 100-year scour predicted for the ultimate cohesive, noncohesive, and reduction-factor methods for each bridge site and soil are always larger than observed scour in this study, except 12% of predicted values that are all within 0.4 ft of the observed scour. The ultimate cohesive scour prediction is smaller than the non-cohesive scour prediction method for 78% of bridge sites and soils. Seventy-six percent of the ultimate cohesive predictions show a 45% or greater reduction from the non-cohesive predictions that are over 10 ft. Comparing the ultimate cohesive and reduction-factor 100-year scour predictions methods for each bridge site and soil, the scour predicted by the ultimate cohesive scour prediction method is less than the reduction-factor 100-year scour prediction method for 51% of bridge sites and soils. Critical shear stress remains a needed parameter in the ultimate scour prediction for cohesive soils. The unconfined soil compressive strength measured by IDOT in the laboratory was found to provide a good prediction of critical shear stress, as measured by using the erosion function apparatus in a previous study. Because laboratory Qu analyses are time-consuming and expensive, the ability of field-measured Rimac data to estimate unconfined soil strength in the critical shear–soil strength relation was tested. A regression analysis was completed using a historic IDOT dataset containing 366 data pairs of laboratory Qu and field Rimac measurements from common sites with cohesive soils. The resulting equations provide a point prediction of Qu, given any Rimac value with the 90% confidence interval. The prediction equations are not significantly different from the identity Qu = Rimac. The alternative predictions of ultimate cohesive scour presented in this study assume Qu will be estimated using Rimac measurements that include computed uncertainty. In particular, the ultimate cohesive predicted scour is greater than observed scour for the entire 90% confidence interval range for predicting Qu at the bridges and soils used in this study, with the exception of the six predicted values that are all within 0.6 ft of the observed scour.

  11. Is there a relationship between geographic distance and uptake of HIV testing services? A representative population-based study of Chinese adults in Guangzhou, China.

    PubMed

    Chen, Wen; Zhou, Fangjing; Hall, Brian J; Tucker, Joseph D; Latkin, Carl; Renzaho, Andre M N; Ling, Li

    2017-01-01

    Achieving high coverage of HIV testing services is critical in many health systems, especially where HIV testing services remain centralized and inconvenient for many. As a result, planning the optimal spatial distribution of HIV testing sites is increasingly important. We aimed to assess the relationship between geographic distance and uptake of HIV testing services among the general population in Guangzhou, China. Utilizing spatial epidemiological methods and stratified household random sampling, we studied 666 adults aged 18-59. Computer-assisted interviews assessed self-reported HIV testing history. Spatial scan statistic assessed the clustering of participants who have ever been tested for HIV, and two-level logistic regression models assessed the association between uptake of HIV testing and the mean driving distance from the participant's residence to all HIV testing sites in the research sites. The percentage of participants who have ever been tested for HIV was 25.2% (168/666, 95%CI: 21.9%, 28.5%), and the majority (82.7%) of participants tested for HIV in Centres for Disease Control and Prevention, public hospitals or STIs clinics. None reported using self-testing. Spatial clustering analyses found a hotspot included 48 participants who have ever been tested for HIV and 25.8 expected cases (Rate Ratio = 1.86, P = 0.002). Adjusted two-level logistic regression found an inverse relationship between geographic distance (kilometers) and ever being tested for HIV (aOR = 0.90, 95%CI: 0.84, 0.96). Married or cohabiting participants (aOR = 2.14, 95%CI: 1.09, 4.20) and those with greater social support (aOR = 1.04, 95%CI: 1.01, 1.07) were more likely to be tested for HIV. Our findings underscore the importance of considering the geographical distribution of HIV testing sites to increase testing. In addition, expanding HIV testing coverage by introducing non-facility based HIV testing services and self-testing might be useful to achieve the goal that 90% of people living with HIV knowing their HIV status by the year 2020.

  12. Timescale Correlation between Marine Atmospheric Exposure and Accelerated Corrosion Testing

    NASA Technical Reports Server (NTRS)

    Montgomery, Eliza L.; Calle, Luz Marina; Curran, Jerone C.; Kolody, Mark R.

    2011-01-01

    Evaluation of metal-based structures has long relied on atmospheric exposure test sites to determine corrosion resistance in marine environments. Traditional accelerated corrosion testing relies on mimicking the exposure conditions, often incorporating salt spray and ultraviolet (UV) radiation, and exposing the metal to continuous or cyclic conditions of the corrosive environment. Their success for correlation to atmospheric exposure is often a concern when determining the timescale to which the accelerated tests can be related. Accelerated laboratory testing, which often focuses on the electrochemical reactions that occur during corrosion conditions, has yet to be universally accepted as a useful tool in predicting the long term service life of a metal despite its ability to rapidly induce corrosion. Although visual and mass loss methods of evaluating corrosion are the standard and their use is imperative, a method that correlates timescales from atmospheric exposure to accelerated testing would be very valuable. This work uses surface chemistry to interpret the chemical changes occurring on low carbon steel during atmospheric and accelerated corrosion conditions with the objective of finding a correlation between its accelerated and long-term corrosion performance. The current results of correlating data from marine atmospheric exposure conditions at the Kennedy Space Center beachside corrosion test site, alternating seawater spray, and immersion in typical electrochemical laboratory conditions, will be presented. Key words: atmospheric exposure, accelerated corrosion testing, alternating seawater spray, marine, correlation, seawater, carbon steel, long-term corrosion performance prediction, X-ray photoelectron spectroscopy.

  13. Application of Phytoscreening to Three Hazardous Waste Sites in Arizona

    NASA Astrophysics Data System (ADS)

    Duncan, C.

    2017-12-01

    The great majority of prior phytoscreening applications have been conducted in humid and temperate environments wherein groundwater is relatively shallow ( 1-6m deep). The objective of this research is to evaluate its use in semi-arid environments for sites with deeper groundwater (>10 m). To that end, phytoscreening is applied to three chlorinated-solvent hazardous-waste sites in Arizona. Contaminant concentrations were quantifiable in tree-tissue samples collected from two of the sites (Nogales, Park-Euclid). Contaminant concentrations were detectable, but not quantifiable, for the third site. Tree-tissue concentrations of tetrachloroethene (PCE) ranged from approximately 400-5000 ug/kg wet weight for burrobrush, cottonwood, palo verde, and velvet mesquite at the Nogales site. In addition to standard trunk-core samples, leaf samples were collected to test the effectiveness of a less invasive sampling method. Leaf-sample concentrations were quantifiable, but several times lower than the corresponding core-sample concentrations. Comparison of results obtained for the test sites to those reported in the literature suggest that tree species is amajor factormediating observed results. One constraint faced for the Arizona siteswas the relative scarcity of mature trees available for sampling, particularly in areas adjacent to industrial zones. The results of this study illustrate that phytoscreening can be used effectively to characterize the presence of groundwater contamination for semi-arid sites with deeper groundwater.

  14. 2012 Groundwater Monitoring and Inspection Report Gnome-Coach, New Mexico, Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-03-01

    Gnome-Coach was the site of a 3-kiloton underground nuclear test conducted in 1961. Surface and subsurface contamination resulted from the underground nuclear testing, post-test drilling, and a groundwater tracer test performed at the site. Surface reclamation and remediation began after the underground testing. A Completion Report was prepared, and the State of New Mexico is currently proceeding with a conditional certificate of completion for the surface. Subsurface corrective action activities began in 1972 and have generally consisted of annual sampling and monitoring of wells near the site. In 2008, the annual site inspections were refined to include hydraulic head monitoringmore » and collection of samples from groundwater monitoring wells onsite using the low-flow sampling method. These activities were conducted during this monitoring period on January 18, 2012. Analytical results from this sampling event indicate that concentrations of tritium, strontium-90, and cesium-137 were generally consistent with concentrations from historical sampling events. The exceptions are the decreases in concentrations of strontium-90 in samples from wells USGS-4 and USGS-8, which were more than 2.5 times lower than last year's results. Well USGS-1 provides water for livestock belonging to area ranchers, and a dedicated submersible pump cycles on and off to maintain a constant volume in a nearby water tank. Water levels in wells USGS-4 and USGS-8 respond to the on/off cycling of the water supply pumping from well USGS-1. Well LRL-7 was not sampled in January, and water levels were still increasing when the transducer data were downloaded in September. A seismic reflection survey was also conducted this year. The survey acquired approximately 13.9 miles of seismic reflection data along 7 profiles on and near the site. These activities were conducted from February 23 through March 10, 2012. The site roads, monitoring well heads, and the monument at surface ground zero were in good condition at the time of the site inspection. However, it was reported in September 2012 that the USGS-1 well head had been damaged by a water truck in April 2012.« less

  15. Tanzania Health Information Technology (T-HIT) System: Pilot Test of a Tablet-Based System to Improve Prevention of Mother-to-Child Transmission of HIV

    PubMed Central

    Bull, Sheana; Nyanza, Elias C; Ngallaba, Sospatro E

    2018-01-01

    Background The prevention of mother-to-child transmission (PMTCT) of HIV requires innovative solutions. Although routine monitoring is effective in some areas, standardized and easy-to-scale solutions to identify and monitor pregnant women, test them for HIV, and treat them and their children is still lacking. Mobile health (mHealth) offers opportunities for surveillance and reporting in rural areas of low- and middle-income countries. Objective The aim of this study was to document the preliminary impacts of the Tanzania Health Information Technology (T-HIT) system mHealth intervention aimed at health workers for PMTCT care delivery and capacity building in a rural area of Tanzania. Methods We developed T-HIT as a tablet-based system for an electronic data collection system designed to capture and report PMTCT data during antenatal, delivery, and postnatal visits in Misungwi, Tanzania. T-HIT was tested by health workers in a pilot randomized trial comparing seven sites using T-HIT assigned at random to seven control sites; all sites maintained standard paper record-keeping during the pilot intervention period. We compared numbers of antenatal visits, number of HIV tests administered, and women testing positive across all sites. Results Health workers recorded data from antenatal visits for 1530 women; of these, 695 (45.42%) were tested for HIV and 3.59% (55/1530) tested positive. Health workers were unable to conduct an HIV test for 103 women (6.73%, 103/1530) because of lack of reagent, which is not captured on paper logs. There was no difference in the activity level for testing when comparing sites T-HIT to non-T-HIT sites. We observed a significant postintervention increase in the numbers of women testing positive for HIV compared with the preintervention period (P=.04), but this was likely not attributable to the T-HIT system. Conclusions T-HIT had a high degree of acceptability and feasibility and is perceived as useful by health workers, who documented more antenatal visits during the pilot intervention compared with a traditional system of paper logs, suggesting potential for improvements in antenatal care for women at risk for HIV. PMID:29335236

  16. Trends in pesticide concentrations in corn-belt streams, 1996-2006

    USGS Publications Warehouse

    Sullivan, Daniel J.; Vecchia, Aldo V.; Lorenz, David L.; Gilliom, Robert J.; Martin, Jeffrey D.

    2009-01-01

    Trends in the concentrations of commonly occurring pesticides in the Corn Belt of the United States were assessed, and the performance and application of several statistical methods for trend analysis were evaluated. Trends in the concentrations of 11 pesticides with sufficient data for trend assessment were assessed at up to 31 stream sites for two time periods: 1996–2002 and 2000–2006. Pesticides included in the trend analyses were atrazine, acetochlor, metolachlor, alachlor, cyanazine, EPTC, simazine, metribuzin, prometon, chlorpyrifos, and diazinon.The statistical methods applied and compared were (1) a modified version of the nonparametric seasonal Kendall test (SEAKEN), (2) a modified version of the Regional Kendall test, (3) a parametric regression model with seasonal wave (SEAWAVE), and (4) a version of SEAWAVE with adjustment for streamflow (SEAWAVE-Q). The SEAKEN test is a statistical hypothesis test for detecting monotonic trends in seasonal time-series data such as pesticide concentrations at a particular site. Trends across a region, represented by multiple sites, were evaluated using the regional seasonal Kendall test, which computes a test for an overall trend within a region by computing a score for each season at each site and adding the scores to compute the total for the region. The SEAWAVE model is a parametric regression model specifically designed for analyzing seasonal variability and trends in pesticide concentrations. The SEAWAVE-Q model accounts for the effect of changing flow conditions in order to separate changes caused by hydrologic trends from changes caused by other factors, such as pesticide use.There was broad, general agreement between unadjusted trends (no adjustment for streamflow effects) identified by the SEAKEN and SEAWAVE methods, including the regional seasonal Kendall test. Only about 10 percent of the paired comparisons between SEAKEN and SEAWAVE indicated a difference in the direction of trend, and none of these had differences significant at the 10-percent significance level. This consistency of results supports the validity and robustness of all three approaches as trend analysis tools. The SEAWAVE method is favored, however, because it has less restrictive data requirements, enabling analysis for more site/pesticide combinations, and can incorporate adjustment for streamflow (SEAWAVE-Q) with substantially fewer measurements than the flow-adjustment procedure used with SEAKEN.Analysis of flow-adjusted trends is preferable to analysis of non-adjusted trends for evaluating potential effects of changes in pesticide use or management practices because flow-adjusted trends account for the influence of flow-related variability.Analysis of flow-adjusted trends by SEAWAVE-Q showed that all of the pesticides assessed, except simazine and acetochlor, were dominated by varying degrees of concentration downtrends in one or both analysis periods. Atrazine, metolachlor, alachlor, cyanazine, EPTC, and metribuzin—all major corn herbicides, as well as prometon and chlorpyrifos, showed more prevalent concentration downtrends during 1996–2002 compared to 2000–2006. Diazinon had no clear trends during 1996–2002, but had predominantly downward trends during 2000–2006. Acetochlor trends were mixed during 1996–2002 and slightly upward during 2000–2006, but most of the trends were not statistically significant. Simazine concentrations trended upward at most sites during both 1996–2002 and 2000–2006.Comparison of concentration trends to agricultural-use trends indicated similarity in direction and magnitude for acetochlor, metolachlor, alachlor, cyanazine, EPTC, and metribuzin. Concentration downtrends for atrazine, chlorpyrifos, and diazinon were steeper than agricultural-use downtrends at some sites, indicating the possibility that agricultural management practices may have increasingly reduced transport to streams (particularly atrazine) or, for chlorpyrifos and diazinon, that nonagricultural uses declined substantially. Concentration uptrends for simazine generally were steeper than agricultural-use uptrends, indicating the possibility that nonagricultural uses of this herbicide increased during the study period.

  17. Design and Testing of a Method to Reach Agreement for Responsibilities in Collection Building Among Libraries. Final Report.

    ERIC Educational Resources Information Center

    Redfield, Gretchen

    As a first step towards resource sharing among libraries in the Cleveland Area Metropolitan Library System (CAMLS), a unique method, called the Site Appraisal for Area Resources Inventory (SAFARI), was developed to examine the library collections. This approach was different than others in that collections were compared by experts in a specific…

  18. Summer trapping method for mule deer. [Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giles, K.R.

    1979-07-01

    A summer mule deer trapping method which uses modified Clover traps in a circular corral with water as a bait is described. Drug restraint was used to facilitate safe handling of mule deer by the investigator. Fifteen mule deer were safely captured and outfitted with radio transmitters, ear tags, and reflective markers, and their movements monitored to determine migration patterns.

  19. Development of methods for the restoration of the American elm in forested landscapes

    Treesearch

    James M. Slavicek

    2013-01-01

    A project was initiated in 2003 to establish test sites to develop methods to reintroduce the American elm (Ulmus americana L.) in forested landscapes. American elm tree strains with high levels of tolerance to Dutch elm disease (DED) were established in areas where the trees can naturally regenerate and spread. The process of regeneration will...

  20. Integrated use of surface geophysical methods for site characterization — A case study in North Kingstown, Rhode Island

    USGS Publications Warehouse

    Johnson, Carole D.; Lane, John W.; Brandon, William C.; Williams, Christine A.P.; White, Eric A.

    2010-01-01

    A suite of complementary, non‐invasive surface geophysical methods was used to assess their utility for site characterization in a pilot investigation at a former defense site in North Kingstown, Rhode Island. The methods included frequency‐domain electromagnetics (FDEM), ground‐penetrating radar (GPR), electrical resistivity tomography (ERT), and multi‐channel analysis of surface‐wave (MASW) seismic. The results of each method were compared to each other and to drive‐point data from the site. FDEM was used as a reconnaissance method to assess buried utilities and anthropogenic structures; to identify near‐surface changes in water chemistry related to conductive leachate from road‐salt storage; and to investigate a resistive signature possibly caused by groundwater discharge. Shallow anomalies observed in the GPR and ERT data were caused by near‐surface infrastructure and were consistent with anomalies observed in the FDEM data. Several parabolic reflectors were observed in the upper part of the GPR profiles, and a fairly continuous reflector that was interpreted as bedrock could be traced across the lower part of the profiles. MASW seismic data showed a sharp break in shear wave velocity at depth, which was interpreted as the overburden/bedrock interface. The MASW profile indicates the presence of a trough in the bedrock surface in the same location where the ERT data indicate lateral variations in resistivity. Depths to bedrock interpreted from the ERT, MASW, and GPR profiles were similar and consistent with the depths of refusal identified in the direct‐push wells. The interpretations of data collected using the individual methods yielded non‐unique solutions with considerable uncertainty. Integrated interpretation of the electrical, electromagnetic, and seismic geophysical profiles produced a more consistent and unique estimation of depth to bedrock that is consistent with ground‐truth data at the site. This test case shows that using complementary techniques that measure different properties can be more effective for site characterization than a single‐method investigation.

  1. Defining the local nerve blocks for feline distal thoracic limb surgery: a cadaveric study

    PubMed Central

    Enomoto, Masataka; Lascelles, B Duncan X; Gerard, Mathew P

    2016-01-01

    Objectives Though controversial, onychectomy remains a commonly performed distal thoracic limb surgical procedure in cats. Peripheral nerve block techniques have been proposed in cats undergoing onychectomy but evidence of efficacy is lacking. Preliminary tests of the described technique using cadavers resulted in incomplete staining of nerves. The aim of this study was to develop nerve block methods based on cadaveric dissections and test these methods with cadaveric dye injections. Methods Ten pairs of feline thoracic limbs (n = 20) were dissected and superficial branches of the radial nerve (RSbr nn.), median nerve (M n.), dorsal branch of ulnar nerve (UDbr n.), superficial branch of palmar branch of ulnar nerve (UPbrS n.) and deep branch of palmar branch of ulnar nerve (UPbrDp n.) were identified. Based on these dissections, a four-point block was developed and tested using dye injections in another six pairs of feline thoracic limbs (n = 12). Using a 25 G × 5/8 inch needle and 1 ml syringe, 0.07 ml/kg methylene blue was injected at the site of the RSbr nn., 0.04 ml/kg at the injection site of the UDbr n., 0.08 ml/kg at the injection site of the M n. and UPbrS n., and 0.01 ml/kg at the injection site of the UPbrDp n. The length and circumference of each nerve that was stained was measured. Results Positive staining of all nerves was observed in 12/12 limbs. The lengths stained for RSbr nn., M n., UDbr n., UPbrS n. and UPbrDp n. were 34.9 ± 5.3, 26.4 ± 4.8, 29.2 ± 4.0, 39.1 ± 4.3 and 17.5 ± 3.3 mm, respectively. The nerve circumferences stained were 93.8 ± 15.5, 95.8 ± 9.7, 100 ± 0.0, 100 ± 0.0 and 93.8 ± 15.5%, respectively. Conclusions and relevance This described four-point injection method may be an effective perioperative analgesia technique for feline distal thoracic limb procedures. PMID:26250858

  2. General Review On The Investigations Conducted At Institute Saint-Louis (ISL) In The Field Of Holographic Nondestructive Testing

    NASA Astrophysics Data System (ADS)

    Smigielski, P.

    1982-10-01

    Among the various methods presently used in the field of nondestructive testing, optical holography is expected to become a very useful and promising tool in the near future. In fact, holography offers a number of advantages which should be briefly outlined here : direct and overall visualization of defects (disbonding, formation of cracks, inhomogeneities...) on large sufaces (of several square meters). Furthermore there is no interaction with the object under test and the surface to be studied has not to be treated. Finally holography is characterized by a high spatial resolution and a great sensitivity (it is possible to detect deformations as small as a few microns). In contrast to other modern techniques,holography is relatively unexpensive and can be used on-site with pulsed lasers. The general principles of holography and of methods using holographic interferometry will be recalled (double-exposure holographic interferometry, real-time holographic interferometry, "time-average" holographic interferometry). Thereafter the activities in which ISL is presently engaged will be reported briefly, that is laboratory feasibility tests and experiments conducted on-site in an industrial environment with the aid, in general, of pulsed ruby lasers : testing of adhesive bonding in solid propellant rockers and in aircraft structures, detection and observation of cracking in fatigue tests, visua-lization of the modes of vibration of mechanical structures, experiments conducted on air-craft subjected to maintenance checking, etc.

  3. Ultra-portable field transfer radiometer for vicarious calibration of earth imaging sensors

    NASA Astrophysics Data System (ADS)

    Thome, Kurtis; Wenny, Brian; Anderson, Nikolaus; McCorkel, Joel; Czapla-Myers, Jeffrey; Biggar, Stuart

    2018-06-01

    A small portable transfer radiometer has been developed as part of an effort to ensure the quality of upwelling radiance from test sites used for vicarious calibration in the solar reflective. The test sites are used to predict top-of-atmosphere reflectance relying on ground-based measurements of the atmosphere and surface. The portable transfer radiometer is designed for one-person operation for on-site field calibration of instrumentation used to determine ground-leaving radiance. The current work describes the detector- and source-based radiometric calibration of the transfer radiometer highlighting the expected accuracy and SI-traceability. The results indicate differences between the detector-based and source-based results greater than the combined uncertainties of the approaches. Results from recent field deployments of the transfer radiometer using a solar radiation based calibration agree with the source-based laboratory calibration within the combined uncertainties of the methods. The detector-based results show a significant difference to the solar-based calibration. The source-based calibration is used as the basis for a radiance-based calibration of the Landsat-8 Operational Land Imager that agrees with the OLI calibration to within the uncertainties of the methods.

  4. Cross-site strain comparison of pharmacological deficits in the touchscreen visual discrimination test.

    PubMed

    Mohler, Eric G; Ding, Zhiyong; Rueter, Lynne E; Chapin, Douglas; Young, Damon; Kozak, Rouba

    2015-11-01

    The low rate of success for identifying effective treatments for cognitive dysfunction has prompted recent efforts to improve pharmaceutical discovery and development. In particular, investigators have emphasized improving translation from pre-clinical to clinical research. A specific area of focus has been touchscreen technology; this computer-automated behavioral testing method provides an objective assessment of performance that can be used across species. As part of a larger multi-site study with partners from the Innovative Medicines Initiative (IMI), two US sites, AbbVie and Pfizer, conducted a cross-site experiment with a common protocol for the visual discrimination (VD) task using identical testing equipment, stimuli, and rats of the same strains, sex, and age from the same supplier. As most touchscreen-based rodent experiments have used Lister-Hooded rats that are not readily available outside of Europe, a strain comparison with male Long-Evans rats was conducted as part of the study. Rats were trained for asymptotic performance, and test sessions were performed once per week in a full crossover design with cognition-impairing drugs. Drugs tested were phencyclidine and S-ketamine (N-methyl-D-aspartate (NMDA) antagonists), D-amphetamine (indirect dopamine agonist), and scopolamine (muscarinic antagonist). Satellite brain and plasma samples were taken to confirm appropriate exposures. Results indicate that both rat strains show similar patterns of impairment, although Lister-Hooded rats were more sensitive than Long-Evans rats to three out of four drugs tested. This suggests that researchers should fully explore dose-response relationships in their strain of choice and use care in the interpretation of reversal of cognitive impairment.

  5. Online clinical reasoning assessment with the Script Concordance test: a feasibility study

    PubMed Central

    Sibert, Louis; Darmoni, Stefan J; Dahamna, Badisse; Weber, Jacques; Charlin, Bernard

    2005-01-01

    Background The script concordance (SC) test is an assessment tool that measures capacity to solve ill-defined problems, that is, reasoning in context of uncertainty. This tool has been used up to now mainly in medicine. The purpose of this pilot study is to assess the feasibility of the test delivered on the Web to French urologists. Methods The principle of SC test construction and the development of the Web site are described. A secure Web site was created with two sequential modules: (a) The first one for the reference panel (n = 26) with two sub-tasks: to validate the content of the test and to elaborate the scoring system; (b) The second for candidates with different levels of experience in Urology: Board certified urologists, residents, medical students (5 or 6th year). Minimum expected number of participants is 150 for urologists, 100 for residents and 50 for medical students. Each candidate is provided with an individual access code to this Web site. He/she may complete the Script Concordance test several times during his/her curriculum. Results The Web site has been operational since April 2004. The reference panel validated the test in June of the same year during the annual seminar of the French Society of Urology. The Web site is available for the candidates since September 2004. In six months, 80% of the target figure for the urologists, 68% of the target figure for the residents and 20% of the target figure for the student passed the test online. During these six months, no technical problem was encountered. Conclusion The feasibility of the web-based SC test is successful as two-thirds of the expected number of participants was included within six months. Psychometric properties (validity, reliability) of the test will be evaluated on a large scale (N = 300). If positive, educational impact of this assessment tool will be useful to help urologists during their curriculum for the acquisition of clinical reasoning skills, which is crucial for professional competence. PMID:15967034

  6. Experimental Detection and Characterization of Void using Time-Domain Reflection Wave

    NASA Astrophysics Data System (ADS)

    Zahari, M. N. H.; Madun, A.; Dahlan, S. H.; Joret, A.; Zainal Abidin, M. H.; Mohammad, A. H.; Omar, A. H.

    2018-04-01

    Recent technologies in engineering views have brought the significant improvement in terms of performance and precision. One of those improvements is in geophysics studies for underground detection. Reflection method has been demonstrated to able to detect and locate subsurface anomalies in previous studies, including voids. Conventional method merely involves field testing only for limited areas. This may lead to undiscovered of the void position. Problems arose when the voids were not recognised in early stage and thus, causing hazards, costs increment, and can lead to serious accidents and structural damages. Therefore, to achieve better certainty of the site investigation, a dynamic approach is needed to be implemented. To estimate and characterize the anomalies signal in a better way, an attempt has been made to model air-filled void as experimental testing at site. Robust detection and characterization of voids through inexpensive cost using reflection method are proposed to improve the detectability and characterization of the void. The result shows 2-Dimensional and 3-Dimensional analyses of void based on reflection data with P-waves velocity at 454.54 m/s.

  7. Testing Mars-inspired operational strategies for semi-autonomous rovers on the Moon: The GeoHeuristic Operational Strategies Test in New Mexico.

    PubMed

    Yingst, R Aileen; Cohen, B A; Crumpler, L; Schmidt, M E; Schrader, C M

    2011-01-01

    We tested the science operational strategy used for the Mars Exploration Rover (MER) mission on Mars to determine its suitability for conducting remote geology on the Moon by conducting a field test at Cerro de Santa Clara, New Mexico. This region contains volcanic and sedimentary products from a variety of provenances, mimicking the variety that might be found at a lunar site such as South Pole-Aitken Basin. At each site a Science Team broke down observational "days" into a sequence of observations of features and targets of interest. The number, timing, and sequence of observations was chosen to mimic those used by the MERs when traversing. Images simulating high-resolution stereo and hand lens-scale images were taken using a professional SLR digital camera; multispectral and XRD data were acquired from samples to mimic the availability of geochemical data. A separate Tiger Team followed the Science Team and examined each site using traditional terrestrial field methods, facilitating comparison between what was revealed by human versus rover-inspired methods. We conclude from this field test that MER-inspired methodology is not conducive to utilizing all acquired data in a timely manner for the case of any lunar architecture that involves the acquisition of rover data in near real-time. We additionally conclude that a methodology similar to that used for MER can be adapted for use on the Moon if mission goals are focused on reconnaissance. If the goal is to locate and identify a specific feature or material, such as water ice, a different methodology will likely be needed.

  8. Identifying protein phosphorylation sites with kinase substrate specificity on human viruses.

    PubMed

    Bretaña, Neil Arvin; Lu, Cheng-Tsung; Chiang, Chiu-Yun; Su, Min-Gang; Huang, Kai-Yao; Lee, Tzong-Yi; Weng, Shun-Long

    2012-01-01

    Viruses infect humans and progress inside the body leading to various diseases and complications. The phosphorylation of viral proteins catalyzed by host kinases plays crucial regulatory roles in enhancing replication and inhibition of normal host-cell functions. Due to its biological importance, there is a desire to identify the protein phosphorylation sites on human viruses. However, the use of mass spectrometry-based experiments is proven to be expensive and labor-intensive. Furthermore, previous studies which have identified phosphorylation sites in human viruses do not include the investigation of the responsible kinases. Thus, we are motivated to propose a new method to identify protein phosphorylation sites with its kinase substrate specificity on human viruses. The experimentally verified phosphorylation data were extracted from virPTM--a database containing 301 experimentally verified phosphorylation data on 104 human kinase-phosphorylated virus proteins. In an attempt to investigate kinase substrate specificities in viral protein phosphorylation sites, maximal dependence decomposition (MDD) is employed to cluster a large set of phosphorylation data into subgroups containing significantly conserved motifs. The experimental human phosphorylation sites are collected from Phospho.ELM, grouped according to its kinase annotation, and compared with the virus MDD clusters. This investigation identifies human kinases such as CK2, PKB, CDK, and MAPK as potential kinases for catalyzing virus protein substrates as confirmed by published literature. Profile hidden Markov model is then applied to learn a predictive model for each subgroup. A five-fold cross validation evaluation on the MDD-clustered HMMs yields an average accuracy of 84.93% for Serine, and 78.05% for Threonine. Furthermore, an independent testing data collected from UniProtKB and Phospho.ELM is used to make a comparison of predictive performance on three popular kinase-specific phosphorylation site prediction tools. In the independent testing, the high sensitivity and specificity of the proposed method demonstrate the predictive effectiveness of the identified substrate motifs and the importance of investigating potential kinases for viral protein phosphorylation sites.

  9. The Use of Tooth Particles as a Biomaterial in Post-Extraction Sockets. Experimental Study in Dogs.

    PubMed

    Calvo-Guirado, José Luis; Maté-Sánchez de Val, José Eduardo; Ramos-Oltra, María Luisa; Pérez-Albacete Martínez, Carlos; Ramírez-Fernández, María Piedad; Maiquez-Gosálvez, Manuel; Gehrke, Sergio A; Fernández-Domínguez, Manuel; Romanos, Georgios E; Delgado-Ruiz, Rafael Arcesio

    2018-05-06

    Objectives : The objective of this study was to evaluate new bone formation derived from freshly crushed extracted teeth, grafted immediately in post-extraction sites in an animal model, compared with sites without graft filling, evaluated at 30 and 90 days. Material and Methods : The bilateral premolars P2, P3, P4 and the first mandibular molar were extracted atraumatically from six Beagle dogs. The clean, dry teeth were ground immediately using the Smart Dentin Grinder. The tooth particles obtained were subsequently sieved through a special sorting filter into two compartments; the upper container isolating particles over 1200 μm, the lower container isolated particles over 300 μm. The crushed teeth were grafted into the post-extraction sockets at P3, P4 and M1 (test group) (larger and smaller post-extraction alveoli), while P2 sites were left unfilled and acted as a control group. Tissue healing and bone formation were evaluated by histological and histomorphometric analysis after 30 and 90 days. Results : At 30 days, test site bone formation was greater in the test group than the control group ( p < 0.05); less immature bone was observed in the test group (25.71%) than the control group (55.98%). At 90 days, significant differences in bone formation were found with more in the test group than the control group. No significant differences were found in new bone formation when comparing the small and large alveoli post-extraction sites. Conclusions : Tooth particles extracted from dog’s teeth, grafted immediately after extractions can be considered a suitable biomaterial for socket preservation.

  10. Factors associated with degree of atopy in Latino children in a nationwide pediatric sample: The GALA II Study

    PubMed Central

    Kumar, Rajesh; Nguyen, Elizabeth A; Roth, Lindsey A; Oh, Sam S; Gignoux, Christopher R.; Huntsman, Scott; Eng, Celeste; Moreno-Estrada, Andres; Sandoval, Karla; Peñaloza-Espinosa, Rosenda; López-López, Marisol; Avila, Pedro C.; Farber, Harold J.; Tcheurekdjian, Haig; Rodriguez-Cintron, William; Rodriguez-Santana, Jose R; Serebrisky, Denise; Thyne, Shannon M.; Williams, L. Keoki; Winkler, Cheryl; Bustamante, Carlos D.; Pérez-Stable, Eliseo J.; Borrell, Luisa N.; Burchard, Esteban G

    2013-01-01

    Background Atopy varies by ethnicity even within Latino groups. This variation may be due to environmental, socio-cultural or genetic factors. Objective To examine risk factors for atopy within a nationwide study of U.S. Latino children with and without asthma. Methods Aeroallergen skin test repsonse was analyzed in 1830 US latino subjects. Key determinants of atopy included: country / region of origin, generation in the U.S., acculturation, genetic ancestry and site to which individuals migrated. Serial multivariate zero inflated negative binomial regressions, stratified by asthma status, examined the association of each key determinant variable with the number of positive skin tests. In addition, the independent effect of each key variable was determined by including all key variables in the final models. Results In baseline analyses, African ancestry was associated with 3 times as many positive skin tests in participants with asthma (95% CI:1.62–5.57) and 3.26 times as many positive skin tests in control participants (95% CI: 1.02–10.39). Generation and recruitment site were also associated with atopy in crude models. In final models adjusted for key variables, Puerto Rican [exp(β) (95%CI): 1.31(1.02–1.69)] and mixed ethnicity [exp(β) (95%CI):1.27(1.03–1.56)] asthmatics had a greater probability of positive skin tests compared to Mexican asthmatics. Ancestry associations were abrogated by recruitment site, but not region of origin. Conclusions Puerto Rican ethnicity and mixed origin were associated with degree of atopy within U.S. Latino children with asthma. African ancestry was not associated with degree of atopy after adjusting for recruitment site. Local environment variation, represented by site, was associated with degree of sensitization. PMID:23684070

  11. Using Geographic Information Systems to Determine Site Suitability for a Low-Level Radioactive Waste Storage Facility.

    PubMed

    Wilson, Charles A; Matthews, Kennith; Pulsipher, Allan; Wang, Wei-Hsung

    2016-02-01

    Radioactive waste is an inevitable product of using radioactive material in education and research activities, medical applications, energy generation, and weapons production. Low-level radioactive waste (LLW) makes up a majority of the radioactive waste produced in the United States. In 2010, over two million cubic feet of LLW were shipped to disposal sites. Despite efforts from several states and compacts as well as from private industry, the options for proper disposal of LLW remain limited. New methods for quickly identifying potential storage locations could alleviate current challenges and eventually provide additional sites and allow for adequate regional disposal of LLW. Furthermore, these methods need to be designed so that they are easily communicated to the public. A Geographic Information Systems (GIS) based method was developed to determine suitability of potential LLW disposal (or storage) sites. Criteria and other parameters of suitability were based on the Code of Federal Regulation (CFR) requirements as well as supporting literature and reports. The resultant method was used to assess areas suitable for further evaluation as prospective disposal sites in Louisiana. Criteria were derived from the 10 minimum requirements in 10 CFR Part 61.50, the Nuclear Regulatory Commission's Regulatory Guide 0902, and studies at existing disposal sites. A suitability formula was developed permitting the use of weighting factors and normalization of all criteria. Data were compiled into GIS data sets and analyzed on a cell grid of approximately 14,000 cells (covering 181,300 square kilometers) using the suitability formula. Requirements were analyzed for each cell using multiple criteria/sub-criteria as well as surrogates for unavailable datasets. Additional criteria were also added when appropriate. The method designed in this project proved to be sufficient for initial screening tests in determining the most suitable areas for prospective disposal (or storage) sites. Cells above 90%, 95%, and 99% suitability include respectively 404, 88, and 4 cells suitable for further analysis. With these areas identified, the next step in siting a LLW storage facility would be on-site analysis using additional requirements as specified by relevant regulatory guidelines. The GIS based method provides an easy, economic, efficient and effective means in evaluating potential sites for LLW storage facilities where sufficient GIS data exist.

  12. MICROFLORA INVOLVED IN PHYTOREMEDIATION OF POLYAROMATIC HYDROCARBONS

    EPA Science Inventory

    This research was accomplished by conducting a series of integrated studies starting with field work at contaminated sites, followed by laboratory studies based on the field work, and concluded with development of preliminary molecular monitoring methods tested on field sam...

  13. Acceleration of Binding Site Comparisons by Graph Partitioning.

    PubMed

    Krotzky, Timo; Klebe, Gerhard

    2015-08-01

    The comparison of protein binding sites is a prominent task in computational chemistry and has been studied in many different ways. For the automatic detection and comparison of putative binding cavities the Cavbase system has been developed which uses a coarse-grained set of pseudocenters to represent the physicochemical properties of a binding site and employs a graph-based procedure to calculate similarities between two binding sites. However, the comparison of two graphs is computationally quite demanding which makes large-scale studies such as the rapid screening of entire databases hardly feasible. In a recent work, we proposed the method Local Cliques (LC) for the efficient comparison of Cavbase binding sites. It employs a clique heuristic to detect the maximum common subgraph of two binding sites and an extended graph model to additionally compare the shape of individual surface patches. In this study, we present an alternative to further accelerate the LC method by partitioning the binding-site graphs into disjoint components prior to their comparisons. The pseudocenter sets are split with regard to their assigned phyiscochemical type, which leads to seven much smaller graphs than the original one. Applying this approach on the same test scenarios as in the former comprehensive way results in a significant speed-up without sacrificing accuracy. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. iSS-PseDNC: identifying splicing sites using pseudo dinucleotide composition.

    PubMed

    Chen, Wei; Feng, Peng-Mian; Lin, Hao; Chou, Kuo-Chen

    2014-01-01

    In eukaryotic genes, exons are generally interrupted by introns. Accurately removing introns and joining exons together are essential processes in eukaryotic gene expression. With the avalanche of genome sequences generated in the postgenomic age, it is highly desired to develop automated methods for rapid and effective detection of splice sites that play important roles in gene structure annotation and even in RNA splicing. Although a series of computational methods were proposed for splice site identification, most of them neglected the intrinsic local structural properties. In the present study, a predictor called "iSS-PseDNC" was developed for identifying splice sites. In the new predictor, the sequences were formulated by a novel feature-vector called "pseudo dinucleotide composition" (PseDNC) into which six DNA local structural properties were incorporated. It was observed by the rigorous cross-validation tests on two benchmark datasets that the overall success rates achieved by iSS-PseDNC in identifying splice donor site and splice acceptor site were 85.45% and 87.73%, respectively. It is anticipated that iSS-PseDNC may become a useful tool for identifying splice sites and that the six DNA local structural properties described in this paper may provide novel insights for in-depth investigations into the mechanism of RNA splicing.

  15. CATE 2016 Indonesia: Image Calibration, Intensity Calibration, and Drift Scan

    NASA Astrophysics Data System (ADS)

    Hare, H. S.; Kovac, S. A.; Jensen, L.; McKay, M. A.; Bosh, R.; Watson, Z.; Mitchell, A. M.; Penn, M. J.

    2016-12-01

    The citizen Continental America Telescopic Eclipse (CATE) experiment aims to provide equipment for 60 sites across the path of totality for the United States August 21st, 2017 total solar eclipse. The opportunity to gather ninety minutes of continuous images of the solar corona is unmatched by any other previous eclipse event. In March of 2016, 5 teams were sent to Indonesia to test CATE equipment and procedures on the March 9th, 2016 total solar eclipse. Also, a goal of the trip was practice and gathering data to use in testing data reduction methods. Of the five teams, four collected data. While in Indonesia, each group participated in community outreach in the location of their site. The 2016 eclipse allowed CATE to test the calibration techniques for the 2017 eclipse. Calibration dark current and flat field images were collected to remove variation across the cameras. Drift scan observations provided information to rotationally align the images from each site. These image's intensity values allowed for intensity calibration for each of the sites. A GPS at each site corrected for major computer errors in time measurement of images. Further refinement of these processes is required before the 2017 eclipse. This work was made possible through the NSO Training for the 2017 Citizen CATE Experiment funded by NASA (NASA NNX16AB92A).

  16. Pore-water extraction from unsaturated tuff by triaxial and one-dimensional compression methods, Nevada Test Site, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mower, T.E.; Higgins, J.D.; Yang, In C.

    1994-07-01

    The hydrologic system in the unsaturated tuff at Yucca Mountain, Nevada, is being evaluated for the US Department of Energy by the Yucca Mountain Project Branch of the US Geological Survey as a potential site for a high-level radioactive-waste repository. Part of this investigation includes a hydrochemical study that is being made to assess characteristics of the hydrologic system such as: traveltime, direction of flow, recharge and source relations, and types and magnitudes of chemical reactions in the unsaturated tuff. In addition, this hydrochemical information will be used in the study of the dispersive and corrosive effects of unsaturated-zone watermore » on the radioactive-waste storage canisters. This report describes the design and validation of laboratory experimental procedures for extracting representative samples of uncontaminated pore water from welded and nonwelded, unsaturated tuffs from the Nevada Test Site.« less

  17. Dutch X-band SLAR calibration

    NASA Technical Reports Server (NTRS)

    Groot, J. S.

    1990-01-01

    In August 1989 the NASA/JPL airborne P/L/C-band DC-8 SAR participated in several remote sensing campaigns in Europe. Amongst other test sites, data were obtained of the Flevopolder test site in the Netherlands on August the 16th. The Dutch X-band SLAR was flown on the same date and imaged parts of the same area as the SAR. To calibrate the two imaging radars a set of 33 calibration devices was deployed. 16 trihedrals were used to calibrate a part of the SLAR data. This short paper outlines the X-band SLAR characteristics, the experimental set-up and the calibration method used to calibrate the SLAR data. Finally some preliminary results are given.

  18. On the use of haplotype phylogeny to detect disease susceptibility loci

    PubMed Central

    Bardel, Claire; Danjean, Vincent; Hugot, Jean-Pierre; Darlu, Pierre; Génin, Emmanuelle

    2005-01-01

    Background The cladistic approach proposed by Templeton has been presented as promising for the study of the genetic factors involved in common diseases. This approach allows the joint study of multiple markers within a gene by considering haplotypes and grouping them in nested clades. The idea is to search for clades with an excess of cases as compared to the whole sample and to identify the mutations defining these clades as potential candidate disease susceptibility sites. However, the performance of this approach for the study of the genetic factors involved in complex diseases has never been studied. Results In this paper, we propose a new method to perform such a cladistic analysis and we estimate its power through simulations. We show that under models where the susceptibility to the disease is caused by a single genetic variant, the cladistic test is neither really more powerful to detect an association nor really more efficient to localize the susceptibility site than an individual SNP testing. However, when two interacting sites are responsible for the disease, the cladistic analysis greatly improves the probability to find the two susceptibility sites. The impact of the linkage disequilibrium and of the tree characteristics on the efficiency of the cladistic analysis are also discussed. An application on a real data set concerning the CARD15 gene and Crohn disease shows that the method can successfully identify the three variant sites that are involved in the disease susceptibility. Conclusion The use of phylogenies to group haplotypes is especially interesting to pinpoint the sites that are likely to be involved in disease susceptibility among the different markers identified within a gene. PMID:15904492

  19. Cone penetrometer testing and discrete-depth ground water sampling techniques: A cost-effective method of site characterization in a multiple-aquifer setting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zemo, D.A.; Pierce, Y.G.; Gallinatti, J.D.

    Cone penetrometer testing (CPT), combined with discrete-depth ground water sampling methods, can significantly reduce the time and expense required to characterize large sites that have multiple aquifers. Results from the screening site characterization can then be used to design and install a cost-effective monitoring well network. At a site in northern California, it was necessary to characterize the stratigraphy and the distribution of volatile organic compounds (VOCs). To expedite characterization, a five-week field screening program was implemented that consisted of a shallow ground water survey, CPT soundings and pore-pressure measurements, and discrete-depth ground water sampling. Based on continuous lithologic informationmore » provided by the CPT soundings, four predominantly coarse-grained, water yielding stratigraphic packages were identified. Seventy-nine discrete-depth ground water samples were collected using either shallow ground water survey techniques, the BAT Enviroprobe, or the QED HydroPunch I, depending on subsurface conditions. Using results from these efforts, a 20-well monitoring network was designed and installed to monitor critical points within each stratigraphic package. Good correlation was found for hydraulic head and chemical results between discrete-depth screening data and monitoring well data. Understanding the vertical VOC distribution and concentrations produced substantial time and cost savings by minimizing the number of permanent monitoring wells and reducing the number of costly conductor casings that had to be installed. Additionally, significant long-term cost savings will result from reduced sampling costs, because fewer wells comprise the monitoring network. The authors estimate these savings to be 50% for site characterization costs, 65% for site characterization time, and 60% for long-term monitoring costs.« less

  20. Evaluating weather factors and material response during outdoor exposure to determine accelerated test protocols for predicting service life

    Treesearch

    R. Sam Williams; Steven Lacher; Corey Halpin; Christopher White

    2005-01-01

    To develop service life prediction methods for the study of sealants, a fully instrumented weather station was installed at an outdoor test site near Madison, WI. Temperature, relative humidiy, rainfall, ultraviolet (UV) radiation at 18 wavelengths, and wind speed and direction are being continuously measured and stored. The weather data can be integrated over time to...

  1. Molecular Modeling in Drug Design for the Development of Organophosphorus Antidotes/Prophylactics.

    DTIC Science & Technology

    1986-06-01

    multidimensional statistical QSAR analysis techniques to suggest new structures for synthesis and evaluation. C. Application of quantum chemical techniques to...compounds for synthesis and testing for antidotal potency. E. Use of computer-assisted methods to determine the steric constraints at the active site...modeling techniques to model the enzyme acetylcholinester-se. H. Suggestion of some novel compounds for synthesis and testing for reactivating

  2. Electrochemical polishing of thread fastener test specimens of nickel-chromium iron alloys

    DOEpatents

    Kephart, Alan R.

    1991-01-01

    An electrochemical polishing device and method for selective anodic dissolution of the surface of test specimens comprised, for example, of nickel-chromium-iron alloys, which provides for uniform dissolution at the localized sites to remove metal through the use of a coiled wire electrode (cathode) placed in the immediate proximity of the working, surface resulting in a polished and uniform grain boundary.

  3. The United Kingdom SATMaP program

    NASA Technical Reports Server (NTRS)

    Towshend, J. R.; Cushnie, J.; Atkinson, P.; Hardy, J. R.; Wilson, A.; Harrison, A.; Baker, J. R.; Jackson, M.

    1983-01-01

    Data from test tapes from the United States (specifically the August Arkansas scene) and the first tape of the UK test site which came from ESRIN are analyzed. Methods for estimating spatial resolution are discussed and some preliminary results are included. The characteristics of the ESRIN data are examined and the utility of the various spectral bands of the thematic mapper for land cover mapping are outlined.

  4. Reduction of Microbial Contaminants in Drinking Water by Ultraviolet Light Technology: ETS UV MODEL UVL-200-4 (Report and Statement)

    EPA Science Inventory

    Final technical report provides test methods used and verification results to be published on ETV web sites. The ETS UV System Model UVL-200-4 was tested to validate the UV dose delivered by the system using biodosimetry and a set line approach. The set line for 40 mJ/cm2 Red...

  5. Three-dimensional geostatistical inversion of flowmeter and pumping test data.

    PubMed

    Li, Wei; Englert, Andreas; Cirpka, Olaf A; Vereecken, Harry

    2008-01-01

    We jointly invert field data of flowmeter and multiple pumping tests in fully screened wells to estimate hydraulic conductivity using a geostatistical method. We use the steady-state drawdowns of pumping tests and the discharge profiles of flowmeter tests as our data in the inference. The discharge profiles need not be converted to absolute hydraulic conductivities. Consequently, we do not need measurements of depth-averaged hydraulic conductivity at well locations. The flowmeter profiles contain information about relative vertical distributions of hydraulic conductivity, while drawdown measurements of pumping tests provide information about horizontal fluctuation of the depth-averaged hydraulic conductivity. We apply the method to data obtained at the Krauthausen test site of the Forschungszentrum Jülich, Germany. The resulting estimate of our joint three-dimensional (3D) geostatistical inversion shows an improved 3D structure in comparison to the inversion of pumping test data only.

  6. Räumliche Charakterisierung der hydraulischen Leitfähigkeit in alluvialen Schotter-Grundwasserleitern: Ein Methodenvergleich

    NASA Astrophysics Data System (ADS)

    Diem, Samuel; Vogt, Tobias; Hoehn, Eduard

    2010-12-01

    For groundwater transport modeling on a scale of 10-100 m, detailed information about the spatial distribution of hydraulic conductivity is of great importance. At a test site (10×20 m) in the alluvial gravel-and-sand aquifer of the perialpine Thur valley (Switzerland), four different methods were applied on different scales to assess this parameter. A comparison of the results showed that multilevel slug tests give reliable results at the required scale. For its analysis, a plausible value of the anisotropy ratio of hydraulic conductivity ( K v / K h ) is needed, which was calculated using a pumping test. The integral results of pumping tests provide an upper boundary of the natural spectrum of hydraulic conductivity at the scale of the test site. Flowmeter logs are recommended if the relative distribution of hydraulic conductivity is of primary importance, while sieve analyses can be used if only a rough estimate of hydraulic conductivity is acceptable.

  7. Limitation and applicability of microtremor records for site-response estimation

    NASA Astrophysics Data System (ADS)

    Song, G.; Kang, T.; Park, S.

    2010-12-01

    Site effects are the modifications of seismic motions which are traveling through near-surface materials. The impedance contrast between the topmost layer and bedrock may significantly amplify ground motions and augment their durations. Inelastic behavior of the geological media such as highly fractured/weathered rocks and unconsolidated sediments may absorb seismic energy, and thus damp the resulting ground motions. It is inherently most desirable to evaluate the site effects using seismic records from large earthquakes. If there are only small events that will be recorded by several seismograph stations, it becomes difficult to evaluate site effects using earthquake data. Recently a number of studies pay attention to microtremor records to assess site effects. The main reason of such efforts is that measurements are relatively easy regardless of site condition and cost-effective without necessity of waiting for earthquakes or of using active sources. Especially microtremor measurements are exclusively a useful option to assess site effects, and thus seismic microzonation, in the urban area and/or region of low to moderate seismicity. Spectral ratios of horizontal components to vertical component (HVSR) of microtremor records have been popular for estimation of site resonant frequency. Although some studies have shown that the amplitude of spectral ratios is an indicator of site amplification relative to bedrock motion, there are still debates on it. This discrepancy may originate from the deficiency of our understanding on the nature of microtremor. Therefore, it is important to understand the limitation and applicability of microtremor records for site-effect assessments. The focus on this problem is how microtremor responses on the subsurface structures and their physical properties, and how parameters deduced from microtremor analyses are related to site responses during earthquake ground motions. In order to investigate how these issues have a practical meaning in real cases, results obtained using the microtremor method were compared with results from a field test, a spectral inversion method, and the reference station method for sites of strong motion stations in the southern Korean Peninsula.

  8. Closure Report for Corrective Action Unit 104: Area 7 Yucca Flat Atmospheric Test Sites, Nevada National Security Site, Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2013-06-27

    This Closure Report (CR) presents information supporting closure of Corrective Action Unit (CAU) 104, Area 7 Yucca Flat Atmospheric Test Sites, and provides documentation supporting the completed corrective actions and confirmation that closure objectives for CAU 104 were met. This CR complies with the requirements of the Federal Facility Agreement and Consent Order (FFACO) that was agreed to by the State of Nevada; the U.S. Department of Energy (DOE), Environmental Management; the U.S. Department of Defense; and DOE, Legacy Management. CAU 104 consists of the following 15 Corrective Action Sites (CASs), located in Area 7 of the Nevada National Securitymore » Site: · CAS 07-23-03, Atmospheric Test Site T-7C · CAS 07-23-04, Atmospheric Test Site T7-1 · CAS 07-23-05, Atmospheric Test Site · CAS 07-23-06, Atmospheric Test Site T7-5a · CAS 07-23-07, Atmospheric Test Site - Dog (T-S) · CAS 07-23-08, Atmospheric Test Site - Baker (T-S) · CAS 07-23-09, Atmospheric Test Site - Charlie (T-S) · CAS 07-23-10, Atmospheric Test Site - Dixie · CAS 07-23-11, Atmospheric Test Site - Dixie · CAS 07-23-12, Atmospheric Test Site - Charlie (Bus) · CAS 07-23-13, Atmospheric Test Site - Baker (Buster) · CAS 07-23-14, Atmospheric Test Site - Ruth · CAS 07-23-15, Atmospheric Test Site T7-4 · CAS 07-23-16, Atmospheric Test Site B7-b · CAS 07-23-17, Atmospheric Test Site - Climax Closure activities began in October 2012 and were completed in April 2013. Activities were conducted according to the Corrective Action Decision Document/Corrective Action Plan for CAU 104. The corrective actions included No Further Action and Clean Closure. Closure activities generated sanitary waste, mixed waste, and recyclable material. Some wastes exceeded land disposal limits and required treatment prior to disposal. Other wastes met land disposal restrictions and were disposed in appropriate onsite landfills. The U.S. Department of Energy, National Nuclear Security Administration Nevada Field Office (NNSA/NFO) requests the following: · A Notice of Completion from the Nevada Division of Environmental Protection to NNSA/NFO for closure of CAU 104 · The transfer of CAU 104 from Appendix III to Appendix IV, Closed Corrective Action Units, of the FFACO« less

  9. Protocol for the use of a rapid real-time PCR method for the detection of HIV-1 proviral DNA using double-stranded primer.

    PubMed

    Pau, Chou-Pong; Wells, Susan K; Granade, Timothy C

    2012-01-01

    This chapter describes a real-time PCR method for the detection of HIV-1 proviral DNA in whole blood samples using a novel double-stranded primer system. The assay utilizes a simple commercially available DNA extraction method and a rapid and easy-to-perform real-time PCR protocol to consistently detect a minimum of four copies of HIV-1 group M proviral DNA in as little as 90 min after sample (whole blood) collection. Co-amplification of the human RNase P gene serves as an internal control to monitor the efficiency of both the DNA extraction and amplification. Once the assay is validated properly, it may be suitable as an alternative confirmation test for HIV-1 infections in a variety of HIV testing venues including the mother-to-child transmission testing sites, clinics, and diagnostic testing centers.

  10. Harmonisation of microbial sampling and testing methods for distillate fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hill, G.C.; Hill, E.C.

    1995-05-01

    Increased incidence of microbial infection in distillate fuels has led to a demand for organisations such as the Institute of Petroleum to propose standards for microbiological quality, based on numbers of viable microbial colony forming units. Variations in quality requirements, and in the spoilage significance of contaminating microbes plus a tendency for temporal and spatial changes in the distribution of microbes, makes such standards difficult to implement. The problem is compounded by a diversity in the procedures employed for sampling and testing for microbial contamination and in the interpretation of the data obtained. The following paper reviews these problems andmore » describes the efforts of The Institute of Petroleum Microbiology Fuels Group to address these issues and in particular to bring about harmonisation of sampling and testing methods. The benefits and drawbacks of available test methods, both laboratory based and on-site, are discussed.« less

  11. Sample preparation for radiocarbon ( 14C) measurements of carbonyl compounds in the atmosphere . quantifying the biogenic contribution

    NASA Astrophysics Data System (ADS)

    Larsen, B. R.; Brussol, C.; Kotzias, D.; Veltkamp, T.; Zwaagstra, O.; Slanina, J.

    A method has been developed for the preparation of samples for radiocarbon ( 14C) measurements of carbonyl compounds in the atmosphere. Sampling on 25 ml 2,4-dinitrophenylhydrazine (DNPH)- coated silica gel cartridges can be carried out with up to 10.000 ℓ of ambient air with no adverse effects on sample integrity. Methods for the selective clean-up of the extracts have been investigated. This is a necessary step in preparing ambient carbonyl samples for a measurement of the radiocarbon ( 14C) content. The method which gave the best results include extraction of the DNPH cartridge with CH 3CN and purification of the carbonyl hydrazones over activated silica gel to remove excess DNPH and non target compounds. This method has been validated with laboratory samples and has been proved to give reliable results The radiocarbon data from the first field experiment showed that ambient air over a semi-rural test site in Ispra, Italy on a late summer day contained mainly five carbonyls (formaldehyde>acetaldehyde>acetone>propanal>butanal) of a mixed biogenic (41-57%) and anthropogenic (43-59%) origin. The method will be used in future monitoring of radiocarbon ( 14C) on a number of test sites in Europe.

  12. A Bayesian hierarchical model to detect differentially methylated loci from single nucleotide resolution sequencing data

    PubMed Central

    Feng, Hao; Conneely, Karen N.; Wu, Hao

    2014-01-01

    DNA methylation is an important epigenetic modification that has essential roles in cellular processes including gene regulation, development and disease and is widely dysregulated in most types of cancer. Recent advances in sequencing technology have enabled the measurement of DNA methylation at single nucleotide resolution through methods such as whole-genome bisulfite sequencing and reduced representation bisulfite sequencing. In DNA methylation studies, a key task is to identify differences under distinct biological contexts, for example, between tumor and normal tissue. A challenge in sequencing studies is that the number of biological replicates is often limited by the costs of sequencing. The small number of replicates leads to unstable variance estimation, which can reduce accuracy to detect differentially methylated loci (DML). Here we propose a novel statistical method to detect DML when comparing two treatment groups. The sequencing counts are described by a lognormal-beta-binomial hierarchical model, which provides a basis for information sharing across different CpG sites. A Wald test is developed for hypothesis testing at each CpG site. Simulation results show that the proposed method yields improved DML detection compared to existing methods, particularly when the number of replicates is low. The proposed method is implemented in the Bioconductor package DSS. PMID:24561809

  13. On the objective identification of flood seasons

    NASA Astrophysics Data System (ADS)

    Cunderlik, Juraj M.; Ouarda, Taha B. M. J.; BobéE, Bernard

    2004-01-01

    The determination of seasons of high and low probability of flood occurrence is a task with many practical applications in contemporary hydrology and water resources management. Flood seasons are generally identified subjectively by visually assessing the temporal distribution of flood occurrences and, then at a regional scale, verified by comparing the temporal distribution with distributions obtained at hydrologically similar neighboring sites. This approach is subjective, time consuming, and potentially unreliable. The main objective of this study is therefore to introduce a new, objective, and systematic method for the identification of flood seasons. The proposed method tests the significance of flood seasons by comparing the observed variability of flood occurrences with the theoretical flood variability in a nonseasonal model. The method also addresses the uncertainty resulting from sampling variability by quantifying the probability associated with the identified flood seasons. The performance of the method was tested on an extensive number of samples with different record lengths generated from several theoretical models of flood seasonality. The proposed approach was then applied on real data from a large set of sites with different flood regimes across Great Britain. The results show that the method can efficiently identify flood seasons from both theoretical and observed distributions of flood occurrence. The results were used for the determination of the main flood seasonality types in Great Britain.

  14. Building a Playground: General Guidelines for Creating Educational Web Sites for Children

    PubMed Central

    Meloncon, Lisa; Haynes, Erin; Varelmann, Megan; Groh, Lisa

    2015-01-01

    Purpose Since 2004, the number of children online has increased 18%, compared with a 10% increase in total users. Not only do children represent a growing segment of Internet users, much of what they do online has a specific purpose: education. To help technical communicators create educational Web sites for children, we offer a set of guidelines to direct the design process. Method Nine children participated in a usability test of the CARES Playground, an educational Web site geared toward 7- to 9-year-olds. The site was designed by a group of graduate students in professional writing based on a review of the (admittedly limited) literature dealing with designing Web sites for children. This paper matches common themes from existing literature to the results of the usability tests. Results Since all the information on designing Web sites for children emerged from the literature of designing Web sites for adults, the themes of navigation, appearance, and content are not unfamiliar. However, the interpretation of those common issues for children—as well as the children’s reaction to them—may be surprising. Conclusion Technical communicators need to be conscious and deliberate when designing Web sites for children. To ensure that educational Web sites are able to meet their learning goals, careful consideration of children’s developmental abilities and Web preferences must be considered. We present several guidelines as a starting point, though further research is needed to confirm and expand upon them. PMID:26633909

  15. TarPmiR: a new approach for microRNA target site prediction.

    PubMed

    Ding, Jun; Li, Xiaoman; Hu, Haiyan

    2016-09-15

    The identification of microRNA (miRNA) target sites is fundamentally important for studying gene regulation. There are dozens of computational methods available for miRNA target site prediction. Despite their existence, we still cannot reliably identify miRNA target sites, partially due to our limited understanding of the characteristics of miRNA target sites. The recently published CLASH (crosslinking ligation and sequencing of hybrids) data provide an unprecedented opportunity to study the characteristics of miRNA target sites and improve miRNA target site prediction methods. Applying four different machine learning approaches to the CLASH data, we identified seven new features of miRNA target sites. Combining these new features with those commonly used by existing miRNA target prediction algorithms, we developed an approach called TarPmiR for miRNA target site prediction. Testing on two human and one mouse non-CLASH datasets, we showed that TarPmiR predicted more than 74.2% of true miRNA target sites in each dataset. Compared with three existing approaches, we demonstrated that TarPmiR is superior to these existing approaches in terms of better recall and better precision. The TarPmiR software is freely available at http://hulab.ucf.edu/research/projects/miRNA/TarPmiR/ CONTACTS: haihu@cs.ucf.edu or xiaoman@mail.ucf.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  16. Evaluation of four rapid methods for hemoglobin screening of whole blood donors in mobile collection settings.

    PubMed

    Gómez-Simón, Antonia; Navarro-Núñez, Leyre; Pérez-Ceballos, Elena; Lozano, María L; Candela, María J; Cascales, Almudena; Martínez, Constantino; Corral, Javier; Vicente, Vicente; Rivera, José

    2007-06-01

    Predonation hemoglobin measurement is a problematic requirement in mobile donation settings, where accurate determination of venous hemoglobin by hematology analyzers is not available. We have evaluated hemoglobin screening in prospective donors by the semiquantitative copper sulphate test and by capillary blood samples analyzed by three portable photometers, HemoCue, STAT-Site MHgb, and the CompoLab HB system. Capillary blood samples were obtained from 380 donors and tested by the copper sulphate test and by at least one of the named portable photometers. Predonation venous hemoglobin was also determined in all donors using a Coulter Max-M analyzer. The three photometers provided acceptable reproducibility (CV below 5%), and displayed a significant correlation between the capillary blood samples and the venous hemoglobin (R2 0.5-0.8). HemoCue showed the best agreement with venous hemoglobin determination, followed by STAT-Site MHgb, and the CompoLab HB system. The copper sulphate test provided the highest rate of donors acceptance (83%) despite unacceptable hemoglobin levels, and the lowest rate for donor deferral (1%) despite acceptable hemoglobin levels. The percentage of donors correctly categorized for blood donation by the portable hemoglobinometers was 85%, 82%, and 76% for CompoLab HB system, HemoCue and STAT-Site, respectively. Our data suggest that hemoglobin determination remains a conflictive issue in donor selection in the mobile setting. Without appropriate performance control, capillary hemoglobin screening by either the copper sulphate method or by the novel portable hemoglobinometers could be inaccurate, thus potentially affecting both donor safety and the blood supply.

  17. Design of the HPTN 065 (TLC-Plus) study: A study to evaluate the feasibility of an enhanced test, link-to-care, plus treat approach for HIV prevention in the United States

    PubMed Central

    Gamble, Theresa; Branson, Bernard; Donnell, Deborah; Hall, H Irene; King, Georgette; Cutler, Blayne; Hader, Shannon; Burns, David; Leider, Jason; Wood, Angela Fulwood; G. Volpp, Kevin; Buchacz, Kate; El-Sadr, Wafaa M

    2017-01-01

    Background/Aims HIV continues to be a major public health threat in the United States, and mathematical modeling has demonstrated that the universal effective use of antiretroviral therapy among all HIV-positive individuals (i.e. the “test and treat” approach) has the potential to control HIV. However, to accomplish this, all the steps that define the HIV care continuum must be achieved at high levels, including HIV testing and diagnosis, linkage to and retention in clinical care, antiretroviral medication initiation, and adherence to achieve and maintain viral suppression. The HPTN 065 (Test, Link-to-Care Plus Treat [TLC-Plus]) study was designed to determine the feasibility of the “test and treat” approach in the United States. Methods HPTN 065 was conducted in two intervention communities, Bronx, NY, and Washington, DC, along with four non-intervention communities, Chicago, IL; Houston, TX; Miami, FL; and Philadelphia, PA. The study consisted of five components: (1) exploring the feasibility of expanded HIV testing via social mobilization and the universal offer of testing in hospital settings, (2) evaluating the effectiveness of financial incentives to increase linkage to care, (3) evaluating the effectiveness of financial incentives to increase viral suppression, (4) evaluating the effectiveness of a computer-delivered intervention to decrease risk behavior in HIV-positive patients in healthcare settings, and (5) administering provider and patient surveys to assess knowledge and attitudes regarding the use of antiretroviral therapy for prevention and the use of financial incentives to improve health outcomes. The study used observational cohorts, cluster and individual randomization, and made novel use of the existing national HIV surveillance data infrastructure. All components were developed with input from a community advisory board, and pragmatic methods were used to implement and assess the outcomes for each study component. Results A total of 76 sites in Washington, DC, and the Bronx, NY, participated in the study: 37 HIV test sites, including 16 hospitals, and 39 HIV care sites. Between September 2010 and December 2014, all study components were successfully implemented at these sites and resulted in valid outcomes. Our pragmatic approach to the study design, implementation, and the assessment of study outcomes allowed the study to be conducted within established programmatic structures and processes. In addition, it was successfully layered on the ongoing standard of care and existing data infrastructure without disrupting health services. Conclusion The HPTN 065 study demonstrated the feasibility of implementing and evaluating a multi-component “test and treat” trial that included a large number of community sites and involved pragmatic approaches to study implementation and evaluation. PMID:28627929

  18. Design of the HPTN 065 (TLC-Plus) study: A study to evaluate the feasibility of an enhanced test, link-to-care, plus treat approach for HIV prevention in the United States.

    PubMed

    Gamble, Theresa; Branson, Bernard; Donnell, Deborah; Hall, H Irene; King, Georgette; Cutler, Blayne; Hader, Shannon; Burns, David; Leider, Jason; Wood, Angela Fulwood; G Volpp, Kevin; Buchacz, Kate; El-Sadr, Wafaa M

    2017-08-01

    Background/Aims HIV continues to be a major public health threat in the United States, and mathematical modeling has demonstrated that the universal effective use of antiretroviral therapy among all HIV-positive individuals (i.e. the "test and treat" approach) has the potential to control HIV. However, to accomplish this, all the steps that define the HIV care continuum must be achieved at high levels, including HIV testing and diagnosis, linkage to and retention in clinical care, antiretroviral medication initiation, and adherence to achieve and maintain viral suppression. The HPTN 065 (Test, Link-to-Care Plus Treat [TLC-Plus]) study was designed to determine the feasibility of the "test and treat" approach in the United States. Methods HPTN 065 was conducted in two intervention communities, Bronx, NY, and Washington, DC, along with four non-intervention communities, Chicago, IL; Houston, TX; Miami, FL; and Philadelphia, PA. The study consisted of five components: (1) exploring the feasibility of expanded HIV testing via social mobilization and the universal offer of testing in hospital settings, (2) evaluating the effectiveness of financial incentives to increase linkage to care, (3) evaluating the effectiveness of financial incentives to increase viral suppression, (4) evaluating the effectiveness of a computer-delivered intervention to decrease risk behavior in HIV-positive patients in healthcare settings, and (5) administering provider and patient surveys to assess knowledge and attitudes regarding the use of antiretroviral therapy for prevention and the use of financial incentives to improve health outcomes. The study used observational cohorts, cluster and individual randomization, and made novel use of the existing national HIV surveillance data infrastructure. All components were developed with input from a community advisory board, and pragmatic methods were used to implement and assess the outcomes for each study component. Results A total of 76 sites in Washington, DC, and the Bronx, NY, participated in the study: 37 HIV test sites, including 16 hospitals, and 39 HIV care sites. Between September 2010 and December 2014, all study components were successfully implemented at these sites and resulted in valid outcomes. Our pragmatic approach to the study design, implementation, and the assessment of study outcomes allowed the study to be conducted within established programmatic structures and processes. In addition, it was successfully layered on the ongoing standard of care and existing data infrastructure without disrupting health services. Conclusion The HPTN 065 study demonstrated the feasibility of implementing and evaluating a multi-component "test and treat" trial that included a large number of community sites and involved pragmatic approaches to study implementation and evaluation.

  19. Improving 1D Site Specific Velocity Profiles for the Kik-Net Network

    NASA Astrophysics Data System (ADS)

    Holt, James; Edwards, Benjamin; Pilz, Marco; Fäh, Donat; Rietbrock, Andreas

    2017-04-01

    Ground motion predication equations (GMPEs) form the cornerstone of modern seismic hazard assessments. When produced to a high standard they provide reliable estimates of ground motion/spectral acceleration for a given site and earthquake scenario. This information is crucial for engineers to optimise design and for regulators who enforce legal minimum safe design capacities. Classically, GMPEs were built upon the assumption that variability around the median model could be treated as aleatory. As understanding improved, it was noted that the propagation could be segregated into the response of the average path from the source and the response of the site. This is because the heterogeneity of the near-surface lithology is significantly different from that of the bulk path. It was then suggested that the semi-ergodic approach could be taken if the site response could be determined, moving uncertainty away from aleatory to epistemic. The determination of reliable site-specific response models is therefore becoming increasingly critical for ground motion models used in engineering practice. Today it is common practice to include proxies for site response within the scope of a GMPE, such as Vs30 or site classification, in an effort to reduce the overall uncertainty of the predication at a given site. However, these proxies are not always reliable enough to give confident ground motion estimates, due to the complexity of the near-surface. Other approaches of quantifying the response of the site include detailed numerical simulations (1/2/3D - linear, EQL, non-linear etc.). However, in order to be reliable, they require highly detailed and accurate velocity and, for non-linear analyses, material property models. It is possible to obtain this information through invasive methods, but is expensive, and not feasible for most projects. Here we propose an alternative method to derive reliable velocity profiles (and their uncertainty), calibrated using almost 20 years of recorded data from the Kik-Net network. First, using a reliable subset of sites, the empirical surface to borehole (S/B) ratio is calculated in the frequency domain using all events recorded at that site. In a subsequent step, we use numerical simulation to produce 1D SH transfer function curves using a suite of stochastic velocity models. Comparing the resulting amplification with the empirical S/B ratio we find optimal 1D velocity models and their uncertainty. The method will be tested to determine the level of initial information required to obtain a reliable Vs profile (e.g., starting Vs model, only Vs30, site-class, H/V ratio etc.) and then applied and tested against data from other regions using site-to-reference or empirical spectral model amplification.

  20. A comparison of change detection methods using multispectral scanner data

    USGS Publications Warehouse

    Seevers, Paul M.; Jones, Brenda K.; Qiu, Zhicheng; Liu, Yutong

    1994-01-01

    Change detection methods were investigated as a cooperative activity between the U.S. Geological Survey and the National Bureau of Surveying and Mapping, People's Republic of China. Subtraction of band 2, band 3, normalized difference vegetation index, and tasseled cap bands 1 and 2 data from two multispectral scanner images were tested using two sites in the United States and one in the People's Republic of China. A new statistical method also was tested. Band 2 subtraction gives the best results for detecting change from vegetative cover to urban development. The statistical method identifies areas that have changed and uses a fast classification algorithm to classify the original data of the changed areas by land cover type present for each image date.

Top