NASA Astrophysics Data System (ADS)
Rouillon, M.; Taylor, M. P.; Dong, C.
2016-12-01
This research assesses the advantages of integrating field portable X-ray Fluorescence (pXRF) technology for reducing the risk and increase confidence of decision making for metal-contaminated site assessments. Metal-contaminated sites are often highly heterogeneous and require a high sampling density to accurately characterize the distribution and concentration of contaminants. The current regulatory assessment approaches rely on a small number of samples processed using standard wet-chemistry methods. In New South Wales (NSW), Australia, the current notification trigger for characterizing metal-contaminated sites require the upper 95% confidence interval of the site mean to equal or exceed the relevant guidelines. The method's low `minimum' sampling requirements can misclassify sites due to the heterogeneous nature of soil contamination, leading to inaccurate decision making. To address this issue, we propose integrating infield pXRF analysis with the established sampling method to overcome sampling limitations. This approach increases the minimum sampling resolution and reduces the 95% CI of the site mean. Infield pXRF analysis at contamination hotspots enhances sample resolution efficiently and without the need to return to the site. In this study, the current and proposed pXRF site assessment methods are compared at five heterogeneous metal-contaminated sites by analysing the spatial distribution of contaminants, 95% confidence intervals of site means, and the sampling and analysis uncertainty associated with each method. Finally, an analysis of costs associated with both the current and proposed methods is presented to demonstrate the advantages of incorporating pXRF into metal-contaminated site assessments. The data shows that pXRF integrated site assessments allows for faster, cost-efficient, characterisation of metal-contaminated sites with greater confidence for decision making.
78 FR 56749 - Site Characteristics and Site Parameters for Nuclear Power Plants
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-13
..., ``Geologic Characterization Information,'' (currently titled as ``Basic Geologic and Seismic Information''); Section 2.5.2, ``Vibratory Ground Motion''; Section 2.5.3, ``Surface Deformation'' (currently titled as... the following methods (unless this document describes a different method for submitting comments on a...
Focused intracochlear electric stimulation with phased array channels.
van den Honert, Chris; Kelsall, David C
2007-06-01
A method is described for producing focused intracochlear electric stimulation using an array of N electrodes. For each electrode site, N weights are computed that define the ratios of positive and negative electrode currents required to produce cancellation of the voltage within scala tympani at all of the N-1 other sites. Multiple sites can be stimulated simultaneously by superposition of their respective current vectors. The method allows N independent stimulus waveforms to be delivered to each of the N electrode sites without spatial overlap. Channel interaction from current spread associated with monopolar stimulation is substantially eliminated. The method operates by inverting the spread functions of individual monopoles as measured with the other electrodes. The method was implemented and validated with data from three human subjects implanted with 22-electrode perimodiolar arrays. Results indicate that (1) focusing is realizable with realistic precision; (2) focusing comes at the cost of increased total stimulation current; (3) uncanceled voltages that arise beyond the ends of the array are weak except when stimulating the two end channels; and (4) close perimodiolar positioning of the electrodes may be important for minimizing stimulation current and sensitivity to measurement errors.
Electrical energy consumption control apparatuses and electrical energy consumption control methods
Hammerstrom, Donald J.
2012-09-04
Electrical energy consumption control apparatuses and electrical energy consumption control methods are described. According to one aspect, an electrical energy consumption control apparatus includes processing circuitry configured to receive a signal which is indicative of current of electrical energy which is consumed by a plurality of loads at a site, to compare the signal which is indicative of current of electrical energy which is consumed by the plurality of loads at the site with a desired substantially sinusoidal waveform of current of electrical energy which is received at the site from an electrical power system, and to use the comparison to control an amount of the electrical energy which is consumed by at least one of the loads of the site.
In-Situ Transfer Standard and Coincident-View Intercomparisons for Sensor Cross-Calibration
NASA Technical Reports Server (NTRS)
Thome, Kurt; McCorkel, Joel; Czapla-Myers, Jeff
2013-01-01
There exist numerous methods for accomplishing on-orbit calibration. Methods include the reflectance-based approach relying on measurements of surface and atmospheric properties at the time of a sensor overpass as well as invariant scene approaches relying on knowledge of the temporal characteristics of the site. The current work examines typical cross-calibration methods and discusses the expected uncertainties of the methods. Data from the Advanced Land Imager (ALI), Advanced Spaceborne Thermal Emission and Reflection and Radiometer (ASTER), Enhanced Thematic Mapper Plus (ETM+), Moderate Resolution Imaging Spectroradiometer (MODIS), and Thematic Mapper (TM) are used to demonstrate the limits of relative sensor-to-sensor calibration as applied to current sensors while Landsat-5 TM and Landsat-7 ETM+ are used to evaluate the limits of in situ site characterizations for SI-traceable cross calibration. The current work examines the difficulties in trending of results from cross-calibration approaches taking into account sampling issues, site-to-site variability, and accuracy of the method. Special attention is given to the differences caused in the cross-comparison of sensors in radiance space as opposed to reflectance space. The results show that cross calibrations with absolute uncertainties lesser than 1.5 percent (1 sigma) are currently achievable even for sensors without coincident views.
Molecular detection of airborne Coccidioides in Tucson, Arizona
Chow, Nancy A.; Griffin, Dale W.; Barker, Bridget M.; Loparev, Vladimir N.; Litvintseva, Anastasia P.
2016-01-01
Environmental surveillance of the soil-dwelling fungus Coccidioides is essential for the prevention of Valley fever, a disease primarily caused by inhalation of the arthroconidia. Methods for collecting and detectingCoccidioides in soil samples are currently in use by several laboratories; however, a method utilizing current air sampling technologies has not been formally demonstrated for the capture of airborne arthroconidia. In this study, we collected air/dust samples at two sites (Site A and Site B) in the endemic region of Tucson, Arizona, and tested a variety of air samplers and membrane matrices. We then employed a single-tube nested qPCR assay for molecular detection. At both sites, numerous soil samples (n = 10 at Site A and n = 24 at Site B) were collected and Coccidioides was detected in two samples (20%) at Site A and in eight samples (33%) at Site B. Of the 25 air/dust samples collected at both sites using five different air sampling methods, we detected Coccidioides in three samples from site B. All three samples were collected using a high-volume sampler with glass-fiber filters. In this report, we describe these methods and propose the use of these air sampling and molecular detection strategies for environmental surveillance of Coccidioides.
Warrell, Mary J.; Riddell, Anna; Yu, Ly-Mee; Phipps, Judith; Diggle, Linda; Bourhy, Hervé; Deeks, Jonathan J.; Fooks, Anthony R.; Audry, Laurent; Brookes, Sharon M.; Meslin, François-Xavier; Moxon, Richard; Pollard, Andrew J.; Warrell, David A.
2008-01-01
Background The need for economical rabies post-exposure prophylaxis (PEP) is increasing in developing countries. Implementation of the two currently approved economical intradermal (ID) vaccine regimens is restricted due to confusion over different vaccines, regimens and dosages, lack of confidence in intradermal technique, and pharmaceutical regulations. We therefore compared a simplified 4-site economical PEP regimen with standard methods. Methods Two hundred and fifty-four volunteers were randomly allocated to a single blind controlled trial. Each received purified vero cell rabies vaccine by one of four PEP regimens: the currently accepted 2-site ID; the 8-site regimen using 0.05 ml per ID site; a new 4-site ID regimen (on day 0, approximately 0.1 ml at 4 ID sites, using the whole 0.5 ml ampoule of vaccine; on day 7, 0.1 ml ID at 2 sites and at one site on days 28 and 90); or the standard 5-dose intramuscular regimen. All ID regimens required the same total amount of vaccine, 60% less than the intramuscular method. Neutralising antibody responses were measured five times over a year in 229 people, for whom complete data were available. Findings All ID regimens showed similar immunogenicity. The intramuscular regimen gave the lowest geometric mean antibody titres. Using the rapid fluorescent focus inhibition test, some sera had unexpectedly high antibody levels that were not attributable to previous vaccination. The results were confirmed using the fluorescent antibody virus neutralisation method. Conclusions This 4-site PEP regimen proved as immunogenic as current regimens, and has the advantages of requiring fewer clinic visits, being more practicable, and having a wider margin of safety, especially in inexperienced hands, than the 2-site regimen. It is more convenient than the 8-site method, and can be used economically with vaccines formulated in 1.0 or 0.5 ml ampoules. The 4-site regimen now meets all requirements of immunogenicity for PEP and can be introduced without further studies. Trial Registration Controlled-Trials.com ISRCTN 30087513 PMID:18431444
The Path to Graduation: A Model Interactive Web Site Design Supporting Doctoral Students
ERIC Educational Resources Information Center
Simmons-Johnson, Nicole
2012-01-01
Objective. This 2-phase mixed method study assessed 2nd-year doctoral students' and dissertation students' perceptions of the current Graduate School of Education dissertation support Web site, with implications for designing a model dissertation support Web site. Methods. Phase 1 collected quantitative and qualitative data through an…
Managing dense nonaqueous phase liquid (DNAPL) contaminated sites continues to be among the most pressing environmental problems currently faced. One approach that has recently been investigated for use in DNAPL site characterization and remediation is mass flux (mass per unit ar...
Marking multi-channel silicon-substrate electrode recording sites using radiofrequency lesions.
Brozoski, Thomas J; Caspary, Donald M; Bauer, Carol A
2006-01-30
Silicon-substrate multi-channel electrodes (multiprobes) have proven useful in a variety of electrophysiological tasks. When using multiprobes it is often useful to identify the site of each channel, e.g., when recording single-unit activity from a heterogeneous structure. Lesion marking of electrode sites has been used for many years. Electrolytic, or direct current (DC) lesions, have been used successfully to mark multiprobe sites in rat hippocampus [Townsend G, Peloquin P, Kloosterman F, Hetke JF, Leung LS. Recording and marking with silicon multichannel electrodes. Brain Res Brain Res Protoc 2002;9:122-9]. The present method used radio-frequency (rf) lesions to distinctly mark each of the 16 recording sites of 16-channel linear array multiprobes, in chinchilla inferior colliculus. A commercial radio-frequency lesioner was used as the current source, in conjunction with custom connectors adapted to the multiprobe configuration. In vitro bench testing was used to establish current-voltage-time parameters, as well as to check multiprobe integrity and radio-frequency performance. In in vivo application, visualization of individual-channel multiprobe recording sites was clear in 21 out of 33 sets of collicular serial-sections (i.e., probe tracks) obtained from acute experimental subjects, i.e., maximum post-lesion survival time of 2h. Advantages of the rf method include well-documented methods of in vitro calibration as well as low impact on probe integrity. The rf method of marking individual-channel sites should be useful in a variety of applications.
Warrell, Mary J; Riddell, Anna; Yu, Ly-Mee; Phipps, Judith; Diggle, Linda; Bourhy, Hervé; Deeks, Jonathan J; Fooks, Anthony R; Audry, Laurent; Brookes, Sharon M; Meslin, François-Xavier; Moxon, Richard; Pollard, Andrew J; Warrell, David A
2008-04-23
The need for economical rabies post-exposure prophylaxis (PEP) is increasing in developing countries. Implementation of the two currently approved economical intradermal (ID) vaccine regimens is restricted due to confusion over different vaccines, regimens and dosages, lack of confidence in intradermal technique, and pharmaceutical regulations. We therefore compared a simplified 4-site economical PEP regimen with standard methods. Two hundred and fifty-four volunteers were randomly allocated to a single blind controlled trial. Each received purified vero cell rabies vaccine by one of four PEP regimens: the currently accepted 2-site ID; the 8-site regimen using 0.05 ml per ID site; a new 4-site ID regimen (on day 0, approximately 0.1 ml at 4 ID sites, using the whole 0.5 ml ampoule of vaccine; on day 7, 0.1 ml ID at 2 sites and at one site on days 28 and 90); or the standard 5-dose intramuscular regimen. All ID regimens required the same total amount of vaccine, 60% less than the intramuscular method. Neutralising antibody responses were measured five times over a year in 229 people, for whom complete data were available. All ID regimens showed similar immunogenicity. The intramuscular regimen gave the lowest geometric mean antibody titres. Using the rapid fluorescent focus inhibition test, some sera had unexpectedly high antibody levels that were not attributable to previous vaccination. The results were confirmed using the fluorescent antibody virus neutralisation method. This 4-site PEP regimen proved as immunogenic as current regimens, and has the advantages of requiring fewer clinic visits, being more practicable, and having a wider margin of safety, especially in inexperienced hands, than the 2-site regimen. It is more convenient than the 8-site method, and can be used economically with vaccines formulated in 1.0 or 0.5 ml ampoules. The 4-site regimen now meets all requirements of immunogenicity for PEP and can be introduced without further studies. Controlled-Trials.com ISRCTN 30087513.
Environmental impact assessment of Gonabad municipal waste landfill site using Leopold Matrix
Sajjadi, Seyed Ali; Aliakbari, Zohreh; Matlabi, Mohammad; Biglari, Hamed; Rasouli, Seyedeh Samira
2017-01-01
Introduction An environmental impact assessment (EIA) before embarking on any project is a useful tool to reduce the potential effects of each project, including landfill, if possible. The main objective of this study was to assess the environmental impact of the current municipal solid waste disposal site of Gonabad by using the Iranian Leopold matrix method. Methods This cross-sectional study was conducted to assess the environmental impacts of a landfill site in Gonabad in 2015 by an Iranian matrix (modified Leopold matrix). This study was conducted based on field visits of the landfill, and collected information from various sources and analyzing and comparing between five available options, including the continuation of the current disposal practices, construction of new sanitary landfills, recycling plans, composting, and incineration plants was examined. The best option was proposed to replace the existing landfill. Results The current approach has a score of 2.35, the construction of new sanitary landfill has a score of 1.59, a score of 1.57 for the compost plant, and recycling and incineration plant, respectively, have scores of 1.68 and 2.3. Conclusion Results showed that continuation of the current method of disposal, due to severe environmental damage and health problems, is rejected. A compost plant with the lowest negative score is the best option for the waste disposal site of Gonabad City and has priority over the other four options. PMID:28465797
Inventory Data Package for Hanford Assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kincaid, Charles T.; Eslinger, Paul W.; Aaberg, Rosanne L.
2006-06-01
This document presents the basis for a compilation of inventory for radioactive contaminants of interest by year for all potentially impactive waste sites on the Hanford Site for which inventory data exist in records or could be reasonably estimated. This document also includes discussions of the historical, current, and reasonably foreseeable (1944 to 2070) future radioactive waste and waste sites; the inventories of radionuclides that may have a potential for environmental impacts; a description of the method(s) for estimating inventories where records are inadequate; a description of the screening method(s) used to select those sites and contaminants that might makemore » a substantial contribution to impacts; a listing of the remedial actions and their completion dates for waste sites; and tables showing the best estimate inventories available for Hanford assessments.« less
Using Dirichlet Processes for Modeling Heterogeneous Treatment Effects across Sites
ERIC Educational Resources Information Center
Miratrix, Luke; Feller, Avi; Pillai, Natesh; Pati, Debdeep
2016-01-01
Modeling the distribution of site level effects is an important problem, but it is also an incredibly difficult one. Current methods rely on distributional assumptions in multilevel models for estimation. There it is hoped that the partial pooling of site level estimates with overall estimates, designed to take into account individual variation as…
Sriram, K. K.; Yeh, Jia-Wei; Lin, Yii-Lih; Chang, Yi-Ren; Chou, Chia-Fu
2014-01-01
Mapping transcription factor (TF) binding sites along a DNA backbone is crucial in understanding the regulatory circuits that control cellular processes. Here, we deployed a method adopting bioconjugation, nanofluidic confinement and fluorescence single molecule imaging for direct mapping of TF (RNA polymerase) binding sites on field-stretched single DNA molecules. Using this method, we have mapped out five of the TF binding sites of E. coli RNA polymerase to bacteriophage λ-DNA, where two promoter sites and three pseudo-promoter sites are identified with the corresponding binding frequency of 45% and 30%, respectively. Our method is quick, robust and capable of resolving protein-binding locations with high accuracy (∼ 300 bp), making our system a complementary platform to the methods currently practiced. It is advantageous in parallel analysis and less prone to false positive results over other single molecule mapping techniques such as optical tweezers, atomic force microscopy and molecular combing, and could potentially be extended to general mapping of protein–DNA interaction sites. PMID:24753422
NASA Astrophysics Data System (ADS)
Yogeshwar, P.; Tezkan, B.; Israil, M.; Candansayar, M. E.
2012-01-01
The impact of sewage irrigation and groundwater contamination were investigated near Roorkee in north India using the Direct Current Resistivity (DCR) method and the Radiomagnetotelluric (RMT) method. Intensive field measurements were carried out in the vicinity of a waste disposal site, which was extensively irrigated with sewage water. For comparison a profile was investigated on a reference site, where no contamination was expected. In addition to conventional 1D and 2D inversion, the measured data sets were interpreted using a 2D joint inversion algorithm. The inversion results from the data obtained from the sewage irrigated site indicate a decrease of resistivity up to 75% in comparison with the reference site. The depth range from 5 to 15 m is identified as a shallow unconfined aquifer and the decreased resistivities are ascribed as the influence of contamination. Furthermore, a systematic increase in the resistivities of the shallow unconfined aquifer is detected as we move away from the waste disposal site. The advantages of both, the DCR and RMT methods, are quantitatively integrated by the 2D joint inversion of both data sets and lead to a joint model, which explains both data sets.
On-site or off-site treatment of medical waste: a challenge
2014-01-01
Treating hazardous-infectious medical waste can be carried out on-site or off-site of health-care establishments. Nevertheless, the selection between on-site and off-site locations for treating medical waste sometimes is a controversial subject. Currently in Iran, due to policies of Health Ministry, the hospitals have selected on-site-treating method as the preferred treatment. The objectives of this study were to assess the current condition of on-site medical waste treatment facilities, compare on-site medical waste treatment facilities with off-site systems and find the best location of medical waste treatment. To assess the current on-site facilities, four provinces (and 40 active hospitals) were selected to participate in the survey. For comparison of on-site and off-site facilities (due to non availability of an installed off-site facility) Analytical Hierarchy Process (AHP) was employed. The result indicated that most on-site medical waste treating systems have problems in financing, planning, determining capacity of installations, operation and maintenance. AHP synthesis (with inconsistency ratio of 0.01 < 0.1) revealed that, in total, the off-site treatment of medical waste was in much higher priority than the on-site treatment (64.1% versus 35.9%). According to the results of study it was concluded that the off-site central treatment can be considered as an alternative. An amendment could be made to Iran’s current medical waste regulations to have infectious-hazardous waste sent to a central off-site installation for treatment. To begin and test this plan and also receive the official approval, a central off-site can be put into practice, at least as a pilot in one province. Next, if it was practically successful, it could be expanded to other provinces and cities. PMID:24739145
Alquezar-Planas, David E.; Ishida, Yasuko; Courtiol, Alexandre; Timms, Peter; Johnson, Rebecca N.; Lenz, Dorina; Helgen, Kristofer M.; Roca, Alfred L.; Hartman, Stefanie
2016-01-01
Background. Retroviral integration into the host germline results in permanent viral colonization of vertebrate genomes. The koala retrovirus (KoRV) is currently invading the germline of the koala (Phascolarctos cinereus) and provides a unique opportunity for studying retroviral endogenization. Previous analysis of KoRV integration patterns in modern koalas demonstrate that they share integration sites primarily if they are related, indicating that the process is currently driven by vertical transmission rather than infection. However, due to methodological challenges, KoRV integrations have not been comprehensively characterized. Results. To overcome these challenges, we applied and compared three target enrichment techniques coupled with next generation sequencing (NGS) and a newly customized sequence-clustering based computational pipeline to determine the integration sites for 10 museum Queensland and New South Wales (NSW) koala samples collected between the 1870s and late 1980s. A secondary aim of this study sought to identify common integration sites across modern and historical specimens by comparing our dataset to previously published studies. Several million sequences were processed, and the KoRV integration sites in each koala were characterized. Conclusions. Although the three enrichment methods each exhibited bias in integration site retrieval, a combination of two methods, Primer Extension Capture and hybridization capture is recommended for future studies on historical samples. Moreover, identification of integration sites shows that the proportion of integration sites shared between any two koalas is quite small. PMID:27069793
Minter, Kelsey M; Jannik, G Timothy; Stagich, Brooke H; Dixon, Kenneth L; Newton, Joseph R
2018-04-01
The U.S. Environmental Protection Agency (EPA) requires the use of the model CAP88 to estimate the total effective dose (TED) to an offsite maximally exposed individual (MEI) for demonstrating compliance with 40 CFR 61, Subpart H: The National Emission Standards for Hazardous Air Pollutants (NESHAP) regulations. For NESHAP compliance at the Savannah River Site (SRS), the EPA, the U.S. Department of Energy (DOE), South Carolina's Department of Health and Environmental Control, and SRS approved a dose assessment method in 1991 that models all radiological emissions as if originating from a generalized center of site (COS) location at two allowable stack heights (0 m and 61 m). However, due to changes in SRS missions, radiological emissions are no longer evenly distributed about the COS. An area-specific simulation of the 2015 SRS radiological airborne emissions was conducted to compare to the current COS method. The results produced a slightly higher dose estimate (2.97 × 10 mSv vs. 2.22 × 10 mSv), marginally changed the overall MEI location, and noted that H-Area tritium emissions dominated the dose. Thus, an H-Area dose model was executed as a potential simplification of the area-specific simulation by adopting the COS methodology and modeling all site emissions from a single location in H-Area using six stack heights that reference stacks specific to the tritium production facilities within H-Area. This "H-Area Tritium Stacks" method produced a small increase in TED estimates (3.03 × 10 mSv vs. 2.97 × 10 mSv) when compared to the area-specific simulation. This suggests that the current COS method is still appropriate for demonstrating compliance with NESHAP regulations but that changing to the H-Area Tritium Stacks assessment method may now be a more appropriate representation of operations at SRS.
75 FR 53999 - Notice of Request for the Extension of Currently Approved Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-02
... identified by the docket number by only one of the following methods: 1. Web site: www.regulations.gov . Follow the instructions for submitting comments on the U.S. Government electronic docket site. (Note: The... comments.) All electronic submissions must be made to the U.S. Government electronic docket site at http...
Wen, Huan Fei; Li, Yan Jun; Arima, Eiji; Naitoh, Yoshitaka; Sugawara, Yasuhiro; Xu, Rui; Cheng, Zhi Hai
2017-03-10
We propose a new multi-image method for obtaining the frequency shift, tunneling current and local contact potential difference (LCPD) on a TiO 2 (110) surface with atomic resolution. The tunneling current image reveals rarely observed surface oxygen atoms contrary to the conventional results. We analyze how the surface and subsurface defects affect the distribution of the LCPD. In addition, the subsurface defects are observed clearly in the tunneling current image, in contrast to a topographic image. To clarify the origin of the atomic contrast, we perform site-dependent spectroscopy as a function of the tip-sample distance. The multi-image method is expected to be widely used to investigate the charge transfer phenomena between the nanoparticles and surface sites, and it is useful for elucidating the mechanisms of catalytic reactions.
Site remediation techniques in India: a review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anomitra Banerjee; Miller Jothi
India is one of the developing countries operating site remediation techniques for the entire nuclear fuel cycle waste for the last three decades. In this paper we intend to provide an overview of remediation methods currently utilized at various hazardous waste sites in India, their advantages and disadvantages. Over the years the site remediation techniques have been well characterized and different processes for treatment, conditioning and disposal are being practiced. Remediation Methods categorized as biological, chemical or physical are summarized for contaminated soils and environmental waters. This paper covers the site remediation techniques implemented for treatment and conditioning of wastelandsmore » arising from the operation of nuclear power plant, research reactors and fuel reprocessing units. (authors)« less
78 FR 8149 - Extension of a Currently Approved Information Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-05
... in achieving sustainable, systemic change that results in greater safety, permanency, and well-being... agreements. The cross-site evaluation uses a mixed-method, longitudinal approach. Data collection methods...
Andrew B. Self; Andrew W. Ezell; Damon B. Hollis; Derek. Alkire
2011-01-01
Mechanical site preparation is frequently proposed to alleviate poor soil conditions when afforesting retired agricultural fields. Without management of soil problems, oak seedlings planted in these areas may exhibit poor survival. While mechanical site preparation methods currently employed in hardwood afforestation are proven, there is a substantial void in research...
NASA Astrophysics Data System (ADS)
Burger, Joanna; Gochfeld, Michael; Bunn, Amoret; Downs, Janelle; Jeitner, Christian; Pittfield, Taryn; Salisbury, Jennifer; Kosson, David
2017-03-01
An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy's Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale from non-discernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy's sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy's sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.
Burger, Joanna; Gochfeld, Michael; Bunn, Amoret; Downs, Janelle; Jeitner, Christian; Pittfield, Taryn; Salisbury, Jennifer; Kosson, David
2017-03-01
An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy's Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale from non-discernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy's sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy's sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.
Environmental impact assessment of Gonabad municipal waste landfill site using Leopold Matrix.
Sajjadi, Seyed Ali; Aliakbari, Zohreh; Matlabi, Mohammad; Biglari, Hamed; Rasouli, Seyedeh Samira
2017-02-01
An environmental impact assessment (EIA) before embarking on any project is a useful tool to reduce the potential effects of each project, including landfill, if possible. The main objective of this study was to assess the environmental impact of the current municipal solid waste disposal site of Gonabad by using the Iranian Leopold matrix method. This cross-sectional study was conducted to assess the environmental impacts of a landfill site in Gonabad in 2015 by an Iranian matrix (modified Leopold matrix). This study was conducted based on field visits of the landfill, and collected information from various sources and analyzing and comparing between five available options, including the continuation of the current disposal practices, construction of new sanitary landfills, recycling plans, composting, and incineration plants was examined. The best option was proposed to replace the existing landfill. The current approach has a score of 2.35, the construction of new sanitary landfill has a score of 1.59, a score of 1.57 for the compost plant, and recycling and incineration plant, respectively, have scores of 1.68 and 2.3. Results showed that continuation of the current method of disposal, due to severe environmental damage and health problems, is rejected. A compost plant with the lowest negative score is the best option for the waste disposal site of Gonabad City and has priority over the other four options.
Development of variable LRFD \\0x03C6 factors for deep foundation design due to site variability.
DOT National Transportation Integrated Search
2012-04-01
The current design guidelines of Load and Resistance Factor Design (LRFD) specifies constant values : for deep foundation design, based on analytical method selected and degree of redundancy of the pier. : However, investigation of multiple sites in ...
Verbal autopsy: current practices and challenges.
Soleman, Nadia; Chandramohan, Daniel; Shibuya, Kenji
2006-01-01
Cause-of-death data derived from verbal autopsy (VA) are increasingly used for health planning, priority setting, monitoring and evaluation in countries with incomplete or no vital registration systems. In some regions of the world it is the only method available to obtain estimates on the distribution of causes of death. Currently, the VA method is routinely used at over 35 sites, mainly in Africa and Asia. In this paper, we present an overview of the VA process and the results of a review of VA tools and operating procedures used at demographic surveillance sites and sample vital registration systems. We asked for information from 36 field sites about field-operating procedures and reviewed 18 verbal autopsy questionnaires and 10 cause-of-death lists used in 13 countries. The format and content of VA questionnaires, field-operating procedures, cause-of-death lists and the procedures to derive causes of death from VA process varied substantially among sites. We discuss the consequences of using varied methods and conclude that the VA tools and procedures must be standardized and reliable in order to make accurate national and international comparisons of VA data. We also highlight further steps needed in the development of a standard VA process. PMID:16583084
78 FR 35072 - Proposed Revisions to Reliability Assurance Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-11
... current staff review methods and practices based on lessons learned from NRC reviews of design... following methods (unless this document describes a different method for submitting comments on a specific... possesses and is publicly-available, by the following methods: Federal Rulemaking Web site: Go to http://www...
Local density approximation in site-occupation embedding theory
NASA Astrophysics Data System (ADS)
Senjean, Bruno; Tsuchiizu, Masahisa; Robert, Vincent; Fromager, Emmanuel
2017-01-01
Site-occupation embedding theory (SOET) is a density functional theory (DFT)-based method which aims at modelling strongly correlated electrons. It is in principle exact and applicable to model and quantum chemical Hamiltonians. The theory is presented here for the Hubbard Hamiltonian. In contrast to conventional DFT approaches, the site (or orbital) occupations are deduced in SOET from a partially interacting system consisting of one (or more) impurity site(s) and non-interacting bath sites. The correlation energy of the bath is then treated implicitly by means of a site-occupation functional. In this work, we propose a simple impurity-occupation functional approximation based on the two-level (2L) Hubbard model which is referred to as two-level impurity local density approximation (2L-ILDA). Results obtained on a prototypical uniform eight-site Hubbard ring are promising. The extension of the method to larger systems and more sophisticated model Hamiltonians is currently in progress.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tegen, Suzanne
This presentation provides an overview of findings from a report published in 2016 by researchers at the National Renewable Energy Laboratory, An Initial Evaluation of Siting Considerations on Current and Future Wind Deployment. The presentation covers the background for research, the Energy Department's Wind Vision, research methods, siting considerations, the wind project deployment process, and costs associated with siting considerations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneemann, Matthias; Carius, Reinhard; Rau, Uwe
2015-05-28
This paper studies the effective electrical size and carrier multiplication of breakdown sites in multi-crystalline silicon solar cells. The local series resistance limits the current of each breakdown site and is thereby linearizing the current-voltage characteristic. This fact allows the estimation of the effective electrical diameters to be as low as 100 nm. Using a laser beam induced current (LBIC) measurement with a high spatial resolution, we find carrier multiplication factors on the order of 30 (Zener-type breakdown) and 100 (avalanche breakdown) as new lower limits. Hence, we prove that also the so-called Zener-type breakdown is followed by avalanche multiplication. Wemore » explain that previous measurements of the carrier multiplication using thermography yield results higher than unity, only if the spatial defect density is high enough, and the illumination intensity is lower than what was used for the LBIC method. The individual series resistances of the breakdown sites limit the current through these breakdown sites. Therefore, the measured multiplication factors depend on the applied voltage as well as on the injected photocurrent. Both dependencies are successfully simulated using a series-resistance-limited diode model.« less
The Prevention of Surgical Site Infection in Elective Colon Surgery
Fry, Donald E.
2013-01-01
Infections at the surgical site continue to occur in as many as 20% of elective colon resection cases. Methods to reduce these infections are inconsistently applied. Surgical site infection (SSI) is the result of multiple interactive variables including the inoculum of bacteria that contaminate the site, the virulence of the contaminating microbes, and the local environment at the surgical site. These variables that promote infection are potentially offset by the effectiveness of the host defense. Reduction in the inoculum of bacteria is achieved by appropriate surgical site preparation, systemic preventive antibiotics, and use of mechanical bowel preparation in conjunction with the oral antibiotic bowel preparation. Intraoperative reduction of hematoma, necrotic tissue, foreign bodies, and tissue dead space will reduce infections. Enhancement of the host may be achieved by perioperative supplemental oxygenation, maintenance of normothermia, and glycemic control. These methods require additional research to identify optimum application. Uniform application of currently understood methods and continued research into new methods to reduce microbial contamination and enhancement of host responsiveness can lead to better outcomes. PMID:24455434
DOE Office of Scientific and Technical Information (OSTI.GOV)
Umari, A.M.J.; Geldon, A.; Patterson, G.
1994-12-31
Yucca Mountain, Nevada, currently is being investigated by the U.S. Geological Survey as a potential site for a high-level nuclear waste repository. Planned hydraulic-stress and tracer tests in fractured, tuffaceous rocks below the water table at Yucca Mountain will require work at depths in excess of 1,300 feet. To facilitate prototype testing of equipment and methods to be used in aquifer tests at Yucca Mountain, an analog site was selected in the foothills of the Sierra Nevada near Raymond, California. Two of nine 250- to 300-feet deep wells drilled into fractured, granitic rocks at the Raymond site have been instrumentedmore » with packers, pressure transducers, and other equipment that will be used at Yucca Mountain. Aquifer tests conducted at the Raymond site to date have demonstrated a need to modify some of the equipment and methods conceived for use at Yucca Mountain.« less
Current reversals and metastable states in the infinite Bose-Hubbard chain with local particle loss
NASA Astrophysics Data System (ADS)
Kiefer-Emmanouilidis, M.; Sirker, J.
2017-12-01
We present an algorithm which combines the quantum trajectory approach to open quantum systems with a density-matrix renormalization-group scheme for infinite one-dimensional lattice systems. We apply this method to investigate the long-time dynamics in the Bose-Hubbard model with local particle loss starting from a Mott-insulating initial state with one boson per site. While the short-time dynamics can be described even quantitatively by an equation of motion (EOM) approach at the mean-field level, many-body interactions lead to unexpected effects at intermediate and long times: local particle currents far away from the dissipative site start to reverse direction ultimately leading to a metastable state with a total particle current pointing away from the lossy site. An alternative EOM approach based on an effective fermion model shows that the reversal of currents can be understood qualitatively by the creation of holon-doublon pairs at the edge of the region of reduced particle density. The doublons are then able to escape while the holes move towards the dissipative site, a process reminiscent—in a loose sense—of Hawking radiation.
Donnelly, Aoife; Misstear, Bruce; Broderick, Brian
2011-02-15
Background concentrations of nitrogen dioxide (NO(2)) are not constant but vary temporally and spatially. The current paper presents a powerful tool for the quantification of the effects of wind direction and wind speed on background NO(2) concentrations, particularly in cases where monitoring data are limited. In contrast to previous studies which applied similar methods to sites directly affected by local pollution sources, the current study focuses on background sites with the aim of improving methods for predicting background concentrations adopted in air quality modelling studies. The relationship between measured NO(2) concentration in air at three such sites in Ireland and locally measured wind direction has been quantified using nonparametric regression methods. The major aim was to analyse a method for quantifying the effects of local wind direction on background levels of NO(2) in Ireland. The method was expanded to include wind speed as an added predictor variable. A Gaussian kernel function is used in the analysis and circular statistics employed for the wind direction variable. Wind direction and wind speed were both found to have a statistically significant effect on background levels of NO(2) at all three sites. Frequently environmental impact assessments are based on short term baseline monitoring producing a limited dataset. The presented non-parametric regression methods, in contrast to the frequently used methods such as binning of the data, allow concentrations for missing data pairs to be estimated and distinction between spurious and true peaks in concentrations to be made. The methods were found to provide a realistic estimation of long term concentration variation with wind direction and speed, even for cases where the data set is limited. Accurate identification of the actual variation at each location and causative factors could be made, thus supporting the improved definition of background concentrations for use in air quality modelling studies. Copyright © 2010 Elsevier B.V. All rights reserved.
DOT National Transportation Integrated Search
1993-12-01
The Alternating Current Potential Drop (ACPD) method is investigated as a means of making measurements in laboratory experiments on the initiation and growth of multiple site damage (MSD) cracks in a common aluminum alloy used for aircraft constructi...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burger, Joanna; Gochfeld, Michael; Bunn, Amoret
An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy’s Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale frommore » nondiscernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy’s sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy’s sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.« less
CisMapper: predicting regulatory interactions from transcription factor ChIP-seq data
O'Connor, Timothy; Bodén, Mikael
2017-01-01
Abstract Identifying the genomic regions and regulatory factors that control the transcription of genes is an important, unsolved problem. The current method of choice predicts transcription factor (TF) binding sites using chromatin immunoprecipitation followed by sequencing (ChIP-seq), and then links the binding sites to putative target genes solely on the basis of the genomic distance between them. Evidence from chromatin conformation capture experiments shows that this approach is inadequate due to long-distance regulation via chromatin looping. We present CisMapper, which predicts the regulatory targets of a TF using the correlation between a histone mark at the TF's bound sites and the expression of each gene across a panel of tissues. Using both chromatin conformation capture and differential expression data, we show that CisMapper is more accurate at predicting the target genes of a TF than the distance-based approaches currently used, and is particularly advantageous for predicting the long-range regulatory interactions typical of tissue-specific gene expression. CisMapper also predicts which TF binding sites regulate a given gene more accurately than using genomic distance. Unlike distance-based methods, CisMapper can predict which transcription start site of a gene is regulated by a particular binding site of the TF. PMID:28204599
Optimal number of stimulation contacts for coordinated reset neuromodulation
Lysyansky, Borys; Popovych, Oleksandr V.; Tass, Peter A.
2013-01-01
In this computational study we investigate coordinated reset (CR) neuromodulation designed for an effective control of synchronization by multi-site stimulation of neuronal target populations. This method was suggested to effectively counteract pathological neuronal synchrony characteristic for several neurological disorders. We study how many stimulation sites are required for optimal CR-induced desynchronization. We found that a moderate increase of the number of stimulation sites may significantly prolong the post-stimulation desynchronized transient after the stimulation is completely switched off. This can, in turn, reduce the amount of the administered stimulation current for the intermittent ON–OFF CR stimulation protocol, where time intervals with stimulation ON are recurrently followed by time intervals with stimulation OFF. In addition, we found that the optimal number of stimulation sites essentially depends on how strongly the administered current decays within the neuronal tissue with increasing distance from the stimulation site. In particular, for a broad spatial stimulation profile, i.e., for a weak spatial decay rate of the stimulation current, CR stimulation can optimally be delivered via a small number of stimulation sites. Our findings may contribute to an optimization of therapeutic applications of CR neuromodulation. PMID:23885239
Miyoshi, S; Sakajiri, M; Ifukube, T; Matsushima, J
1997-01-01
We have proposed the Tripolar Electrode Stimulation Method (TESM) which may enable us to narrow the stimulation region and to move continuously the stimulation site for the cochlear implants. We evaluated whether or not TESM works according to a theory based on numerical analysis using the auditory nerve fiber model. In this simulation, the sum of the excited model fibers were compared with the compound actions potentials obtained from animal experiments. As a result, this experiment showed that TESM could narrow a stimulation region by controlling the sum of the currents emitted from the electrodes on both sides, and continuously move a stimulation site by changing the ratio of the currents emitted from the electrodes on both sides.
NASA Astrophysics Data System (ADS)
Harris, Courtney K.; Wiberg, Patricia L.
1997-09-01
Modeling shelf sediment transport rates and bed reworking depths is problematic when the wave and current forcing conditions are not precisely known, as is usually the case when long-term sedimentation patterns are of interest. Two approaches to modeling sediment transport under such circumstances are considered. The first relies on measured or simulated time series of flow conditions to drive model calculations. The second approach uses as model input probability distribution functions of bottom boundary layer flow conditions developed from wave and current measurements. Sediment transport rates, frequency of bed resuspension by waves and currents, and bed reworking calculated using the two methods are compared at the mid-shelf STRESS (Sediment TRansport on Shelves and Slopes) site on the northern California continental shelf. Current, wave and resuspension measurements at the site are used to generate model inputs and test model results. An 11-year record of bottom wave orbital velocity, calculated from surface wave spectra measured by the National Data Buoy Center (NDBC) Buoy 46013 and verified against bottom tripod measurements, is used to characterize the frequency and duration of wave-driven transport events and to estimate the joint probability distribution of wave orbital velocity and period. A 109-day record of hourly current measurements 10 m above bottom is used to estimate the probability distribution of bottom boundary layer current velocity at this site and to develop an auto-regressive model to simulate current velocities for times when direct measurements of currents are not available. Frequency of transport, the maximum volume of suspended sediment, and average flux calculated using measured wave and simulated current time series agree well with values calculated using measured time series. A probabilistic approach is more amenable to calculations over time scales longer than existing wave records, but it tends to underestimate net transport because it does not capture the episodic nature of transport events. Both methods enable estimates to be made of the uncertainty in transport quantities that arise from an incomplete knowledge of the specific timing of wave and current conditions. 1997 Elsevier Science Ltd
Sando, Roy; Chase, Katherine J.
2017-03-23
A common statistical procedure for estimating streamflow statistics at ungaged locations is to develop a relational model between streamflow and drainage basin characteristics at gaged locations using least squares regression analysis; however, least squares regression methods are parametric and make constraining assumptions about the data distribution. The random forest regression method provides an alternative nonparametric method for estimating streamflow characteristics at ungaged sites and requires that the data meet fewer statistical conditions than least squares regression methods.Random forest regression analysis was used to develop predictive models for 89 streamflow characteristics using Precipitation-Runoff Modeling System simulated streamflow data and drainage basin characteristics at 179 sites in central and eastern Montana. The predictive models were developed from streamflow data simulated for current (baseline, water years 1982–99) conditions and three future periods (water years 2021–38, 2046–63, and 2071–88) under three different climate-change scenarios. These predictive models were then used to predict streamflow characteristics for baseline conditions and three future periods at 1,707 fish sampling sites in central and eastern Montana. The average root mean square error for all predictive models was about 50 percent. When streamflow predictions at 23 fish sampling sites were compared to nearby locations with simulated data, the mean relative percent difference was about 43 percent. When predictions were compared to streamflow data recorded at 21 U.S. Geological Survey streamflow-gaging stations outside of the calibration basins, the average mean absolute percent error was about 73 percent.
NASA Technical Reports Server (NTRS)
Canfield, Richard C.; De La Beaujardiere, J.-F.; Fan, Yuhong; Leka, K. D.; Mcclymont, A. N.; Metcalf, Thomas R.; Mickey, Donald L.; Wuelser, Jean-Pierre; Lites, Bruce W.
1993-01-01
Electric current systems in solar active regions and their spatial relationship to sites of electron precipitation and high-pressure in flares were studied with the purpose of providing observational evidence for or against the flare models commonly discussed in the literature. The paper describes the instrumentation, the data used, and the data analysis methods, as well as improvements made upon earlier studies. Several flare models are overviewed, and the predictions yielded by each model for the relationships of flares to the vertical current systems are discussed.
Detecting defects in marine structures by using eddy current infrared thermography.
Swiderski, W
2016-12-01
Eddy current infrared (IR) thermography is a new nondestructive testing (NDT) technique used for the detection of cracks in electroconductive materials. By combining the well-established inspection methods of eddy current NDT and IR thermography, this technique uses induced eddy currents to heat test samples. In this way, IR thermography allows the visualization of eddy current distribution that is distorted in defect sites. This paper discusses the results of numerical modeling of eddy current IR thermography procedures in application to marine structures.
Non-Invasive Seismic Methods for Earthquake Site Classification Applied to Ontario Bridge Sites
NASA Astrophysics Data System (ADS)
Bilson Darko, A.; Molnar, S.; Sadrekarimi, A.
2017-12-01
How a site responds to earthquake shaking and its corresponding damage is largely influenced by the underlying ground conditions through which it propagates. The effects of site conditions on propagating seismic waves can be predicted from measurements of the shear wave velocity (Vs) of the soil layer(s) and the impedance ratio between bedrock and soil. Currently the seismic design of new buildings and bridges (2015 Canadian building and bridge codes) requires determination of the time-averaged shear-wave velocity of the upper 30 metres (Vs30) of a given site. In this study, two in situ Vs profiling methods; Multichannel Analysis of Surface Waves (MASW) and Ambient Vibration Array (AVA) methods are used to determine Vs30 at chosen bridge sites in Ontario, Canada. Both active-source (MASW) and passive-source (AVA) surface wave methods are used at each bridge site to obtain Rayleigh-wave phase velocities over a wide frequency bandwidth. The dispersion curve is jointly inverted with each site's amplification function (microtremor horizontal-to-vertical spectral ratio) to obtain shear-wave velocity profile(s). We apply our non-invasive testing at three major infrastructure projects, e.g., five bridge sites along the Rt. Hon. Herb Gray Parkway in Windsor, Ontario. Our non-invasive testing is co-located with previous invasive testing, including Standard Penetration Test (SPT), Cone Penetration Test and downhole Vs data. Correlations between SPT blowcount and Vs are developed for the different soil types sampled at our Ontario bridge sites. A robust earthquake site classification procedure (reliable Vs30 estimates) for bridge sites across Ontario is evaluated from available combinations of invasive and non-invasive site characterization methods.
NASA Astrophysics Data System (ADS)
Wu, Yunna; Chen, Kaifeng; Xu, Hu; Xu, Chuanbo; Zhang, Haobo; Yang, Meng
2017-12-01
There is insufficient research relating to offshore wind farm site selection in China. The current methods for site selection have some defects. First, information loss is caused by two aspects: the implicit assumption that the probability distribution on the interval number is uniform; and ignoring the value of decision makers' (DMs') common opinion on the criteria information evaluation. Secondly, the difference in DMs' utility function has failed to receive attention. An innovative method is proposed in this article to solve these drawbacks. First, a new form of interval number and its weighted operator are proposed to reflect the uncertainty and reduce information loss. Secondly, a new stochastic dominance degree is proposed to quantify the interval number with a probability distribution. Thirdly, a two-stage method integrating the weighted operator with stochastic dominance degree is proposed to evaluate the alternatives. Finally, a case from China proves the effectiveness of this method.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-14
...'s Web site at http://www.batstrading.com , at the principal office of the Exchange, and at the...-brand one of its routing strategies, currently referred to as ``DART,'' as the ``Dark Routing Technique... only one method. The Commission will post all comments on the Commission's Internet Web site ( http...
Preliminary Evaluation of Method to Monitor Landfills Resilience against Methane Emission
NASA Astrophysics Data System (ADS)
Chusna, Noor Amalia; Maryono, Maryono
2018-02-01
Methane emission from landfill sites contribute to global warming and un-proper methane treatment can pose an explosion hazard. Stakeholder and government in the cities in Indonesia been found significant difficulties to monitor the resilience of landfill from methane emission. Moreover, the management of methane gas has always been a challenging issue for long waste management service and operations. Landfills are a significant contributor to anthropogenic methane emissions. This study conducted preliminary evaluation of method to manage methane gas emission by assessing LandGem and IPCC method. From the preliminary evaluation, this study found that the IPCC method is based on the availability of current and historical country specific data regarding the waste disposed of in landfills while from the LandGEM method is an automated tool for estimating emission rates for total landfill gas this method account total gas of methane, carbon dioxide and other. The method can be used either with specific data to estimate emissions in the site or default parameters if no site-specific data are available. Both of method could be utilize to monitor the methane emission from landfill site in cities of Central Java.
NASA Technical Reports Server (NTRS)
Campbell, W. H.; Zimmerman, J. E.
1979-01-01
The field gradient method for observing the electric currents in the Alaska pipeline provided consistent values for both the fluxgate and SQUID method of observation. These currents were linearly related to the regularly measured electric and magnetic field changes. Determinations of pipeline current were consistent with values obtained by a direct connection, current shunt technique at a pipeline site about 9.6 km away. The gradient method has the distinct advantage of portability and buried- pipe capability. Field gradients due to the pipe magnetization, geological features, or ionospheric source currents do not seem to contribute a measurable error to such pipe current determination. The SQUID gradiometer is inherently sensitive enough to detect very small currents in a linear conductor at 10 meters, or conversely, to detect small currents of one amphere or more at relatively great distances. It is fairly straightforward to achieve imbalance less than one part in ten thousand, and with extreme care, one part in one million or better.
Fire history, population, and calcium cycling in the current river watershed
Richard P. Guyette; Bruce E. Cutter
1997-01-01
Quantitative details about past anthropogenic fire regimes and their effects have been lacking in the central hardwood region. Here, we present fire scar chronologies from 23 oak-shortleaf pine (Quercus spp. and Pinus echinata Mill.) sites in the upper Current River watershed of the Missouri Ozarks. Dendrochronological methods were...
Analysis strategies for longitudinal attachment loss data.
Beck, J D; Elter, J R
2000-02-01
The purpose of this invited review is to describe and discuss methods currently in use to quantify the progression of attachment loss in epidemiological studies of periodontal disease, and to make recommendations for specific analytic methods based upon the particular design of the study and structure of the data. The review concentrates on the definition of incident attachment loss (ALOSS) and its component parts; measurement issues including thresholds and regression to the mean; methods of accounting for longitudinal change, including changes in means, changes in proportions of affected sites, incidence density, the effect of tooth loss and reversals, and repeated events; statistical models of longitudinal change, including the incorporation of the time element, use of linear, logistic or Poisson regression or survival analysis, and statistical tests; site vs person level of analysis, including statistical adjustment for correlated data; the strengths and limitations of ALOSS data. Examples from the Piedmont 65+ Dental Study are used to illustrate specific concepts. We conclude that incidence density is the preferred methodology to use for periodontal studies with more than one period of follow-up and that the use of studies not employing methods for dealing with complex samples, correlated data, and repeated measures does not take advantage of our current understanding of the site- and person-level variables important in periodontal disease and may generate biased results.
Francy, Donna S.; Stelzer, Erin A.; Duris, Joseph W.; Brady, Amie M.G.; Harrison, John H.; Johnson, Heather E.; Ware, Michael W.
2013-01-01
Predictive models, based on environmental and water quality variables, have been used to improve the timeliness and accuracy of recreational water quality assessments, but their effectiveness has not been studied in inland waters. Sampling at eight inland recreational lakes in Ohio was done in order to investigate using predictive models for Escherichia coli and to understand the links between E. coli concentrations, predictive variables, and pathogens. Based upon results from 21 beach sites, models were developed for 13 sites, and the most predictive variables were rainfall, wind direction and speed, turbidity, and water temperature. Models were not developed at sites where the E. coli standard was seldom exceeded. Models were validated at nine sites during an independent year. At three sites, the model resulted in increased correct responses, sensitivities, and specificities compared to use of the previous day's E. coli concentration (the current method). Drought conditions during the validation year precluded being able to adequately assess model performance at most of the other sites. Cryptosporidium, adenovirus, eaeA (E. coli), ipaH (Shigella), and spvC (Salmonella) were found in at least 20% of samples collected for pathogens at five sites. The presence or absence of the three bacterial genes was related to some of the model variables but was not consistently related to E. coli concentrations. Predictive models were not effective at all inland lake sites; however, their use at two lakes with high swimmer densities will provide better estimates of public health risk than current methods and will be a valuable resource for beach managers and the public.
Francy, Donna S; Stelzer, Erin A; Duris, Joseph W; Brady, Amie M G; Harrison, John H; Johnson, Heather E; Ware, Michael W
2013-03-01
Predictive models, based on environmental and water quality variables, have been used to improve the timeliness and accuracy of recreational water quality assessments, but their effectiveness has not been studied in inland waters. Sampling at eight inland recreational lakes in Ohio was done in order to investigate using predictive models for Escherichia coli and to understand the links between E. coli concentrations, predictive variables, and pathogens. Based upon results from 21 beach sites, models were developed for 13 sites, and the most predictive variables were rainfall, wind direction and speed, turbidity, and water temperature. Models were not developed at sites where the E. coli standard was seldom exceeded. Models were validated at nine sites during an independent year. At three sites, the model resulted in increased correct responses, sensitivities, and specificities compared to use of the previous day's E. coli concentration (the current method). Drought conditions during the validation year precluded being able to adequately assess model performance at most of the other sites. Cryptosporidium, adenovirus, eaeA (E. coli), ipaH (Shigella), and spvC (Salmonella) were found in at least 20% of samples collected for pathogens at five sites. The presence or absence of the three bacterial genes was related to some of the model variables but was not consistently related to E. coli concentrations. Predictive models were not effective at all inland lake sites; however, their use at two lakes with high swimmer densities will provide better estimates of public health risk than current methods and will be a valuable resource for beach managers and the public.
The Spleen as an Optimal Site for Islet Transplantation and a Source of Mesenchymal Stem Cells.
Sakata, Naoaki; Yoshimatsu, Gumpei; Kodama, Shohta
2018-05-07
This review demonstrates the unique potential of the spleen as an optimal site for islet transplantation and as a source of mesenchymal stem cells. Islet transplantation is a cellular replacement therapy used to treat severe diabetes mellitus; however, its clinical outcome is currently unsatisfactory. Selection of the most appropriate transplantation site is a major factor affecting the clinical success of this therapy. The spleen has long been studied as a candidate site for islet transplantation. Its advantages include physiological insulin drainage and regulation of immunity, and it has recently also been shown to contribute to the regeneration of transplanted islets. However, the efficacy of transplantation in the spleen is lower than that of intraportal transplantation, which is the current representative method of clinical islet transplantation. Safer and more effective methods of islet transplantation need to be established to allow the spleen to be used for clinical transplantation. The spleen is also of interest as a mesenchymal stem cell reservoir. Splenic mesenchymal stem cells contribute to the repair of damaged tissue, and their infusion may thus be a promising therapy for autoimmune diseases, including type 1 diabetes mellitus and Sjogren’s syndrome.
Smoking-Associated Site-Specific Differential Methylation in Buccal Mucosa in the COPDGene Study
Qiu, Weiliang; Carey, Vincent J.; Morrow, Jarrett; Bacherman, Helene; Foreman, Marilyn G.; Hokanson, John E.; Bowler, Russell P.; Crapo, James D.; DeMeo, Dawn L.
2015-01-01
DNA methylation is a complex, tissue-specific phenomenon that can reflect both endogenous factors and exogenous exposures. Buccal brushings represent an easily accessible source of DNA, which may be an appropriate surrogate tissue in the study of environmental exposures and chronic respiratory diseases. Buccal brushings were obtained from a subset of current and former smokers from the COPDGene study. Genome-wide DNA methylation data were obtained in the discovery cohort (n = 82) using the Illumina HumanMethylation450K array. Empirical Bayes methods were used to test for differential methylation by current smoking status at 468,219 autosomal CpG sites using linear models adjusted for age, sex, and race. Pyrosequencing was performed in a nonoverlapping replication cohort (n = 130). Current smokers were significantly younger than former smokers in both the discovery and replication cohorts. Seven CpG sites were associated with current smoking at a false discovery rate less than 0.05 in the discovery cohort. Six of the seven significant sites were pyrosequenced in the replication cohort; five CpG sites, including sites annotated to CYP1B1 and PARVA, were replicated. Correlations between cumulative smoke exposure and time since smoking cessation were observed in a subset of the significantly associated CpG sites. A significant correlation between reduced lung function and increased radiographic emphysema with methylation at cg02162897 (CYP1B1) was observed among female subjects. Site-specific methylation of DNA isolated from buccal mucosa is associated with exposure to cigarette smoke, and may provide insights into the mechanisms underlying differential susceptibility toward the development of smoking-related chronic respiratory diseases. PMID:25517428
Automated prediction of protein function and detection of functional sites from structure.
Pazos, Florencio; Sternberg, Michael J E
2004-10-12
Current structural genomics projects are yielding structures for proteins whose functions are unknown. Accordingly, there is a pressing requirement for computational methods for function prediction. Here we present PHUNCTIONER, an automatic method for structure-based function prediction using automatically extracted functional sites (residues associated to functions). The method relates proteins with the same function through structural alignments and extracts 3D profiles of conserved residues. Functional features to train the method are extracted from the Gene Ontology (GO) database. The method extracts these features from the entire GO hierarchy and hence is applicable across the whole range of function specificity. 3D profiles associated with 121 GO annotations were extracted. We tested the power of the method both for the prediction of function and for the extraction of functional sites. The success of function prediction by our method was compared with the standard homology-based method. In the zone of low sequence similarity (approximately 15%), our method assigns the correct GO annotation in 90% of the protein structures considered, approximately 20% higher than inheritance of function from the closest homologue.
Currently, there is limited guidance on selecting test sites to measure surface infiltration rates in permeable pavement systems to determine maintenance frequency. The ASTM method (ASTM C1701) for measuring infiltration rate of in-place pervious concrete suggest to either (1) p...
Electron microscopy methods in studies of cultural heritage sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vasiliev, A. L., E-mail: a.vasiliev56@gmail.com; Kovalchuk, M. V.; Yatsishina, E. B.
The history of the development and application of scanning electron microscopy (SEM), transmission electron microscopy (TEM), and energy-dispersive X-ray microanalysis (EDXMA) in studies of cultural heritage sites is considered. In fact, investigations based on these methods began when electron microscopes became a commercial product. Currently, these methods, being developed and improved, help solve many historical enigmas. To date, electron microscopy combined with microanalysis makes it possible to investigate any object, from parchment and wooden articles to pigments, tools, and objects of art. Studies by these methods have revealed that some articles were made by ancient masters using ancient “nanotechnologies”; hence,more » their comprehensive analysis calls for the latest achievements in the corresponding instrumental methods and sample preparation techniques.« less
Electron microscopy methods in studies of cultural heritage sites
NASA Astrophysics Data System (ADS)
Vasiliev, A. L.; Kovalchuk, M. V.; Yatsishina, E. B.
2016-11-01
The history of the development and application of scanning electron microscopy (SEM), transmission electron microscopy (TEM), and energy-dispersive X-ray microanalysis (EDXMA) in studies of cultural heritage sites is considered. In fact, investigations based on these methods began when electron microscopes became a commercial product. Currently, these methods, being developed and improved, help solve many historical enigmas. To date, electron microscopy combined with microanalysis makes it possible to investigate any object, from parchment and wooden articles to pigments, tools, and objects of art. Studies by these methods have revealed that some articles were made by ancient masters using ancient "nanotechnologies"; hence, their comprehensive analysis calls for the latest achievements in the corresponding instrumental methods and sample preparation techniques.
Studies of reaction geometry in oxidation and reduction of the alkaline silver electrode
NASA Technical Reports Server (NTRS)
Butler, E. A.; Blackham, A. U.
1971-01-01
Two methods of surface area estimations of sintered silver electrodes have given roughness factors of 58 and 81. One method is based on constant current oxidation, the other is based on potentiostatic oxidation. Examination of both wire and sintered silver electrodes via scanning electron microscopy at various stages of oxidation have shown that important structural features are mounds of oxide. In potentiostatic oxidations these appear to form on sites instantaneously nucleated while in constant current oxidations progressive nucleation is indicated.
Characterization of Leachate at Simpang Renggam Landfill Site, Johor, Malaysia
NASA Astrophysics Data System (ADS)
Zailani, L. W. M.; Amdan, N. S. M.; Zin, N. S. M.
2018-04-01
Nowadays, the world facing a major problem in managed solid waste due to the increasing of solid waste. Malaysia, one of the country also involves in this matter which is 296 landfills are open to overcome this problem. Currently, the best alternative option to manage solid waste is by using landfilling method because it has low costing advantages. The disadvantage of landfill method, it might cause a pollution by producing leachate that will give an effect to the ground and surface water resources. This study focuses on analysing the leachate composition at Simpang Renggam Landfill(SRL) site for seven parameters such as COD, BOD, SS, turbidity, pH, BOD5/COD, and ammonia (NH3-N). All the data obtained were compared with previous researcher and Malaysia Environmental Quality Act 1974. From the result, SRL site was categorized as partially stabilized leachate with the parameter of BOD5/COD > 0.1. The SRL site is recommended to use a physical-chemical method for a better treatment because the leachate composition is classified as old leachate and aerated lagoon method are not satisfied to be used in treating the aging leachate at SRL site.
Methods for determining manning's coefficients for Illinois streams
Soong, D.T.; Halfar, T.M.; Jupin, M.A.; Wobig, L.A.; ,
2004-01-01
Determination of Manning's coefficient, n, for natural streams remains a challenge in practices. One source for determining the n-values that has received practitioners' attention is presenting the n-values determined from field data (measured discharge and water-surface slope) in combination of photographs and site descriptions (ancillary information). Further improvements in the visual approach can be made in presenting site characteristics and describing site ancillary information. In this manner, users can use the presented information for sites of interest with similar features. This approach in a current project on the subject for Illinois streams is discussed.
Hurtado-Chong, Anahí; Joeris, Alexander; Hess, Denise; Blauth, Michael
2017-07-12
A considerable number of clinical studies experience delays, which result in increased duration and costs. In multicentre studies, patient recruitment is among the leading causes of delays. Poor site selection can result in low recruitment and bad data quality. Site selection is therefore crucial for study quality and completion, but currently no specific guidelines are available. Selection of sites adequate to participate in a prospective multicentre cohort study was performed through an open call using a newly developed objective multistep approach. The method is based on use of a network, definition of objective criteria and a systematic screening process. Out of 266 interested sites, 24 were shortlisted and finally 12 sites were selected to participate in the study. The steps in the process included an open call through a network, use of selection questionnaires tailored to the study, evaluation of responses using objective criteria and scripted telephone interviews. At each step, the number of candidate sites was quickly reduced leaving only the most promising candidates. Recruitment and quality of data went according to expectations in spite of the contracting problems faced with some sites. The results of our first experience with a standardised and objective method of site selection are encouraging. The site selection method described here can serve as a guideline for other researchers performing multicentre studies. ClinicalTrials.gov: NCT02297581. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Alternative right ventricular pacing sites.
Łuciuk, Dariusz; Łuciuk, Marek; Gajek, Jacek
2015-01-01
The main adverse effect of chronic stimulation is stimulation-induced heart failure in case of ventricular contraction dyssynchrony. Because of this fact, new techniques of stimulation should be considered to optimize electrotherapy. One of these methods is pacing from alternative right ventricular sites. The purpose of this article is to review currently accumulated data about alternative sites of cardiac pacing. Medline and PubMed bases were used to search English and Polish reports published recently. Recent studies report a deleterious effect of long term apical pacing. It is suggested that permanent apical stimulation, by omitting physiological conduction pattern with His-Purkinie network, may lead to electrical and mechanical dyssynchrony of heart muscle contraction. In the long term this pathological situation can lead to severe heart failure and death. Because of this, scientists began to search for some alternative sites of cardiac pacing to reduce the deleterious effect of stimulation. Based on current accumulated data, it is suggested that the right ventricular outflow tract, right ventricular septum, direct His-bundle or biventricular pacing are better alternatives due to more physiological electrical impulse propagation within the heart and the reduction of the dyssynchrony effect. These methods should preserve a better left ventricular function and prevent the development of heart failure in permanent paced patients. As there is still not enough, long-term, randomized, prospective, cross-over and multicenter studies, further research is required to validate the benefits of using this kind of therapy. The article should pay attention to new sites of cardiac stimulation as a better and safer method of treatment.
Impact of germline and somatic missense variations on drug binding sites.
Yan, C; Pattabiraman, N; Goecks, J; Lam, P; Nayak, A; Pan, Y; Torcivia-Rodriguez, J; Voskanian, A; Wan, Q; Mazumder, R
2017-03-01
Advancements in next-generation sequencing (NGS) technologies are generating a vast amount of data. This exacerbates the current challenge of translating NGS data into actionable clinical interpretations. We have comprehensively combined germline and somatic nonsynonymous single-nucleotide variations (nsSNVs) that affect drug binding sites in order to investigate their prevalence. The integrated data thus generated in conjunction with exome or whole-genome sequencing can be used to identify patients who may not respond to a specific drug because of alterations in drug binding efficacy due to nsSNVs in the target protein's gene. To identify the nsSNVs that may affect drug binding, protein-drug complex structures were retrieved from Protein Data Bank (PDB) followed by identification of amino acids in the protein-drug binding sites using an occluded surface method. Then, the germline and somatic mutations were mapped to these amino acids to identify which of these alter protein-drug binding sites. Using this method we identified 12 993 amino acid-drug binding sites across 253 unique proteins bound to 235 unique drugs. The integration of amino acid-drug binding sites data with both germline and somatic nsSNVs data sets revealed 3133 nsSNVs affecting amino acid-drug binding sites. In addition, a comprehensive drug target discovery was conducted based on protein structure similarity and conservation of amino acid-drug binding sites. Using this method, 81 paralogs were identified that could serve as alternative drug targets. In addition, non-human mammalian proteins bound to drugs were used to identify 142 homologs in humans that can potentially bind to drugs. In the current protein-drug pairs that contain somatic mutations within their binding site, we identified 85 proteins with significant differential gene expression changes associated with specific cancer types. Information on protein-drug binding predicted drug target proteins and prevalence of both somatic and germline nsSNVs that disrupt these binding sites can provide valuable knowledge for personalized medicine treatment. A web portal is available where nsSNVs from individual patient can be checked by scanning against DrugVar to determine whether any of the SNVs affect the binding of any drug in the database.
Liu, Ya-Jun; Zhang, Jie; Cui, Gu-Zhen; Cui, Qiu
2015-06-01
Targetrons are mobile group II introns that can recognize their DNA target sites by base-pairing RNA-DNA interactions with the aid of site-specific binding reverse transcriptases. Targetron technology stands out from recently developed gene targeting methods because of the flexibility, feasibility, and efficiency, and is particularly suitable for the genetic engineering of difficult microorganisms, including cellulolytic bacteria that are considered promising candidates for biomass conversion via consolidated bioprocessing. Along with the development of the thermotargetron method for thermophiles, targetron technology becomes increasingly important for the metabolic engineering of industrial microorganisms aiming at biofuel/chemical production. To summarize the current progress of targetron technology and provide new insights on the use of the technology, this paper reviews the retrohoming mechanisms of both mesophilic and thermophilic targetron methods based on various group II introns, investigates the improvement of targetron tools for high target efficiency and specificity, and discusses the current applications in the metabolic engineering for bacterial producers. Although there are still intellectual property and technical restrictions in targetron applications, we propose that targetron technology will contribute to both biochemistry research and the metabolic engineering for industrial productions. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
2011-07-01
to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT...these innovative methods with conventional diagnostic tools that are currently used for assessing bioremediation performance. 132 Rula Deeb (510) 596...conventional diagnostic tools that are currently used for assessing bioremediation performance. DEMONSTRATION RESULTS 3-D multi-level systems
NASA Astrophysics Data System (ADS)
Clinton, J.
2017-12-01
Much of Hawaii's history is recorded in archeological sites. Researchers and cultural practitioners have been studying and reconstructing significant archeological sites for generations. Climate change, and more specifically, sea level rise may threaten these sites. Our research records current sea levels and then projects possible consequences to these cultural monuments due to sea level rise. In this mixed methods study, research scientists, cultural practitioners, and secondary students use plane-table mapping techniques to create maps of coastlines and historic sites. Students compare historical records to these maps, analyze current sea level rise trends, and calculate future sea levels. They also gather data through interviews with community experts and kupuna (elders). If climate change continues at projected rates, some historic sites will be in danger of negative impact due to sea level rise. Knowing projected sea levels at specific sites allows for preventative action and contributes to raised awareness of the impacts of climate change to the Hawaiian Islands. Students will share results with the community and governmental agencies in hopes of inspiring action to minimize climate change. It will take collaboration between scientists and cultural communities to inspire future action on climate change.
Adjusting slash pine growth and yield for silvicultural treatments
Stephen R. Logan; Barry D. Shiver
2006-01-01
With intensive silvicultural treatments such as fertilization and competition control now commonplace in today's slash pine (Pinus elliottii Engelm.) plantations, a method to adjust current growth and yield models is required to accurately account for yield increases due to these practices. Some commonly used ad-hoc methods, such as raising site...
Calibration of collection procedures for the determination of precipitation chemistry
James N. Galloway; Gene E. Likens
1976-01-01
Precipitation is currently collected by several methods, including several different designs of collection apparatus. We are investigating these differing methods and designs to determine which gives the most representative sample of precipitation for the analysis of some 25 chemical parameters. The experimental site, located in Ithaca, New York, has 22 collectors of...
Analytical methods for characterization of explosives-contaminated sites on U.S. Army installations
NASA Astrophysics Data System (ADS)
Jenkins, Thomas F.; Walsh, Marianne E.; Thorne, Philip G.
1995-10-01
The U.S. Army manufactures munitions at facilities throughout the United States. Many of these facilities are contaminated with residues of explosives from production, disposal of off- specification, and out-of-data munitions. The first step in remediating these sites is careful characterization. Currently sites are being characterized using a combination of on-site field screening and off-site laboratory analysis. Most of the contamination is associated with TNT (2,4,6-trinitrotoluene) and RDX (hexahydro-1,3,5-tri-nitro-1,3,5-triazine) and their manufacturing impurities and environmental transformation products. Both colorimetric and enzyme immunoassay-based field screening methods have been used successfully for on-site characterization. These methods have similar detection capabilities but differ in their selectivity. Although field screening is very cost-effective, laboratory analysis is still required to fully characterize a site. Laboratory analysis for explosives residues in the United States is generally conducted using high-performance liquid chromatography equipped with a UV detector. Air-dried soils are extracted with acetonitrile in an ultrasonic bath. Water is analyzed directly if detection limits in the range of 10 - 20 (mu) g/L are acceptable, or preconcentrated using either salting-out solvent extraction with acetonitrile or solid phase extraction.
Stelzer, Erin A.; Duris, Joseph W.; Brady, Amie M. G.; Harrison, John H.; Johnson, Heather E.; Ware, Michael W.
2013-01-01
Predictive models, based on environmental and water quality variables, have been used to improve the timeliness and accuracy of recreational water quality assessments, but their effectiveness has not been studied in inland waters. Sampling at eight inland recreational lakes in Ohio was done in order to investigate using predictive models for Escherichia coli and to understand the links between E. coli concentrations, predictive variables, and pathogens. Based upon results from 21 beach sites, models were developed for 13 sites, and the most predictive variables were rainfall, wind direction and speed, turbidity, and water temperature. Models were not developed at sites where the E. coli standard was seldom exceeded. Models were validated at nine sites during an independent year. At three sites, the model resulted in increased correct responses, sensitivities, and specificities compared to use of the previous day's E. coli concentration (the current method). Drought conditions during the validation year precluded being able to adequately assess model performance at most of the other sites. Cryptosporidium, adenovirus, eaeA (E. coli), ipaH (Shigella), and spvC (Salmonella) were found in at least 20% of samples collected for pathogens at five sites. The presence or absence of the three bacterial genes was related to some of the model variables but was not consistently related to E. coli concentrations. Predictive models were not effective at all inland lake sites; however, their use at two lakes with high swimmer densities will provide better estimates of public health risk than current methods and will be a valuable resource for beach managers and the public. PMID:23291550
Digital signal processing methods for biosequence comparison.
Benson, D C
1990-01-01
A method is discussed for DNA or protein sequence comparison using a finite field fast Fourier transform, a digital signal processing technique; and statistical methods are discussed for analyzing the output of this algorithm. This method compares two sequences of length N in computing time proportional to N log N compared to N2 for methods currently used. This method makes it feasible to compare very long sequences. An example is given to show that the method correctly identifies sites of known homology. PMID:2349096
The calcium current of Helix neuron
1978-01-01
Calcium current, Ica, was studied in isolated nerve cell bodies of Helix aspersa after suppression of Na+ and K+ currents. The suction pipette method described in the preceding paper was used. Ica rises to a peak value and then subsides exponentially and has a null potential of 150 mV or more and a relationship with [Ca2+]o that is hyperbolic over a small range of [Ca2+]o's. When [Ca2+]i is increased, Ica is reduced disproportionately, but the effect is not hyperbolic. Ica is blocked by extracellular Ni2+, La3+, Cd2+, and Co2+ and is greater when Ba2+ and Sr2+ carry the current. Saturation and blockage are described by a Langmuir adsorption relationship similar to that found in Balanus. Thus, the calcium conductance probably contains a site which binds the ions referred to. The site also appears to be voltage-dependent. Activation and inactivation of Ica are described by first order kinetics, and there is evidence that the processes are coupled. For example, inactivation is delayed slightly in its onset and tau inactivation depends upon the method of study. However, the currents are described equally well by either a noncoupled Hodgkin-Huxley mh scheme or a coupled reaction. Facilitation of Ica by prepulses was not observed. For times up to 50 ms, currents even at small depolarizations were accounted for by suitable adjustment of the activation and inactivation rate constants. PMID:660160
A Coupled model for ERT monitoring of contaminated sites
NASA Astrophysics Data System (ADS)
Wang, Yuling; Zhang, Bo; Gong, Shulan; Xu, Ya
2018-02-01
The performance of electrical resistivity tomography (ERT) system is usually investigated using a fixed resistivity distribution model in numerical simulation study. In this paper, a method to construct a time-varying resistivity model by coupling water transport, solute transport and constant current field is proposed for ERT monitoring of contaminated sites. Using the proposed method, a monitoring model is constructed for a contaminated site with a pollution region on the surface and ERT monitoring results at different time is calculated by the finite element method. The results show that ERT monitoring profiles can effectively reflect the increase of the pollution area caused by the diffusion of pollutants, but the extent of the pollution is not exactly the same as the actual situation. The model can be extended to any other case and can be used to scheme design and results analysis for ERT monitoring.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-16
... in achieving sustainable, systemic change that results in greater safety, permanency, and well-being... agreements. The cross-site evaluation uses a mixed-method, longitudinal approach to examine the ICs (funded...
Current approaches for the assessment of in situ biodegradation.
Bombach, Petra; Richnow, Hans H; Kästner, Matthias; Fischer, Anko
2010-04-01
Considering the high costs and technical difficulties associated with conventional remediation strategies, in situ biodegradation has become a promising approach for cleaning up contaminated aquifers. To verify if in situ biodegradation of organic contaminants is taking place at a contaminated site and to determine if these processes are efficient enough to replace conventional cleanup technologies, a comprehensive characterization of site-specific biodegradation processes is essential. In recent years, several strategies including geochemical analyses, microbial and molecular methods, tracer tests, metabolite analysis, compound-specific isotope analysis, and in situ microcosms have been developed to investigate the relevance of biodegradation processes for cleaning up contaminated aquifers. In this review, we outline current approaches for the assessment of in situ biodegradation and discuss their potential and limitations. We also discuss the benefits of research strategies combining complementary methods to gain a more comprehensive understanding of the complex hydrogeological and microbial interactions governing contaminant biodegradation in the field.
NASA Astrophysics Data System (ADS)
Hagras, Muhammad Ahmed
Electron transfer occurs in many biological systems which are imperative to sustain life; oxidative phosphorylation in prokaryotes and eukaryotes, and photophosphorylation in photosynthetic and plant cells are well-balanced and complementary processes. Investigating electron transfer in those natural systems provides detailed knowledge of the atomistic events that lead eventually to production of ATP, or harvesting light energy. Ubiquinol:cytochrome c oxidoreductase complex (also known as bc 1 complex, or respiratory complex III) is a middle player in the electron transport proton pumping orchestra, located in the inner-mitochondrial membrane in eukaryotes or plasma membrane in prokaryotes, which converts the free energy of redox reactions to electrochemical proton gradient across the membrane, following the fundamental chemiosmotic principle discovered by Peter Mitchell 1. In humans, the malfunctioned bc1 complex plays a major role in many neurodegenerative diseases, stress-induced aging, and cancer development, because it produces most of the reactive oxygen species, which are also involved in cellular signaling 2. The mitochondrial bc1 complex has an intertwined dimeric structure comprised of 11 subunits in each monomer, but only three of them have catalytic function, and those are the only domains found in bacterial bc1 complex. The core subunits include: Rieske domain, which incorporates iron-sulfur cluster [2Fe-2S]; trans-membrane cytochrome b domain, incorporating low-potential heme group (heme b L) and high-potential heme group (heme b H); and cytochrome c1 domain, containing heme c1 group and two separate binding sites, Qo (or QP) site where the hydrophobic electron carrier ubihydroquinol QH2 is oxidized, and Qi (or QN) site where ubiquinone molecule Q is reduced 3. Electrons and protons in the bc1 complex flow according to the proton-motive Q-cycle proposed by Mitchell, which includes a unique electron flow bifurcation at the Qo site. At this site, one electron of a bound QH2 molecule transfers to [2Fe-2S] cluster of the Rieske domain, docked at the proximal docking site, and another electron transfers to heme b L , which subsequently passes it to heme bH , and finally to Q or SQ molecule bound at the Qi-site 4. Rieske domain undergoes a domain movement 22 A to bind at the distal docking site, where [2Fe-2S] cluster passes its electron to heme c1, which in turn passes it to heme c of the water-soluble cytochrome c carrier 3c, 5 (which shuttles it to cytochrome c oxidase, complex IV). In the current compiled work presented in the subsequent chapters, we deployed a stacking tiers hierarchy where each chapter's work presents a foundation for the next one. In chapter 1, we first present different methods to calculate tunneling currents in proteins including a new derivation method for the inter-atomic tunneling current method. In addition, we show the results of the inter-atomic tunneling current theory on models based on heme bL-heme bH redox pair system in bc1 complex. Afterwards, in chapter 2, we examine the electron tunneling pathways 6 between different intra-monomeric and inter-monomeric redox centers of bc1 complex, including its electron carriers - ubiquinol, ubiquinone, and cytochrome c molecules, using the well-studied coarse-grained interatomic method of the tunneling current theory 7. Going through the different tunneling pathways in bc1 complex, we discovered a pair of internal switches that modulate the electron transfer rate which we discuss in full details in chapter 3. Motivated by the discovery of those internal switches, we discuss in chapter 4 the discovery of a new binding pocket (designated as NonQ-site or NQ-site for short) in bc 1 complex which is located at the opposite side of the enzyme with respect to Qo site. In contrast to Qo site, however, the NQ-site penetrates deeply in the cytochrome b domain and reaches very closely the LH region. Hence the NQ-site provides a suitable binding pocket for ligands that can influence the orientation of Phe90 residue, and hence modulate the corresponding ET rate between heme b L and heme bH. Finally we present in chapter 5 our unique integrated software package (called Electron Tunneling in Proteins Program or ETP) which provides an environment with different capabilities such as tunneling current calculation, semi-empirical quantum mechanical calculation and molecular modeling simulation for calculation and analysis of electron transfer reactions in proteins.
Grunebaum, Lisa Danielle; Reiter, David
2006-01-01
To determine current practice for use of perioperative antibiotics among facial plastic surgeons, to determine the extent of use of literature support for preferences of facial plastic surgeons, and to compare patterns of use with nationally supported evidence-based guidelines. A link to a Web site containing a questionnaire on perioperative antibiotic use was e-mailed to more than 1000 facial plastic surgeons in the United States. Responses were archived in a dedicated database and analyzed to determine patterns of use and methods of documenting that use. Current literature was used to develop evidence-based recommendations for perioperative antibiotic use, emphasizing current nationally supported guidelines. Preferences varied significantly for medication used, dosage and regimen, time of first dose relative to incision time, setting in which medication was administered, and procedures for which perioperative antibiotic was deemed necessary. Surgical site infection in facial plastic surgery can be reduced by better conformance to currently available evidence-based guidelines. We offer specific recommendations that are supported by the current literature.
Are rapid population estimates accurate? A field trial of two different assessment methods.
Grais, Rebecca F; Coulombier, Denis; Ampuero, Julia; Lucas, Marcelino E S; Barretto, Avertino T; Jacquier, Guy; Diaz, Francisco; Balandine, Serge; Mahoudeau, Claude; Brown, Vincent
2006-09-01
Emergencies resulting in large-scale displacement often lead to populations resettling in areas where basic health services and sanitation are unavailable. To plan relief-related activities quickly, rapid population size estimates are needed. The currently recommended Quadrat method estimates total population by extrapolating the average population size living in square blocks of known area to the total site surface. An alternative approach, the T-Square, provides a population estimate based on analysis of the spatial distribution of housing units taken throughout a site. We field tested both methods and validated the results against a census in Esturro Bairro, Beira, Mozambique. Compared to the census (population: 9,479), the T-Square yielded a better population estimate (9,523) than the Quadrat method (7,681; 95% confidence interval: 6,160-9,201), but was more difficult for field survey teams to implement. Although applicable only to similar sites, several general conclusions can be drawn for emergency planning.
NASA Astrophysics Data System (ADS)
Nomade, Sebastien, ,, Dr.; Pereira, MSc. Alison; voinchet, Pierre, ,, Dr.; Bahain, Jean-Jacques, ,, Dr.; Aureli, Daniele, ,, Dr.; Arzarello, Marta, ,, Dr.; Anzidei, Anna-Paola, ,, Dr.; Biddittu, Italo, ,, Dr.; Bulgarelli, Maria-Grazia, ,, Dr.; Falguères, Christophe, ,, Dr.; Giaccio, Biagio, ,, Dr.; Guillou, Hervé, ,, Dr.; Manzi, Gorgio, ,, Dr.; Moncel, Marie-Hélène, ,, Dr.; Nicoud, Elisa, ,, Dr.; Pagli, Maria, ,, Dr.; Parenti, Fabio, ,, Dr.; Peretto, Carlo, ,, Dr.; Piperno, Marcello, ,, Dr.; Rocca, Roxane, ,, Dr.
2017-04-01
European Middle-Pleistocene archaeological and/or paleontological sites lack a unified and precise chronological framework. Despite recent efforts mostly focused on methods such as OSL, ESR/U-series or cosmogenic nuclides, the age of numerous sites from this period fundamentally still relies on qualitative and speculative palaeoenvironmental and/or palaeontological/palaeoanthropological considerations. The lack of robust chronologies, along with the scarcity of human fossils, prevent coherent correlations between European sites which in turn limits our understanding of human diffusion dynamics, understand techno-cultural evolution or correlate archaeological sites with palaeoclimatic and environmental records. With the goal of providing an accurate and precise chronological framework based on a multi-method approach, a research network including geochronologists, archaeologist and paleoanthropologists from various French and Italian institutions launched in 2010 a wide study of Middle-Pleistocene archaeological sites of central and southern Italy. This study combining the 39Ar/40Ar method with palaeo-dosimetric methods applied to European sites in the age range of 700 ka to 300 ka is unprecedented. In parallel, a large effort has been done to improve the regional Middle-Pleistocene tephrostratigraphic database through a massive application of both high-precision 40Ar/39Ar geochronological and geochemical investigations. We illustrate our approach and results in addressing several key-sites such as Notarchirico, Valle Giumentina; Ceprano-Campogrande and La Polledrara di Cecanibbio. The accurate and precise chronological framework we built permits us to replace all the investigated archaeological and palaeontological records into a coherent climatic and environmental context. Furthermore, our work provides the opportunity to compare lithic industries from a technical and evolutionary point of view within a homogeneous temporal frame. These preliminary results border the current limitations of the 40Ar/39Ar method and will guide expected advances to apply our approach to other European sites.
NASA Astrophysics Data System (ADS)
Marcantonio, Franco; Lyle, Mitchell; Ibrahim, Rami
2014-08-01
The 230Th method of determining mass accumulation rates (MARs) assumes that little to no fractionation occurs during sediment redistribution processes at the seafloor. We examine 230Th inventories in radiocarbon-dated multicore sediments from paired winnowed and focused sites at Cocos and Carnegie Ridges, Panama Basin. Radiocarbon-derived sand MARs, which likely represent the vertical rain of particles poorly transported by bottom currents, are similar at each of the paired sites but are different using 230Th normalization. 230Th-normalized MARs are about 60% lower at focused sites and likely underestimate vertical MARs, while the reverse is true for winnowed sites. We hypothesize that size fractionation occurs most frequently at lower current velocities, resulting in the coarse fraction being left behind and primarily the fine 230Th-rich grains being transported downslope. 230Th-normalization works well for recording fine-grained (detrital and opal), but not coarse-grained (carbonate), fluxes in regions that have undergone sediment redistribution.
Clark, Adelaide E; Yoon, Subin; Sheesley, Rebecca J; Usenko, Sascha
2016-12-01
The atmospheric concentrations of seven current-use pesticides in particulate matter were determined at four locations throughout the Houston metropolitan area in TSP and PM 2.5 samples from September 2013. Atmospheric concentrations in both TSP and PM 2.5 ranged from below method detection limits (MDLs) to nearly 1100 pg m -3 . The three compounds most frequently detected above MDLs were chlorothalonil, bifenthrin, and λ-cyhalothrin. Atmospheric chlorothalonil concentrations were above 800 pg m -3 in several TSP samples, but
Balzarolo, Manuela; Anderson, Karen; Nichol, Caroline; Rossini, Micol; Vescovo, Loris; Arriga, Nicola; Wohlfahrt, Georg; Calvet, Jean-Christophe; Carrara, Arnaud; Cerasoli, Sofia; Cogliati, Sergio; Daumard, Fabrice; Eklundh, Lars; Elbers, Jan A.; Evrendilek, Fatih; Handcock, Rebecca N.; Kaduk, Joerg; Klumpp, Katja; Longdoz, Bernard; Matteucci, Giorgio; Meroni, Michele; Montagnani, Lenoardo; Ourcival, Jean-Marc; Sánchez-Cañete, Enrique P.; Pontailler, Jean-Yves; Juszczak, Radoslaw; Scholes, Bob; Martín, M. Pilar
2011-01-01
This paper reviews the currently available optical sensors, their limitations and opportunities for deployment at Eddy Covariance (EC) sites in Europe. This review is based on the results obtained from an online survey designed and disseminated by the Co-cooperation in Science and Technology (COST) Action ESO903—“Spectral Sampling Tools for Vegetation Biophysical Parameters and Flux Measurements in Europe” that provided a complete view on spectral sampling activities carried out within the different research teams in European countries. The results have highlighted that a wide variety of optical sensors are in use at flux sites across Europe, and responses further demonstrated that users were not always fully aware of the key issues underpinning repeatability and the reproducibility of their spectral measurements. The key findings of this survey point towards the need for greater awareness of the need for standardisation and development of a common protocol of optical sampling at the European EC sites. PMID:22164055
Azimuth selection for sea level measurements using geodetic GPS receivers
NASA Astrophysics Data System (ADS)
Wang, Xiaolei; Zhang, Qin; Zhang, Shuangcheng
2018-03-01
Based on analysis of Global Positioning System (GPS) multipath signals recorded by a geodetic GPS receiver, GPS Reflectometry (GPS-R) has demonstrated unique advantages in relation to sea level monitoring. Founded on multipath reflectometry theory, sea level changes can be measured by GPS-R through spectral analysis of recorded signal-to-noise ratio data. However, prior to estimating multipath parameters, it is necessary to define azimuth and elevation angle mask to ensure the reflecting zones are on water. Here, a method is presented to address azimuth selection, a topic currently under active development in the field of GPS-R. Data from three test sites: the Kachemak Bay GPS site PBAY in Alaska (USA), Friday Harbor GPS site SC02 in the San Juan Islands (USA), and Brest Harbor GPS site BRST in Brest (France) are analyzed. These sites are located in different multipath environments, from a rural coastal area to a busy harbor, and they experience different tidal ranges. Estimates by the GPS tide gauges at azimuths selected by the presented method are compared with measurements from physical tide gauges and acceptable correspondence found for all three sites.
SITE-SPECIFIC DIAGNOSTIC TOOLS
US EPA's Office of Water is proposing Combined Assessment and Listing Methods (CALM) to
meet reporting requirements under both Sections 305b and 303d for chemical and nonchemical
stressors in the nation's waterbodies. Current Environmental Monitoring and Assessment
Prog...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodriguez, Jennifer N.; Hwang, Wonjun; Horn, John
We report that the rupture of an intracranial aneurysm, which can result in severe mental disabilities or death, affects approximately 30,000 people in the United States annually. The traditional surgical method of treating these arterial malformations involves a full craniotomy procedure, wherein a clip is placed around the aneurysm neck. In recent decades, research and device development have focused on new endovascular treatment methods to occlude the aneurysm void space. These methods, some of which are currently in clinical use, utilize metal, polymeric, or hybrid devices delivered via catheter to the aneurysm site. In this review, we present several suchmore » devices, including those that have been approved for clinical use, and some that are currently in development. We present several design requirements for a successful aneurysm filling device and discuss the success or failure of current and past technologies. Lastly, we also present novel polymeric based aneurysm filling methods that are currently being tested in animal models that could result in superior healing.« less
Design and biocompatibility of endovascular aneurysm filling devices
Rodriguez, Jennifer N.; Hwang, Wonjun; Horn, John; ...
2014-08-04
We report that the rupture of an intracranial aneurysm, which can result in severe mental disabilities or death, affects approximately 30,000 people in the United States annually. The traditional surgical method of treating these arterial malformations involves a full craniotomy procedure, wherein a clip is placed around the aneurysm neck. In recent decades, research and device development have focused on new endovascular treatment methods to occlude the aneurysm void space. These methods, some of which are currently in clinical use, utilize metal, polymeric, or hybrid devices delivered via catheter to the aneurysm site. In this review, we present several suchmore » devices, including those that have been approved for clinical use, and some that are currently in development. We present several design requirements for a successful aneurysm filling device and discuss the success or failure of current and past technologies. Lastly, we also present novel polymeric based aneurysm filling methods that are currently being tested in animal models that could result in superior healing.« less
Design and biocompatibility of endovascular aneurysm filling devices
Rodriguez, Jennifer N.; Hwang, Wonjun; Horn, John; Landsman, Todd L.; Boyle, Anthony; Wierzbicki, Mark A.; Hasan, Sayyeda M.; Follmer, Douglas; Bryant, Jesse; Small, Ward; Maitland, Duncan J.
2014-01-01
The rupture of an intracranial aneurysm, which can result in severe mental disabilities or death, affects approximately 30,000 people in the United States annually. The traditional surgical method of treating these arterial malformations involves a full craniotomy procedure, wherein a clip is placed around the aneurysm neck. In recent decades, research and device development have focused on new endovascular treatment methods to occlude the aneurysm void space. These methods, some of which are currently in clinical use, utilize metal, polymeric, or hybrid devices delivered via catheter to the aneurysm site. In this review, we present several such devices, including those that have been approved for clinical use, and some that are currently in development. We present several design requirements for a successful aneurysm filling device and discuss the success or failure of current and past technologies. We also present novel polymeric based aneurysm filling methods that are currently being tested in animal models that could result in superior healing. PMID:25044644
Ban, Tomohiro; Ohue, Masahito; Akiyama, Yutaka
2018-04-01
The identification of comprehensive drug-target interactions is important in drug discovery. Although numerous computational methods have been developed over the years, a gold standard technique has not been established. Computational ligand docking and structure-based drug design allow researchers to predict the binding affinity between a compound and a target protein, and thus, they are often used to virtually screen compound libraries. In addition, docking techniques have also been applied to the virtual screening of target proteins (inverse docking) to predict target proteins of a drug candidate. Nevertheless, a more accurate docking method is currently required. In this study, we proposed a method in which a predicted ligand-binding site is covered by multiple grids, termed multiple grid arrangement. Notably, multiple grid arrangement facilitates the conformational search for a grid-based ligand docking software and can be applied to the state-of-the-art commercial docking software Glide (Schrödinger, LLC). We validated the proposed method by re-docking with the Astex diverse benchmark dataset and blind binding site situations, which improved the correct prediction rate of the top scoring docking pose from 27.1% to 34.1%; however, only a slight improvement in target prediction accuracy was observed with inverse docking scenarios. These findings highlight the limitations and challenges of current scoring functions and the need for more accurate docking methods. The proposed multiple grid arrangement method was implemented in Glide by modifying a cross-docking script for Glide, xglide.py. The script of our method is freely available online at http://www.bi.cs.titech.ac.jp/mga_glide/. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Liao, Wenjie; van der Werf, Hayo M G; Salmon-Monviola, Jordy
2015-09-15
One of the major challenges in environmental life cycle assessment (LCA) of crop production is the nonlinearity between nitrogen (N) fertilizer inputs and on-site N emissions resulting from complex biogeochemical processes. A few studies have addressed this nonlinearity by combining process-based N simulation models with LCA, but none accounted for nitrate (NO3(-)) flows across fields. In this study, we present a new method, TNT2-LCA, that couples the topography-based simulation of nitrogen transfer and transformation (TNT2) model with LCA, and compare the new method with a current LCA method based on a French life cycle inventory database. Application of the two methods to a case study of crop production in a catchment in France showed that, compared to the current method, TNT2-LCA allows delineation of more appropriate temporal limits when developing data for on-site N emissions associated with specific crops in this catchment. It also improves estimates of NO3(-) emissions by better consideration of agricultural practices, soil-climatic conditions, and spatial interactions of NO3(-) flows across fields, and by providing predicted crop yield. The new method presented in this study provides improved LCA of crop production at the catchment scale.
Pan, Keyao; Deem, Michael W.
2011-01-01
Many viruses evolve rapidly. For example, haemagglutinin (HA) of the H3N2 influenza A virus evolves to escape antibody binding. This evolution of the H3N2 virus means that people who have previously been exposed to an influenza strain may be infected by a newly emerged virus. In this paper, we use Shannon entropy and relative entropy to measure the diversity and selection pressure by an antibody in each amino acid site of H3 HA between the 1992–1993 season and the 2009–2010 season. Shannon entropy and relative entropy are two independent state variables that we use to characterize H3N2 evolution. The entropy method estimates future H3N2 evolution and migration using currently available H3 HA sequences. First, we show that the rate of evolution increases with the virus diversity in the current season. The Shannon entropy of the sequence in the current season predicts relative entropy between sequences in the current season and those in the next season. Second, a global migration pattern of H3N2 is assembled by comparing the relative entropy flows of sequences sampled in China, Japan, the USA and Europe. We verify this entropy method by describing two aspects of historical H3N2 evolution. First, we identify 54 amino acid sites in HA that have evolved in the past to evade the immune system. Second, the entropy method shows that epitopes A and B on the top of HA evolve most vigorously to escape antibody binding. Our work provides a novel entropy-based method to predict and quantify future H3N2 evolution and to describe the evolutionary history of H3N2. PMID:21543352
Karapanagioti, Hrissi K.; Childs, Jeffrey; Sabatini, David A.
2001-01-01
Organic petrography has been proposed as a tool for characterizing the heterogeneous organic matter present in soil and sediment samples. A new simplified method is proposed as a quantitative means of interpreting observed sorption behavior for phenanthrene and different soils and sediments based on their organic petrographical characterization. This method is tested under singe solute conditions and at phenanthrene concentration of 1 μg/L. Since the opaque organic matter fraction dominates the sorption process, we propose that by quantifying this fraction one can interpret organic content normalized sorption distribution coefficient (Koc) values for a sample. While this method was developed and tested for various samples within the same aquifer, in the current study the method is validated for soil and sediment samples from different sites that cover a wide range of organic matter origin, age, and organic content. All 10 soil and sediment samples studied had log Koc values for the opaque particles between 5.6 and 6.8. This range of Koc values illustrates the heterogeneity of opaque particles between sites and geological formations and thus the need to characterize the opaque fraction of materials on a site-by-site basis.
Shi, Lei; Jiang, Fan; Ouyang, Fengxiu; Zhang, Jun; Wang, Zhimin; Shen, Xiaoming
2018-03-01
Age estimation is critical in forensic science, in competitive sports and games and in other age-related fields, but the current methods are suboptimal. The combination of age-associated DNA methylation markers with skeletal age (SA) and dental age (DA) may improve the accuracy and precision of age estimation, but no study has examined this topic. In the current study, we measured SA (GP, TW3-RUS, and TW3-Carpal methods) and DA (Demirjian and Willems methods) by X-ray examination in 124 Chinese children (78 boys and 46 girls) aged 6-15 years. To identify age-associated CpG sites, we analyzed methylome-wide DNA methylation profiling by using the Illumina HumanMethylation450 BeadChip system in 48 randomly selected children. Five CpG sites were identified as associated with chronologic age (CA), with an absolute value of Pearson's correlation coefficient (r)>0.5 (p<0.01) and a false discovery rate<0.01. The validation of age-associated CpG sites was performed using droplet digital PCR techniques in all 124 children. After validation, four CpG sites for boys and five CpG sites for girls were further adopted to build the age estimation model with SA and DA using multivariate linear stepwise regressions. These CpG sites were located at 4 known genes: DDO, PRPH2, DHX8, and ITGA2B and at one unknown gene with the Illumina ID number of 22398226. The accuracy of age estimation methods was compared according to the mean absolute error (MAE) and root mean square error (RMSE). The best single measure for SA was the TW3-RUS method (MAE=0.69years, RMSE=0.95years) in boys, and the GP method (MAE=0.74years, RMSE=0.94years) in girls. For DA, the Willems method was the best single measure for both boys (MAE=0.63years, RMSE=0.78years) and girls (MAE=0.54years, RMSE=0.68years). The models that incorporated SA and DA with the methylation levels of age-associated CpG sites provided the highest accuracy of age estimation in both boys (MAE=0.47years, R 2 =0.886) and girls (MAE=0.33years, R 2 =0.941). Cross validation of the results confirmed the reliability and validity of the models. In conclusion, age-associated DNA methylation markers in combination with SA and DA greatly improve the accuracy of age estimation in Chinese children. This method may be applied in forensic science, in competitive sports and games and in other age-related fields. Copyright © 2017. Published by Elsevier B.V.
Kuhn, Alexandre; Ong, Yao Min; Quake, Stephen R; Burkholder, William F
2015-07-08
Like other structural variants, transposable element insertions can be highly polymorphic across individuals. Their functional impact, however, remains poorly understood. Current genome-wide approaches for genotyping insertion-site polymorphisms based on targeted or whole-genome sequencing remain very expensive and can lack accuracy, hence new large-scale genotyping methods are needed. We describe a high-throughput method for genotyping transposable element insertions and other types of structural variants that can be assayed by breakpoint PCR. The method relies on next-generation sequencing of multiplex, site-specific PCR amplification products and read count-based genotype calls. We show that this method is flexible, efficient (it does not require rounds of optimization), cost-effective and highly accurate. This method can benefit a wide range of applications from the routine genotyping of animal and plant populations to the functional study of structural variants in humans.
Centler, Florian; Heße, Falk; Thullner, Martin
2013-09-01
At field sites with varying redox conditions, different redox-specific microbial degradation pathways contribute to total contaminant degradation. The identification of pathway-specific contributions to total contaminant removal is of high practical relevance, yet difficult to achieve with current methods. Current stable-isotope-fractionation-based techniques focus on the identification of dominant biodegradation pathways under constant environmental conditions. We present an approach based on dual stable isotope data to estimate the individual contributions of two redox-specific pathways. We apply this approach to carbon and hydrogen isotope data obtained from reactive transport simulations of an organic contaminant plume in a two-dimensional aquifer cross section to test the applicability of the method. To take aspects typically encountered at field sites into account, additional simulations addressed the effects of transverse mixing, diffusion-induced stable-isotope fractionation, heterogeneities in the flow field, and mixing in sampling wells on isotope-based estimates for aerobic and anaerobic pathway contributions to total contaminant biodegradation. Results confirm the general applicability of the presented estimation method which is most accurate along the plume core and less accurate towards the fringe where flow paths receive contaminant mass and associated isotope signatures from the core by transverse dispersion. The presented method complements the stable-isotope-fractionation-based analysis toolbox. At field sites with varying redox conditions, it provides a means to identify the relative importance of individual, redox-specific degradation pathways. © 2013.
Virtual Planetary Analysis Environment for Remote Science
NASA Technical Reports Server (NTRS)
Keely, Leslie; Beyer, Ross; Edwards. Laurence; Lees, David
2009-01-01
All of the data for NASA's current planetary missions and most data for field experiments are collected via orbiting spacecraft, aircraft, and robotic explorers. Mission scientists are unable to employ traditional field methods when operating remotely. We have developed a virtual exploration tool for remote sites with data analysis capabilities that extend human perception quantitatively and qualitatively. Scientists and mission engineers can use it to explore a realistic representation of a remote site. It also provides software tools to "touch" and "measure" remote sites with an immediacy that boosts scientific productivity and is essential for mission operations.
Haas, Edwin Gerard; Beauman, Ronald; Palo, Jr., Stefan
2013-01-29
The invention provides a device and method for actuating electrical switches remotely. The device is removably attached to the switch and is actuated through the transfer of a user's force. The user is able to remain physically removed from the switch site obviating need for protective equipment. The device and method allow rapid, safe actuation of high-voltage or high-current carrying electrical switches or circuit breakers.
Vadose zone transport field study: Detailed test plan for simulated leak tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
AL Ward; GW Gee
2000-06-23
The US Department of Energy (DOE) Groundwater/Vadose Zone Integration Project Science and Technology initiative was created in FY 1999 to reduce the uncertainty associated with vadose zone transport processes beneath waste sites at DOE's Hanford Site near Richland, Washington. This information is needed not only to evaluate the risks from transport, but also to support the adoption of measures for minimizing impacts to the groundwater and surrounding environment. The principal uncertainties in vadose zone transport are the current distribution of source contaminants and the natural heterogeneity of the soil in which the contaminants reside. Oversimplified conceptual models resulting from thesemore » uncertainties and limited use of hydrologic characterization and monitoring technologies have hampered the understanding contaminant migration through Hanford's vadose zone. Essential prerequisites for reducing vadose transport uncertainly include the development of accurate conceptual models and the development or adoption of monitoring techniques capable of delineating the current distributions of source contaminants and characterizing natural site heterogeneity. The Vadose Zone Transport Field Study (VZTFS) was conceived as part of the initiative to address the major uncertainties confronting vadose zone fate and transport predictions at the Hanford Site and to overcome the limitations of previous characterization attempts. Pacific Northwest National Laboratory (PNNL) is managing the VZTFS for DOE. The VZTFS will conduct field investigations that will improve the understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. Ideally, these methods will capture the extent of contaminant plumes using existing infrastructure (i.e., more than 1,300 steel-cased boreholes). The objectives of the VZTFS are to conduct controlled transport experiments at well-instrumented field sites at Hanford to: identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford's waste disposal sites; reduce uncertainty in conceptual models; develop a detailed and accurate database of hydraulic and transport parameters for validation of three-dimensional numerical models; identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. This plan provides details for conducting field tests during FY 2000 to accomplish these objectives. Details of additional testing during FY 2001 and FY 2002 will be developed as part of the work planning process implemented by the Integration Project.« less
University education and nuclear criticality safety professionals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, R.E.; Stachowiak, R.V.; Knief, R.A.
1996-12-31
The problem of developing a productive criticality safety specialist at a nuclear fuel facility has long been with us. The normal practice is to hire a recent undergraduate or graduate degree recipient and invest at least a decade in on-the-job training. In the early 1980s, the U.S. Department of Energy (DOE) developed a model intern program in an attempt to speed up the process. The program involved working at assigned projects for extended periods at a working critical mass laboratory, a methods development group, and a fuel cycle facility. This never gained support as it involved extended time away frommore » the job. At the Rocky Flats Environmental Technology Site, the training method is currently the traditional one involving extensive experience. The flaw is that the criticality safety staff turnover has been such that few individuals continue for the decade some consider necessary for maturity in the discipline. To maintain quality evaluations and controls as well as interpretation decisions, extensive group review is used. This has proved costly to the site and professionally unsatisfying to the current staff. The site contractor has proposed a training program to remedy the basic problem.« less
Effects of nano red elemental selenium on sodium currents in rat dorsal root ganglion neurons.
Yuan, Huijun; Lin, Jiarui; Lan, Tonghan
2006-01-01
Nano red elemental selenium (Nano-Se), was demonstrated to be useful in medical and scientific researches. Here, we investigated the effects of Nano-Se on sodium currents on rat dorsal root ganglion neurons (DRG), using the whole-cell patch clamp method. Nano-Se reversibly decrease the I(Na)(TTX-S) in a concentration-dependent, time-dependent and open-channel block manners without affecting I(Na)(TTX-R). It shifted the steady-state activation and inactivation curves for I(Na) to more negative potentials. In the research of recovery from inactivation, the recovery time constant is longer in the present of Nano-Se. Nano-Se had a weaker inhibitory effect on I(Na), compared with marked decrease caused by selenite which indicated that Nano-Se is less neurotoxic than selenite in short-term/large dose treatments and had similar bio availability to sodium selenite. The results of interaction between the effects of Nano-Se and selenite on sodium currents indicated a negative allosteric interaction between the selenite binding site and the Nano-Se binding site or that they have the same competitive binding site.
NASA Astrophysics Data System (ADS)
Maalek, R.; Lichti, D. D.; Ruwanpura, J.
2015-08-01
The application of terrestrial laser scanners (TLSs) on construction sites for automating construction progress monitoring and controlling structural dimension compliance is growing markedly. However, current research in construction management relies on the planned building information model (BIM) to assign the accumulated point clouds to their corresponding structural elements, which may not be reliable in cases where the dimensions of the as-built structure differ from those of the planned model and/or the planned model is not available with sufficient detail. In addition outliers exist in construction site datasets due to data artefacts caused by moving objects, occlusions and dust. In order to overcome the aforementioned limitations, a novel method for robust classification and segmentation of planar and linear features is proposed to reduce the effects of outliers present in the LiDAR data collected from construction sites. First, coplanar and collinear points are classified through a robust principal components analysis procedure. The classified points are then grouped using a robust clustering method. A method is also proposed to robustly extract the points belonging to the flat-slab floors and/or ceilings without performing the aforementioned stages in order to preserve computational efficiency. The applicability of the proposed method is investigated in two scenarios, namely, a laboratory with 30 million points and an actual construction site with over 150 million points. The results obtained by the two experiments validate the suitability of the proposed method for robust segmentation of planar and linear features in contaminated datasets, such as those collected from construction sites.
Knott, Jayne Fifield; Olimpio, Julio C.
1986-01-01
Estimation of the average annual rate of ground-water recharge to sand and gravel aquifers using elevated tritium concentrations in ground water is an alternative to traditional steady-state and water-balance recharge-rate methods. The concept of the tritium tracer method is that the average annual rate of ground-water recharge over a period of time can be calculated from the depth of the peak tritium concentration in the aquifer. Assuming that ground-water flow is vertically downward and that aquifer properties are reasonably homogeneous, and knowing the date of maximum tritium concentration in precipitation and the current depth to the tritium peak from the water table, the average recharge rate can be calculated. The method, which is a direct-measurement technique, was applied at two sites on Nantucket Island, Massachusetts. At site 1, the average annual recharge rate between 1964 and 1983 was 26.1 inches per year, or 68 percent of the average annual precipitation, and the estimated uncertainty is ?15 percent. At site 2, the multilevel water samplers were not constructed deep enough to determine the peak concentration of tritium in ground water. The tritium profile at site 2 resembles the upper part of the tritium profile at site 1 and indicates that the average recharge rate was at least 16 .7 inches per year, or at least 44 percent of the average annual precipitation. The Nantucket tritium recharge rates clearly are higher than rates determined elsewhere in southeastern Massachusetts using the tritium, water-table-fluctuation, and water-balance (Thornthwaite) methods, regardless of the method or the area. Because the recharge potential on Nantucket is so high (runoff is only 2 percent of the total water balance), the tritium recharge rates probably represent the effective upper limit for ground-water recharge in this region. The recharge-rate values used by Guswa and LeBlanc (1985) and LeBlanc (1984) in their ground-water-flow computer models of Cape Cod are 20 to 30 percent lower than this upper limit. The accuracy of the tritium method is dependent on two key factors: the accuracy of the effective-porosity data, and the sampling interval used at the site. For some sites, the need for recharge-rate data may require a determination as statistically accurate as that which can be provided by the tritium method. However, the tritium method is more costly and more time consuming than the other methods because numerous wells must be drilled and installed and because many water samples must be analyzed for tritium, to a very small level of analytical detection. For many sites, a less accurate, less expensive, and faster method of recharge-rate determination might be more satisfactory . The factor that most seriously limits the usefulness of the tritium tracer method is the current depth of the tritium peak. Water with peak concentrations of tritium entered the ground more than 20 years ago, and, according to the Nantucket data, that water now is more than 100 feet below the land surface. This suggests that the tracer method will work only in sand and gravel aquifers that are exceedingly thick by New England standards. Conversely, the results suggest that the method may work in areas where saturated thicknesses are less than 100 feet and the rate of vertical ground-water movement is relatively slow, such as in till and in silt- and clay-rich sand and gravel deposits.
Spencer, Paula; Bowman, Michelle F; Dubé, Monique G
2008-07-01
It is not known if current chemical and biological monitoring methods are appropriate for assessing the impacts of growing industrial development on ecologically sensitive northern waters. We used a multitrophic level approach to evaluate current monitoring methods and to determine whether metal-mining activities had affected 2 otherwise pristine rivers that flow into the South Nahanni River, Northwest Territories, a World Heritage Site. We compared upstream reference conditions in the rivers to sites downstream and further downstream of mines. The endpoints we evaluated included concentrations of metals in river water, sediments, and liver and flesh of slimy sculpin (Cottus cognatus); benthic algal and macroinvertebrate abundance, richness, diversity, and community composition; and various slimy sculpin measures, our sentinel forage fish species. Elevated concentrations of copper and iron in liver tissue of sculpin from the Flat River were associated with high concentrations of mine-derived iron in river water and copper in sediments that were above national guidelines. In addition, sites downstream of the mine on the Flat River had increased algal abundances and altered benthic macroinvertebrate communities, whereas the sites downstream of the mine on Prairie Creek had increased benthic macroinvertebrate taxa richness and improved sculpin condition. Biological differences in both rivers were consistent with mild enrichment of the rivers downstream of current and historical mining activity. We recommend that monitoring in these northern rivers focus on indicators in epilithon and benthic macroinvertebrate communities due to their responsiveness and as alternatives to lethal fish sampling in habitats with low fish abundance. We also recommend monitoring of metal burdens in periphyton and benthic invertebrates for assessment of exposure to mine effluent and causal association. Although the effects of mining activities on riverine biota currently are limited, our results show that there is potential for effects to occur with proposed growth in mining activities.
Simultaneous distribution of AC and DC power
Polese, Luigi Gentile
2015-09-15
A system and method for the transport and distribution of both AC (alternating current) power and DC (direct current) power over wiring infrastructure normally used for distributing AC power only, for example, residential and/or commercial buildings' electrical wires is disclosed and taught. The system and method permits the combining of AC and DC power sources and the simultaneous distribution of the resulting power over the same wiring. At the utilization site a complementary device permits the separation of the DC power from the AC power and their reconstruction, for use in conventional AC-only and DC-only devices.
Wroble, Julie; Frederick, Timothy; Frame, Alicia; Vallero, Daniel
2017-01-01
Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)'s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete ("grab") samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples.
Site Characterization at a Tidal Energy Site in the East River, NY (usa)
NASA Astrophysics Data System (ADS)
Gunawan, B.; Neary, V. S.; Colby, J.
2012-12-01
A comprehensive tidal energy site characterization is performed using ADV measurements of instantaneous horizontal current magnitude and direction at the planned hub centerline of a tidal turbine over a two month period, and contributes to the growing data base of tidal energy site hydrodynamic conditions. The temporal variation, mean current statistics, and turbulence of the key tidal hydrodynamic parameters are examined in detail, and compared to estimates from two tidal energy sites in Puget Sound. Tidal hydrodynamic conditions, including mean annual current (at hub height), the speed of extreme gusts (instantaneous horizontal currents acting normal to the rotor plane), and turbulence intensity (as proposed here, relative to a mean current of 2 m s-1) can vary greatly among tidal energy sites. Comparison of hydrodynamic conditions measured in the East River tidal straight in New York City with those reported for two tidal energy sites in Puget Sound indicate differences of mean annual current speeds, difference in the instantaneous current speeds of extreme gusts, and differences in turbulence intensities. Significant differences in these parameters among the tidal energy sites, and with the tidal resource assessment map, highlight the importance of conducting site resource characterization with ADV measurements at the machine scale. As with the wind industry, which adopted an International Electrotechnical Commission (IEC) wind class standard to aid in the selection of wind turbines for a particular site, it is recommended that the tidal energy industry adopt an appropriate standard for tidal current classes. Such a standard requires a comprehensive field campaign at multiple tidal energy sites that can identify the key hydrodynamic parameters for tidal current site classification, select a list of tidal energy sites that exhibit the range of hydrodynamic conditions that will be encountered, and adopt consistent measurement practices (standards) for site classification.
Luo, Jake; Chen, Weiheng; Wu, Min; Weng, Chunhua
2018-01-01
Background Prior studies of clinical trial planning indicate that it is crucial to search and screen recruitment sites before starting to enroll participants. However, currently there is no systematic method developed to support clinical investigators to search candidate recruitment sites according to their interested clinical trial factors. Objective In this study, we aim at developing a new approach to integrating the location data of over one million heterogeneous recruitment sites that are stored in clinical trial documents. The integrated recruitment location data can be searched and visualized using a map-based information retrieval method. The method enables systematic search and analysis of recruitment sites across a large amount of clinical trials. Methods The location data of more than 1.4 million recruitment sites of over 183,000 clinical trials was normalized and integrated using a geocoding method. The integrated data can be used to support geographic information retrieval of recruitment sites. Additionally, the information of over 6000 clinical trial target disease conditions and close to 4000 interventions was also integrated into the system and linked to the recruitment locations. Such data integration enabled the construction of a novel map-based query system. The system will allow clinical investigators to search and visualize candidate recruitment sites for clinical trials based on target conditions and interventions. Results The evaluation results showed that the coverage of the geographic location mapping for the 1.4 million recruitment sites was 99.8%. The evaluation of 200 randomly retrieved recruitment sites showed that the correctness of geographic information mapping was 96.5%. The recruitment intensities of the top 30 countries were also retrieved and analyzed. The data analysis results indicated that the recruitment intensity varied significantly across different countries and geographic areas. Conclusion This study contributed a new data processing framework to extract and integrate the location data of heterogeneous recruitment sites from clinical trial documents. The developed system can support effective retrieval and analysis of potential recruitment sites using target clinical trial factors. PMID:29132636
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunawan, Budi; Neary, Vincent Sinclair; Mortensen, Josh
Hydrokinetic energy from flowing water in open channels has the potential to support local electricity needs with lower regulatory or capital investment than impounding water with more conventional means. MOU agencies involved in federal hydropower development have identified the need to better understand the opportunities for hydrokinetic (HK) energy development within existing canal systems that may already have integrated hydropower plants. This document provides an overview of the main considerations, tools, and assessment methods, for implementing field tests in an open-channel water system to characterize current energy converter (CEC) device performance and hydrodynamic effects. It describes open channel processes relevantmore » to their HK site and perform pertinent analyses to guide siting and CEC layout design, with the goal of streamlining the evaluation process and reducing the risk of interfering with existing uses of the site. This document outlines key site parameters of interest and effective tools and methods for measurement and analysis with examples drawn from the Roza Main Canal, in Yakima, WA to illustrate a site application.« less
Survey of phosphorylation near drug binding sites in the Protein Data Bank (PDB) and their effects.
Smith, Kyle P; Gifford, Kathleen M; Waitzman, Joshua S; Rice, Sarah E
2015-01-01
While it is currently estimated that 40 to 50% of eukaryotic proteins are phosphorylated, little is known about the frequency and local effects of phosphorylation near pharmaceutical inhibitor binding sites. In this study, we investigated how frequently phosphorylation may affect the binding of drug inhibitors to target proteins. We examined the 453 non-redundant structures of soluble mammalian drug target proteins bound to inhibitors currently available in the Protein Data Bank (PDB). We cross-referenced these structures with phosphorylation data available from the PhosphoSitePlus database. Three hundred twenty-two of 453 (71%) of drug targets have evidence of phosphorylation that has been validated by multiple methods or labs. For 132 of 453 (29%) of those, the phosphorylation site is within 12 Å of the small molecule-binding site, where it would likely alter small molecule binding affinity. We propose a framework for distinguishing between drug-phosphorylation site interactions that are likely to alter the efficacy of drugs versus those that are not. In addition we highlight examples of well-established drug targets, such as estrogen receptor alpha, for which phosphorylation may affect drug affinity and clinical efficacy. Our data suggest that phosphorylation may affect drug binding and efficacy for a significant fraction of drug target proteins. © 2014 Wiley Periodicals, Inc.
Velocity profile, water-surface slope, and bed-material size for selected streams in Colorado
Marchand, J.P.; Jarrett, R.D.; Jones, L.L.
1984-01-01
Existing methods for determining the mean velocity in a vertical sampling section do not address the conditions present in high-gradient, shallow-depth streams common to mountainous regions such as Colorado. The report presents velocity-profile data that were collected for 11 streamflow-gaging stations in Colorado using both a standard Price type AA current meter and a prototype Price Model PAA current meter. Computational results are compiled that will enable mean velocities calculated from measurements by the two current meters to be compared with each other and with existing methods for determining mean velocity. Water-surface slope, bed-material size, and flow-characteristic data for the 11 sites studied also are presented. (USGS)
Antibiotic, Pharmaceutical, and Wastewater-Compound Data for Michigan, 1998-2005
Haack, Sheridan Kidd
2010-01-01
Beginning in the late 1990's, the U.S. Geological Survey began to develop analytical methods to detect, at concentrations less than 1 microgram per liter (ug/L), emerging water contaminants such as pharmaceuticals, personal-care chemicals, and a variety of other chemicals associated with various human and animal sources. During 1998-2005, the U.S. Geological Survey analyzed the following Michigan water samples: 41 samples for antibiotic compounds, 28 samples for pharmaceutical compounds, 46 unfiltered samples for wastewater compounds (dissolved and suspended compounds), and 113 filtered samples for wastewater compounds (dissolved constituents only). The purpose of this report is to summarize the status of emerging contaminants in Michigan waters based on data from several different project-specific sample-collection efforts in Michigan during an 8-year period. During the course of the 8-year sampling effort, antibiotics were determined at 20 surface-water sites and 2 groundwater sites, pharmaceuticals were determined at 11 surface-water sites, wastewater compounds in unfiltered water were determined at 31 surface-water sites, and wastewater compounds in filtered water were determined at 40 surface-water and 4 groundwater sites. Some sites were visited only once, but others were visited multiple times. A variety of quality-assurance samples also were collected. This report describes the analytical methods used, describes the variations in analytical methods and reporting levels during the 8-year period, and summarizes all data using current (2009) reporting criteria. Very few chemicals were detected at concentrations greater than current laboratory reporting levels, which currently vary from a low of 0.005 ug/L for some antibiotics to 5 ug/L for some wastewater compounds. Nevertheless, 10 of 51 chemicals in the antibiotics analysis, 9 of 14 chemicals in the pharmaceuticals analysis, 34 of 67 chemicals in the unfiltered-wastewater analysis, and 56 of 62 chemicals in the filtered-wastewater analysis were detected. Antibiotics were detected at 7 of 20 tested surface-water sites, but none were detected in 2 groundwater samples. Pharmaceuticals were detected at 7 of 11 surface-water sites. Wastewater compounds were detected at 25 of 31 sites for which unfiltered water samples were analyzed and at least once at all 40 surface-water sites and all 4 groundwater sites for which filtered water samples were analyzed. Overall, the chemicals detected most frequently in Michigan waters were similar to those reported frequently in other studies nationwide. Patterns of chemical detections were site specific and appear to be related to local sources, overall land use, and hydrologic conditions at the time of sampling. Field-blank results provide important information for the design of future sampling programs in Michigan and demonstrate the need for careful field-study design. Field-replicate results indicated substantial confidence regarding the presence or absence of the many chemicals tested. Overall, data reported herein indicate that a wide array of antibiotic, pharmaceutical, and organic wastewater compounds occur in Michigan waters. Patterns of occurrence, with respect to hydrologic, land use, and source variables, generally appear to be similar for Michigan as for other sampled waters across the United States. The data reported herein can serve as a basis for future studies in Michigan.
ERIC Educational Resources Information Center
Jørgensen, Clara Rübner
2015-01-01
This paper discusses the strengths of using ethnographic research methods in cross-national comparative research. It focuses particularly on the potential of applying such methods to the study of migrants and minority ethnic youth in education, where large-scale quantitative studies or single-sited ethnographies are currently dominant. By linking…
Gordon W. Shaw; Daniel C. Dey; John Kabrick; Jennifer Grabner; Rose-Marie Muzika
2003-01-01
Regenerating oak in floodplains is problematic and current silvicultural methods are not always reliable. We are evaluating the field performance of a new nursery product, the RPM seedling, and the benefit of soil mounding and a cover crop of redtop grass to the survival and growth of pin oak and swamp white oak regeneration on former bottomland cropfields....
Supply of human allograft tissue in Canada.
Lakey, Jonathan R T; Mirbolooki, Mohammadreza; Rogers, Christina; Mohr, Jim
2007-01-01
There is relatively little known about the supply for allograft tissues in Canada. The major aim of this study is to quantify the current or "Known Supply" of human allograft tissue (bone, tendons, soft tissue, cardiovascular, ocular and skin) from known tissue banks in Canada, to estimate the "Unknown Supply" of human allograft tissue available to Canadian users from other sources, and to investigate the nature and source of these tissue products. Two surveys were developed; one for tissue banks processing one or more tissue types and the other specific to eye banks. Thirty nine sites were initially identified as potential tissue bank respondent sites. Of the 39 sites, 29 sites indicated that they were interested in participating or would consider completing the survey. A survey package and a self-addressed courier envelope were couriered to each of 29 sites. A three week response time was indicated. The project consultants conducted telephone and email follow-up for incomplete data. Unknown supply was estimated by 5 methods. Twenty-eight of 29 sites (97%) completed and returned surveys. Over the past year, respondents reported a total of 5,691 donors (1,550 living and 4,141 cadaveric donors). Including cancellous ground bone, there were 10,729 tissue products produced by the respondent banks. Of these, 71% were produced by accredited banks and 32% were ocular tissues. Total predicted shortfall of allograft tissues was 31,860-66,481 grafts. Through estimating Current supply, and compiling additional qualitative information, this study has provided a snapshot of the current Canadian supply and shortfall of allograft tissue grafts.
Experience of wrong site surgery and surgical marking practices among clinicians in the UK
Giles, Sally J; Rhodes, Penny; Clements, Gill; Cook, Gary A; Hayton, Ruth; Maxwell, Melanie J; Sheldon, Trevor A; Wright, John
2006-01-01
Background Little is known about the incidence of “wrong site surgery”, but the consequences of this type of medical error can be severe. Guidance from both the USA and more recently the UK has highlighted the importance of preventing error by marking patients before surgery. Objective To investigate the experiences of wrong site surgery and current marking practices among clinicians in the UK before the release of a national Correct Site Surgery Alert. Methods 38 telephone or face‐to‐face interviews were conducted with consultant surgeons in ophthalmology, orthopaedics and urology in 14 National Health Service hospitals in the UK. The interviews were coded and analysed thematically using the software package QSR Nud*ist 6. Results Most surgeons had experience of wrong site surgery, but there was no clear pattern of underlying causes. Marking practices varied considerably. Surgeons were divided on the value of marking and varied in their practices. Orthopaedic surgeons reported that they marked before surgery; however, some urologists and ophthalmologists reported that they did not. There seemed to be no formal hospital policies in place specifically relating to wrong site surgery, and there were problems associated with implementing a system of marking in some cases. The methods used to mark patients also varied. Some surgeons believed that marking was a limited method of preventing wrong site surgery and may even increase the risk of wrong site surgery. Conclusion Marking practices are variable and marking is not always used. Introducing standard guidance on marking may reduce the overall risk of wrong site surgery, especially as clinicians work at different hospital sites. However, the more specific needs of people and specialties must also be considered. PMID:17074875
REMEDIATION OF SOILS CONTAMINATED WITH WOOD-TREATMENT CHEMICALS (PCP AND CREOSOTE)
PCP and creosote PAHs are found in most of the contaminated soils at wood-treatment sites. The treatment methods currently being used for such soils include soil washing, incineration, and biotreatment. Soil washing involves removal of the hazardous chemicals from soils ...
NASA Astrophysics Data System (ADS)
Wu, Yunwen; Momma, Toshiyuki; Ahn, Seongki; Yokoshima, Tokihiko; Nara, Hiroki; Osaka, Tetsuya
2017-10-01
This work reports a new chemical pre-lithiation method to fabricate lithium sulfide (Li2S) cathode. This pre-lithiation process is taken place simply by dropping the organolithium reagent lithium naphthalenide (Li+Naph-) on the prepared sulfur cathode. It is the first time realizing the room temperature chemical pre-lithaition reaction attributed by the 3D nanostructured carbon nanotube (CNT) current collector. It is confirmed that the Li2S cathode fabricated at room temperature showing higher capacity and lower hysteresis than the Li2S cathode fabricated at high temperature pre-lithiation. The pre-lithiated Li2S cathode at room temperature shows stable cycling performance with a 600 mAh g-1 capacity after 100 cycles at 0.1 C-rate and high capacity of 500 mAh g-1 at 2 C-rate. This simple on-site pre-lithiation method at room temperature is demonstrated to be applicable for the in-situ pre-lithiation in a Li metal free battery.
Inokuchi, Go; Yajima, Daisuke; Hayakawa, Mutsumi; Motomura, Ayumi; Chiba, Fumiko; Torimitsu, Suguru; Makino, Yohsuke; Iwase, Hirotaro
2014-12-01
One of the advantages of postmortem imaging is its ability to obtain diagnostic findings in a non-destructive manner when autopsy is either difficult or may destroy forensic evidence. In recent years, efforts have been made to incorporate computed tomography (CT) based postmortem angiography into forensic pathology; however, it is not currently clear how well the modality can determine sites of bleeding in cases of subarachnoid hemorrhage. Therefore, in this study, we investigated the utility of postmortem cerebral angiography using multi-detector row CT (MDCT) by injecting a contrast medium through a catheter inserted into the internal carotid and vertebral arteries of 10 subarachnoid hemorrhage cases. While postmortem MDCT angiography (PMCTA) was capable of detecting aneurysms in a non-destructive manner, it was sometimes difficult to identify the aneurysm and bleeding sites because of a large amount of contrast medium leaking into the extravascular space. To overcome this problem, we developed the novel contrast imaging method "dynamic cerebral angiography," which involves scanning the same area multiple times while injecting contrast medium to enable real-time observation of the contrasted vasculature. Using multiphase contrast images acquired by this method, we successfully captured the moment when contrast medium leaked from the hemorrhage site. This method will be useful for identifying exact bleeding sites on PMCTA.
1993-12-30
projectile fragments from target materials, principally sand. Phase I activities included (1) literature review of separations technology , (2) site visits, (3...the current operation, evaluation of alternative means for separation of DU from sand, a review of uranium mining technology for v possible...the current operation, evaluation of alternative means for separation of DU from sand, a review of uranium mining technology for possible
You, Leiming; Wu, Jiexin; Feng, Yuchao; Fu, Yonggui; Guo, Yanan; Long, Liyuan; Zhang, Hui; Luan, Yijie; Tian, Peng; Chen, Liangfu; Huang, Guangrui; Huang, Shengfeng; Li, Yuxin; Li, Jie; Chen, Chengyong; Zhang, Yaqing; Chen, Shangwu; Xu, Anlong
2015-01-01
Increasing amounts of genes have been shown to utilize alternative polyadenylation (APA) 3′-processing sites depending on the cell and tissue type and/or physiological and pathological conditions at the time of processing, and the construction of genome-wide database regarding APA is urgently needed for better understanding poly(A) site selection and APA-directed gene expression regulation for a given biology. Here we present a web-accessible database, named APASdb (http://mosas.sysu.edu.cn/utr), which can visualize the precise map and usage quantification of different APA isoforms for all genes. The datasets are deeply profiled by the sequencing alternative polyadenylation sites (SAPAS) method capable of high-throughput sequencing 3′-ends of polyadenylated transcripts. Thus, APASdb details all the heterogeneous cleavage sites downstream of poly(A) signals, and maintains near complete coverage for APA sites, much better than the previous databases using conventional methods. Furthermore, APASdb provides the quantification of a given APA variant among transcripts with different APA sites by computing their corresponding normalized-reads, making our database more useful. In addition, APASdb supports URL-based retrieval, browsing and display of exon-intron structure, poly(A) signals, poly(A) sites location and usage reads, and 3′-untranslated regions (3′-UTRs). Currently, APASdb involves APA in various biological processes and diseases in human, mouse and zebrafish. PMID:25378337
NASA Astrophysics Data System (ADS)
Chen, Tsung-Wei; Hsiao, Chin-Lun; Hu, Chong-Der
2016-07-01
We investigate the change in the non-zero Chern number and out-of-plane spin polarization of the edge currents in a honeycomb lattice with the Haldane-Rashba interaction. This interaction breaks the time-reversal symmetry due to the Haldane phase caused by a current loop at the site-I and site-II atoms, and also accounts for the Rashba-type spin-orbit interaction. The Rashba spin-orbit interaction increases the number of Dirac points and the band-touching phenomenon can be generated by tuning the on-site potential in the non-zero Haldane phase. By using the Pontryagin winding number and numerical Berry curvature methods, we find that the Chern number pattern is {+2, -1, 0} and {-2, +1, 0} for the positive and negative Haldane phase, respectively. A non-zero Chern number is called a Chern-insulating phase. We discovered that changes in both the Haldane phase and on-site potential leads to a change in the orientation of the bulk spin polarization of site-I and site-II atoms. Interestingly, in a ribbon with a zigzag edge, which naturally has site-I atoms at one outer edge and site-II atoms at the opposite outer edge, the spin polarization of the edge states approximately obeys the properties of bulk spin polarization regardless of the change in the Chern number. In addition, even when the Chern number changes from +2 to -1 (or -2 to +1), by tuning the strength of the on-site potential, the sign of the spin polarization of the edge states persists. This approximate bulk-edge correspondence of the spin polarization in the Haldane-Rashba system would play an important role in spintronics, because it enables us to control the orientation of the spin polarization in a single Chern-insulating phase.
Chen, Tsung-Wei; Hsiao, Chin-Lun; Hu, Chong-Der
2016-07-13
We investigate the change in the non-zero Chern number and out-of-plane spin polarization of the edge currents in a honeycomb lattice with the Haldane-Rashba interaction. This interaction breaks the time-reversal symmetry due to the Haldane phase caused by a current loop at the site-I and site-II atoms, and also accounts for the Rashba-type spin-orbit interaction. The Rashba spin-orbit interaction increases the number of Dirac points and the band-touching phenomenon can be generated by tuning the on-site potential in the non-zero Haldane phase. By using the Pontryagin winding number and numerical Berry curvature methods, we find that the Chern number pattern is {+2, -1, 0} and {-2, +1, 0} for the positive and negative Haldane phase, respectively. A non-zero Chern number is called a Chern-insulating phase. We discovered that changes in both the Haldane phase and on-site potential leads to a change in the orientation of the bulk spin polarization of site-I and site-II atoms. Interestingly, in a ribbon with a zigzag edge, which naturally has site-I atoms at one outer edge and site-II atoms at the opposite outer edge, the spin polarization of the edge states approximately obeys the properties of bulk spin polarization regardless of the change in the Chern number. In addition, even when the Chern number changes from +2 to -1 (or -2 to +1), by tuning the strength of the on-site potential, the sign of the spin polarization of the edge states persists. This approximate bulk-edge correspondence of the spin polarization in the Haldane-Rashba system would play an important role in spintronics, because it enables us to control the orientation of the spin polarization in a single Chern-insulating phase.
TargetSpy: a supervised machine learning approach for microRNA target prediction.
Sturm, Martin; Hackenberg, Michael; Langenberger, David; Frishman, Dmitrij
2010-05-28
Virtually all currently available microRNA target site prediction algorithms require the presence of a (conserved) seed match to the 5' end of the microRNA. Recently however, it has been shown that this requirement might be too stringent, leading to a substantial number of missed target sites. We developed TargetSpy, a novel computational approach for predicting target sites regardless of the presence of a seed match. It is based on machine learning and automatic feature selection using a wide spectrum of compositional, structural, and base pairing features covering current biological knowledge. Our model does not rely on evolutionary conservation, which allows the detection of species-specific interactions and makes TargetSpy suitable for analyzing unconserved genomic sequences.In order to allow for an unbiased comparison of TargetSpy to other methods, we classified all algorithms into three groups: I) no seed match requirement, II) seed match requirement, and III) conserved seed match requirement. TargetSpy predictions for classes II and III are generated by appropriate postfiltering. On a human dataset revealing fold-change in protein production for five selected microRNAs our method shows superior performance in all classes. In Drosophila melanogaster not only our class II and III predictions are on par with other algorithms, but notably the class I (no-seed) predictions are just marginally less accurate. We estimate that TargetSpy predicts between 26 and 112 functional target sites without a seed match per microRNA that are missed by all other currently available algorithms. Only a few algorithms can predict target sites without demanding a seed match and TargetSpy demonstrates a substantial improvement in prediction accuracy in that class. Furthermore, when conservation and the presence of a seed match are required, the performance is comparable with state-of-the-art algorithms. TargetSpy was trained on mouse and performs well in human and drosophila, suggesting that it may be applicable to a broad range of species. Moreover, we have demonstrated that the application of machine learning techniques in combination with upcoming deep sequencing data results in a powerful microRNA target site prediction tool http://www.targetspy.org.
TargetSpy: a supervised machine learning approach for microRNA target prediction
2010-01-01
Background Virtually all currently available microRNA target site prediction algorithms require the presence of a (conserved) seed match to the 5' end of the microRNA. Recently however, it has been shown that this requirement might be too stringent, leading to a substantial number of missed target sites. Results We developed TargetSpy, a novel computational approach for predicting target sites regardless of the presence of a seed match. It is based on machine learning and automatic feature selection using a wide spectrum of compositional, structural, and base pairing features covering current biological knowledge. Our model does not rely on evolutionary conservation, which allows the detection of species-specific interactions and makes TargetSpy suitable for analyzing unconserved genomic sequences. In order to allow for an unbiased comparison of TargetSpy to other methods, we classified all algorithms into three groups: I) no seed match requirement, II) seed match requirement, and III) conserved seed match requirement. TargetSpy predictions for classes II and III are generated by appropriate postfiltering. On a human dataset revealing fold-change in protein production for five selected microRNAs our method shows superior performance in all classes. In Drosophila melanogaster not only our class II and III predictions are on par with other algorithms, but notably the class I (no-seed) predictions are just marginally less accurate. We estimate that TargetSpy predicts between 26 and 112 functional target sites without a seed match per microRNA that are missed by all other currently available algorithms. Conclusion Only a few algorithms can predict target sites without demanding a seed match and TargetSpy demonstrates a substantial improvement in prediction accuracy in that class. Furthermore, when conservation and the presence of a seed match are required, the performance is comparable with state-of-the-art algorithms. TargetSpy was trained on mouse and performs well in human and drosophila, suggesting that it may be applicable to a broad range of species. Moreover, we have demonstrated that the application of machine learning techniques in combination with upcoming deep sequencing data results in a powerful microRNA target site prediction tool http://www.targetspy.org. PMID:20509939
The Savannah River Site`s Groundwater Monitoring Program. Second quarter, 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-01-10
The Environmental Protection Department/Environmental Monitoring Section (EPD/EMS) administers the Savannah River Site`s (SRS) Groundwater Monitoring Program. During second quarter 1991 EPD/EMS conducted extensive sampling of monitoring wells. EPD/EMS established two sets of flagging criteria in 1986 to assist in the management of sample results. The flagging criteria do not define contamination levels; instead, they aid personnel in sample scheduling, interpretation of data, and trend identification. Beginning in 1991, the flagging criteria are based on EPA drinking water standards and method detection limits. A detailed explanation of the current flagging criteria is presented in the Flagging Criteria section of this document.more » Analytical results from second quarter 1991 are listed in this report.« less
NASA Astrophysics Data System (ADS)
Bouldin, J.
2010-12-01
In the reconstruction of past climates from tree rings multi-decadal to multi-centennial periods, one longstanding problem is the confounding of the natural biological growth trend of the tree with any existing long term trends in the climate. No existing analytical method is capable of resolving these two change components, so it remains unclear how accurate existing ring series standardizations are, and by implication, climate reconstructions based upon them. For example, dendrochronological at the ITRDB are typically standardized by detrending, at each site, each individual tree core, using a relatively stiff deterministic function such as a negative exponential curve or smoothing spline. Another approach, referred to as RCS (Regional Curve Standardization) attempts to solve some problems of the individual series detrending, by constructing a single growth curve from the aggregated cambial ages of the rings of the cores at a site (or collection of sites). This curve is presumed to represent the “ideal” or expected growth of the trees from which it is derived. Although an improvement in some respects, this method will be degraded in direct proportion to the lack of a mixture of tree sizes or ages throughout the span of the chronology. I present a new method of removing the biological curve from tree ring series, such that temporal changes better represent the environmental variation captured by the tree rings. The method institutes several new approaches, such as the correction for the estimated number of missed rings near the pith, and the use of tree size and ring area relationships instead of the traditional tree ages and ring widths. The most important innovation is a careful extraction of the existing information on the relationship between tree size (basal area) and ring area that exists within each single year of the chronology. This information is, by definition, not contaminated by temporal climatic changes, and so when removed, leaves the climatically caused, and random error components of the chronology. A sophisticated algorithm, based on pair-wise ring comparisons in which tree size is standardized both within and between years, forms the basis of the method. Evaluations of the method are underway with both simulated and actual (ITRDB) data, to evaluate the potentials and drawbacks of the method relative to existing methods. The ITRDB test data consists of a set of about 50 primarily high elevation sites from across western North America. Most of these sites show a pronounced 20th Century warming relative to earlier centuries, in accordance with current understanding, albeit at a non-global scale. A relative minority show cooling, occasionally strongly. Current and future work emphasizes evaluation of the method with varying, simulated data, and more thorough empirical evaluations of the method in situations where the type, and intensity, of the primary environmentally limiting factor varies (e.g temperature versus soil moisture limited sites).
Tube cutter tool and method of use for coupon removal
Nachbar, H.D.; Etten, M.P. Jr.; Kurowski, P.A.
1997-05-06
A tube cutter tool is insertable into a tube for cutting a coupon from a damaged site on the exterior of the tube. Prior to using the tool, the damaged site is first located from the interior of the tube using a multi-coil pancake eddy current test probe. The damaged site is then marked. A fiber optic probe is used to monitor the subsequent cutting procedure which is performed using a hole saw mounted on the tube cutter tool. Prior to completion of the cutting procedure, a drill in the center of the hole saw is drilled into the coupon to hold it in place. 4 figs.
Tube cutter tool and method of use for coupon removal
Nachbar, Henry D.; Etten, Jr., Marvin P.; Kurowski, Paul A.
1997-01-01
A tube cutter tool is insertable into a tube for cutting a coupon from a damaged site on the exterior of the tube. Prior to using the tool, the damaged site is first located from the interior of the tube using a multi-coil pancake eddy current test probe. The damaged site is then marked. A fiber optic probe is used to monitor the subsequent cutting procedure which is performed using a hole saw mounted on the tube cutter tool. Prior to completion of the cutting procedure, a drill in the center of the hole saw is drilled into the coupon to hold it in place.
RBind: computational network method to predict RNA binding sites.
Wang, Kaili; Jian, Yiren; Wang, Huiwen; Zeng, Chen; Zhao, Yunjie
2018-04-26
Non-coding RNA molecules play essential roles by interacting with other molecules to perform various biological functions. However, it is difficult to determine RNA structures due to their flexibility. At present, the number of experimentally solved RNA-ligand and RNA-protein structures is still insufficient. Therefore, binding sites prediction of non-coding RNA is required to understand their functions. Current RNA binding site prediction algorithms produce many false positive nucleotides that are distance away from the binding sites. Here, we present a network approach, RBind, to predict the RNA binding sites. We benchmarked RBind in RNA-ligand and RNA-protein datasets. The average accuracy of 0.82 in RNA-ligand and 0.63 in RNA-protein testing showed that this network strategy has a reliable accuracy for binding sites prediction. The codes and datasets are available at https://zhaolab.com.cn/RBind. yjzhaowh@mail.ccnu.edu.cn. Supplementary data are available at Bioinformatics online.
Participant verification: prevention of co-enrolment in clinical trials in South Africa.
Harichund, C; Haripersad, K; Ramjee, R
2013-05-15
As KwaZulu-Natal Province is the epicentre of the HIV epidemic in both South Africa (SA) and globally, it is an ideal location to conduct HIV prevention and therapeutic trials. Numerous prevention trials are currently being conducted here; the potential for participant co-enrolment may compromise the validity of these studies and is therefore of great concern. To report the development and feasibility of a digital, fingerprint-based participant identification method to prevent co-enrolment at multiple clinical trial sites. The Medical Research Council (MRC) HIV Prevention Research Unit (HPRU) developed the Biometric Co-enrolment Prevention System (BCEPS), which uses fingerprint-based biometric technology to identify participants. A trial website was used to determine the robustness and usability of the system. After successful testing, the BCEPS was piloted in July 2010 across 7 HPRU clinical research sites. The BCEPS was pre-loaded with study names and clinical trial sites, with new participant information loaded at first visit to a trial site. We successfully implemented the BCEPS at the 7 HPRU sites. Using the BCEPS, we performed real-time 'flagging' of women who were already enrolled in another study as they entered a trial at an HPRU site and, where necessary, excluded them from participation on site. This system has promise in reducing co-enrolment in clinical trials and represents a valuable tool for future implementation by all groups conducting trials. The MRC is currently co-ordinating this effort with clinical trial sites nationally.
NASA Astrophysics Data System (ADS)
Hassellöv, Ida-Maja; Tengberg, Anders
2017-04-01
The Baltic Sea region contains a dark legacy of about 100 000 tons of dumped chemical warfare agents. As time passes the gun shells corrode and the risks of release of contaminants increase. A major goal of the EU-flagship project Daimon is to support governmental organisations with case-to-case adapted methods for sustainable management of dumped toxic munitions. At the Chalmers University of Technology, a partner of Daimon, a unique ISO 31000 adapted method was developed to provide decision support regarding potentially oilpolluting shipwrecks. The method is called VRAKA and is based on probability calculations. It includes site-specific information as well as expert knowledge. VRAKA is now being adapted to dumped chemical munitions. To estimate corrosion potential of gun shells and ship wrecks along with sediment re-suspension and transport multiparameter instruments are deployed at dump sites. Parameters measured include Currents, Salinity, Temperature, Oxygen, Depth, Waves and Suspended particles. These measurements have revealed how trawling at dump sites seems to have large implications in spreading toxic substances (Arsenic) over larger areas. This presentation will shortly describe the decision support model, the used instrumentation and discuss some of the obtain results.
USEFULNESS OF CURRENT SEDIMENT TOXICITY TESTS TO INDICATE CONTAMINATION IN GULF OF MEXICO ESTUARIES.
Sediment toxicity evaluations were conducted during a three-year period in several Gulf of Mexico near-coastal areas using a variety of laboratory and field methods. The sediments were collected adjacent to Superfund sites, urban runoff discharges, treated municipal and industria...
Estimated flow-duration curves for selected ungaged sites in Kansas
Studley, S.E.
2001-01-01
Flow-duration curves for 1968-98 were estimated for 32 ungaged sites in the Missouri, Smoky Hill-Saline, Solomon, Marais des Cygnes, Walnut, Verdigris, and Neosho River Basins in Kansas. Also included from a previous report are estimated flow-duration curves for 16 ungaged sites in the Cimarron and lower Arkansas River Basins in Kansas. The method of estimation used six unique factors of flow duration: (1) mean streamflow and percentage duration of mean streamflow, (2) ratio of 1-percent-duration streamflow to mean streamflow, (3) ratio of 0.1-percent-duration streamflow to 1-percent-duration streamflow, (4) ratio of 50-percent-duration streamflow to mean streamflow, (5) percentage duration of appreciable streamflow (0.10 cubic foot per second), and (6) average slope of the flow-duration curve. These factors were previously developed from a regionalized study of flow-duration curves using streamflow data for 1921-76 from streamflow-gaging stations with drainage areas of 100 to 3,000 square miles. The method was tested on a currently (2001) measured, continuous-record streamflow-gaging station on Salt Creek near Lyndon, Kansas, with a drainage area of 111 square miles and was found to adequately estimate the computed flow-duration curve for the station. The method also was tested on a currently (2001) measured, continuous-record, streamflow-gaging station on Soldier Creek near Circleville, Kansas, with a drainage area of 49.3 square miles. The results of the test on Soldier Creek near Circleville indicated that the method could adequately estimate flow-duration curves for sites with drainage areas of less than 100 square miles. The low-flow parts of the estimated flow-duration curves were verified or revised using 137 base-flow discharge measurements made during 1999-2000 at the 32 ungaged sites that were correlated with base-flow measurements and flow-duration analyses performed at nearby, long-term, continuous-record, streamflow-gaging stations (index stations). The method did not adequately estimate the flow-duration curves for two sites in the western one-third of the State because of substantial changes in farming practices (terracing and intensive ground-water withdrawal) that were not accounted for in the two previous studies (Furness, 1959; Jordan, 1983). For these two sites, there was enough historic, continuous-streamflow record available to perform record-extension techniques correlated to their respective index stations for the development of the estimated flow-duration curves. The estimated flow-duration curves at the ungaged sites can be used for projecting future flow frequencies for assessment of total maximum daily loads (TMDLs) or other water-quality constituents, water-availability studies, and for basin-characteristic studies.
Design of an ultra-portable field transfer radiometer supporting automated vicarious calibration
NASA Astrophysics Data System (ADS)
Anderson, Nikolaus; Thome, Kurtis; Czapla-Myers, Jeffrey; Biggar, Stuart
2015-09-01
The University of Arizona Remote Sensing Group (RSG) began outfitting the radiometric calibration test site (RadCaTS) at Railroad Valley Nevada in 2004 for automated vicarious calibration of Earth-observing sensors. RadCaTS was upgraded to use RSG custom 8-band ground viewing radiometers (GVRs) beginning in 2011 and currently four GVRs are deployed providing an average reflectance for the test site. This measurement of ground reflectance is the most critical component of vicarious calibration using the reflectance-based method. In order to ensure the quality of these measurements, RSG has been exploring more efficient and accurate methods of on-site calibration evaluation. This work describes the design of, and initial results from, a small portable transfer radiometer for the purpose of GVR calibration validation on site. Prior to deployment, RSG uses high accuracy laboratory calibration methods in order to provide radiance calibrations with low uncertainties for each GVR. After deployment, a solar radiation based calibration has typically been used. The method is highly dependent on a clear, stable atmosphere, requires at least two people to perform, is time consuming in post processing, and is dependent on several large pieces of equipment. In order to provide more regular and more accurate calibration monitoring, the small portable transfer radiometer is designed for quick, one-person operation and on-site field calibration comparison results. The radiometer is also suited for laboratory calibration use and thus could be used as a transfer radiometer calibration standard for ground viewing radiometers of a RadCalNet site.
Methods and strategies for future reactor safety goals
NASA Astrophysics Data System (ADS)
Arndt, Steven Andrew
There have been significant discussions over the past few years by the United States Nuclear Regulatory Commission (NRC), the Advisory Committee on Reactor Safeguards (ACRS), and others as to the adequacy of the NRC safety goals for use with the next generation of nuclear power reactors to be built in the United States. The NRC, in its safety goals policy statement, has provided general qualitative safety goals and basic quantitative health objectives (QHOs) for nuclear reactors in the United States. Risk metrics such as core damage frequency (CDF) and large early release frequency (LERF) have been used as surrogates for the QHOs. In its review of the new plant licensing policy the ACRS has looked at the safety goals, as has the NRC. A number of issues have been raised including what the Commission had in mind when it drafted the safety goals and QHOs, how risk from multiple reactors at a site should be combined for evaluation, how the combination of a new and old reactor at the same site should be evaluated, what the criteria for evaluating new reactors should be, and whether new reactors should be required to be safer than current generation reactors. As part of the development and application of the NRC safety goal policy statement the Commissioners laid out the expectations for the safety of a nuclear power plant but did not address the risk associated with current multi-unit sites, potential modular reactor sites, and hybrid sites that could contain current generation reactors, new passive reactors, and/or modular reactors. The NRC safety goals and the QHOs refer to a "nuclear power plant," but do not discuss whether a "plant" refers to only a single unit or all of the units on a site. There has been much discussion on this issue recently due to the development of modular reactors. Additionally, the risk of multiple reactor accidents on the same site has been largely ignored in the probabilistic risk assessments (PRAs) done to date, and in most risk-informed analyses and discussions. This dissertation examines potential approaches to updating the safety goals that include the establishment of new quantitative safety goal associated with the comparative risk of generating electricity by viable competing technologies and modifications of the goals to account for multi-plant reactor sites, and issues associated with the use of safety goals in both initial licensing and operational decision making. This research develops a new quantitative health objective that uses a comparable benefit risk metric based on the life-cycle risk of the construction, operation and decommissioning of a comparable non-nuclear electric generation facility, as well as the risks associated with mining and transportation. This dissertation also evaluates the effects of using various methods for aggregating site risk as a safety metric, as opposed to using single plant safety goals. Additionally, a number of important assumptions inherent in the current safety goals, including the effect of other potential negative societal effects such as the generation of greenhouse gases (e.g., carbon dioxide) have on the risk of electric power production and their effects on the setting of safety goals, is explored. Finally, the role risk perception should play in establishing safety goals has been explored. To complete this evaluation, a new method to analytically compare alternative technologies of generating electricity was developed, including development of a new way to evaluate risk perception, and a new method was developed for evaluating the risk at multiple units on a single site. To test these modifications to the safety goals a number of possible reactor designs and configurations were evaluated using these new proposed safety goals to determine the goals' usefulness and utility. The results of the analysis showed that the modifications provide measures that more closely evaluate the potential risk to the public from the operation of nuclear power plants than the current safety goals, while still providing a straight-forward process for assessment of reactor design and operation.
Limitations of on-site dairy farm regulatory debits as milk quality predictors.
Borneman, Darand L; Stiegert, Kyle; Ingham, Steve
2015-03-01
In the United States, compliance with grade A raw fluid milk regulatory standards is assessed via laboratory milk quality testing and by on-site inspection of producers (farms). This study evaluated the correlation between on-site survey debits being marked and somatic cell count (SCC) or standard plate count (SPC) laboratory results for 1,301 Wisconsin grade A dairy farms in 2012. Debits recorded on the survey form were tested as predictors of laboratory results utilizing ordinary least squares regression to determine if results of the current method for on-site evaluation of grade A dairy farms accurately predict SCC and SPC test results. Such a correlation may indicate that current methods of on-site inspection serve the primary intended purpose of assuring availability of high-quality milk. A model for predicting SCC was estimated using ordinary least squares regression methods. Step-wise selected regressors of grouped debit items were able to predict SCC levels with some degree of accuracy (adjusted R2=0.1432). Specific debit items, seasonality, and farm size were the best predictors of SCC levels. The SPC data presented an analytical challenge because over 75% of the SPC observations were at or below a 25,000 cfu/mL threshold but were recorded by testing laboratories as at the threshold value. This classic censoring problem necessitated the use of a Tobit regression approach. Even with this approach, prediction of SPC values based on on-site survey criteria was much less successful (adjusted R2=0.034) and provided little support for the on-site survey system as a way to inform farmers about making improvements that would improve SPC. The lower level of correlation with SPC may indicate that factors affecting SPC are more varied and differ from those affecting SCC. Further, unobserved deficiencies in postmilking handling and storage sanitation could enhance bacterial growth and increase SPC, whereas postmilking sanitation will have no effect on SCC because somatic cells do not reproduce in stored milk. Results suggest that close examination, and perhaps redefinition, of survey debits, along with making the survey coincident with SCC and SPC sampling, could make the on-site survey a better tool for ensuring availability of high-quality milk. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Cryptic binding sites on proteins: definition, detection, and druggability.
Vajda, Sandor; Beglov, Dmitri; Wakefield, Amanda E; Egbert, Megan; Whitty, Adrian
2018-05-22
Many proteins in their unbound structures lack surface pockets appropriately sized for drug binding. Hence, a variety of experimental and computational tools have been developed for the identification of cryptic sites that are not evident in the unbound protein but form upon ligand binding, and can provide tractable drug target sites. The goal of this review is to discuss the definition, detection, and druggability of such sites, and their potential value for drug discovery. Novel methods based on molecular dynamics simulations are particularly promising and yield a large number of transient pockets, but it has been shown that only a minority of such sites are generally capable of binding ligands with substantial affinity. Based on recent studies, current methodology can be improved by combining molecular dynamics with fragment docking and machine learning approaches. Copyright © 2018 Elsevier Ltd. All rights reserved.
Capesius, Joseph P.; Sullivan, Joseph R.; O'Neill, Gregory B.; Williams, Cory A.
2005-01-01
Accurate ice-affected streamflow records are difficult to obtain for several reasons, which makes the management of instream-flow water rights in the wintertime a challenging endeavor. This report documents a method to improve ice-affected streamflow records for two gaging stations in Colorado. In January and February 2002, the U.S. Geological Survey, in cooperation with the Colorado Water Conservation Board, conducted an experiment using a sodium chloride tracer to measure streamflow under ice cover by the tracer-dilution discharge method. The purpose of this study was to determine the feasibility of obtaining accurate ice-affected streamflow records by using a sodium chloride tracer that was injected into the stream. The tracer was injected at two gaging stations once per day for approximately 20 minutes for 25 days. Multiple-parameter water-quality sensors at the two gaging stations monitored background and peak chloride concentrations. These data were used to determine discharge at each site. A comparison of the current-meter streamflow record to the tracer-dilution streamflow record shows different levels of accuracy and precision of the tracer-dilution streamflow record at the two sites. At the lower elevation and warmer site, Brandon Ditch near Whitewater, the tracer-dilution method overestimated flow by an average of 14 percent, but this average is strongly biased by outliers. At the higher elevation and colder site, Keystone Gulch near Dillon, the tracer-dilution method experienced problems with the tracer solution partially freezing in the injection line. The partial freezing of the tracer contributed to the tracer-dilution method underestimating flow by 52 percent at Keystone Gulch. In addition, a tracer-pump-reliability test was conducted to test how accurately the tracer pumps can discharge the tracer solution in conditions similar to those used at the gaging stations. Although the pumps were reliable and consistent throughout the 25-day study period, the pumps underdischarged the tracer by 5.8-15.9 percent as compared to the initial pumping rate setting, which may explain some of the error in the tracer-dilution streamflow record as compared to current-meter streamflow record.
Streeter, K A; Sunshine, M D; Patel, S R; Liddell, S S; Denholtz, L E; Reier, P J; Fuller, D D; Baekey, D M
2017-03-01
Midcervical spinal interneurons form a complex and diffuse network and may be involved in modulating phrenic motor output. The intent of the current work was to enable a better understanding of midcervical "network-level" connectivity by pairing the neurophysiological multielectrode array (MEA) data with histological verification of the recording locations. We first developed a method to deliver 100-nA currents to electroplate silver onto and subsequently deposit silver from electrode tips after obtaining midcervical (C3-C5) recordings using an MEA in anesthetized and ventilated adult rats. Spinal tissue was then fixed, harvested, and histologically processed to "develop" the deposited silver. Histological studies verified that the silver deposition method discretely labeled (50-μm resolution) spinal recording locations between laminae IV and X in cervical segments C3-C5. Using correlative techniques, we next tested the hypothesis that midcervical neuronal discharge patterns are temporally linked. Cross-correlation histograms produced few positive peaks (5.3%) in the range of 0-0.4 ms, but 21.4% of neuronal pairs had correlogram peaks with a lag of ≥0.6 ms. These results are consistent with synchronous discharge involving mono- and polysynaptic connections among midcervical neurons. We conclude that there is a high degree of synaptic connectivity in the midcervical spinal cord and that the silver-labeling method can reliably mark metal electrode recording sites and "map" interneuron populations, thereby providing a low-cost and effective tool for use in MEA experiments. We suggest that this method will be useful for further exploration of midcervical network connectivity. NEW & NOTEWORTHY We describe a method that reliably identifies the locations of multielectrode array (MEA) recording sites while preserving the surrounding tissue for immunohistochemistry. To our knowledge, this is the first cost-effective method to identify the anatomic locations of neuronal ensembles recorded with a MEA during acute preparations without the requirement of specialized array electrodes. In addition, evaluation of activity recorded from silver-labeled sites revealed a previously unappreciated degree of connectivity between midcervical interneurons. Copyright © 2017 the American Physiological Society.
Yu, Tsung-Hsien; Tung, Yu-Chi; Chung, Kuo-Piao
2015-08-01
Volume-infection relation studies have been published for high-risk surgical procedures, although the conclusions remain controversial. Inconsistent results may be caused by inconsistent categorization methods, the definitions of service volume, and different statistical approaches. The purpose of this study was to examine whether a relation exists between provider volume and coronary artery bypass graft (CABG) surgical site infection (SSI) using different categorization methods. A population-based cross-sectional multi-level study was conducted. A total of 10,405 patients who received CABG surgery between 2006 and 2008 in Taiwan were recruited. The outcome of interest was surgical site infection for CABG surgery. The associations among several patient, surgeon, and hospital characteristics was examined. The definition of surgeons' and hospitals' service volume was the cumulative CABG service volumes in the previous year for each CABG operation and categorized by three types of approaches: Continuous, quartile, and k-means clustering. The results of multi-level mixed effects modeling showed that hospital volume had no association with SSI. Although the relation between surgeon volume and surgical site infection was negative, it was inconsistent among the different categorization methods. Categorization of service volume is an important issue in volume-infection study. The findings of the current study suggest that different categorization methods might influence the relation between volume and SSI. The selection of an optimal cutoff point should be taken into account for future research.
Local antimicrobial administration for prophylaxis of surgical site infections.
Huiras, Paul; Logan, Jill K; Papadopoulos, Stella; Whitney, Dana
2012-11-01
Despite a lack of consensus guidelines, local antibiotic administration for prophylaxis of surgical site infections is used during many surgical procedures. The rationale behind this practice is to provide high antibiotic concentrations at the site of surgery while minimizing systemic exposure and adverse effects. Local antibiotic administration for surgical site prophylaxis has inherent limitations in that antibiotics are applied after the incision is made, rather than the current standard for surgical site prophylaxis that recommends providing adequate antibiotic concentrations at the site before the incision. The efficacy and safety of local application of antibiotics for surgical site prophylaxis have been assessed in different types of surgery with a variety of antibiotic agents and methods of application. We identified 22 prospective, randomized, controlled trials that evaluated local application of antibiotics for surgical site prophylaxis. These trials were subsequently divided and analyzed based on the type of surgical procedure: dermatologic, orthopedic, abdominal, colorectal, and cardiothoracic. Methods of local application analyzed included irrigations, powders, ointments, pastes, beads, sponges, and fleeces. Overall, there is a significant lack of level I evidence supporting this practice for any of the surgical genres evaluated. In addition, the literature spans several decades, and changes in surgical procedures, systemic antibiotic prophylaxis, and microbial flora make conclusions difficult to determine. Based on available data, the efficacy of local antibiotic administration for the prophylaxis of surgical site infections remains uncertain, and recommendations supporting this practice for surgical site prophylaxis cannot be made. © 2012 Pharmacotherapy Publications, Inc.
Using single cell sequencing data to model the evolutionary history of a tumor.
Kim, Kyung In; Simon, Richard
2014-01-24
The introduction of next-generation sequencing (NGS) technology has made it possible to detect genomic alterations within tumor cells on a large scale. However, most applications of NGS show the genetic content of mixtures of cells. Recently developed single cell sequencing technology can identify variation within a single cell. Characterization of multiple samples from a tumor using single cell sequencing can potentially provide information on the evolutionary history of that tumor. This may facilitate understanding how key mutations accumulate and evolve in lineages to form a heterogeneous tumor. We provide a computational method to infer an evolutionary mutation tree based on single cell sequencing data. Our approach differs from traditional phylogenetic tree approaches in that our mutation tree directly describes temporal order relationships among mutation sites. Our method also accommodates sequencing errors. Furthermore, we provide a method for estimating the proportion of time from the earliest mutation event of the sample to the most recent common ancestor of the sample of cells. Finally, we discuss current limitations on modeling with single cell sequencing data and possible improvements under those limitations. Inferring the temporal ordering of mutational sites using current single cell sequencing data is a challenge. Our proposed method may help elucidate relationships among key mutations and their role in tumor progression.
Simulation of a model nanopore sensor: Ion competition underlies device behavior.
Mádai, Eszter; Valiskó, Mónika; Dallos, András; Boda, Dezső
2017-12-28
We study a model nanopore sensor with which a very low concentration of analyte molecules can be detected on the basis of the selective binding of the analyte molecules to the binding sites on the pore wall. The bound analyte ions partially replace the current-carrier cations in a thermodynamic competition. This competition depends both on the properties of the nanopore and the concentrations of the competing ions (through their chemical potentials). The output signal given by the device is the current reduction caused by the presence of the analyte ions. The concentration of the analyte ions can be determined through calibration curves. We model the binding site with the square-well potential and the electrolyte as charged hard spheres in an implicit background solvent. We study the system with a hybrid method in which we compute the ion flux with the Nernst-Planck (NP) equation coupled with the Local Equilibrium Monte Carlo (LEMC) simulation technique. The resulting NP+LEMC method is able to handle both strong ionic correlations inside the pore (including finite size of ions) and bulk concentrations as low as micromolar. We analyze the effect of bulk ion concentrations, pore parameters, binding site parameters, electrolyte properties, and voltage on the behavior of the device.
Simulation of a model nanopore sensor: Ion competition underlies device behavior
NASA Astrophysics Data System (ADS)
Mádai, Eszter; Valiskó, Mónika; Dallos, András; Boda, Dezső
2017-12-01
We study a model nanopore sensor with which a very low concentration of analyte molecules can be detected on the basis of the selective binding of the analyte molecules to the binding sites on the pore wall. The bound analyte ions partially replace the current-carrier cations in a thermodynamic competition. This competition depends both on the properties of the nanopore and the concentrations of the competing ions (through their chemical potentials). The output signal given by the device is the current reduction caused by the presence of the analyte ions. The concentration of the analyte ions can be determined through calibration curves. We model the binding site with the square-well potential and the electrolyte as charged hard spheres in an implicit background solvent. We study the system with a hybrid method in which we compute the ion flux with the Nernst-Planck (NP) equation coupled with the Local Equilibrium Monte Carlo (LEMC) simulation technique. The resulting NP+LEMC method is able to handle both strong ionic correlations inside the pore (including finite size of ions) and bulk concentrations as low as micromolar. We analyze the effect of bulk ion concentrations, pore parameters, binding site parameters, electrolyte properties, and voltage on the behavior of the device.
Health technology assessment: Off-site sterilization
Dehnavieh, Reza; Mirshekari, Nadia; Ghasemi, Sara; Goudarzi, Reza; Haghdoost, AliAkbar; Mehrolhassani, Mohammad Hossain; Moshkani, Zahra; Noori Hekmat, Somayeh
2016-01-01
Background: Every year millions of dollars are expended to equip and maintain the hospital sterilization centers, and our country is not an exception of this matter. According to this, it is important to use more effective technologies and methods in health system in order to reach more effectiveness and saving in costs. This study was conducted with the aim of evaluating the technology of regional sterilization centers. Methods: This study was done in four steps. At the first step, safety and effectiveness of technology was studied via systematic study of evidence. The next step was done to evaluate the economical aspect of off-site sterilization technology using gathered data from systematic review of the texts which were related to the technology and costs of off-site and in-site hospital sterilization. Third step was conducted to collect experiences of using technology in some selected hospitals around the world. And in the last step different aspects of acceptance and use of this technology in Iran were evaluated. Results: Review of the selected articles indicated that efficacy and effectiveness of this technology is Confirmed. The results also showed that using this method is not economical in Iran. Conclusion: According to the revealed evidences and also cost analysis, due to shortage of necessary substructures and economical aspect, installing the off-site sterilization health technology in hospitals is not possible currently. But this method can be used to provide sterilization services for clinics and outpatients centers. PMID:27390714
Abraham, Jared D.; Lucius, Jeffrey E.
2004-01-01
In order to study the distribution of water in the unsaturated zone and potential for ground-water recharge near the Amargosa Desert Research Site south of Beatty, Nevada, the U.S. Geological Survey collected direct-current resistivity measurements along three profiles in May 2003 using an eight-channel resistivity imaging system. Resistivity data were collected along profiles across the ADRS, across a poorly incised (distributary) channel system of the Amargosa River southwest of the ADRS, and across a well-incised flood plain of the Amargosa River northwest of the ADRS.This report describes results of an initial investigation to estimate the distribution of water in the unsaturated zone and to evaluate the shallow subsurface stratigraphy near the ADRS. The geophysical method of dc resistivity was employed by using automated data collection with numerous electrodes. "Cross sections" of resistivity, produced by using an inversion algorithm on the field data, at the three field sites are presented and interpreted.
EVALUATION OF BACTERIOLOGICAL INDICATORS OF BIOSOLIDS DISINFECTION
Under the current regulations (CFR 503), Class B biosolids may be land applied with certain site restrictions. One method for achieving Class B status is to raise the pH of the sludge to >12 for a minimum of 2 hours with an alkaline material (normally lime). Alternately, a Clas...
2016-04-01
key leaders, government services, and businesses, while the cultural/historical/religious focuses on specific cultural sites. These cards were...labeled “intensive,” because these impacts had severe but localized impacts (Figure 21). Environmental impacts (such as raw sewage dumping following
Droplet digital PCR quantifies host inflammatory transcripts in feces reliably and reproducibly
USDA-ARS?s Scientific Manuscript database
The gut is the most extensive, interactive, and complex interface between the human host and the environment and therefore a critical site of immunological activity. Non-invasive methods to assess the host response in this organ are currently lacking. Feces are the available analyte which have been ...
Plotting landscape perspectives of clearcut units.
Roger H. Twito
1978-01-01
The aesthetics of clearcut units potentially visible from a distance can be improved by judicious design. Screening clearcuts from view or shaping them to characterize natural openings are current methods used for this purpose. Perspective plots illustrating how proposed clearcut units will look from specific off-site viewing points provide an...
Copper clusters capture and convert carbon dioxide to make fuel | Argonne
Photos Videos Fact Sheets, Brochures and Reports Summer Science Writing Internship Careers Education Photos Videos Fact Sheets, Brochures and Reports Summer Science Writing Internship Copper clusters sites, the current method of reduction creates high-pressure conditions to facilitate stronger bonds
ERIC Educational Resources Information Center
Levin, Sarah
This paper describes a method for designing, implementing, and evaluating a work-site physical activity campaign aimed at employees who are currently sedentary in their leisure time. Inactivity is a major but modifiable risk factor for coronary heart disease. Increasing the activity levels of underactive adults would have a positive impact on…
Current Treatment of Lower Gastrointestinal Hemorrhage
Raphaeli, Tal; Menon, Raman
2012-01-01
Massive lower gastrointestinal bleeding is a significant and expensive problem that requires methodical evaluation, management, and treatment. After initial resuscitation, care should be taken to localize the site of bleeding. Once localized, lesions can then be treated with endoscopic or angiographic interventions, reserving surgery for ongoing or recurrent bleeding. PMID:24294124
Wilson, Charles A; Matthews, Kennith; Pulsipher, Allan; Wang, Wei-Hsung
2016-02-01
Radioactive waste is an inevitable product of using radioactive material in education and research activities, medical applications, energy generation, and weapons production. Low-level radioactive waste (LLW) makes up a majority of the radioactive waste produced in the United States. In 2010, over two million cubic feet of LLW were shipped to disposal sites. Despite efforts from several states and compacts as well as from private industry, the options for proper disposal of LLW remain limited. New methods for quickly identifying potential storage locations could alleviate current challenges and eventually provide additional sites and allow for adequate regional disposal of LLW. Furthermore, these methods need to be designed so that they are easily communicated to the public. A Geographic Information Systems (GIS) based method was developed to determine suitability of potential LLW disposal (or storage) sites. Criteria and other parameters of suitability were based on the Code of Federal Regulation (CFR) requirements as well as supporting literature and reports. The resultant method was used to assess areas suitable for further evaluation as prospective disposal sites in Louisiana. Criteria were derived from the 10 minimum requirements in 10 CFR Part 61.50, the Nuclear Regulatory Commission's Regulatory Guide 0902, and studies at existing disposal sites. A suitability formula was developed permitting the use of weighting factors and normalization of all criteria. Data were compiled into GIS data sets and analyzed on a cell grid of approximately 14,000 cells (covering 181,300 square kilometers) using the suitability formula. Requirements were analyzed for each cell using multiple criteria/sub-criteria as well as surrogates for unavailable datasets. Additional criteria were also added when appropriate. The method designed in this project proved to be sufficient for initial screening tests in determining the most suitable areas for prospective disposal (or storage) sites. Cells above 90%, 95%, and 99% suitability include respectively 404, 88, and 4 cells suitable for further analysis. With these areas identified, the next step in siting a LLW storage facility would be on-site analysis using additional requirements as specified by relevant regulatory guidelines. The GIS based method provides an easy, economic, efficient and effective means in evaluating potential sites for LLW storage facilities where sufficient GIS data exist.
Barord, Gregory J; Dooley, Frederick; Dunstan, Andrew; Ilano, Anthony; Keister, Karen N; Neumeister, Heike; Preuss, Thomas; Schoepfer, Shane; Ward, Peter D
2014-01-01
The extant species of Nautilus and Allonautilus (Cephalopoda) inhabit fore-reef slope environments across a large geographic area of the tropical western Pacific and eastern Indian Oceans. While many aspects of their biology and behavior are now well-documented, uncertainties concerning their current populations and ecological role in the deeper, fore-reef slope environments remain. Given the historical to current day presence of nautilus fisheries at various locales across the Pacific and Indian Oceans, a comparative assessment of the current state of nautilus populations is critical to determine whether conservation measures are warranted. We used baited remote underwater video systems (BRUVS) to make quantitative photographic records as a means of estimating population abundance of Nautilus sp. at sites in the Philippine Islands, American Samoa, Fiji, and along an approximately 125 km transect on the fore reef slope of the Great Barrier Reef from east of Cairns to east of Lizard Island, Australia. Each site was selected based on its geography, historical abundance, and the presence (Philippines) or absence (other sites) of Nautilus fisheries The results from these observations indicate that there are significantly fewer nautiluses observable with this method in the Philippine Islands site. While there may be multiple possibilities for this difference, the most parsimonious is that the Philippine Islands population has been reduced due to fishing. When compared to historical trap records from the same site the data suggest there have been far more nautiluses at this site in the past. The BRUVS proved to be a valuable tool to measure Nautilus abundance in the deep sea (300-400 m) while reducing our overall footprint on the environment.
Current perspective on assessing site of digestion in ruminants.
Merchen, N R; Elizalde, J C; Drackley, J K
1997-08-01
The site of nutrient digestion in the gastrointestinal tract of ruminants affects nutrient availability and the nature of digestive end-products supplied to the host animal. Methods to study site of digestion in vivo provide a tool to obtain data that enhance the ability to interpret or predict animal responses to some feeding practices. Examples are discussed for which site of digestion data provided insights that might not have been evident from other approaches. The use of site of digestion techniques may provide interpretation regarding digestion of N by ruminants different from those derived from measurement of total tract N digestion. Site of digestion measurements have been particularly important in studying effects of heat processing of protein sources and in understanding the nature of digestion of N in high-quality, fresh forages. Moreover, site of digestion techniques have been instrumental in interpreting the influences of supplemental fat sources on ruminal digestion and ruminal biohydrogenation and small intestinal digestion of long-chain fatty acids.
Luo, Jake; Chen, Weiheng; Wu, Min; Weng, Chunhua
2017-12-01
Prior studies of clinical trial planning indicate that it is crucial to search and screen recruitment sites before starting to enroll participants. However, currently there is no systematic method developed to support clinical investigators to search candidate recruitment sites according to their interested clinical trial factors. In this study, we aim at developing a new approach to integrating the location data of over one million heterogeneous recruitment sites that are stored in clinical trial documents. The integrated recruitment location data can be searched and visualized using a map-based information retrieval method. The method enables systematic search and analysis of recruitment sites across a large amount of clinical trials. The location data of more than 1.4 million recruitment sites of over 183,000 clinical trials was normalized and integrated using a geocoding method. The integrated data can be used to support geographic information retrieval of recruitment sites. Additionally, the information of over 6000 clinical trial target disease conditions and close to 4000 interventions was also integrated into the system and linked to the recruitment locations. Such data integration enabled the construction of a novel map-based query system. The system will allow clinical investigators to search and visualize candidate recruitment sites for clinical trials based on target conditions and interventions. The evaluation results showed that the coverage of the geographic location mapping for the 1.4 million recruitment sites was 99.8%. The evaluation of 200 randomly retrieved recruitment sites showed that the correctness of geographic information mapping was 96.5%. The recruitment intensities of the top 30 countries were also retrieved and analyzed. The data analysis results indicated that the recruitment intensity varied significantly across different countries and geographic areas. This study contributed a new data processing framework to extract and integrate the location data of heterogeneous recruitment sites from clinical trial documents. The developed system can support effective retrieval and analysis of potential recruitment sites using target clinical trial factors. Copyright © 2017 Elsevier B.V. All rights reserved.
D Model Generation from Uav: Historical Mosque (masjid LAMA Nilai)
NASA Astrophysics Data System (ADS)
Nasir, N. H. Mohd; Tahar, K. N.
2017-08-01
Preserving cultural heritage and historic sites is an important issue. These sites are subjected to erosion and vandalism, and, as long-lived artifacts, they have gone through many phases of construction, damage and repair. It is important to keep an accurate record of these sites using the 3-D model building technology as they currently are, so that preservationists can track changes, foresee structural problems, and allow a wider audience to "virtually" see and tour these sites. Due to the complexity of these sites, building 3-D models is time consuming and difficult, usually involving much manual effort. This study discusses new methods that can reduce the time to build a model using the Unmanned Aerial Vehicle method. This study aims to develop a 3D model of a historical mosque using UAV photogrammetry. In order to achieve this, the data acquisition set of Masjid Lama Nilai, Negeri Sembilan was captured by using an Unmanned Aerial Vehicle. In addition, accuracy assessment between the actual and measured values is made. Besides that, a comparison between the rendering 3D model and texturing 3D model is also carried out through this study.
SPEER-SERVER: a web server for prediction of protein specificity determining sites
Chakraborty, Abhijit; Mandloi, Sapan; Lanczycki, Christopher J.; Panchenko, Anna R.; Chakrabarti, Saikat
2012-01-01
Sites that show specific conservation patterns within subsets of proteins in a protein family are likely to be involved in the development of functional specificity. These sites, generally termed specificity determining sites (SDS), might play a crucial role in binding to a specific substrate or proteins. Identification of SDS through experimental techniques is a slow, difficult and tedious job. Hence, it is very important to develop efficient computational methods that can more expediently identify SDS. Herein, we present Specificity prediction using amino acids’ Properties, Entropy and Evolution Rate (SPEER)-SERVER, a web server that predicts SDS by analyzing quantitative measures of the conservation patterns of protein sites based on their physico-chemical properties and the heterogeneity of evolutionary changes between and within the protein subfamilies. This web server provides an improved representation of results, adds useful input and output options and integrates a wide range of analysis and data visualization tools when compared with the original standalone version of the SPEER algorithm. Extensive benchmarking finds that SPEER-SERVER exhibits sensitivity and precision performance that, on average, meets or exceeds that of other currently available methods. SPEER-SERVER is available at http://www.hpppi.iicb.res.in/ss/. PMID:22689646
Recurrent bottlenecks in the malaria life cycle obscure signals of positive selection.
Chang, Hsiao-Han; Hartl, Daniel L
2015-02-01
Detecting signals of selection in the genome of malaria parasites is a key to identify targets for drug and vaccine development. Malaria parasites have a unique life cycle alternating between vector and host organism with a population bottleneck at each transition. These recurrent bottlenecks could influence the patterns of genetic diversity and the power of existing population genetic tools to identify sites under positive selection. We therefore simulated the site-frequency spectrum of a beneficial mutant allele through time under the malaria life cycle. We investigated the power of current population genetic methods to detect positive selection based on the site-frequency spectrum as well as temporal changes in allele frequency. We found that a within-host selective advantage is difficult to detect using these methods. Although a between-host transmission advantage could be detected, the power is decreased when compared with the classical Wright-Fisher (WF) population model. Using an adjusted null site-frequency spectrum that takes the malaria life cycle into account, the power of tests based on the site-frequency spectrum to detect positive selection is greatly improved. Our study demonstrates the importance of considering the life cycle in genetic analysis, especially in parasites with complex life cycles.
SPEER-SERVER: a web server for prediction of protein specificity determining sites.
Chakraborty, Abhijit; Mandloi, Sapan; Lanczycki, Christopher J; Panchenko, Anna R; Chakrabarti, Saikat
2012-07-01
Sites that show specific conservation patterns within subsets of proteins in a protein family are likely to be involved in the development of functional specificity. These sites, generally termed specificity determining sites (SDS), might play a crucial role in binding to a specific substrate or proteins. Identification of SDS through experimental techniques is a slow, difficult and tedious job. Hence, it is very important to develop efficient computational methods that can more expediently identify SDS. Herein, we present Specificity prediction using amino acids' Properties, Entropy and Evolution Rate (SPEER)-SERVER, a web server that predicts SDS by analyzing quantitative measures of the conservation patterns of protein sites based on their physico-chemical properties and the heterogeneity of evolutionary changes between and within the protein subfamilies. This web server provides an improved representation of results, adds useful input and output options and integrates a wide range of analysis and data visualization tools when compared with the original standalone version of the SPEER algorithm. Extensive benchmarking finds that SPEER-SERVER exhibits sensitivity and precision performance that, on average, meets or exceeds that of other currently available methods. SPEER-SERVER is available at http://www.hpppi.iicb.res.in/ss/.
NASA Astrophysics Data System (ADS)
Zdeb, T. F.
2012-12-01
Preparing a Perimeter Air Monitoring Plan that provides the essential information and methods of evaluation needed to assure that the health of the surrounding community is adequately protected and adapting currently existing Cal/OSHA regulations to be relevant to the protection of workers at sites involving the excavation of Naturally Occurring Asbestos (NOA) is oftentimes challenging in California. Current guidelines regarding what constitutes an effective air monitoring program are often lacking in details regarding what should be sampled and analyzed to characterize a site and what evaluation techniques should be applied to process the results of monitoring, and the current Cal/OSHA asbestos related regulations regarding worker protection are for the most part largely pertinent to the abatement of asbestos in buildings. An overview of the essential components of an effective Baseline and Perimeter Air Monitoring Plan will be presented that includes a brief discussion of the various asbestos types and fiber sizes that may need to be considered, possible approachs for evaluating temporal and spatial variability, review of selected site boundary target concentrations, and consideration of the potential for airborne dust and soil containing asbestos (and other contaminants) to migrate and accumulate offsite eventually contributing to "background creep" --the incremental increase of overall airborne asbestos concentrations in the areas surrounding the site due to the re-entrainment of asbestos from the settled dust and/or transported soil. In addition to the above, the current Cal/OSHA asbestos regulations related to worker protection will be briefly discussed with respect to their relevancy at NOA sites with an overview of the adaptations to the regulations that were developed as a result of some fairly lengthy discussions with representatives of Cal/OSHA. These adaptations include, among other things, defining how regulated areas (asbestos concentrations over 1%) and "provisionally regulated" areas (less than 1%) are established and treated, what variables need to be considered when attempting to complete a personal exposure assessment, Cal/OSHA Permissible Exposure Limits (PELs) versus site specific permissible exposure limits, allowable work practices, and the qualifications of personnel performing the sampling and analyses of data.
2013-01-01
Background Estimating the size of forcibly displaced populations is key to documenting their plight and allocating sufficient resources to their assistance, but is often not done, particularly during the acute phase of displacement, due to methodological challenges and inaccessibility. In this study, we explored the potential use of very high resolution satellite imagery to remotely estimate forcibly displaced populations. Methods Our method consisted of multiplying (i) manual counts of assumed residential structures on a satellite image and (ii) estimates of the mean number of people per structure (structure occupancy) obtained from publicly available reports. We computed population estimates for 11 sites in Bangladesh, Chad, Democratic Republic of Congo, Ethiopia, Haiti, Kenya and Mozambique (six refugee camps, three internally displaced persons’ camps and two urban neighbourhoods with a mixture of residents and displaced) ranging in population from 1,969 to 90,547, and compared these to “gold standard” reference population figures from census or other robust methods. Results Structure counts by independent analysts were reasonably consistent. Between one and 11 occupancy reports were available per site and most of these reported people per household rather than per structure. The imagery-based method had a precision relative to reference population figures of <10% in four sites and 10–30% in three sites, but severely over-estimated the population in an Ethiopian camp with implausible occupancy data and two post-earthquake Haiti sites featuring dense and complex residential layout. For each site, estimates were produced in 2–5 working person-days. Conclusions In settings with clearly distinguishable individual structures, the remote, imagery-based method had reasonable accuracy for the purposes of rapid estimation, was simple and quick to implement, and would likely perform better in more current application. However, it may have insurmountable limitations in settings featuring connected buildings or shelters, a complex pattern of roofs and multi-level buildings. Based on these results, we discuss possible ways forward for the method’s development. PMID:23343099
Checchi, Francesco; Stewart, Barclay T; Palmer, Jennifer J; Grundy, Chris
2013-01-23
Estimating the size of forcibly displaced populations is key to documenting their plight and allocating sufficient resources to their assistance, but is often not done, particularly during the acute phase of displacement, due to methodological challenges and inaccessibility. In this study, we explored the potential use of very high resolution satellite imagery to remotely estimate forcibly displaced populations. Our method consisted of multiplying (i) manual counts of assumed residential structures on a satellite image and (ii) estimates of the mean number of people per structure (structure occupancy) obtained from publicly available reports. We computed population estimates for 11 sites in Bangladesh, Chad, Democratic Republic of Congo, Ethiopia, Haiti, Kenya and Mozambique (six refugee camps, three internally displaced persons' camps and two urban neighbourhoods with a mixture of residents and displaced) ranging in population from 1,969 to 90,547, and compared these to "gold standard" reference population figures from census or other robust methods. Structure counts by independent analysts were reasonably consistent. Between one and 11 occupancy reports were available per site and most of these reported people per household rather than per structure. The imagery-based method had a precision relative to reference population figures of <10% in four sites and 10-30% in three sites, but severely over-estimated the population in an Ethiopian camp with implausible occupancy data and two post-earthquake Haiti sites featuring dense and complex residential layout. For each site, estimates were produced in 2-5 working person-days. In settings with clearly distinguishable individual structures, the remote, imagery-based method had reasonable accuracy for the purposes of rapid estimation, was simple and quick to implement, and would likely perform better in more current application. However, it may have insurmountable limitations in settings featuring connected buildings or shelters, a complex pattern of roofs and multi-level buildings. Based on these results, we discuss possible ways forward for the method's development.
A multi-landing pad DNA integration platform for mammalian cell engineering
Gaidukov, Leonid; Wroblewska, Liliana; Teague, Brian; Nelson, Tom; Zhang, Xin; Liu, Yan; Jagtap, Kalpana; Mamo, Selamawit; Tseng, Wen Allen; Lowe, Alexis; Das, Jishnu; Bandara, Kalpanie; Baijuraj, Swetha; Summers, Nevin M; Zhang, Lin; Weiss, Ron
2018-01-01
Abstract Engineering mammalian cell lines that stably express many transgenes requires the precise insertion of large amounts of heterologous DNA into well-characterized genomic loci, but current methods are limited. To facilitate reliable large-scale engineering of CHO cells, we identified 21 novel genomic sites that supported stable long-term expression of transgenes, and then constructed cell lines containing one, two or three ‘landing pad’ recombination sites at selected loci. By using a highly efficient BxB1 recombinase along with different selection markers at each site, we directed recombinase-mediated insertion of heterologous DNA to selected sites, including targeting all three with a single transfection. We used this method to controllably integrate up to nine copies of a monoclonal antibody, representing about 100 kb of heterologous DNA in 21 transcriptional units. Because the integration was targeted to pre-validated loci, recombinant protein expression remained stable for weeks and additional copies of the antibody cassette in the integrated payload resulted in a linear increase in antibody expression. Overall, this multi-copy site-specific integration platform allows for controllable and reproducible insertion of large amounts of DNA into stable genomic sites, which has broad applications for mammalian synthetic biology, recombinant protein production and biomanufacturing. PMID:29617873
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgans, D. L.; Lindberg, S. L.
The purpose of this technical approach document (TAD) is to document the assumptions, equations, and methods used to perform the groundwater pathway radiological dose calculations for the revised Hanford Site Composite Analysis (CA). DOE M 435.1-1, states, “The composite analysis results shall be used for planning, radiation protection activities, and future use commitments to minimize the likelihood that current low-level waste disposal activities will result in the need for future corrective or remedial actions to adequately protect the public and the environment.”
Facilities and Methods Used in Full-scale Airplane Crash-fire Investigation
NASA Technical Reports Server (NTRS)
Black, Dugald O.
1952-01-01
The facilities and the techniques employed in the conduct of full scale airplane crash-fire studies currently being conducted at the NACA Lewis laboratory are discussed herein. This investigation is part of a comprehensive study of the airplane crash-fire problem. The crash configuration chosen, the general physical layout of the crash site, the test methods, the instrumentation, the data-recording systems, and the post-crash examination procedure are described
DeepSig: deep learning improves signal peptide detection in proteins.
Savojardo, Castrense; Martelli, Pier Luigi; Fariselli, Piero; Casadio, Rita
2018-05-15
The identification of signal peptides in protein sequences is an important step toward protein localization and function characterization. Here, we present DeepSig, an improved approach for signal peptide detection and cleavage-site prediction based on deep learning methods. Comparative benchmarks performed on an updated independent dataset of proteins show that DeepSig is the current best performing method, scoring better than other available state-of-the-art approaches on both signal peptide detection and precise cleavage-site identification. DeepSig is available as both standalone program and web server at https://deepsig.biocomp.unibo.it. All datasets used in this study can be obtained from the same website. pierluigi.martelli@unibo.it. Supplementary data are available at Bioinformatics online.
Kaya, M Ali; Ozürlan, Gülçin; Sengül, Ebru
2007-12-01
Direct current (DC) resistivity, self potential (SP) and very low frequency electromagnetic (VLF-EM) measurements are carried out to detect the spread of groundwater contamination and to locate possible pathways of leachate plumes, that resulted from an open waste disposal site of Canakkale municipality. There is no proper management of the waste disposal site in which industrial and domestic wastes were improperly dumped. Furthermore, because of the dumpsite is being located at the catchment area borders of a small creek and is being topographically at a high elevation relative to the urban area, the groundwater is expected to be hazardously contaminated. Interpretations of DC resistivity geoelectrical data showed a low resistivity zone (<5 ohm-m), which appears to be a zone, that is fully saturated with leachate from an open dumpsite. The VLF-EM and SP method, support the results of geoelectrical method relating a contaminated zone in the survey area. There is a good correlation between the geophysical investigations and the results of previously collected geochemical and hydrochemical measurements.
NASA Technical Reports Server (NTRS)
Kerczewski, Robert J.; Ivancic, William D.; Zuzek, John E.
1991-01-01
The development of new space communications technologies by NASA has included both commercial applications and space science requirements. At NASA's Lewis Research Center, methods and facilities have been developed for evaluating these new technologies in the laboratory. NASA's Systems Integration, Test and Evaluation (SITE) Space Communication System Simulator is a hardware-based laboratory simulator for evaluating space communications technologies at the component, subsystem, system, and network level, geared toward high frequency, high data rate systems. The SITE facility is well-suited for evaluation of the new technologies required for the Space Exploration Initiative (SEI) and advanced commercial systems. This paper describes the technology developments and evaluation requirements for current and planned commercial and space science programs. Also examined are the capabilities of SITE, the past, present, and planned future configurations of the SITE facility, and applications of SITE to evaluation of SEI technology.
Wilson, Mark G.; Goetzel, Ron Z.; Ozminkowski, Ronald J.; DeJoy, Dave M.; Della, Lindsay; Roemer, Enid Chung; Schneider, Jennifer; Tully, Karen J.; White, John M.; Baase, Catherine M.
2010-01-01
Objective This paper presents the formative research phase of a large multi-site intervention study conducted to inform the feasibility of introducing environmental and ecological interventions. Methods Using mixed methods that included an environmental assessment, climate survey, leadership focus groups and interviews, and archival data, information was collected on employee health and job factors, the physical environment, social-organizational environment, and current health programs. Results Results show that 83% of employees at the study sites were overweight or obese. Leadership was very supportive of health initiatives and felt integrating the strategies into organizational operations would increase their likelihood of success. Environmental assessment scores ranged from 47 to 19 on a 100 point scale. Health services personnel tended to view the organizational climate for health more positively than site leadership (mean of 3.6 vs 3.0 respectively). Conclusions Intervention strategies chosen included increasing healthy food choices in vending, cafeterias, and company meetings, providing a walking path, targeting messages, developing site goals, training leaders, and establishing leaders at the work group level. PMID:18073340
Axono-cortical evoked potentials: A proof-of-concept study.
Mandonnet, E; Dadoun, Y; Poisson, I; Madadaki, C; Froelich, S; Lozeron, P
2016-04-01
Awake surgery is currently considered the best method to tailor intraparenchymatous resections according to functional boundaries. However, the exact mechanisms by which electrical stimulation disturbs behavior remain largely unknown. In this case report, we describe a new method to explore the propagation toward cortical sites of a brief pulse applied to an eloquent white matter pathway. We present a patient, operated on in awake condition for removal of a cavernoma of the left ventral premotor cortex. At the end of the resection, the application of 60Hz stimulation in the white matter of the operculum induced anomia. Stimulating the same site at a frequency of 1Hz during 70seconds allowed to record responses on electrodes put over Broca's area and around the inferior part of central sulcus. Axono-cortical evoked potentials were then obtained by averaging unitary responses, time-locked to the stimulus. We then discuss the origin of these evoked axono-cortical potentials and the likely pathway connecting the stimulation site to the recorded cortical sites. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Grimm, Amanda G.; Brooks, Colin N.; Binder, Thomas R.; Riley, Stephen C.; Farha, Steve A.; Shuchman, Robert A.; Krueger, Charles C.
2016-01-01
The availability and quality of spawning habitat may limit lake trout recovery in the Great Lakes, but little is known about the location and characteristics of current spawning habitats. Current methods used to identify lake trout spawning locations are time- and labor-intensive and spatially limited. Due to the observation that some lake trout spawning sites are relatively clean of overlaying algae compared to areas not used for spawning, we suspected that spawning sites could be identified using satellite imagery. Satellite imagery collected just before and after the spawning season in 2013 was used to assess whether lake trout spawning habitat could be identified based on its spectral characteristics. Results indicated that Pléiades high-resolution multispectral satellite imagery can be successfully used to estimate algal coverage of substrates and temporal changes in algal coverage, and that models developed from processed imagery can be used to identify potential lake trout spawning sites based on comparison of sites where lake trout eggs were and were not observed after spawning. Satellite imagery is a potential new tool for identifying lake trout spawning habitat at large scales in shallow nearshore areas of the Great Lakes.
Application of hidden Markov models to biological data mining: a case study
NASA Astrophysics Data System (ADS)
Yin, Michael M.; Wang, Jason T.
2000-04-01
In this paper we present an example of biological data mining: the detection of splicing junction acceptors in eukaryotic genes. Identification or prediction of transcribed sequences from within genomic DNA has been a major rate-limiting step in the pursuit of genes. Programs currently available are far from being powerful enough to elucidate the gene structure completely. Here we develop a hidden Markov model (HMM) to represent the degeneracy features of splicing junction acceptor sites in eukaryotic genes. The HMM system is fully trained using an expectation maximization (EM) algorithm and the system performance is evaluated using the 10-way cross- validation method. Experimental results show that our HMM system can correctly classify more than 94% of the candidate sequences (including true and false acceptor sites) into right categories. About 90% of the true acceptor sites and 96% of the false acceptor sites in the test data are classified correctly. These results are very promising considering that only the local information in DNA is used. The proposed model will be a very important component of an effective and accurate gene structure detection system currently being developed in our lab.
Keita, K; Camara, D; Barry, Y; Ossè, R; Wang, L; Sylla, M; Miller, D; Leite, L; Schopp, P; Lawrence, G G; Akogbéto, M; Dotson, E M; Guilavogui, T; Keita, M; Irish, S R
2017-05-01
Insecticide resistance is one of the primary threats to the recent gains in malaria control. This is especially true in Guinea, where long-lasting insecticidal nets are currently the primary vector control intervention. To better inform the national malaria control program on the current status of insecticide resistance in Guinea, resistance bioassays were conducted, using Anopheles gambiae s.l. Giles, in three sites. Molecular analyses were also done on An. gambiae s.l. to determine the species and find whether the target-site mutations kdr and Ace1R were present. Susceptibility tests revealed resistance to DDT and pyrethroids, although mosquitoes were susceptible to deltamethrin in two of the three sites tested. Mosquitoes were susceptible to bendiocarb, except in Kissidougou, Guinea. The kdr-west mutation was widespread and the frequency was 60% or more in all sites. However, the Ace1R mutation was present in low levels. Insecticide susceptibility should continue to be monitored in Guinea to ensure insecticide-based vector control methods remain effective. Published by Oxford University Press on behalf of Entomological Society of America 2017. This work is written by US Government employees and is in the public domain in the US.
Anophelism in a Former Malaria Area of Northeastern Spain
Bueno-Marí, Rubén; Jiménez-Peydró, Ricardo
2013-01-01
Background: A field study on diversity and distribution of anophelines currently present in a past endemic malaria area of Spain was carried out in order to identify possible risk areas of local disease transmission. Methods: Multiple larval sites were sampled from June to October of 2011 in the Region of Somontano de Barbastro (Northeastern Spain). The sampling effort was fixed at 10 minutes which included the active search for larvae in each biotope visited. Results: A total of 237 larval specimens belonging to four Anopheles species (Anopheles atroparvus, An. claviger, An. maculipennis and An. petragnani) were collected and identified. Conclusions: Malaria receptivity in the study area is high, especially in the area of Cinca river valley, due to the abundance of breeding sites of An. atroparvus very close to human settlements. Although current socio-economic conditions in Spain reduce possibilities of re-emergence of malaria transmission, it is evident that certain entomological and epidemiological vigilance must be maintained and even increased in the context of current processes of climate change and globalization. PMID:24409440
Zhao, Xue Jiao; Zhu, Guang; Fan, You Jun; Li, Hua Yang; Wang, Zhong Lin
2015-07-28
We report a flexible and area-scalable energy-harvesting technique for converting kinetic wave energy. Triboelectrification as a result of direct interaction between a dynamic wave and a large-area nanostructured solid surface produces an induced current among an array of electrodes. An integration method ensures that the induced current between any pair of electrodes can be constructively added up, which enables significant enhancement in output power and realizes area-scalable integration of electrode arrays. Internal and external factors that affect the electric output are comprehensively discussed. The produced electricity not only drives small electronics but also achieves effective impressed current cathodic protection. This type of thin-film-based device is a potentially practical solution of on-site sustained power supply at either coastal or off-shore sites wherever a dynamic wave is available. Potential applications include corrosion protection, pollution degradation, water desalination, and wireless sensing for marine surveillance.
Clean-ups at Aberdeen Proving Ground
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardenuto, R.A.
1994-12-31
The Department of Defense has utilized radiative material in numerous applications over several decades. Aberdeen Proving Ground has been an integral player in the Army`s Research, Development, and Testing of items incorporating radionuclides, as well as developing new and innovative applications. As new information becomes available and society progresses, we find that the best management practices used decades, or even sometimes years earlier are inadequate to meet the current demands. Aberdeen Proving Ground is committed to remediating historic disposal sites, and utilizing the best available technology in current operations to prevent future adverse impact. Two projects which are currently ongoingmore » at Aberdeen Proving Ground illustrates these points. The first, the remediation of contaminated metal storage areas, depicts how available technology has provided a means for recycling material whereby preventing the continued stock piling, and allowing for the decommissioning of the areas. The second, the 26Th Street Disposal Site Removal Action, shows how historic methods of disposition were inadequate to meet today`s needs.« less
What is the lifetime risk of developing cancer?: the effect of adjusting for multiple primaries
Sasieni, P D; Shelton, J; Ormiston-Smith, N; Thomson, C S; Silcocks, P B
2011-01-01
Background: The ‘lifetime risk' of cancer is generally estimated by combining current incidence rates with current all-cause mortality (‘current probability' method) rather than by describing the experience of a birth cohort. As individuals may get more than one type of cancer, what is generally estimated is the average (mean) number of cancers over a lifetime. This is not the same as the probability of getting cancer. Methods: We describe a method for estimating lifetime risk that corrects for the inclusion of multiple primary cancers in the incidence rates routinely published by cancer registries. The new method applies cancer incidence rates to the estimated probability of being alive without a previous cancer. The new method is illustrated using data from the Scottish Cancer Registry and is compared with ‘gold-standard' estimates that use (unpublished) data on first primaries. Results: The effect of this correction is to make the estimated ‘lifetime risk' smaller. The new estimates are extremely similar to those obtained using incidence based on first primaries. The usual ‘current probability' method considerably overestimates the lifetime risk of all cancers combined, although the correction for any single cancer site is minimal. Conclusion: Estimation of the lifetime risk of cancer should either be based on first primaries or should use the new method. PMID:21772332
NASA Technical Reports Server (NTRS)
Nichols, Jonathan E.; Peteet, Dorothy M.; Frolking, Steve; Karavias, John
2017-01-01
Arctic peatlands are an important part of the global carbon cycle, accumulating atmospheric carbon as organic matter since the Late glacial. Current methods for understanding the changing efficiency of the peatland carbon sink rely on peatlands with an undisturbed stratigraphy. Here we present a method of estimating primary carbon accumulation rate from a site where permafrost processes have either vertically or horizontally translocated nearby carbon-rich sediment out of stratigraphic order. Briefly, our new algorithm estimates the probability of the age of deposition of a random increment of sediment in the core. The method assumes that if sediment age is measured at even depth increments, dates are more likely to occur during intervals of higher accumulation rate and vice versa. Multiplying estimated sedimentation rate by measured carbon density yields carbon accumulation rate. We perform this analysis at the Imnavait Creek Peatland, near the Arctic Long Term Ecological Research network site at Toolik Lake, Alaska. Using classical radiocarbon age modeling, we find unreasonably high rates of carbon accumulation at various Holocene intervals. With our new method, we find accumulation rate changes that are in improved agreement within the context of other sites throughout Alaska and the rest of the Circum-Arctic region.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-27
.... NSCC believes it can offer a number of control improvements to the current manual, decentralized, and...: Electronic Comments Use the Commission's Internet comment form ( http://www.sec.gov/rules/sro.shtml ); or... one method. The Commission will post all comments on the Commission's Internet Web site ( http://www...
Alternate Methods of Effluent Disposal for On-Lot Home Sewage Systems. Special Circular 214.
ERIC Educational Resources Information Center
Wooding, N. Henry
This circular provides current information for homeowners who must repair or replace existing on-lot sewage disposal systems. Several alternatives such as elevated sand mounds, sand-lined beds and trenches and oversized absorption areas are discussed. Site characteristics and preparation are outlined. Each alternative is accompanied by a diagram…
Psychiatric Syndromes in Adolescents with Marijuana Abuse and Dependency in Outpatient Treatment
ERIC Educational Resources Information Center
Diamond, Guy; Panichelli-Mindel, Susan M.; Shera, David; Dennis, Mike; Tims, Frank; Ungemack, Jane
2006-01-01
Objective: The purpose of the current study to assist in understanding the prevalence and clinical correlates of psychiatric distress in adolescents seeking outpatient services for marijuana abuse or dependency. Methods: In a multi-site randomized clinical trial, 600 adolescents and their parents were assessed at intake using the Global Appraisals…
Regenerating shortleaf pine in clearcuts in the Missouri Ozark Highlands
David Gwaze; Mark Johanson
2013-01-01
A shortleaf pine (Pinus echinata Mill.) regeneration study was established by the Missouri Department of Conservation in 1986 at the Current River Conservation Area. The objective of the study was to compare natural to artificial regeneration methods, and site preparation prescribed burning to bulldozing for shortleaf pine establishment and growth....
NASA Astrophysics Data System (ADS)
Wyjadłowski, Marek
2017-12-01
The constant development of geotechnical technologies imposes the necessity of monitoring techniques to provide a proper quality and the safe execution of geotechnical works. Several monitoring methods enable the preliminary design of work process and current control of hydrotechnical works (pile driving, sheet piling, ground improvement methods). Wave parameter measurements and/or continuous histogram recording of shocks and vibrations and its dynamic impact on engineering structures in the close vicinity of the building site enable the modification of the technology parameters, such as vibrator frequency or hammer drop height. Many examples of practical applications have already been published and provide a basis for the formulation of guidelines, for work on the following sites. In the current work the author's experience gained during sheet piling works for the reconstruction of City Channel in Wrocław (Poland) was presented. The examples chosen describe ways of proceedings in the case of new and old residential buildings where the concrete or masonry walls were exposed to vibrations and in the case of the hydrotechnical structures (sluices, bridges).
Simulation of Electromigration Based on Resistor Networks
NASA Astrophysics Data System (ADS)
Patrinos, Anthony John
A two dimensional computer simulation of electromigration based on resistor networks was designed and implemented. The model utilizes a realistic grain structure generated by the Monte Carlo method and takes specific account of the local effects through which electromigration damage progresses. The dynamic evolution of the simulated thin film is governed by the local current and temperature distributions. The current distribution is calculated by superimposing a two dimensional electrical network on the lattice whose nodes correspond to the particles in the lattice and the branches to interparticle bonds. Current is assumed to flow from site to site via nearest neighbor bonds. The current distribution problem is solved by applying Kirchhoff's rules on the resulting electrical network. The calculation of the temperature distribution in the lattice proceeds by discretizing the partial differential equation for heat conduction, with appropriate material parameters chosen for the lattice and its defects. SEReNe (for Simulation of Electromigration using Resistor Networks) was tested by applying it to common situations arising in experiments with real films with satisfactory results. Specifically, the model successfully reproduces the expected grain size, line width and bamboo effects, the lognormal failure time distribution and the relationship between current density exponent and current density. It has also been modified to simulate temperature ramp experiments but with mixed, in this case, results.
Bonizzoni, Paola; Rizzi, Raffaella; Pesole, Graziano
2005-10-05
Currently available methods to predict splice sites are mainly based on the independent and progressive alignment of transcript data (mostly ESTs) to the genomic sequence. Apart from often being computationally expensive, this approach is vulnerable to several problems--hence the need to develop novel strategies. We propose a method, based on a novel multiple genome-EST alignment algorithm, for the detection of splice sites. To avoid limitations of splice sites prediction (mainly, over-predictions) due to independent single EST alignments to the genomic sequence our approach performs a multiple alignment of transcript data to the genomic sequence based on the combined analysis of all available data. We recast the problem of predicting constitutive and alternative splicing as an optimization problem, where the optimal multiple transcript alignment minimizes the number of exons and hence of splice site observations. We have implemented a splice site predictor based on this algorithm in the software tool ASPIC (Alternative Splicing PredICtion). It is distinguished from other methods based on BLAST-like tools by the incorporation of entirely new ad hoc procedures for accurate and computationally efficient transcript alignment and adopts dynamic programming for the refinement of intron boundaries. ASPIC also provides the minimal set of non-mergeable transcript isoforms compatible with the detected splicing events. The ASPIC web resource is dynamically interconnected with the Ensembl and Unigene databases and also implements an upload facility. Extensive bench marking shows that ASPIC outperforms other existing methods in the detection of novel splicing isoforms and in the minimization of over-predictions. ASPIC also requires a lower computation time for processing a single gene and an EST cluster. The ASPIC web resource is available at http://aspic.algo.disco.unimib.it/aspic-devel/.
Orthodontic extrusion for pre-implant site enhancement: Principles and clinical guidelines.
Alsahhaf, Abdulaziz; Att, Wael
2016-07-01
The aim of this paper is to provide a concise overview about the principles of pre-implant orthodontic extrusion, describe methods and techniques available and provide the clinicians with guidelines about its application. A number of reports describe orthodontic extrusion as a reliable method for pre-implant site enhancement. However, no standard protocols have been provided about the application of this technique. The literature database was searched for studies involving implant site enhancement by means of orthodontic extrusion. Information about the principles, indications and contraindications of this method, type of anchorage, force and time were obtained from the literature. Despite that the scarce data is largely limited to case reports and case series, implant site enhancement by means of orthodontic extrusion seems to be a promising option to improve soft and hard tissue conditions prior to implant placement. Orthodontic extrusion is being implemented as a treatment alternative to enhance hard and soft tissue prior to implant placement. While the current literature does not provide clear guidelines, the decision making for a specific approach seems to be based on the clinician's preferences. Clinical studies are needed to verify the validity of this treatment option. Copyright © 2016 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vuichard, N.; Papale, D.
In this study, exchanges of carbon, water and energy between the land surface and the atmosphere are monitored by eddy covariance technique at the ecosystem level. Currently, the FLUXNET database contains more than 500 registered sites, and up to 250 of them share data (free fair-use data set). Many modelling groups use the FLUXNET data set for evaluating ecosystem models' performance, but this requires uninterrupted time series for the meteorological variables used as input. Because original in situ data often contain gaps, from very short (few hours) up to relatively long (some months) ones, we develop a new and robustmore » method for filling the gaps in meteorological data measured at site level. Our approach has the benefit of making use of continuous data available globally (ERA-Interim) and a high temporal resolution spanning from 1989 to today. These data are, however, not measured at site level, and for this reason a method to downscale and correct the ERA-Interim data is needed. We apply this method to the level 4 data (L4) from the La Thuile collection, freely available after registration under a fair-use policy. The performance of the developed method varies across sites and is also function of the meteorological variable. On average over all sites, applying the bias correction method to the ERA-Interim data reduced the mismatch with the in situ data by 10 to 36 %, depending on the meteorological variable considered. In comparison to the internal variability of the in situ data, the root mean square error (RMSE) between the in situ data and the unbiased ERA-I (ERA-Interim) data remains relatively large (on average over all sites, from 27 to 76 % of the standard deviation of in situ data, depending on the meteorological variable considered). The performance of the method remains poor for the wind speed field, in particular regarding its capacity to conserve a standard deviation similar to the one measured at FLUXNET stations.« less
Vuichard, N.; Papale, D.
2015-07-13
In this study, exchanges of carbon, water and energy between the land surface and the atmosphere are monitored by eddy covariance technique at the ecosystem level. Currently, the FLUXNET database contains more than 500 registered sites, and up to 250 of them share data (free fair-use data set). Many modelling groups use the FLUXNET data set for evaluating ecosystem models' performance, but this requires uninterrupted time series for the meteorological variables used as input. Because original in situ data often contain gaps, from very short (few hours) up to relatively long (some months) ones, we develop a new and robustmore » method for filling the gaps in meteorological data measured at site level. Our approach has the benefit of making use of continuous data available globally (ERA-Interim) and a high temporal resolution spanning from 1989 to today. These data are, however, not measured at site level, and for this reason a method to downscale and correct the ERA-Interim data is needed. We apply this method to the level 4 data (L4) from the La Thuile collection, freely available after registration under a fair-use policy. The performance of the developed method varies across sites and is also function of the meteorological variable. On average over all sites, applying the bias correction method to the ERA-Interim data reduced the mismatch with the in situ data by 10 to 36 %, depending on the meteorological variable considered. In comparison to the internal variability of the in situ data, the root mean square error (RMSE) between the in situ data and the unbiased ERA-I (ERA-Interim) data remains relatively large (on average over all sites, from 27 to 76 % of the standard deviation of in situ data, depending on the meteorological variable considered). The performance of the method remains poor for the wind speed field, in particular regarding its capacity to conserve a standard deviation similar to the one measured at FLUXNET stations.« less
Generic protease detection technology for monitoring periodontal disease.
Zheng, Xinwei; Cook, Joseph P; Watkinson, Michael; Yang, Shoufeng; Douglas, Ian; Rawlinson, Andrew; Krause, Steffi
2011-01-01
Periodontal diseases are inflammatory conditions that affect the supporting tissues of teeth and can lead to destruction of the bone support and ultimately tooth loss if untreated. Progression of periodontitis is usually site specific but not uniform, and currently there are no accurate clinical methods for distinguishing sites where there is active disease progression from sites that are quiescent. Consequently, unnecessary and costly treatment of periodontal sites that are not progressing may occur. Three proteases have been identified as suitable markers for distinguishing sites with active disease progression and quiescent sites: human neutrophil elastase, cathepsin G and MMP8. Generic sensor materials for the detection of these three proteases have been developed based on thin dextran hydrogel films cross-linked with peptides. Degradation of the hydrogel films was monitored using impedance measurements. The target proteases were detected in the clinically relevant range within a time frame of 3 min. Good specificity for different proteases was achieved by choosing appropriate peptide cross-linkers.
Applications of fuzzy ranking methods to risk-management decisions
NASA Astrophysics Data System (ADS)
Mitchell, Harold A.; Carter, James C., III
1993-12-01
The Department of Energy is making significant improvements to its nuclear facilities as a result of more stringent regulation, internal audits, and recommendations from external review groups. A large backlog of upgrades has resulted. Currently, a prioritization method is being utilized which relies on a matrix of potential consequence and probability of occurrence. The attributes of the potential consequences considered include likelihood, exposure, public health and safety, environmental impact, site personnel safety, public relations, legal liability, and business loss. This paper describes an improved method which utilizes fuzzy multiple attribute decision methods to rank proposed improvement projects.
Agricultural areas in potentially contaminated sites: characterization, risk, management.
Vanni, Fabiana; Scaini, Federica; Beccaloni, Eleonora
2016-01-01
In Italy, the current legislation for contaminants in soils provides two land uses: residential/public or private gardens and commercial/industrial; there are not specific reference values for agricultural soils, even if a special decree has been developed and is currently going through the legislative approval process. The topic of agricultural areas is relevant, also in consideration of their presence near potentially contaminated sites. Aim and results. In this paper, contamination sources and transport modes of contaminants from sources to the target in agricultural areas are examined and a suitable "conceptual model" to define appropriate characterization methods and risk assessment procedures is proposed. These procedures have already been used by the National Institute of Health in various Italian areas characterized by different agricultural settings. Finally, specific remediation techniques are suggested to preserve soil resources and, if possible, its particular land use.
Methods to Reduce Forest Residue Volume after Timber Harvesting and Produce Black Carbon.
Page-Dumroese, Deborah S; Busse, Matt D; Archuleta, James G; McAvoy, Darren; Roussel, Eric
2017-01-01
Forest restoration often includes thinning to reduce tree density and improve ecosystem processes and function while also reducing the risk of wildfire or insect and disease outbreaks. However, one drawback of these restoration treatments is that slash is often burned in piles that may damage the soil and require further restoration activities. Pile burning is currently used on many forest sites as the preferred method for residue disposal because piles can be burned at various times of the year and are usually more controlled than broadcast burns. In many cases, fire can be beneficial to site conditions and soil properties, but slash piles, with a large concentration of wood, needles, forest floor, and sometimes mineral soil, can cause long-term damage. We describe several alternative methods for reducing nonmerchantable forest residues that will help remove excess woody biomass, minimize detrimental soil impacts, and create charcoal for improving soil organic matter and carbon sequestration.
Accurate identification of RNA editing sites from primitive sequence with deep neural networks.
Ouyang, Zhangyi; Liu, Feng; Zhao, Chenghui; Ren, Chao; An, Gaole; Mei, Chuan; Bo, Xiaochen; Shu, Wenjie
2018-04-16
RNA editing is a post-transcriptional RNA sequence alteration. Current methods have identified editing sites and facilitated research but require sufficient genomic annotations and prior-knowledge-based filtering steps, resulting in a cumbersome, time-consuming identification process. Moreover, these methods have limited generalizability and applicability in species with insufficient genomic annotations or in conditions of limited prior knowledge. We developed DeepRed, a deep learning-based method that identifies RNA editing from primitive RNA sequences without prior-knowledge-based filtering steps or genomic annotations. DeepRed achieved 98.1% and 97.9% area under the curve (AUC) in training and test sets, respectively. We further validated DeepRed using experimentally verified U87 cell RNA-seq data, achieving 97.9% positive predictive value (PPV). We demonstrated that DeepRed offers better prediction accuracy and computational efficiency than current methods with large-scale, mass RNA-seq data. We used DeepRed to assess the impact of multiple factors on editing identification with RNA-seq data from the Association of Biomolecular Resource Facilities and Sequencing Quality Control projects. We explored developmental RNA editing pattern changes during human early embryogenesis and evolutionary patterns in Drosophila species and the primate lineage using DeepRed. Our work illustrates DeepRed's state-of-the-art performance; it may decipher the hidden principles behind RNA editing, making editing detection convenient and effective.
Self-presentation 2.0: narcissism and self-esteem on Facebook.
Mehdizadeh, Soraya
2010-08-01
Online social networking sites have revealed an entirely new method of self-presentation. This cyber social tool provides a new site of analysis to examine personality and identity. The current study examines how narcissism and self-esteem are manifested on the social networking Web site Facebook.com . Self-esteem and narcissistic personality self-reports were collected from 100 Facebook users at York University. Participant Web pages were also coded based on self-promotional content features. Correlation analyses revealed that individuals higher in narcissism and lower in self-esteem were related to greater online activity as well as some self-promotional content. Gender differences were found to influence the type of self-promotional content presented by individual Facebook users. Implications and future research directions of narcissism and self-esteem on social networking Web sites are discussed.
Jia, Xianbo; Lin, Xinjian; Chen, Jichen
2017-11-02
Current genome walking methods are very time consuming, and many produce non-specific amplification products. To amplify the flanking sequences that are adjacent to Tn5 transposon insertion sites in Serratia marcescens FZSF02, we developed a genome walking method based on TAIL-PCR. This PCR method added a 20-cycle linear amplification step before the exponential amplification step to increase the concentration of the target sequences. Products of the linear amplification and the exponential amplification were diluted 100-fold to decrease the concentration of the templates that cause non-specific amplification. Fast DNA polymerase with a high extension speed was used in this method, and an amplification program was used to rapidly amplify long specific sequences. With this linear and exponential TAIL-PCR (LETAIL-PCR), we successfully obtained products larger than 2 kb from Tn5 transposon insertion mutant strains within 3 h. This method can be widely used in genome walking studies to amplify unknown sequences that are adjacent to known sequences.
Estimating climate resilience for conservation across geophysical settings.
Anderson, Mark G; Clark, Melissa; Sheldon, Arlene Olivero
2014-08-01
Conservationists need methods to conserve biological diversity while allowing species and communities to rearrange in response to a changing climate. We developed and tested such a method for northeastern North America that we based on physical features associated with ecological diversity and site resilience to climate change. We comprehensively mapped 30 distinct geophysical settings based on geology and elevation. Within each geophysical setting, we identified sites that were both connected by natural cover and that had relatively more microclimates indicated by diverse topography and elevation gradients. We did this by scoring every 405 ha hexagon in the region for these two characteristics and selecting those that scored >SD 0.5 above the mean combined score for each setting. We hypothesized that these high-scoring sites had the greatest resilience to climate change, and we compared them with sites selected by The Nature Conservancy for their high-quality rare species populations and natural community occurrences. High-scoring sites captured significantly more of the biodiversity sites than expected by chance (p < 0.0001): 75% of the 414 target species, 49% of the 4592 target species locations, and 53% of the 2170 target community locations. Calcareous bedrock, coarse sand, and fine silt settings scored markedly lower for estimated resilience and had low levels of permanent land protection (average 7%). Because our method identifies-for every geophysical setting-sites that are the most likely to retain species and functions longer under a changing climate, it reveals natural strongholds for future conservation that would also capture substantial existing biodiversity and correct the bias in current secured lands. © 2014 The Authors. Conservation Biology published by Wiley Periodicals, Inc., on behalf of the Society for Conservation Biology.
Loyola Briceno, Ana Carolina; Kawatu, Jennifer; Saul, Katie; DeAngelis, Katie; Frederiksen, Brittni; Moskosky, Susan B; Gavin, Lorrie
2017-09-01
The objective was to describe a Performance Measure Learning Collaborative (PMLC) designed to help Title X family planning grantees use new clinical performance measures for contraceptive care. Twelve Title X grantee-service site teams participated in an 8-month PMLC from November 2015 to June 2016; baseline was assessed in October 2015. Each team documented their selected best practices and strategies to improve performance, and calculated the contraceptive care performance measures at baseline and for each of the subsequent 8 months. PMLC sites implemented a mix of best practices: (a) ensuring access to a broad range of methods (n=7 sites), (b) supporting women through client-centered counseling and reproductive life planning (n=8 sites), (c) developing systems for same-day provision of all methods (n=10 sites) and (d) utilizing diverse payment options to reduce cost as a barrier (n=4 sites). Ten sites (83%) observed an increase in the clinical performance measures focused on most and moderately effective methods (MME), with a median percent change of 6% for MME (from a median of 73% at baseline to 77% post-PMLC). Evidence suggests that the PMLC model is an approach that can be used to improve the quality of contraceptive care offered to clients in some settings. Further replication of the PMLC among other groups and beyond the Title X network will help strengthen the current model through lessons learned. Using the performance measures in the context of a learning collaborative may be a useful strategy for other programs (e.g., Federally Qualified Health Centers, Medicaid, private health plans) that provide contraceptive care. Expanded use of the measures may help increase access to contraceptive care to achieve national goals for family planning. Published by Elsevier Inc.
Spatial characteristics of magnetotail reconnection
NASA Astrophysics Data System (ADS)
Genestreti, Kevin J.
We examine the properties of magnetic reconnection as it occurs in the Earth's magnetosphere, first focusing on the spatial characteristics of the near-Earth magnetotail reconnection site, then analyzing the properties of cold plasma that may affect reconnection at the dayside magnetopause. Two models are developed that empirically map the position and occurrence rate of the nightside ion diffusion region, which are based upon Geotail data (first model) and a combination of Geotail and Cluster data (second model). We use these empirical models to estimate that NASA's MMS mission will encounter the ion-scale reconnection site 11+/-4 times during its upcoming magnetotail survey phase. We also find that the occurrence of magnetotail reconnection is localized and asymmetric, with reconnection occurring most frequently at the duskside magnetotail neutral sheet near YGSM* = 5 RE. To determine the physics that governs this asymmetry and localization, we analyze the time history of the solar wind, the instantaneous properties of the magnetotail lobes and current sheet, as well as the geomagnetic activity levels, all for a larger set of Geotail and Cluster reconnection site observations. We find evidence in our own results and in the preexisting literature that localized (small DeltaY) reconnection sites initially form near YGSM* = 5 RE due to an asymmetry in the current sheet thickness. If the solar wind driving remains strong, then localized reconnection sites may expand in the +/-Y direction. The DeltaY extent of the reconnection site ap- pears to be positively correlated with the geomagnetic activity level, which is to be expected for a simplified "energy in equals energy out"-type picture of 3D reconnection. We develop two new methods for determining the temperatures of plasmas that are largely below the energy detection range of electrostatic analyzer instruments. The first method involves the direct application of a theoretical fit to the visible, high-energy portion of the distribution function. The second method for determining temperatures involves a comparison of the energy-dependent and total plasma number densities. Both methods assume an infinitely thin sheath model for space- craft charging, a Maxwellian-type plasma, and bulk velocities that are strictly governed by ExB drift, which we model with a dipole magnetic field and a Volland-Stern electric potential field. The two methods are applied to RBSP observations of the plasmasphere proper. We find positive agreement with existing measurements of the temperatures, which were based upon data from low-altitude polar orbiting spacecraft. We also find evidence for in situ heating of the plasmasphere at the equator in the ring current overlap region. Finally, we apply these techniques to a single conjunction event, where MMS and RBSP provided simultaneous and nearly continuous coverage of the plasmasphere and plume from its equatorial base to the reconnecting magnetopause. We develop scaling laws for the temperature and density of the plasmasphere as a function of geocentric distance, showing that it is heated and density depleted by factors 20 and 200 (respectively) from L = 5 to the magnetospheric side of the reconnection boundary layer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Check, C.E.; Bach, S.B.H.
1995-12-31
The contamination of air, water, and soils by a myriad of sources generates a large sample Currently, sample volume for hazardous constituent analyses is approximately half a million samples per year. The total analytical costs associated with this are astronomical. The analysis of these samples is vital in terms of assessing the types of contamination present and to what degree a site has been contaminated. The results of these analyses are very important for making an informed, knowledgeable decision as to the need for remediation and what type of remediation processes should be initiated based on site suitability vs non-actionmore » for the various sample sites. With an ever growing environmental consciousness in today`s society, the assessment and subsequent remediation of a site needs to be accomplished promptly despite the time constraints traditional methods place on such actions. In order to facilitate a rapid assessment, it is desirable to utilize instrumentation and equipment which afford the most information about a site allowing for optimization in environmental assessment while maintaining a realistic time schedule for the resulting remediation process. Because there are various types of environmental samples that can be taken at a site, different combinations of instrumentation and methods are required for assessing the level and type of contamination present whether it is in air, water, or soils. This study is limited to analyzing soil-like media that would normally fall under EPA Method 8270 which is used to analyze solid waste matrices, soils, and groundwater for semi-volatile organic compounds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, M.S.; Whaley, J.E.; McCain, W.C.
1995-12-31
Current methods used for evaluating ecological risk to vertebrate receptors have come under increasing criticisms in that they neglect factors influencing population sustainability (i.e., predator avoidance and mate recognition behavior, reproductive performance, indirect effects, etc.). Further, recent declines in the densities of species of amphibians, combined with the fact that they are most exposed to soil, sediment, and surface water contaminants indicate that amphibians are conservative indicators of environmental stress. The authors marked 20 craters at J-Field, Edgewood Area, Aberdeen Proving Ground, Maryland, which were created from either high explosives, impact, or were excavated to receive surface water runoff frommore » hazardous waste sites. Each of these sites were chosen a priori and were habitats likely to be used by amphibians. Contaminants of concern were explosives, organochlorines (PCBs, chlordane, DDE), and some metals (As, Hg, Pb, Ba, Cd, Cr). The authors compared relative abundance and reproductive performance (defined by pooled egg mass weight) to measures of contaminant concentrations. Subsequent qualitative assessments of embryo development were also made. The authors contend that these methods are valuable in that they reduce uncertainty (including effects of selection), and provide a novel, yet feasible alternative to current ecological assessment methods.« less
Caries detection: current status and future prospects using lasers
NASA Astrophysics Data System (ADS)
Longbottom, Christopher
2000-03-01
Caries detection currently occupies a good deal of attention in the arena of dental research for a number of reasons. In searching for caries detection methods with greater accuracy than conventional technique researchers have used a variety of optical methods and have increasingly turned to the use of lasers. Several laser-based methods have been and are being assessed for both imaging and disease quantification techniques. The phenomenon of fluorescence of teeth and caries in laser light and the different effects produced by different wavelengths has been investigated by a number of workers in Europe. With an argon ion laser excitation, QLF (Quantified Laser Fluorescence) demonstrated a high correlation between loss of fluorescence intensity and enamel mineral loss in white spot lesions in free smooth surface lesions, both in vitro and in vivo. Recent work with a red laser diode source (655 nm), which appears to stimulate bacterial porphyrins to fluoresce, has demonstrated that a relatively simple device based on this phenomenon can provide sensitivity and specificity values of the order of 80% in vitro and in vivo for primary caries at occlusal sites. In vitro studies using a simulated in vivo methodology indicate that the device can produce sensitivity values of the order of 90% for primary caries at approximal sites.
Analysis of munitions constituents in groundwater using a field-portable GC-MS.
Bednar, A J; Russell, A L; Hayes, C A; Jones, W T; Tackett, P; Splichal, D E; Georgian, T; Parker, L V; Kirgan, R A; MacMillan, D K
2012-05-01
The use of munitions constituents (MCs) at military installations can produce soil and groundwater contamination that requires periodic monitoring even after training or manufacturing activities have ceased. Traditional groundwater monitoring methods require large volumes of aqueous samples (e.g., 2-4 L) to be shipped under chain of custody, to fixed laboratories for analysis. The samples must also be packed on ice and shielded from light to minimize degradation that may occur during transport and storage. The laboratory's turn-around time for sample analysis and reporting can be as long as 45 d. This process hinders the reporting of data to customers in a timely manner; yields data that are not necessarily representative of current site conditions owing to the lag time between sample collection and reporting; and incurs significant shipping costs for samples. The current work compares a field portable Gas Chromatograph-Mass Spectrometer (GC-MS) for analysis of MCs on-site with traditional laboratory-based analysis using High Performance Liquid Chromatography with UV absorption detection. The field method provides near real-time (within ~1 h of sampling) concentrations of MCs in groundwater samples. Mass spectrometry provides reliable confirmation of MCs and a means to identify unknown compounds that are potential false positives for methods with UV and other non-selective detectors. Published by Elsevier Ltd.
Identifying Interactions that Determine Fragment Binding at Protein Hotspots.
Radoux, Chris J; Olsson, Tjelvar S G; Pitt, Will R; Groom, Colin R; Blundell, Tom L
2016-05-12
Locating a ligand-binding site is an important first step in structure-guided drug discovery, but current methods do little to suggest which interactions within a pocket are the most important for binding. Here we illustrate a method that samples atomic hotspots with simple molecular probes to produce fragment hotspot maps. These maps specifically highlight fragment-binding sites and their corresponding pharmacophores. For ligand-bound structures, they provide an intuitive visual guide within the binding site, directing medicinal chemists where to grow the molecule and alerting them to suboptimal interactions within the original hit. The fragment hotspot map calculation is validated using experimental binding positions of 21 fragments and subsequent lead molecules. The ligands are found in high scoring areas of the fragment hotspot maps, with fragment atoms having a median percentage rank of 97%. Protein kinase B and pantothenate synthetase are examined in detail. In each case, the fragment hotspot maps are able to rationalize a Free-Wilson analysis of SAR data from a fragment-based drug design project.
Cao, Shan; Liu, Bing; Cheng, Baozhen; Lu, Fuping; Wang, Yanping; Li, Yu
2017-01-05
The eco-friendly combination tanning process has been developed to reduce chromium in existing researches, which is based on zinc tanning agents. This can be considered as a less-chrome substitute for current tanning process. To gain deeper understanding of the binding mechanisms of zinc-collagen interaction, which are affected by tanning pH, experiments have been carried out. Analysis in this paper reveals how chemical bonds from the collagen's main function groups combine with zinc. XPS and NIR data was analyzed for further understanding of where the zinc binding sites lie on collagen fibers at different pH. The results indicate that high pH is helpful to amino-binding sites while low pH promotes carboxyl-binding sites on collagen fibers. Furthermore, from the effect of Zinc-chrome combination tanning, we can see that the new method reduces the chromium dosage in tanning process compared to the conventional chrome tanning method. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunawan, Budi; Neary, Vincent S.; Colby, Jonathan
This study demonstrates a site resource assessment to examine the temporal variation of the mean current, turbulence intensities, and power densities for a tidal energy site in the East River tidal strait. These variables were derived from two-months of acoustic Doppler velocimeter (ADV) measurements at the design hub height of the Verdant Power Gen5 hydrokinetic turbine. The study site is a tidal strait that exhibits semi-diurnal tidal current characteristics, with a mean horizontal current speed of 1.4 m s -1, and turbulence intensity of 15% at a reference mean current of 2 m s -1. Flood and ebb flow directionsmore » are nearly bi-directional, with higher current magnitude during flood tide, which skews the power production towards the flood tide period. The tidal hydrodynamics at the site are highly regular, as indicated by the tidal current time series that resembles a sinusoidal function. This study also shows that the theoretical force and power densities derived from the current measurements can significantly be influenced by the length of the time window used for averaging the current data. Furthermore, the theoretical power density at the site, derived from the current measurements, is one order of magnitude greater than that reported in the U.S. national resource assessment. As a result, this discrepancy highlights the importance of conducting site resource assessments based on measurements at the tidal energy converter device scale.« less
Gunawan, Budi; Neary, Vincent S.; Colby, Jonathan
2014-06-22
This study demonstrates a site resource assessment to examine the temporal variation of the mean current, turbulence intensities, and power densities for a tidal energy site in the East River tidal strait. These variables were derived from two-months of acoustic Doppler velocimeter (ADV) measurements at the design hub height of the Verdant Power Gen5 hydrokinetic turbine. The study site is a tidal strait that exhibits semi-diurnal tidal current characteristics, with a mean horizontal current speed of 1.4 m s -1, and turbulence intensity of 15% at a reference mean current of 2 m s -1. Flood and ebb flow directionsmore » are nearly bi-directional, with higher current magnitude during flood tide, which skews the power production towards the flood tide period. The tidal hydrodynamics at the site are highly regular, as indicated by the tidal current time series that resembles a sinusoidal function. This study also shows that the theoretical force and power densities derived from the current measurements can significantly be influenced by the length of the time window used for averaging the current data. Furthermore, the theoretical power density at the site, derived from the current measurements, is one order of magnitude greater than that reported in the U.S. national resource assessment. As a result, this discrepancy highlights the importance of conducting site resource assessments based on measurements at the tidal energy converter device scale.« less
Anderson, Cathy M.; Jackson, Jennifer; Lucy, S. Deborah; Prendergast, Monique; Sinclair, Susanne
2010-01-01
ABSTRACT Purpose: To determine current Canadian physical therapy practice for adult patients requiring routine care following cardiac surgery. Methods: A telephone survey was conducted of a selected sample (n=18) of Canadian hospitals performing cardiac surgery to determine cardiorespiratory care, mobility, exercises, and education provided to patients undergoing cardiac surgery. Results: An average of 21 cardiac surgeries per week (range: 6–42) were performed, with an average length of stay of 6.4 days (range: 4.0–10.6). Patients were seen preoperatively at 7 of 18 sites and on postoperative day 1 (POD-1) at 16 of 18 sites. On POD-1, 16 sites performed deep breathing and coughing, 7 used incentive spirometers, 13 did upper-extremity exercises, and 12 did lower-extremity exercises. Nine sites provided cardiorespiratory treatment on POD-3. On POD-1, patients were dangled at 17 sites and mobilized out of bed at 13. By POD-3, patients ambulated 50–120 m per session 2–5 times per day. Sternal precautions were variable, but the lifting limit was reported as ranging between 5 lb and 10 lb. Conclusions: Canadian physical therapists reported the provision of cardiorespiratory treatment after POD-1. According to current available evidence, this level of care may be unnecessary for uncomplicated patients following cardiac surgery. In addition, some sites provide cardiorespiratory treatment techniques that are not supported by evidence in the literature. Further research is required. PMID:21629599
NASA Astrophysics Data System (ADS)
Simoes Correa, Thiago Barreto
Scleractinian cold-water corals are widely distributed in seaways and basins of the North Atlantic Ocean, including the Straits of Florida. These corals can form extensive biogenic mounds, which are biodiversity hotspots in the deep ocean. The processes that lead to the genesis of such cold-water coral mounds and control their distribution and morphology are poorly understood. This work uses an innovative mapping approach that combines 130 km 2 of high resolution geophysical and oceanographic data collected using an Autonomous Underwater Vehicle (AUV) from five cold-water coral habitats in the Straits of Florida. These AUV data, together with ground-truthing observations from eleven submersible dives, are used to investigate fine-scale mound parameters and their relationships with environmental factors. Based on these datasets, automated methods are developed for extracting and analyzing mound morphometrics and coral cover. These analyses reveal that mound density is 14 mound/km 2 for the three surveyed sites on the toe-of-slope of Great Bahama Bank (GBB); this density is higher than previously documented (0.3 mound/km 2) in nearby mound fields. Morphometric analyses further indicate that mounds vary significantly in size, from a meter to up to 110 m in relief, and 81 to 600,000 m2 in footprint area. In addition to individual mounds, cold-water corals also develop in some areas as elongated low-relief ridges that are up to 25 m high and 2000 m long. These ridges cover approximately 60 and 70% of the mapped seafloor from the sites at the center of the Straits and at the base of the Miami Terrace, respectively. Morphometrics and current data analyses across the five surveyed fields indicate that mounds and ridges are not in alignment with the dominant current directions. These findings contradict previous studies that described streamlined mounds parallel to the northward Florida Current. In contrast, this study shows that the sites dominated by coral ridges are influenced by unidirectional flowing current, whereas the mounds on the GBB slope are influenced by tidal current regime. The GBB mounds also experience higher sedimentation rates relative to the sites away from the GBB slope. Sub-surface data document partially or completely buried mounds on the GBB sites. The sediments burying mounds are off-bank material transported downslope by mass gravity flow. Mass gravity transport creates complex slope architecture on the toe-of-slope of GBB, with canyons, slump scars, and gravity flow deposits. Cold-water corals use all three of these features as location for colonization. Coral mounds growing on such pre-existing topography keep up with off-bank sedimentation. In contrast, away from the GBB slope, off-bank sedimentation is absent and coral ridges grow independently of antecedent topography. In the sediment-starved Miami Terrace site, coral ridge initiation is related to a cemented mid-Miocene unconformity. In the center of the Straits, coral ridges and knobs develop over an unconsolidated sand sheet at the tail of the Pourtales drift. Coral features at the Miami Terrace and center of the Straits have intricate morphologies, including waveform and chevron-like ridges, which result from asymmetrical coral growth. Dense coral frameworks and living coral colonies grow preferentially on the current-facing ridge sides in order to optimize food particle capture, whereas coral rubble and mud-sized sediments accumulate in the ridge leesides. Finally, this study provides a method using solely acoustic data for discriminating habitats in which cold-water corals are actively growing. Results from this method can guide future research on and management of cold-water coral ecosystems. Taken together, spatial quantitative analyses of the large-scale, high-resolution integrated surveys indicate that cold-water coral habitats in the Straits of Florida: (1) are significantly more diverse and abundant than previously thought, and (2) can be influenced in their distribution and development by current regime, sedimentation, and/or antecedent topography.
Eisenhower National Historic Site visitor transportation and access study
DOT National Transportation Integrated Search
2017-11-01
This study evaluates the current shuttle system and Eisenhower National Historic Site, which is currently the sole access to the site. Visitation at Eisenhower has been declining since the site opened, and the study looks at the impacts of this trend...
A model-based method for estimating Ca2+ release fluxes from linescan images in Xenopus oocytes.
Baran, Irina; Popescu, Anca
2009-09-01
We propose a model-based method of interpreting linescan images observed in Xenopus oocytes with the use of Oregon Green-1 as a fluorescent dye. We use a detailed modeling formalism based on numerical simulations that incorporate physical barriers for local diffusion, and, by assuming a Gaussian distribution of release durations, we derive the distributions of release Ca(2+) amounts and currents, fluorescence amplitudes, and puff widths. We analyze a wide set of available data collected from 857 and 281 events observed in the animal and the vegetal hemispheres of the oocyte, respectively. A relatively small fraction of events appear to involve coupling of two or three adjacent clusters of Ca(2+) releasing channels. In the animal hemisphere, the distribution of release currents with a mean of 1.4 pA presents a maximum at 1.0 pA and a rather long tail extending up to 5 pA. The overall distribution of liberated Ca(2+) amounts exhibits a dominant peak at 120 fC, a smaller peak at 375 fC, and an average of 166 fC. Ca(2+) amounts and release fluxes in the vegetal hemisphere appear to be 3.6 and 1.6 times smaller than in the animal hemisphere, respectively. Predicted diameters of elemental release sites are approximately 1.0 microm in the animal and approximately 0.5 microm in the vegetal hemisphere, but the side-to-side separation between adjacent sites appears to be identical (approximately 0.4 microm). By fitting the model to individual puffs we can estimate the quantity of liberated calcium, the release current, the orientation of the scan line, and the dimension of the corresponding release site.
Pierce, M L; Ruffner, D E
1998-01-01
Antisense-mediated gene inhibition uses short complementary DNA or RNA oligonucleotides to block expression of any mRNA of interest. A key parameter in the success or failure of an antisense therapy is the identification of a suitable target site on the chosen mRNA. Ultimately, the accessibility of the target to the antisense agent determines target suitability. Since accessibility is a function of many complex factors, it is currently beyond our ability to predict. Consequently, identification of the most effective target(s) requires examination of every site. Towards this goal, we describe a method to construct directed ribozyme libraries against any chosen mRNA. The library contains nearly equal amounts of ribozymes targeting every site on the chosen transcript and the library only contains ribozymes capable of binding to that transcript. Expression of the ribozyme library in cultured cells should allow identification of optimal target sites under natural conditions, subject to the complexities of a fully functional cell. Optimal target sites identified in this manner should be the most effective sites for therapeutic intervention. PMID:9801305
Under the current regulations (CFR 503), Class B biosolids may be land applied with certain site restrictions. One method for achieving Class B status is to raise the pH of the sludge to >12 for a minimum of 2 hours with an alkaline material (normally lime). Alternately, a Clas...
Risk assessment is a crucial component of the site remediation decision-making process. Some current EPA methods do not have detection limits low enough for risk assessment of many VOCs (e.g., EPA Region 3 Risk Based Concentration levels, EPA Region 9 Preliminary Remediation Goa...
Fragment-Based Approaches to Enhance GTP Competitive KRAS G12C Inhibitors
During the current period we completed work on a series of guanine nucleotide mimetics and published results. As part of this we developed and...reported a novel method of measuring small molecule binding to KRAS G12C active site. We also published 2 additional manuscripts about KRAS G12C directed
1997-04-30
Currently there are no systems available which allow for economical and accurate subsurface imaging of remediation sites. In some cases, high...system to address this need. This project has been very successful in showing a promising new direction for high resolution subsurface imaging . Our
Conceptual assessment framework for forested wetland restoration: the Pen Branch experience
Randy K. Kolka; E. A. Nelson; C. C. Trettin
2000-01-01
Development of an assessment framework and associated indicators that can be used to evaluate the effectiveness of a wetland restoration is critical to demonstrating the sustainability of restored sites. Current wetland restoration assessment techniques such as the index of biotic integrity (IBI) or the hydrogeomorphic method (HGM) generally focus on either the biotic...
Using nocturnal cold air drainage flow to monitor ecosystem processes in complex terrain
Thomas G. Pypker; Michael H. Unsworth; Alan C. Mix; William Rugh; Troy Ocheltree; Karrin Alstad; Barbara J. Bond
2007-01-01
This paper presents initial investigations of a new approach to monitor ecosystem processes in complex terrain on large scales. Metabolic processes in mountainous ecosystems are poorly represented in current ecosystem monitoring campaigns because the methods used for monitoring metabolism at the ecosystem scale (e.g., eddy covariance) require flat study sites. Our goal...
NASA Astrophysics Data System (ADS)
Wu, S.; Romanak, K.; Yang, C.
2009-12-01
We report the development of two methods for subsurface monitoring of CO2 in both air and water phases at sequestration sites. The first method is based on line-of-sight (LOS) tunable laser spectroscopy. Funded by DOE, we demonstrated the Phase Insensitive Two Tone Frquency Modulation spectroscopy (PITTFM). FM reduces low frequency noise in the beam path due to scintillations; while the PI design gives the ease of installation. We demonstrated measurement over 1 mile distance with an accuracy of 3ppm of CO2 in normal air. Built-in switches shoot the laser beam into multi-directions, thus forming a cellular monitoring network covering 10 km^2. The system cost is under $100K, and COTS telecom components guarantee the reliability in the field over decades. Software will log the data and translate the 2D CO2 profile. When coupled with other parameters, it will be able to locate the point and rate of leakages. Field tests at SECARB sequestration site are proposed. The system also monitors other green house gases (GHG), e.g. CH4, which is also needed where EOR is pursued along with CO2 sequestration. Figures 1 through 2 give the results of this method. The second method is based on the latest technology advances in quantum cascade lasers (QCLs). The current state of the art technology to measure Total/Dissolved Inorganic Carbon (TIC/DIC) in water is menometer. Menometer is both time consuming and costly, and could not be used underground, i.e. high pressure and temperature. We propose to use high brightness QC lasers to extend the current Mid-IR optical path from 30 microns to over 500microns, thus providing the possibility to measure CO2 dissoveled (Aqueous phase) with an accuracy of 0.2mg/Liter. Preliminary results will be presented.
Detection of Influenza A viruses at migratory bird stopover sites in Michigan, USA.
Lickfett, Todd M; Clark, Erica; Gehring, Thomas M; Alm, Elizabeth W
2018-01-01
Introduction: Influenza A viruses have the potential to cause devastating illness in humans and domestic poultry. Wild birds are the natural reservoirs of Influenza A viruses and migratory birds are implicated in their global dissemination. High concentrations of this virus are excreted in the faeces of infected birds and faecal contamination of shared aquatic habitats can lead to indirect transmission among birds via the faecal-oral route. The role of migratory birds in the spread of avian influenza has led to large-scale surveillance efforts of circulating avian influenza viruses through direct sampling of live and dead wild birds. Environmental monitoring of bird habitats using molecular detection methods may provide additional information on the persistence of influenza virus at migratory stopover sites distributed across large spatial scales. Materials and methods: In the current study, faecal and water samples were collected at migratory stopover sites and evaluated for Influenza A by real-time quantitative reverse transcriptase PCR. Results and Discussion: This study found that Influenza A was detected at 53% of the evaluated stopover sites, and 7% and 4.8% of the faecal and water samples, respectively, tested positive for Influenza A virus. Conclusion: Environmental monitoring detected Influenza A at stopover sites used by migratory birds.
NASA Astrophysics Data System (ADS)
Bour, O.; Le Borgne, T.; Longuevergne, L.; Lavenant, N.; Jimenez-Martinez, J.; De Dreuzy, J. R.; Schuite, J.; Boudin, F.; Labasque, T.; Aquilina, L.
2014-12-01
Characterizing the hydraulic properties of heterogeneous and complex aquifers often requires field scale investigations at multiple space and time scales to better constrain hydraulic property estimates. Here, we present and discuss results from the site of Ploemeur (Brittany, France) where complementary hydrological and geophysical approaches have been combined to characterize the hydrogeological functioning of this highly fractured crystalline rock aquifer. In particular, we show how cross-borehole flowmeter tests, pumping tests and frequency domain analysis of groundwater levels allow quantifying the hydraulic properties of the aquifer at different scales. In complement, we used groundwater temperature as an excellent tracer for characterizing groundwater flow. At the site scale, measurements of ground surface deformation through long-base tiltmeters provide robust estimates of aquifer storage and allow identifying the active structures where groundwater pressure changes occur, including those acting during recharge process. Finally, a numerical model of the site that combines hydraulic data and groundwater ages confirms the geometry of this complex aquifer and the consistency of the different datasets. The Ploemeur site, which has been used for water supply at a rate of about 106 m3 per year since 1991, belongs to the French network of hydrogeological sites H+ and is currently used for monitoring groundwater changes and testing innovative field methods.
2012-01-01
Background High quality program data is critical for managing, monitoring, and evaluating national HIV treatment programs. By 2009, the Malawi Ministry of Health had initiated more than 270,000 patients on HIV treatment at 377 sites. Quarterly supervision of these antiretroviral therapy (ART) sites ensures high quality care, but the time currently dedicated to exhaustive record review and data cleaning detracts from other critical components. The exhaustive record review is unlikely to be sustainable long term because of the resources required and increasing number of patients on ART. This study quantifies the current levels of data quality and evaluates Lot Quality Assurance Sampling (LQAS) as a tool to prioritize sites with low data quality, thus lowering costs while maintaining sufficient quality for program monitoring and patient care. Methods In January 2010, a study team joined supervision teams at 19 sites purposely selected to reflect the variety of ART sites. During the exhaustive data review, the time allocated to data cleaning and data discrepancies were documented. The team then randomly sampled 76 records from each site, recording secondary outcomes and the time required for sampling. Results At the 19 sites, only 1.2% of records had discrepancies in patient outcomes and 0.4% in treatment regimen. However, data cleaning took 28.5 hours in total, suggesting that data cleaning for all 377 ART sites would require over 350 supervision-hours quarterly. The LQAS tool accurately identified the sites with the low data quality, reduced the time for data cleaning by 70%, and allowed for reporting on secondary outcomes. Conclusions Most sites maintained high quality records. In spite of this, data cleaning required significant amounts of time with little effect on program estimates of patient outcomes. LQAS conserves resources while maintaining sufficient data quality for program assessment and management to allow for quality patient care. PMID:22776745
Bender, Kelly S; Rice, Melissa R; Fugate, William H; Coates, John D; Achenbach, Laurie A
2004-09-01
Natural attenuation of the environmental contaminant perchlorate is a cost-effective alternative to current removal methods. The success of natural perchlorate remediation is dependent on the presence and activity of dissimilatory (per)chlorate-reducing bacteria (DPRB) within a target site. To detect DPRB in the environment, two degenerate primer sets targeting the chlorite dismutase (cld) gene were developed and optimized. A nested PCR approach was used in conjunction with these primer sets to increase the sensitivity of the molecular detection method. Screening of environmental samples indicated that all products amplified by this method were cld gene sequences. These sequences were obtained from pristine sites as well as contaminated sites from which DPRB were isolated. More than one cld phylotype was also identified from some samples, indicating the presence of more than one DPRB strain at those sites. The use of these primer sets represents a direct and sensitive molecular method for the qualitative detection of (per)chlorate-reducing bacteria in the environment, thus offering another tool for monitoring natural attenuation. Sequences of cld genes isolated in the course of this project were also generated from various DPRB and provided the first opportunity for a phylogenetic treatment of this metabolic gene. Comparisons of the cld and 16S ribosomal DNA (rDNA) gene trees indicated that the cld gene does not track 16S rDNA phylogeny, further implicating the possible role of horizontal transfer in the evolution of (per)chlorate respiration.
2013-01-01
Background Currently it is uncertain how to define osteoporosis and who to treat after a hip fracture. There is little to support the universal treatment of all such patients but how to select those most in need of treatment is not clear. In this study we have compared cortical and trabecular bone status between patients with spinal fractures and those with hip fracture with or without spinal fracture with the aim to begin to identify, by a simple clinical method (spine x-ray), a group of hip fracture patients likely to be more responsive to treatment with current antiresorptive agents. Methods Comparison of convenience samples of three groups of 50 patients, one with spinal fractures, one with a hip fracture, and one with both. Measurements consist of bone mineral density at the lumbar spine, at the four standard hip sites, number, distribution and severity of spinal fractures by the method of Genant, cortical bone thickness at the infero-medial femoral neck site, femoral neck and axis length and femoral neck width. Results Patients with spinal fractures alone have the most deficient bones at both trabecular and cortical sites: those with hip fracture and no spinal fractures the best at trabecular bone and most cortical bone sites: and those with both hip and spinal fractures intermediate in most measurements. Hip axis length and neck width did not differ between groups. Conclusion The presence of the spinal fracture indicates poor trabecular bone status in hip fracture patients. Hip fracture patients without spinal fractures have a bone mass similar to the reference range for their age and gender. Poor trabecular bone in hip fracture patients may point to a category of patient more likely to benefit from therapy and may be indicated by the presence of spinal fractures. PMID:23432767
Rajamani, Sripriya; Roche, Erin; Soderberg, Karen; Bieringer, Aaron
2014-01-01
Background: Immunization information systems (IIS) operate in an evolving health care landscape with technology changes driven by initiatives such as the Centers for Medicare and Medicaid Services EHR incentive program, promoting adoption and use of electronic health record (EHR) systems, including standards-based public health reporting. There is flux in organizational affiliations to support models such as accountable care organizations (ACO). These impact institutional structure of how reporting of immunizations occurs and the methods adopted. Objectives: To evaluate the technical and organizational characteristics of healthcare provider reporting of immunizations to public health in Minnesota and to assess the adoption of standardized codes, formats and transport. Methods: Data on organizations and reporting status was obtained from Minnesota IIS (Minnesota Immunization Information Connection: MIIC) by collating information from existing lists, specialized queries and review of annual reports. EHR adoption data of clinics was obtained in collaboration with informatics office supporting the Minnesota e-Health Initiative. These data from various sources were merged, checked for quality to create a current state assessment of immunization reporting and results validated with subject matter experts. Results: Standards-based reporting of immunizations to MIIC increased to 708 sites over the last 3 years. A growth in automated real-time reporting occurred in 2013 with 143 new sites adopting the method. Though the uptake of message standards (HL7) has increased, the adoption of current version of HL7 and web services transport remains low. The EHR landscape is dominated by a single vendor (used by 40% of clinics) in the state. There is trend towards centralized reporting of immunizations with an organizational unit reporting for many sites ranging from 4 to 140 sites. Conclusion: High EHR adoption in Minnesota, predominance of a vendor in the market, and centralized reporting models present opportunities for better interoperability and also adaptation of strategies to fit this landscape. It is essential for IIS managers to have a good understanding of their constituent landscape for technical assistance and program planning purposes. PMID:25598866
Frederick, Thomas E; Peng, Jeffrey W
2018-01-01
Increasing evidence shows that active sites of proteins have non-trivial conformational dynamics. These dynamics include active site residues sampling different local conformations that allow for multiple, and possibly novel, inhibitor binding poses. Yet, active site dynamics garner only marginal attention in most inhibitor design efforts and exert little influence on synthesis strategies. This is partly because synthesis requires a level of atomic structural detail that is frequently missing in current characterizations of conformational dynamics. In particular, while the identity of the mobile protein residues may be clear, the specific conformations they sample remain obscure. Here, we show how an appropriate choice of ligand can significantly sharpen our abilities to describe the interconverting binding poses (conformations) of protein active sites. Specifically, we show how 2-(2'-carboxyphenyl)-benzoyl-6-aminopenicillanic acid (CBAP) exposes otherwise hidden dynamics of a protein active site that binds β-lactam antibiotics. When CBAP acylates (binds) the active site serine of the β-lactam sensor domain of BlaR1 (BlaRS), it shifts the time scale of the active site dynamics to the slow exchange regime. Slow exchange enables direct characterization of inter-converting protein and bound ligand conformations using NMR methods. These methods include chemical shift analysis, 2-d exchange spectroscopy, off-resonance ROESY of the bound ligand, and reduced spectral density mapping. The active site architecture of BlaRS is shared by many β-lactamases of therapeutic interest, suggesting CBAP could expose functional motions in other β-lactam binding proteins. More broadly, CBAP highlights the utility of identifying chemical probes common to structurally homologous proteins to better expose functional motions of active sites.
The applications of nanotechnology in food industry.
Rashidi, Ladan; Khosravi-Darani, Kianoush
2011-09-01
Nanotechnology has the potential of application in the food industry and processing as new tools for pathogen detection, disease treatment delivery systems, food packaging, and delivery of bioactive compounds to target sites. The application of nanotechnology in food systems will provide new methods to improve safety and the nutritional value of food products. This article will review the current advances of applications of nanotechnology in food science and technology. Also, it describes new current food laws for nanofood and novel articles in the field of risk assessment of using nanotechnology in the food industry.
Meallem, Ilana; Garb, Yaakov; Cwikel, Julie
2010-01-01
The Bedouin of the Negev region of Israel are a formerly nomadic, indigenous, ethnic minority, of which 40% currently live in unrecognized villages without organized, solid waste disposal. This study, using both quantitative and qualitative methods, explored the transition from traditional rubbish production and disposal to current uses, the current composition of rubbish, methods of waste disposal, and the extent of exposure to waste-related environmental hazards in the village of Um Batim. The modern, consumer lifestyle produced both residential and construction waste that was dumped very close to households. Waste was tended to by women who predominantly used backyard burning for disposal, exposing villagers to corrosive, poisonous, and dangerously flammable items at these burn sites. Village residents expressed a high level of concern over environmental hazards, yet no organized waste disposal or environmental hazards reduction was implemented.
Classifying aerosol type using in situ surface spectral aerosol optical properties
NASA Astrophysics Data System (ADS)
Schmeisser, Lauren; Andrews, Elisabeth; Ogren, John A.; Sheridan, Patrick; Jefferson, Anne; Sharma, Sangeeta; Kim, Jeong Eun; Sherman, James P.; Sorribas, Mar; Kalapov, Ivo; Arsov, Todor; Angelov, Christo; Mayol-Bracero, Olga L.; Labuschagne, Casper; Kim, Sang-Woo; Hoffer, András; Lin, Neng-Huei; Chia, Hao-Ping; Bergin, Michael; Sun, Junying; Liu, Peng; Wu, Hao
2017-10-01
Knowledge of aerosol size and composition is important for determining radiative forcing effects of aerosols, identifying aerosol sources and improving aerosol satellite retrieval algorithms. The ability to extrapolate aerosol size and composition, or type, from intensive aerosol optical properties can help expand the current knowledge of spatiotemporal variability in aerosol type globally, particularly where chemical composition measurements do not exist concurrently with optical property measurements. This study uses medians of the scattering Ångström exponent (SAE), absorption Ångström exponent (AAE) and single scattering albedo (SSA) from 24 stations within the NOAA/ESRL Federated Aerosol Monitoring Network to infer aerosol type using previously published aerosol classification schemes.Three methods are implemented to obtain a best estimate of dominant aerosol type at each station using aerosol optical properties. The first method plots station medians into an AAE vs. SAE plot space, so that a unique combination of intensive properties corresponds with an aerosol type. The second typing method expands on the first by introducing a multivariate cluster analysis, which aims to group stations with similar optical characteristics and thus similar dominant aerosol type. The third and final classification method pairs 3-day backward air mass trajectories with median aerosol optical properties to explore the relationship between trajectory origin (proxy for likely aerosol type) and aerosol intensive parameters, while allowing for multiple dominant aerosol types at each station.The three aerosol classification methods have some common, and thus robust, results. In general, estimating dominant aerosol type using optical properties is best suited for site locations with a stable and homogenous aerosol population, particularly continental polluted (carbonaceous aerosol), marine polluted (carbonaceous aerosol mixed with sea salt) and continental dust/biomass sites (dust and carbonaceous aerosol); however, current classification schemes perform poorly when predicting dominant aerosol type at remote marine and Arctic sites and at stations with more complex locations and topography where variable aerosol populations are not well represented by median optical properties. Although the aerosol classification methods presented here provide new ways to reduce ambiguity in typing schemes, there is more work needed to find aerosol typing methods that are useful for a larger range of geographic locations and aerosol populations.
Collins, Brian D.; Corbett, Skye C.; Sankey, Joel B.; Fairley, Helen C.
2014-01-01
Along the Colorado River corridor between Glen Canyon Dam and Lees Ferry, Arizona, located some 25 km downstream from the dam, archaeological sites dating from 8,000 years before present through the modern era are located within and on top of fluvial and alluvial terraces of the prehistorically undammed river. These terraces are known to have undergone significant erosion and retreat since emplacement of Glen Canyon Dam in 1963. Land managers and policy makers associated with managing the flow of the Colorado River are interested in understanding how the operations of Glen Canyon Dam have affected the archeological sites associated with these terraces and how dam-controlled flows currently interact with other landscape-shaping processes. In 2012, the U.S. Geological Survey initiated a research project in Glen Canyon to study the types and causes of erosion of the terraces. This report provides the first step towards this understanding by presenting comparative analyses on several types of high-resolution topographic data (airborne lidar, terrestrial lidar, and airborne photogrammetry) that can be used in the future to document and analyze changes to terrace-based archaeological sites. Herein, we present topographic and geomorphologic data of four archaeological sites within a 14 km segment of Glen Canyon using each of the three data sources. In addition to comparing each method’s suitability for adequately representing the topography of the sites, we also analyze the data within each site’s context and describe the geomorphological processes responsible for erosion. Our results show that each method has its own strengths and weaknesses, and that terrestrial and airborne lidar are essentially interchangeable for many important topographic characterization and monitoring purposes. However, whereas terrestrial lidar provides enhanced capacity for feature recognition and gully morphology delineation, airborne methods (whether by way of laser or optical sensors) are better suited for reach- and regional-scale mapping. Our site-specific geomorphic analyses of the four archeological sites indicate that their current topographical conditions are a result of different and sometimes competing erosional agents, including bedrock- and terrace-based overland flow, fluvial-induced terrace bank collapse, and alluvial-fan-generated debris flows. Although the influences of anthropogenic-induced erosion from dam operations are not specifically analyzed in this report, we do identify geomorphic settings where dam operations are either more or less likely to affect archeological site stability. This information can be used to assist with future monitoring efforts of these sites and identification of similar conditions for other archeological sites along the Colorado River corridor in Glen Canyon.
Mixtures and their risk assessment in toxicology.
Mumtaz, Moiz M; Hansen, Hugh; Pohl, Hana R
2011-01-01
For communities generally and for persons living in the vicinity of waste sites specifically, potential exposures to chemical mixtures are genuine concerns. Such concerns often arise from perceptions of a site's higher than anticipated toxicity due to synergistic interactions among chemicals. This chapter outlines some historical approaches to mixtures risk assessment. It also outlines ATSDR's current approach to toxicity risk assessment. The ATSDR's joint toxicity assessment guidance for chemical mixtures addresses interactions among components of chemical mixtures. The guidance recommends a series of steps that include simple calculations for a systematic analysis of data leading to conclusions regarding any hazards chemical mixtures might pose. These conclusions can, in turn, lead to recommendations such as targeted research to fill data gaps, development of new methods using current science, and health education to raise awareness of residents and health care providers. The chapter also provides examples of future trends in chemical mixtures assessment.
A tale of two sequences: microRNA-target chimeric reads.
Broughton, James P; Pasquinelli, Amy E
2016-04-04
In animals, a functional interaction between a microRNA (miRNA) and its target RNA requires only partial base pairing. The limited number of base pair interactions required for miRNA targeting provides miRNAs with broad regulatory potential and also makes target prediction challenging. Computational approaches to target prediction have focused on identifying miRNA target sites based on known sequence features that are important for canonical targeting and may miss non-canonical targets. Current state-of-the-art experimental approaches, such as CLIP-seq (cross-linking immunoprecipitation with sequencing), PAR-CLIP (photoactivatable-ribonucleoside-enhanced CLIP), and iCLIP (individual-nucleotide resolution CLIP), require inference of which miRNA is bound at each site. Recently, the development of methods to ligate miRNAs to their target RNAs during the preparation of sequencing libraries has provided a new tool for the identification of miRNA target sites. The chimeric, or hybrid, miRNA-target reads that are produced by these methods unambiguously identify the miRNA bound at a specific target site. The information provided by these chimeric reads has revealed extensive non-canonical interactions between miRNAs and their target mRNAs, and identified many novel interactions between miRNAs and noncoding RNAs.
Surveying the Commons: Current Implementation of Information Commons Web sites
ERIC Educational Resources Information Center
Leeder, Christopher
2009-01-01
This study assessed the content of 72 academic library Information Commons (IC) Web sites using content analysis, quantitative assessment and qualitative surveys of site administrators to analyze current implementation by the academic library community. Results show that IC Web sites vary widely in content, design and functionality, with few…
megaTALs: a rare-cleaving nuclease architecture for therapeutic genome engineering.
Boissel, Sandrine; Jarjour, Jordan; Astrakhan, Alexander; Adey, Andrew; Gouble, Agnès; Duchateau, Philippe; Shendure, Jay; Stoddard, Barry L; Certo, Michael T; Baker, David; Scharenberg, Andrew M
2014-02-01
Rare-cleaving endonucleases have emerged as important tools for making targeted genome modifications. While multiple platforms are now available to generate reagents for research applications, each existing platform has significant limitations in one or more of three key properties necessary for therapeutic application: efficiency of cleavage at the desired target site, specificity of cleavage (i.e. rate of cleavage at 'off-target' sites), and efficient/facile means for delivery to desired target cells. Here, we describe the development of a single-chain rare-cleaving nuclease architecture, which we designate 'megaTAL', in which the DNA binding region of a transcription activator-like (TAL) effector is used to 'address' a site-specific meganuclease adjacent to a single desired genomic target site. This architecture allows the generation of extremely active and hyper-specific compact nucleases that are compatible with all current viral and nonviral cell delivery methods.
LASIC: Light Activated Site-Specific Conjugation of Native IgGs.
Hui, James Z; Tamsen, Shereen; Song, Yang; Tsourkas, Andrew
2015-08-19
Numerous biological applications, from diagnostic assays to immunotherapies, rely on the use of antibody-conjugates. The efficacy of these conjugates can be significantly influenced by the site at which Immunoglobulin G (IgG) is modified. Current methods that provide control over the conjugation site, however, suffer from a number of shortfalls and often require large investments of time and cost. We have developed a novel adapter protein that, when activated by long wavelength UV light, can covalently and site-specifically label the Fc region of nearly any native, full-length IgG, including all human IgG subclasses. Labeling occurs with unprecedented efficiency and speed (>90% after 30 min), with no effect on IgG affinity. The adapter domain can be bacterially expressed and customized to contain a variety of moieties (e.g., biotin, azide, fluorophores), making reliable and efficient conjugation of antibodies widely accessible to researchers at large.
Brady, Amie M.G.; Bushon, Rebecca N.; Plona, Meg B.
2009-01-01
The Cuyahoga River within Cuyahoga Valley National Park (CVNP) in Ohio is often impaired for recreational use because of elevated concentrations of bacteria, which are indicators of fecal contamination. During the recreational seasons (May through August) of 2004 through 2007, samples were collected at two river sites, one upstream of and one centrally-located within CVNP. Bacterial concentrations and turbidity were determined, and streamflow at time of sampling and rainfall amounts over the previous 24 hours prior to sampling were ascertained. Statistical models to predict Escherichia coli (E. coli) concentrations were developed for each site (with data from 2004 through 2006) and tested during an independent year (2007). At Jaite, a sampling site near the center of CVNP, the predictive model performed better than the traditional method of determining the current day's water quality using the previous day's E. coli concentration. During 2007, the Jaite model, based on turbidity, produced more correct responses (81 percent) and fewer false negatives (3.2 percent) than the traditional method (68 and 26 percent, respectively). At Old Portage, a sampling site just upstream from CVNP, a predictive model with turbidity and rainfall as explanatory variables did not perform as well as the traditional method. The Jaite model was used to estimate water quality at three other sites in the park; although it did not perform as well as the traditional method, it performed well - yielding between 68 and 91 percent correct responses. Further research would be necessary to determine whether using the Jaite model to predict recreational water quality elsewhere on the river would provide accurate results.
Bridge scour countermeasure assessments at select bridges in the United States, 2014–16
Dudunake, Taylor J.; Huizinga, Richard J.; Fosness, Ryan L.
2017-05-23
In 2009, the Federal Highway Administration published Hydraulic Engineering Circular No. 23 (HEC-23) to provide specific design and implementation guidelines for bridge scour and stream instability countermeasures. However, the effectiveness of countermeasures implemented over the past decade following those guidelines has not been evaluated. Therefore, in 2013, the U.S. Geological Survey, in cooperation with the Federal Highway Administration, began a study to assess the current condition of bridge-scour countermeasures at selected sites to evaluate their effectiveness. Bridge-scour countermeasures were assessed during 2014-2016. Site assessments included reviewing countermeasure design plans, summarizing the peak and daily streamflow history, and assessments at each site. Each site survey included a photo log summary, field form, and topographic and bathymetric geospatial data and metadata. This report documents the study area and site-selection criteria, explains the survey methods used to evaluate the condition of countermeasures, and presents the complete documentation for each countermeasure assessment.
Accurate and sensitive quantification of protein-DNA binding affinity.
Rastogi, Chaitanya; Rube, H Tomas; Kribelbauer, Judith F; Crocker, Justin; Loker, Ryan E; Martini, Gabriella D; Laptenko, Oleg; Freed-Pastor, William A; Prives, Carol; Stern, David L; Mann, Richard S; Bussemaker, Harmen J
2018-04-17
Transcription factors (TFs) control gene expression by binding to genomic DNA in a sequence-specific manner. Mutations in TF binding sites are increasingly found to be associated with human disease, yet we currently lack robust methods to predict these sites. Here, we developed a versatile maximum likelihood framework named No Read Left Behind (NRLB) that infers a biophysical model of protein-DNA recognition across the full affinity range from a library of in vitro selected DNA binding sites. NRLB predicts human Max homodimer binding in near-perfect agreement with existing low-throughput measurements. It can capture the specificity of the p53 tetramer and distinguish multiple binding modes within a single sample. Additionally, we confirm that newly identified low-affinity enhancer binding sites are functional in vivo, and that their contribution to gene expression matches their predicted affinity. Our results establish a powerful paradigm for identifying protein binding sites and interpreting gene regulatory sequences in eukaryotic genomes. Copyright © 2018 the Author(s). Published by PNAS.
Accurate and sensitive quantification of protein-DNA binding affinity
Rastogi, Chaitanya; Rube, H. Tomas; Kribelbauer, Judith F.; Crocker, Justin; Loker, Ryan E.; Martini, Gabriella D.; Laptenko, Oleg; Freed-Pastor, William A.; Prives, Carol; Stern, David L.; Mann, Richard S.; Bussemaker, Harmen J.
2018-01-01
Transcription factors (TFs) control gene expression by binding to genomic DNA in a sequence-specific manner. Mutations in TF binding sites are increasingly found to be associated with human disease, yet we currently lack robust methods to predict these sites. Here, we developed a versatile maximum likelihood framework named No Read Left Behind (NRLB) that infers a biophysical model of protein-DNA recognition across the full affinity range from a library of in vitro selected DNA binding sites. NRLB predicts human Max homodimer binding in near-perfect agreement with existing low-throughput measurements. It can capture the specificity of the p53 tetramer and distinguish multiple binding modes within a single sample. Additionally, we confirm that newly identified low-affinity enhancer binding sites are functional in vivo, and that their contribution to gene expression matches their predicted affinity. Our results establish a powerful paradigm for identifying protein binding sites and interpreting gene regulatory sequences in eukaryotic genomes. PMID:29610332
Integrating Building Information Modeling and Health and Safety for Onsite Construction
Ganah, Abdulkadir; John, Godfaurd A.
2014-01-01
Background Health and safety (H&S) on a construction site can either make or break a contractor, if not properly managed. The usage of Building Information Modeling (BIM) for H&S on construction execution has the potential to augment practitioner understanding of their sites, and by so doing reduce the probability of accidents. This research explores BIM usage within the construction industry in relation to H&S communication. Methods In addition to an extensive literature review, a questionnaire survey was conducted to gather information on the embedment of H&S planning with the BIM environment for site practitioners. Results The analysis of responses indicated that BIM will enhance the current approach of H&S planning for construction site personnel. Conclusion From the survey, toolbox talk will have to be integrated with the BIM environment, because it is the predominantly used procedure for enhancing H&S issues within construction sites. The advantage is that personnel can visually understand H&S issues as work progresses during the toolbox talk onsite. PMID:25830069
Li, Jingkun; Alsudairi, Amell; Ma, Zi-Feng; Mukerjee, Sanjeev; Jia, Qingying
2017-02-01
Proper understanding of the major limitations of current catalysts for oxygen reduction reaction (ORR) is essential for further advancement. Herein by studying representative Pt and non-Pt ORR catalysts with a wide range of redox potential (E redox ) via combined electrochemical, theoretical, and in situ spectroscopic methods, we demonstrate that the role of the site-blocking effect in limiting the ORR varies drastically depending on the E redox of active sites; and the intrinsic activity of active sites with low E redox have been markedly underestimated owing to the overlook of this effect. Accordingly, we establish a general asymmetric volcano trend in the ORR activity: the ORR of the catalysts on the overly high E redox side of the volcano is limited by the intrinsic activity; whereas the ORR of the catalysts on the low E redox side is limited by either the site-blocking effect and/or intrinsic activity depending on the E redox .
The design and construction of an interactive website concerning biomedical photography.
Williams, Robin; Williams, Gigi
2003-06-01
The purpose of this communication is to make readers aware of what the authors believe is an important online resource about medical and scientific photography for doctors, scientists and students. It is a website freely accessible and its URL is http://msp.rmit.edu.au. The site is designed as a resource base: it is not meant to be a 'course' but the reader will find much practical information about technique and applications of scientific imaging methods. The site is currently a comprehensive collection of resources relating to invisible radiation photography but there are plans to expand the site to a range of clinical recording topics, and other potential contributors are asked to join the project. It contains a vast collection of photographs from many photographers as well as graphs, diagrams, tables and references. This paper also discusses some of the important issues surrounding the 'publication' of such a site such as currency and access versus credibility; technological obsolescence, site design and usage.
NASA Astrophysics Data System (ADS)
Van Damme, T.
2015-04-01
Computer Vision Photogrammetry allows archaeologists to accurately record underwater sites in three dimensions using simple twodimensional picture or video sequences, automatically processed in dedicated software. In this article, I share my experience in working with one such software package, namely PhotoScan, to record a Dutch shipwreck site. In order to demonstrate the method's reliability and flexibility, the site in question is reconstructed from simple GoPro footage, captured in low-visibility conditions. Based on the results of this case study, Computer Vision Photogrammetry compares very favourably to manual recording methods both in recording efficiency, and in the quality of the final results. In a final section, the significance of Computer Vision Photogrammetry is then assessed from a historical perspective, by placing the current research in the wider context of about half a century of successful use of Analytical and later Digital photogrammetry in the field of underwater archaeology. I conclude that while photogrammetry has been used in our discipline for several decades now, for various reasons the method was only ever used by a relatively small percentage of projects. This is likely to change in the near future since, compared to the `traditional' photogrammetry approaches employed in the past, today Computer Vision Photogrammetry is easier to use, more reliable and more affordable than ever before, while at the same time producing more accurate and more detailed three-dimensional results.
Jones, Hendrée E.; Fischer, Gabriele; Heil, Sarah H.; Kaltenbach, Karol; Martin, Peter R.; Coyle, Mara G.; Selby, Peter; Stine, Susan M.; O’Grady, Kevin E.; Arria, Amelia M.
2015-01-01
Aims The Maternal Opioid Treatment: Human Experimental Research (MOTHER) project, an eight-site randomized, double-blind, double-dummy, flexible-dosing, parallel-group clinical trial is described. This study is the most current – and single most comprehensive – research effort to investigate the safety and efficacy of maternal and prenatal exposure to methadone and buprenorphine. Methods The MOTHER study design is outlined, and its basic features are presented. Conclusions At least seven important lessons have been learned from the MOTHER study: (1) an interdisciplinary focus improves the design and methods of a randomized clinical trial; (2) multiple sites in a clinical trial present continuing challenges to the investigative team due to variations in recruitment goals, patient populations, and hospital practices that in turn differentially impact recruitment rates, treatment compliance, and attrition; (3) study design and protocols must be flexible in order to meet the unforeseen demands of both research and clinical management; (4) staff turnover needs to be addressed with a proactive focus on both hiring and training; (5) the implementation of a protocol for the treatment of a particular disorder may identify important ancillary clinical issues worthy of investigation; (6) timely tracking of data in a multi-site trial is both demanding and unforgiving; and, (7) complex multi-site trials pose unanticipated challenges that complicate the choice of statistical methods, thereby placing added demands on investigators to effectively communicate their results. PMID:23106924
Gaffron, Philine; Niemeier, Deb
2015-01-01
It has been shown that the location of schools near heavily trafficked roads can have detrimental effects on the health of children attending those schools. It is therefore desirable to screen both existing school locations and potential new school sites to assess either the need for remedial measures or suitability for the intended use. Current screening tools and public guidance on school siting are either too coarse in their spatial resolution for assessing individual sites or are highly resource intensive in their execution (e.g., through dispersion modeling). We propose a new method to help bridge the gap between these two approaches. Using this method, we also examine the public K-12 schools in the Sacramento Area Council of Governments Region, California (USA) from an environmental justice perspective. We find that PM2.5 emissions from road traffic affecting a school site are significantly positively correlated with the following metrics: percent share of Black, Hispanic and multi-ethnic students, percent share of students eligible for subsidized meals. The emissions metric correlates negatively with the schools’ Academic Performance Index, the share of White students and average parental education levels. Our PM2.5 metric also correlates with the traffic related, census tract level screening indicators from the California Communities Environmental Health Screening Tool and the tool’s tract level rate of asthma related emergency department visits. PMID:25679341
The prediction of palmitoylation site locations using a multiple feature extraction method.
Shi, Shao-Ping; Sun, Xing-Yu; Qiu, Jian-Ding; Suo, Sheng-Bao; Chen, Xiang; Huang, Shu-Yun; Liang, Ru-Ping
2013-03-01
As an extremely important and ubiquitous post-translational lipid modification, palmitoylation plays a significant role in a variety of biological and physiological processes. Unlike other lipid modifications, protein palmitoylation and depalmitoylation are highly dynamic and can regulate both protein function and localization. The dynamic nature of palmitoylation is poorly understood because of the limitations in current assay methods. The in vivo or in vitro experimental identification of palmitoylation sites is both time consuming and expensive. Due to the large volume of protein sequences generated in the post-genomic era, it is extraordinarily important in both basic research and drug discovery to rapidly identify the attributes of a new protein's palmitoylation sites. In this work, a new computational method, WAP-Palm, combining multiple feature extraction, has been developed to predict the palmitoylation sites of proteins. The performance of the WAP-Palm model is measured herein and was found to have a sensitivity of 81.53%, a specificity of 90.45%, an accuracy of 85.99% and a Matthews correlation coefficient of 72.26% in 10-fold cross-validation test. The results obtained from both the cross-validation and independent tests suggest that the WAP-Palm model might facilitate the identification and annotation of protein palmitoylation locations. The online service is available at http://bioinfo.ncu.edu.cn/WAP-Palm.aspx. Copyright © 2013 Elsevier Inc. All rights reserved.
Carter, J.L.; Purcell, A.H.; Fend, S.V.; Resh, V.H.
2009-01-01
Research that explores the biological response to urbanization on a site-specific scale is necessary for management of urban basins. Recent studies have proposed a method to characterize the biological response of benthic macroinvertebrates along an urban gradient for several climatic regions in the USA. Our study demonstrates how this general framework can be refined and applied on a smaller scale to an urbanized basin, the Santa Clara Basin (surrounding San Jose, California, USA). Eighty-four sampling sites on 14 streams in the Santa Clara Basin were used for assessing local stream conditions. First, an urban index composed of human population density, road density, and urban land cover was used to determine the extent of urbanization upstream from each sampling site. Second, a multimetric biological index was developed to characterize the response of macroinvertebrate assemblages along the urban gradient. The resulting biological index included metrics from 3 ecological categories: taxonomic composition ( Ephemeroptera, Plecoptera, and Trichoptera), functional feeding group (shredder richness), and habit ( clingers). The 90th-quantile regression line was used to define the best available biological conditions along the urban gradient, which we define as the predicted biological potential. This descriptor was then used to determine the relative condition of sites throughout the basin. Hierarchical partitioning of variance revealed that several site-specific variables (dissolved O2 and temperature) were significantly related to a site's deviation from its predicted biological potential. Spatial analysis of each site's deviation from its biological potential indicated geographic heterogeneity in the distribution of impaired sites. The presence and operation of local dams optimize water use, but modify natural flow regimes, which in turn influence stream habitat, dissolved O2, and temperature. Current dissolved O2 and temperature regimes deviate from natural conditions and appear to affect benthic macroinvertebrate assemblages. The assessment methods presented in our study provide finer-scale assessment tools for managers in urban basins. ?? North American Benthological Society.
2013-01-01
Background Daily pain and multi-site pain are both associated with reduction in work ability and health-related quality of life (HRQoL) among adults. However, no population-based studies have yet investigated the prevalence of daily and multi-site pain among adolescents and how these are associated with respondent characteristics. The purpose of this study was to investigate the prevalence of self-reported daily and multi-site pain among adolescents aged 12–19 years and associations of almost daily pain and multi-site pain with respondent characteristics (sex, age, body mass index, HRQoL and sports participation). Methods A population-based cross-sectional study was conducted among 4,007 adolescents aged 12–19 years in Denmark. Adolescents answered an online questionnaire during physical education lessons. The questionnaire contained a mannequin divided into 12 regions on which the respondents indicated their current pain sites and pain frequency (rarely, monthly, weekly, more than once per week, almost daily pain), characteristics, sports participation and HRQoL measured by the EuroQoL 5D. Multivariate regression was used to calculate the odds ratio for the association between almost daily pain, multi-site pain and respondent characteristics. Results The response rate was 73.7%. A total of 2,953 adolescents (62% females) answered the questionnaire. 33.3% reported multi-site pain (pain in >1 region) while 19.8% reported almost daily pain. 61% reported current pain in at least one region with knee and back pain being the most common sites. Female sex (OR: 1.35-1.44) and a high level of sports participation (OR: 1.51-2.09) were associated with increased odds of having almost daily pain and multi-site pain. Better EQ-5D score was associated with decreased odds of having almost daily pain or multi-site pain (OR: 0.92-0.94). Conclusion In this population-based cohort of school-attending Danish adolescents, nearly two out of three reported current pain and, on average, one out of three reported pain in more than one body region. Female sex, and high level of sports participation were associated with increased odds of having almost daily pain and multi-site pain. The study highlights an important health issue that calls for investigations to improve our understanding of adolescent pain and our capacity to prevent and treat this condition. PMID:24252440
A summation of online recruiting practices for health care organizations.
Gautam, Kanak S
2005-01-01
Worker shortage is among the foremost challenges facing US health care today. Health care organizations are also confronted with rising costs of recruiting and compensating scarce workers in times of declining reimbursement. Many health care organizations are adopting online recruitment as a nontraditional, low-cost method for hiring staff. Online recruitment is the fastest growing method of recruitment today, and has advantages over traditional recruiting in terms of cost, reach, and time-saving. Several health care organizations have achieved great success in recruiting online. Yet awareness of online recruiting remains lower among health care managers than managers in other industries. Many health care organizations still search for job candidates within a 30-mile radius using traditional methods. This article describes the various aspects of online recruitment for health care organizations. It is meant to help health care managers currently recruiting online by answering frequently asked questions (eg, Should I be advertising on national job sites? Why is my Web site not attracting job seekers? Is my online ad effective?). It is also meant to educate health care managers not doing online recruiting so that they try recruiting online. The article discusses the salient aspects of online recruiting: (a) using commercial job boards; (b) building one's own career center; (c) building one's own job board; (d) collecting and storing resumes; (e) attracting job seekers to one's Web site; (f) creating online job ads; (g) screening and evaluating candidates online; and (h) building long-term relationships with candidates. Job seekers in health care are adopting the Internet faster than health care employers. To recruit successfully during the current labor shortage, it is imperative that employers adopt and expand online recruiting.
Ratheal, Ian M.; Virgin, Gail K.; Yu, Haibo; Roux, Benoît; Gatto, Craig; Artigas, Pablo
2010-01-01
The Na/K pump is a P-type ATPase that exchanges three intracellular Na+ ions for two extracellular K+ ions through the plasmalemma of nearly all animal cells. The mechanisms involved in cation selection by the pump's ion-binding sites (site I and site II bind either Na+ or K+; site III binds only Na+) are poorly understood. We studied cation selectivity by outward-facing sites (high K+ affinity) of Na/K pumps expressed in Xenopus oocytes, under voltage clamp. Guanidinium+, methylguanidinium+, and aminoguanidinium+ produced two phenomena possibly reflecting actions at site III: (i) voltage-dependent inhibition (VDI) of outwardly directed pump current at saturating K+, and (ii) induction of pump-mediated, guanidinium-derivative–carried inward current at negative potentials without Na+ and K+. In contrast, formamidinium+ and acetamidinium+ induced K+-like outward currents. Measurement of ouabain-sensitive ATPase activity and radiolabeled cation uptake confirmed that these cations are external K+ congeners. Molecular dynamics simulations indicate that bound organic cations induce minor distortion of the binding sites. Among tested metals, only Li+ induced Na+-like VDI, whereas all metals tested except Na+ induced K+-like outward currents. Pump-mediated K+-like organic cation transport challenges the concept of rigid structural models in which ion specificity at site I and site II arises from a precise and unique arrangement of coordinating ligands. Furthermore, actions by guanidinium+ derivatives suggest that Na+ binds to site III in a hydrated form and that the inward current observed without external Na+ and K+ represents cation transport when normal occlusion at sites I and II is impaired. These results provide insights on external ion selectivity at the three binding sites. PMID:20937860
Huang, Weiyan; Zhao, Mei; Wei, Na; Wang, Xiaoxia; Cao, Huqing; Du, Quan; Liang, Zicai
2014-01-01
Potent RNase activities were found in the serum of mammals but the physiological function of the RNases was never well illustrated, largely due to the caveats in methods of RNase activity measurement. None of the existing methods can distinguish between RNases with different target specificities. A systematic study was recently carried out in our lab to investigate the site-specificity of serum RNases on double-stranded RNA substrates, and found that serum RNases cleave double-stranded RNAs predominantly at 5′-U/A-3′ and 5′-C/A-3′ dinucleotide sites, in a manner closely resembling RNase A. Based on this finding, a FRET assay was developed in the current study to measure this site-specific serum RNase activity in human samples using a double stranded RNA substrate. We demonstrated that the method has a dynamic range of 10−5 mg/ml- 10−1 mg/ml using serial dilution of RNase A. The sera of 303 cancer patients were subjected to comparison with 128 healthy controls, and it was found that serum RNase activities visualized with this site-specific double stranded probe were found to be significantly reduced in patients with gastric cancer, liver cancer, pancreatic cancer, esophageal cancer, ovary cancer, cervical cancer, bladder cancer, kidney cancer and lung cancer, while only minor changes were found in breast and colon cancer patients. This is the first report using double stranded RNA as probe to quantify site-specific activities of RNase A in a serum. The results illustrated that RNase A might be further evaluated to determine if it can serve as a new class of biomarkers for certain cancer types. PMID:24805924
Review of methods used by chiropractors to determine the site for applying manipulation
2013-01-01
Background With the development of increasing evidence for the use of manipulation in the management of musculoskeletal conditions, there is growing interest in identifying the appropriate indications for care. Recently, attempts have been made to develop clinical prediction rules, however the validity of these clinical prediction rules remains unclear and their impact on care delivery has yet to be established. The current study was designed to evaluate the literature on the validity and reliability of the more common methods used by doctors of chiropractic to inform the choice of the site at which to apply spinal manipulation. Methods Structured searches were conducted in Medline, PubMed, CINAHL and ICL, supported by hand searches of archives, to identify studies of the diagnostic reliability and validity of common methods used to identify the site of treatment application. To be included, studies were to present original data from studies of human subjects and be designed to address the region or location of care delivery. Only English language manuscripts from peer-reviewed journals were included. The quality of evidence was ranked using QUADAS for validity and QAREL for reliability, as appropriate. Data were extracted and synthesized, and were evaluated in terms of strength of evidence and the degree to which the evidence was favourable for clinical use of the method under investigation. Results A total of 2594 titles were screened from which 201 articles met all inclusion criteria. The spectrum of manuscript quality was quite broad, as was the degree to which the evidence favoured clinical application of the diagnostic methods reviewed. The most convincing favourable evidence was for methods which confirmed or provoked pain at a specific spinal segmental level or region. There was also high quality evidence supporting the use, with limitations, of static and motion palpation, and measures of leg length inequality. Evidence of mixed quality supported the use, with limitations, of postural evaluation. The evidence was unclear on the applicability of measures of stiffness and the use of spinal x-rays. The evidence was of mixed quality, but unfavourable for the use of manual muscle testing, skin conductance, surface electromyography and skin temperature measurement. Conclusions A considerable range of methods is in use for determining where in the spine to administer spinal manipulation. The currently published evidence falls across a spectrum ranging from strongly favourable to strongly unfavourable in regard to using these methods. In general, the stronger and more favourable evidence is for those procedures which take a direct measure of the presumptive site of care– methods involving pain provocation upon palpation or localized tissue examination. Procedures which involve some indirect assessment for identifying the manipulable lesion of the spine–such as skin conductance or thermography–tend not to be supported by the available evidence. PMID:24499598
Under the current regulations (CFR 503), Class B biosolids may be land applied with certain site restrictions. One method for achieving Class B status is to raise the pH of the sludge to >12 for a minimum of 2 hours with an alkaline material (normally lime). Alternately, a Clas...
ERIC Educational Resources Information Center
Malesky, L. Alvin; Peters, Chris
2012-01-01
The vast majority of university students have profiles on social networking sites (e.g., Myspace, Facebook) (Salaway et al. 2008). However, it is yet to be determined what role this rapidly evolving method of communication will play in an academic setting. Data for the current study was collected from 459 university students and 159 university…
Alternatives to Retroorbital Blood Collection in Hispid Cotton Rats (Sigmodon hispidus)
Ayers, Jessica D; Rota, Paul A; Collins, Marcus L; Drew, Clifton P
2012-01-01
Cotton rats (Sigmodon hispidus) are a valuable animal model for many human viral diseases, including polio virus, measles virus, respiratory syncytial virus, and herpes simplex virus. Although cotton rats have been used in research since 1939, few publications address handling and sampling techniques for this species, and the retroorbital sinus remains the recommended blood sampling site. Here we assessed blood sampling methods that are currently used in other species and a novel subzygomatic sampling site for their use in S. hispidus. The subzygomatic approach accesses a venous sinus that possibly is unique to this species and that lies just below the zygomatic arch of the maxilla and deep to the masseter muscle. We report that both the novel subzygomatic approach and the sublingual vein method can be used effectively in cotton rats. PMID:22776125
Degnan, James R.; Moore, Richard Bridge; Mack, Thomas J.
2001-01-01
Bedrock-fracture zones near high-yield bedrock wells in southern New Hampshire well fields were located and characterized using seven surface and six borehole geophysical survey methods. Detailed surveys of six sites with various methods provide an opportunity to integrate and compare survey results. Borehole geophysical surveys were conducted at three of the sites to confirm subsurface features. Hydrogeologic settings, including a variety of bedrock and surface geologic materials, were sought to gain an insight into the usefulness of the methods in varied terrains. Results from 15 survey lines, 8 arrays, and 3 boreholes were processed and interpreted from the 6 sites. The surface geophysical methods used provided physical properties of fractured bedrock. Seismic refraction and ground-penetrating radar (GPR) primarily were used to characterize the overburden materials, but in a few cases indicated bedrock-fracture zones. Magnetometer surveys were used to obtain background information about the bedrock to compare with other results, and to search for magnetic lows, which may result from weathered fractured rock. Electromagnetic terrain conductivity surveys (EM) and very-low-frequency electromagnetic surveys (VLF) were used as rapid reconnaissance techniques with the primary purpose of identifying electrical anomalies, indicating potential fracture zones in bedrock. Direct-current (dc) resistivity methods were used to gather detailed subsurface information about fracture depth and orientation. Two-dimensional (2-D) dc-resistivity surveys using dipole-dipole and Schlumberger arrays located and characterized the overburden, bedrock, and bedrock-fracture zones through analysis of data inversions. Azimuthal square array dc-resistivity survey results indicated orientations of conductive steep-dipping bedrock-fracture zones that were located and characterized by previously applied geophysical methods. Various available data sets were used for site selection, characterizations, and interpretations. Lineament data, developed as a part of a statewide and regional scale investigation of the bedrock aquifer, were available to identify potential near-vertical fracture zones. Geophysical surveys indicated fracture zones coincident with lineaments at 4 of the sites. Geologic data collected as a part of the regional scale investigation provided outcrop fracture measurements, ductile fabric, and contact information. Dominant fracture trends correspond to the trends of geophysical anomalies at 4 of the sites. Water-well drillers? logs from water supply and environmental data sets also were used where available to characterize sites. Regional overburden information was compiled from stratified-drift aquifer maps and surficial-geological maps.
Bayraktarov, Elisa; Pizarro, Valeria; Eidens, Corvin; Wilke, Thomas; Wild, Christian
2013-01-01
Coral bleaching events are globally occurring more frequently and with higher intensity, mainly caused by increases in seawater temperature. In Tayrona National Natural Park (TNNP) in the Colombian Caribbean, local coral communities are subjected to seasonal wind-triggered upwelling events coinciding with stronger water currents depending on location. This natural phenomenon offers the unique opportunity to study potential water current-induced mitigation mechanisms of coral bleaching in an upwelling influenced region. Therefore, coral bleaching susceptibility and recovery patterns were compared during a moderate and a mild bleaching event in December 2010 and 2011, and at the end of the subsequent upwelling periods at a water current-exposed and -sheltered site of an exemplary bay using permanent transect and labeling tools. This was accompanied by parallel monitoring of key environmental variables. Findings revealed that in 2010 overall coral bleaching before upwelling was significantly higher at the sheltered (34%) compared to the exposed site (8%). Whereas 97% of all previously bleached corals at the water current-exposed site had recovered from bleaching by April 2011, only 77% recovered at the sheltered site, but 12% had died there. In December 2011, only mild bleaching (<10% at both sites) was observed, but corals recovered significantly at both sites in the course of upwelling. No differences in water temperatures between sites occurred, but water current exposure and turbidity were significantly higher at the exposed site, suggesting that these variables may be responsible for the observed site-specific mitigation of coral bleaching. This indicates the existence of local resilience patterns against coral bleaching in Caribbean reefs.
Bayraktarov, Elisa; Pizarro, Valeria; Eidens, Corvin; Wilke, Thomas; Wild, Christian
2013-01-01
Coral bleaching events are globally occurring more frequently and with higher intensity, mainly caused by increases in seawater temperature. In Tayrona National Natural Park (TNNP) in the Colombian Caribbean, local coral communities are subjected to seasonal wind-triggered upwelling events coinciding with stronger water currents depending on location. This natural phenomenon offers the unique opportunity to study potential water current-induced mitigation mechanisms of coral bleaching in an upwelling influenced region. Therefore, coral bleaching susceptibility and recovery patterns were compared during a moderate and a mild bleaching event in December 2010 and 2011, and at the end of the subsequent upwelling periods at a water current-exposed and -sheltered site of an exemplary bay using permanent transect and labeling tools. This was accompanied by parallel monitoring of key environmental variables. Findings revealed that in 2010 overall coral bleaching before upwelling was significantly higher at the sheltered (34%) compared to the exposed site (8%). Whereas 97% of all previously bleached corals at the water current-exposed site had recovered from bleaching by April 2011, only 77% recovered at the sheltered site, but 12% had died there. In December 2011, only mild bleaching (<10% at both sites) was observed, but corals recovered significantly at both sites in the course of upwelling. No differences in water temperatures between sites occurred, but water current exposure and turbidity were significantly higher at the exposed site, suggesting that these variables may be responsible for the observed site-specific mitigation of coral bleaching. This indicates the existence of local resilience patterns against coral bleaching in Caribbean reefs. PMID:24282551
A New Global Open Source Marine Hydrocarbon Emission Site Database
NASA Astrophysics Data System (ADS)
Onyia, E., Jr.; Wood, W. T.; Barnard, A.; Dada, T.; Qazzaz, M.; Lee, T. R.; Herrera, E.; Sager, W.
2017-12-01
Hydrocarbon emission sites (e.g. seeps) discharge large volumes of fluids and gases into the oceans that are not only important for biogeochemical budgets, but also support abundant chemosynthetic communities. Documenting the locations of modern emissions is a first step towards understanding and monitoring how they affect the global state of the seafloor and oceans. Currently, no global open source (i.e. non-proprietry) detailed maps of emissions sites are available. As a solution, we have created a database that is housed within an Excel spreadsheet and use the latest versions of Earthpoint and Google Earth for position coordinate conversions and data mapping, respectively. To date, approximately 1,000 data points have been collected from referenceable sources across the globe, and we are continualy expanding the dataset. Due to the variety of spatial extents encountered, to identify each site we used two different methods: 1) point (x, y, z) locations for individual sites and; 2) delineation of areas where sites are clustered. Certain well-known areas, such as the Gulf of Mexico and the Mediterranean Sea, have a greater abundance of information; whereas significantly less information is available in other regions due to the absence of emission sites, lack of data, or because the existing data is proprietary. Although the geographical extent of the data is currently restricted to regions where the most data is publicly available, as the database matures, we expect to have more complete coverage of the world's oceans. This database is an information resource that consolidates and organizes the existing literature on hydrocarbons released into the marine environment, thereby providing a comprehensive reference for future work. We expect that the availability of seafloor hydrocarbon emission maps will benefit scientific understanding of hydrocarbon rich areas as well as potentially aiding hydrocarbon exploration and environmental impact assessements.
Detection of heavy metal by paper-based microfluidics.
Lin, Yang; Gritsenko, Dmitry; Feng, Shaolong; Teh, Yi Chen; Lu, Xiaonan; Xu, Jie
2016-09-15
Heavy metal pollution has shown great threat to the environment and public health worldwide. Current methods for the detection of heavy metals require expensive instrumentation and laborious operation, which can only be accomplished in centralized laboratories. Various microfluidic paper-based analytical devices have been developed recently as simple, cheap and disposable alternatives to conventional ones for on-site detection of heavy metals. In this review, we first summarize current development of paper-based analytical devices and discuss the selection of paper substrates, methods of device fabrication, and relevant theories in these devices. We then compare and categorize recent reports on detection of heavy metals using paper-based microfluidic devices on the basis of various detection mechanisms, such as colorimetric, fluorescent, and electrochemical methods. To finalize, the future development and trend in this field are discussed. Copyright © 2016 Elsevier B.V. All rights reserved.
Development and Application of Fiber Bragg Grating Clinometer
NASA Astrophysics Data System (ADS)
Guo, Xin; Li, Wen; Wang, Wentao; Feng, Xiaoyu
2017-06-01
Using FBG (fiber bragg grating) technology in clinometers can solve the technological problem facing by wireless transmission devices like big data transfer volume and poor stability, which has been receiving more and more attention. This paper discusses a new clinometer that is designed and transformed based on upgrading current clinometers, installing fiber grating strain gauges and fiber thermometers, and carrying out studies on such aspects as equipment upgrading, on-site setting, and data acquisition and analysis. In addition, it brings up the method of calculating displacement change based on wavelength change; this method is used in safety monitoring of the right side slope of Longyong Expressway ZK56+860 ~ ZK56+940 Section. Data shows that the device is operating well with a higher accuracy, and the slope is currently in a steady state. The equipment improvement and the method together provide reference data for safety analysis of the side slope.
Pen rearing and imprinting of fall Chinook salmon. Annual report 1989
Beeman, J.W.; Novotny, J.F.
1990-01-01
The goal of this project is to compare net-pen rearing methods to traditional hatchery methods of rearing upriver bright fall chinook salmon (Oncorhvnchus tshawvtscha). Fish were reared at several densities in net pens at three Columbia River backwater sites during 1984-1987, and in a barrier net at one site during 1984-1986; methods included both fed and unfed treatments. The purpose of this report is to summarize the results obtained from the unfed treatments and the current return of adults from all fed treatments and the barrier net. Zooplankton were the primary food item of unfed fish. Fish reared in net pens utilized insects colonizing the nets as an additional food source, whereas those reared in the barrier net did not. Growth and production of fish reared in the unfed treatments were low. Instantaneous growth rates of unfed fish were much lower than those of the fed treatments and hatchery controls except when zooplankton densities were high and chironomid larvae were important in the diet of unfed fish reared in pens. Only fish in the barrier net treatment resulted in consistent net gains in growth and production over the rearing periods. Adult returns of fish from all fed and unfed treatments are lower than those of control fish reared at the hatchery. Returns appear to be inversely related to rearing density. Even though adult returns are lower than those of traditional hatchery methods, a cost-benefit analysis, as return data becomes more complete, may prove these methods to be an economical means of expanding current hatchery production, particularly if "thinning" releases were used.
NASA Astrophysics Data System (ADS)
Formetta, Giuseppe; Bell, Victoria; Stewart, Elizabeth
2018-02-01
Regional flood frequency analysis is one of the most commonly applied methods for estimating extreme flood events at ungauged sites or locations with short measurement records. It is based on: (i) the definition of a homogeneous group (pooling-group) of catchments, and on (ii) the use of the pooling-group data to estimate flood quantiles. Although many methods to define a pooling-group (pooling schemes, PS) are based on catchment physiographic similarity measures, in the last decade methods based on flood seasonality similarity have been contemplated. In this paper, two seasonality-based PS are proposed and tested both in terms of the homogeneity of the pooling-groups they generate and in terms of the accuracy in estimating extreme flood events. The method has been applied in 420 catchments in Great Britain (considered as both gauged and ungauged) and compared against the current Flood Estimation Handbook (FEH) PS. Results for gauged sites show that, compared to the current PS, the seasonality-based PS performs better both in terms of homogeneity of the pooling-group and in terms of the accuracy of flood quantile estimates. For ungauged locations, a national-scale hydrological model has been used for the first time to quantify flood seasonality. Results show that in 75% of the tested locations the seasonality-based PS provides an improvement in the accuracy of the flood quantile estimates. The remaining 25% were located in highly urbanized, groundwater-dependent catchments. The promising results support the aspiration that large-scale hydrological models complement traditional methods for estimating design floods.
Williams, Cory A.; Richards, Rodney J.; Collins, Kent L.
2015-01-01
The U.S. Bureau of Reclamation (USBR) and local stakeholder groups are evaluating reservoir-management strategies within Paonia Reservoir. This small reservoir fills to capacity each spring and requires approximately half of the snowmelt-runoff volume from its sediment-laden source waters, Muddy Creek. The U.S. Geological Survey is currently conducting high-resolution (15-minute data-recording interval) sediment monitoring to characterize incoming and outgoing sediment flux during reservoir operations at two sites on Muddy Creek. The high-resolution monitoring is being used to establish current rates of reservoir sedimentation, support USBR sediment transport and storage models, and assess the viability of water-storage recovery in Paonia Reservoir. These sites are equipped with in situ, single-frequency, side-looking acoustic Doppler current meters in conjunction with turbidity sensors to monitor sediment flux. This project serves as a demonstration of the capability of using surrogate techniques to predict suspended-sediment concentrations in small streams (less than 20 meters in width and 2 meters in depth). These two sites provide the ability to report near real-time suspended-sediment concentrations through the U.S. Geological Survey National Water Information System (NWIS) web interface and National Real-Time Water Quality websites (NRTWQ) to aid in reservoir operations and assessments.
Temporal variation of velocity and turbulence characteristics at a tidal energy site
NASA Astrophysics Data System (ADS)
Gunawan, B.; Neary, V. S.; Colby, J.
2013-12-01
This study examines the temporal variability, frequency, direction and magnitude of the mean current, turbulence, hydrodynamic force and tidal power availability at a proposed tidal energy site in a tidal channel located in East River, NY, USA. The channel has a width of 190 m, a mean water level of 9.8 m and a mean tidal range of 1.3 m. A two-month velocity measurement was conducted at the design hub-height of a tidal turbine using an acoustic Doppler velocimeter (ADV). The site has semi-diurnal tidal characteristics with tidal current pattern resembles that of sinusoidal function. The five-minute mean currents at the site varied between 0 and 2.4 m s-1. Flood current magnitudes were typically higher that the ebb current magnitudes, which skewed the tidal energy production towards the flood period. The effect of small-scale turbulence on the computed velocity, hydrodynamic load and power densities timeseries were investigated. Excluding the small-scale turbulence may lead to a significant underestimation of the mean and the maximum values of the analyzed variable. Comparison of hydrodynamic conditions with other tidal energy sites indicates that the key parameters for tidal energy site development are likely to be site-specific, which highlight the need to develop a classification system for tidal energy sites. Such a classification system would enable a direct comparison of key parameters between potential project locations and ultimately help investors in the decision making process. Turbulence intensity vs. mean current magnitude
Rapid Estimation of TPH Reduction in Oil-Contaminated Soils Using the MED Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edenborn, H.M.; Zenone, V.A.
2007-09-01
Oil-contaminated soil and sludge generated during federal well plugging activities in northwestern Pennsylvania are currently remediated on small landfarm sites in lieu of more expensive landfill disposal. Bioremediation success at these sites in the past has been gauged by the decrease in total petroleum hydrocarbon (TPH) concentrations to less than 10,000 mg/kg measured using EPA Method 418.1. We tested the “molarity of ethanol droplet” (MED) water repellency test as a rapid indicator of TPH concentration in soil at one landfarm near Bradford, PA. MED was estimated by determining the minimum ethanol concentration (0 – 6 M) required to penetrate air-driedmore » and sieved soil samples within 10 sec. TPH in soil was analyzed by rapid fluorometric analysis of methanol soil extracts, which correlated well with EPA Method 1664. Uncontaminated landfarm site soil amended with increasing concentrations of waste oil sludge showed a high correlation between MED and TPH. MED values exceeded the upper limit of 6 M as TPH estimates exceed ca. 25,000 mg/kg. MED and TPH at the land farm were sampled monthly during summer months over two years in a grid pattern that allowed spatial comparisons of site remediation effectiveness. MED and TPH decreased at a constant rate over time and remained highly correlated. Inexpensive alternatives to reagent-grade ethanol gave comparable results. The simple MED approach served as an inexpensive alternative to the routine laboratory analysis of TPH during the monitoring of oily waste bioremediation at this landfarm site.« less
NASA Astrophysics Data System (ADS)
Violette, Aurélie; Heinesch, Bernard; Erpicum, Michel; Carnol, Monique; Aubinet, Marc; François, Louis
2013-04-01
For 15 years, networks of flux towers have been developed to determine accurate carbon balance with the eddy-covariance method and determine if forests are sink or source of carbon. However, for prediction of the evolution of carbon cycle and climate, major uncertainties remain on the ecosystem respiration (Reco, which includes the respiration of above ground part of trees, roots respiration and mineralization of the soil organic matter), the gross primary productivity (GPP) and their difference, the net ecosystem exchange (NEE) of forests. These uncertainties are consequences of spatial and inter-annual variability, driven by previous and current climatic conditions, as well as by the particular history of the site (management, diseases, etc.). In this study we focus on the carbon cycle in two mixed forests in the Belgian Ardennes. The first site, Vielsalm, is a mature stand mostly composed of beeches (Fagus sylvatica) and douglas fir (Pseudotsuga menziesii) from 80 to 100 years old. The second site, La Robinette, was covered before 1995 with spruces. After an important windfall and a clear cutting, the site was replanted, between 1995 and 2000, with spruces (Piceas abies) and deciduous species (mostly Betula pendula, Aulnus glutinosa and Salix aurita). The challenge here is to highlight how initial conditions can influence the current behavior of the carbon cycle in a growing stand compared to a mature one, where initial conditions are supposed to be forgotten. A modeling approach suits particularly well for sensitivity tests and estimation of the temporal lag between an event and the ecosystem response. We use the forest ecosystem model ASPECTS (Rasse et al., Ecological Modelling 141, 35-52, 2001). This model predicts long-term forest growth by calculating, over time, hourly NEE. It was developed and already validated on the Vielsalm forest. Modelling results are confronted to eddy-covariance data on both sites from 2006 to 2011. The main difference between both sites seems to rely on soil respiration, which is probably partly a heritage of the previous ecosystem at the young forest site.
Deng, Zhongyuan; Zhang, Shen; Gu, Shaohua; Ni, Xinzhi; Zeng, Wenxian; Li, Xianchun
2018-01-17
The link between polyadenylation (pA) and various biological, behavioral, and pathological events of eukaryotes underlines the need to develop in vivo polyadenylation assay methods for characterization of the cis -acting elements, trans -acting factors and environmental stimuli that affect polyadenylation efficiency and/or relative usage of two alternative polyadenylation (APA) sites. The current protein-based CAT or luciferase reporter systems can measure the polyadenylation efficiency of a single pA site or candidate cis element but not the choice of two APA sites. To address this issue, we developed a set of four new bicistronic reporter vectors that harbor either two luciferase or fluorescence protein open reading frames connected with one Internal Ribosome Entry Site (IRES). Transfection of single or dual insertion constructs of these vectors into mammalian cells demonstrated that they could be utilized not only to quantify the strength of a single candidate pA site or cis element, but also to accurately measure the relative usage of two APA sites at both the mRNA (qRT-PCR) and protein levels. This represents the first reporter system that can study polyadenylation efficiency of a single pA site or element and regulation of two APA sites at both the mRNA and protein levels.
Airborne and Ground-Based Optical Characterization of Legacy Underground Nuclear Test Sites
NASA Astrophysics Data System (ADS)
Vigil, S.; Craven, J.; Anderson, D.; Dzur, R.; Schultz-Fellenz, E. S.; Sussman, A. J.
2015-12-01
Detecting, locating, and characterizing suspected underground nuclear test sites is a U.S. security priority. Currently, global underground nuclear explosion monitoring relies on seismic and infrasound sensor networks to provide rapid initial detection of potential underground nuclear tests. While seismic and infrasound might be able to generally locate potential underground nuclear tests, additional sensing methods might be required to further pinpoint test site locations. Optical remote sensing is a robust approach for site location and characterization due to the ability it provides to search large areas relatively quickly, resolve surface features in fine detail, and perform these tasks non-intrusively. Optical remote sensing provides both cultural and surface geological information about a site, for example, operational infrastructure, surface fractures. Surface geological information, when combined with known or estimated subsurface geologic information, could provide clues concerning test parameters. We have characterized two legacy nuclear test sites on the Nevada National Security Site (NNSS), U20ak and U20az using helicopter-, ground- and unmanned aerial system-based RGB imagery and light detection and ranging (lidar) systems. The multi-faceted information garnered from these different sensing modalities has allowed us to build a knowledge base of how a nuclear test site might look when sensed remotely, and the standoff distances required to resolve important site characteristics.
History and progress of the North American Soil Geochemical Landscapes Project, 2001-2010
Smith, David B.; Cannon, William F.; Woodruff, Laurel G.; Rivera, Francisco Moreira; Rencz, Andrew N.; Garrett, Robert G.
2012-01-01
In 2007, the U.S. Geological Survey, the Geological Survey of Canada, and the Mexican Geological Survey initiated a low-density (1 site per 1600 km2, 13323 sites) geochemical and mineralogical survey of North American soils (North American Soil Geochemical Landscapes Project). Sampling and analytical protocols were developed at a series of workshops in 20032004 and pilot studies were conducted from 20042007. The ideal sampling protocol at each site includes a sample from 05 cm depth, a composite of the soil A horizon, and a sample from the soil C horizon. The 3, HClO4, and HF. Separate methods are used for As, Hg, Se, and total C on this same size fraction. The major mineralogical components are determined by a quantitative X-ray diffraction method. Sampling in the conterminous U.S. was completed in 2010 (c. 4800 sites) with chemical and mineralogical analysis currently underway. In Mexico, approximately 66% of the sampling (871 sites) had been done by the end of 2010 with completion expected in 2012. After completing sampling in the Maritime provinces and portions of other provinces (472 sites, 7.6% of the total), Canada withdrew from the project in 2010. Preliminary results for a swath from the central U.S. to Florida clearly show the effects of soil parent material and climate on the chemical and mineralogical composition of soils. A sample archive will be established and made available for future investigations.
Senning, Eric N.; Aman, Teresa K.
2016-01-01
Biological membranes are complex assemblies of lipids and proteins that serve as platforms for cell signaling. We have developed a novel method for measuring the structure and dynamics of the membrane based on fluorescence resonance energy transfer (FRET). The method marries four technologies: (1) unroofing cells to isolate and access the cytoplasmic leaflet of the plasma membrane; (2) patch-clamp fluorometry (PCF) to measure currents and fluorescence simultaneously from a membrane patch; (3) a synthetic lipid with a metal-chelating head group to decorate the membrane with metal-binding sites; and (4) transition metal ion FRET (tmFRET) to measure short distances between a fluorescent probe and a transition metal ion on the membrane. We applied this method to measure the density and affinity of native and introduced metal-binding sites in the membrane. These experiments pave the way for measuring structural rearrangements of membrane proteins relative to the membrane. PMID:26755772
Detecting Past Positive Selection through Ongoing Negative Selection
Bazykin, Georgii A.; Kondrashov, Alexey S.
2011-01-01
Detecting positive selection is a challenging task. We propose a method for detecting past positive selection through ongoing negative selection, based on comparison of the parameters of intraspecies polymorphism at functionally important and selectively neutral sites where a nucleotide substitution of the same kind occurred recently. Reduced occurrence of recently replaced ancestral alleles at functionally important sites indicates that negative selection currently acts against these alleles and, therefore, that their replacements were driven by positive selection. Application of this method to the Drosophila melanogaster lineage shows that the fraction of adaptive amino acid replacements remained approximately 0.5 for a long time. In the Homo sapiens lineage, however, this fraction drops from approximately 0.5 before the Ponginae–Homininae divergence to approximately 0 after it. The proposed method is based on essentially the same data as the McDonald–Kreitman test but is free from some of its limitations, which may open new opportunities, especially when many genotypes within a species are known. PMID:21859804
Methods to Reduce Forest Residue Volume after Timber Harvesting and Produce Black Carbon
Busse, Matt D.; Archuleta, James G.; McAvoy, Darren; Roussel, Eric
2017-01-01
Forest restoration often includes thinning to reduce tree density and improve ecosystem processes and function while also reducing the risk of wildfire or insect and disease outbreaks. However, one drawback of these restoration treatments is that slash is often burned in piles that may damage the soil and require further restoration activities. Pile burning is currently used on many forest sites as the preferred method for residue disposal because piles can be burned at various times of the year and are usually more controlled than broadcast burns. In many cases, fire can be beneficial to site conditions and soil properties, but slash piles, with a large concentration of wood, needles, forest floor, and sometimes mineral soil, can cause long-term damage. We describe several alternative methods for reducing nonmerchantable forest residues that will help remove excess woody biomass, minimize detrimental soil impacts, and create charcoal for improving soil organic matter and carbon sequestration. PMID:28377830
Zhu, Yinzhou; Pirnie, Stephan P; Carmichael, Gordon G
2017-08-01
Ribose methylation (2'- O -methylation, 2'- O Me) occurs at high frequencies in rRNAs and other small RNAs and is carried out using a shared mechanism across eukaryotes and archaea. As RNA modifications are important for ribosome maturation, and alterations in these modifications are associated with cellular defects and diseases, it is important to characterize the landscape of 2'- O -methylation. Here we report the development of a highly sensitive and accurate method for ribose methylation detection using next-generation sequencing. A key feature of this method is the generation of RNA fragments with random 3'-ends, followed by periodate oxidation of all molecules terminating in 2',3'-OH groups. This allows only RNAs harboring 2'-OMe groups at their 3'-ends to be sequenced. Although currently requiring microgram amounts of starting material, this method is robust for the analysis of rRNAs even at low sequencing depth. © 2017 Zhu et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Sequence-Based Prediction of RNA-Binding Residues in Proteins.
Walia, Rasna R; El-Manzalawy, Yasser; Honavar, Vasant G; Dobbs, Drena
2017-01-01
Identifying individual residues in the interfaces of protein-RNA complexes is important for understanding the molecular determinants of protein-RNA recognition and has many potential applications. Recent technical advances have led to several high-throughput experimental methods for identifying partners in protein-RNA complexes, but determining RNA-binding residues in proteins is still expensive and time-consuming. This chapter focuses on available computational methods for identifying which amino acids in an RNA-binding protein participate directly in contacting RNA. Step-by-step protocols for using three different web-based servers to predict RNA-binding residues are described. In addition, currently available web servers and software tools for predicting RNA-binding sites, as well as databases that contain valuable information about known protein-RNA complexes, RNA-binding motifs in proteins, and protein-binding recognition sites in RNA are provided. We emphasize sequence-based methods that can reliably identify interfacial residues without the requirement for structural information regarding either the RNA-binding protein or its RNA partner.
Sequence-Based Prediction of RNA-Binding Residues in Proteins
Walia, Rasna R.; EL-Manzalawy, Yasser; Honavar, Vasant G.; Dobbs, Drena
2017-01-01
Identifying individual residues in the interfaces of protein–RNA complexes is important for understanding the molecular determinants of protein–RNA recognition and has many potential applications. Recent technical advances have led to several high-throughput experimental methods for identifying partners in protein–RNA complexes, but determining RNA-binding residues in proteins is still expensive and time-consuming. This chapter focuses on available computational methods for identifying which amino acids in an RNA-binding protein participate directly in contacting RNA. Step-by-step protocols for using three different web-based servers to predict RNA-binding residues are described. In addition, currently available web servers and software tools for predicting RNA-binding sites, as well as databases that contain valuable information about known protein–RNA complexes, RNA-binding motifs in proteins, and protein-binding recognition sites in RNA are provided. We emphasize sequence-based methods that can reliably identify interfacial residues without the requirement for structural information regarding either the RNA-binding protein or its RNA partner. PMID:27787829
2017-01-01
Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)’s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete (“grab”) samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples. PMID:28759607
Analysis and elimination method of the effects of cables on LVRT testing for offshore wind turbines
NASA Astrophysics Data System (ADS)
Jiang, Zimin; Liu, Xiaohao; Li, Changgang; Liu, Yutian
2018-02-01
The current state, characteristics and necessity of the low voltage ride through (LVRT) on-site testing for grid-connected offshore wind turbines are introduced firstly. Then the effects of submarine cables on the LVRT testing are analysed based on the equivalent circuit of the testing system. A scheme for eliminating the effects of cables on the proposed LVRT testing method is presented. The specified voltage dips are guaranteed to be in compliance with the testing standards by adjusting the ratio between the current limiting impedance and short circuit impedance according to the steady voltage relationship derived from the equivalent circuit. Finally, simulation results demonstrate that the voltage dips at the high voltage side of wind turbine transformer satisfy the requirements of testing standards.
NASA Technical Reports Server (NTRS)
James, W. P.
1971-01-01
A simplified procedure is presented for determining water current velocities and diffusion coefficients. Dye drops which form dye patches in the receiving water are made from an aircraft. The changes in position and size of the patches are recorded from two flights over the area. The simplified data processing procedure requires only that the ground coordinates about the dye patches be determined at the time of each flight. With an automatic recording coordinatograph for measuring coordinates and a computer for processing the data, this technique provides a practical method of determining circulation patterns and mixing characteristics of large aquatic systems. This information is useful in assessing the environmental impact of waste water discharges and for industrial plant siting.
A stochastic context free grammar based framework for analysis of protein sequences
Dyrka, Witold; Nebel, Jean-Christophe
2009-01-01
Background In the last decade, there have been many applications of formal language theory in bioinformatics such as RNA structure prediction and detection of patterns in DNA. However, in the field of proteomics, the size of the protein alphabet and the complexity of relationship between amino acids have mainly limited the application of formal language theory to the production of grammars whose expressive power is not higher than stochastic regular grammars. However, these grammars, like other state of the art methods, cannot cover any higher-order dependencies such as nested and crossing relationships that are common in proteins. In order to overcome some of these limitations, we propose a Stochastic Context Free Grammar based framework for the analysis of protein sequences where grammars are induced using a genetic algorithm. Results This framework was implemented in a system aiming at the production of binding site descriptors. These descriptors not only allow detection of protein regions that are involved in these sites, but also provide insight in their structure. Grammars were induced using quantitative properties of amino acids to deal with the size of the protein alphabet. Moreover, we imposed some structural constraints on grammars to reduce the extent of the rule search space. Finally, grammars based on different properties were combined to convey as much information as possible. Evaluation was performed on sites of various sizes and complexity described either by PROSITE patterns, domain profiles or a set of patterns. Results show the produced binding site descriptors are human-readable and, hence, highlight biologically meaningful features. Moreover, they achieve good accuracy in both annotation and detection. In addition, findings suggest that, unlike current state-of-the-art methods, our system may be particularly suited to deal with patterns shared by non-homologous proteins. Conclusion A new Stochastic Context Free Grammar based framework has been introduced allowing the production of binding site descriptors for analysis of protein sequences. Experiments have shown that not only is this new approach valid, but produces human-readable descriptors for binding sites which have been beyond the capability of current machine learning techniques. PMID:19814800
Mapping AmeriFlux footprints: Towards knowing the flux source area across a network of towers
NASA Astrophysics Data System (ADS)
Menzer, O.; Pastorello, G.; Metzger, S.; Poindexter, C.; Agarwal, D.; Papale, D.
2014-12-01
The AmeriFlux network collects long-term carbon, water and energy flux measurements obtained with the eddy covariance method. In order to attribute fluxes to specific areas of the land surface, flux source calculations are essential. Consequently, footprint models can support flux up-scaling exercises to larger regions, often based on remote sensing data. However, flux footprints are not currently being routinely calculated; different approaches exist but have not been standardized. In part, this is due to varying instrumentation and data processing methods at the site level. The goal of this work is to map tower footprints for a future standardized AmeriFlux product to be generated at the network level. These footprints can be estimated by analytical models, Lagrangian simulations, and large-eddy simulations. However, for many sites, the datasets currently submitted to central databases generally do not include all variables required. The AmeriFlux network is moving to collection of raw data and expansion of the variables requested from sites, giving the possibility to calculate all parameters and variables needed to run most of the available footprint models. In this pilot study, we are applying state of the art footprint models across a subset of AmeriFlux sites, to evaluate the feasibility and merit of developing standardized footprint results. In addition to comparing outcomes from several footprint models, we will attempt to verify and validate the results in two ways: (i) Verification of our footprint calculations at sites where footprints have been experimentally estimated. (ii) Validation at towers situated in heterogeneous landscapes: here, variations in the observed fluxes are expected to correlate with spatiotemporal variations of the source area composition. Once implemented, the footprint results can be used as additional information within the AmeriFlux database that can support data interpretation and data assimilation. Lastly, we will explore the expandability of this approach to other flux networks by collaborating with and including sites from the ICOS and NEON networks in our analyses. This can enable utilizing the footprint model output to improve network interoperability, thus further promoting synthesis analyses and understanding of system-level questions in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarvis, T.T.; Andrews, W.B.; Buck, J.W.
1998-03-01
Since 1989, the Department of Energy`s (DOE) Environmental Management (EM) Program has managed the environmental legacy of US nuclear weapons production, research and testing at 137 facilities in 31 states and one US territory. The EM program has conducted several studies on the public risks posed by contaminated sites at these facilities. In Risks and the Risk Debate [DOE, 1995a], the Department analyzed the risks at sites before, during, and after remediation work by the EM program. The results indicated that aside from a few urgent risks, most hazards present little inherent risk because physical and active site management controlsmore » limit both the releases of site contaminants, and public access to these hazards. Without these controls, these sites would pose greater risks to the public. Past risk reports, however, provided little information about post-cleanup risk, primarily because of uncertainty about future site uses and site characteristics at the end of planned cleanup activities. This is of concern because in many cases current cleanup technologies, and remedies, will last a shorter period of time than the waste itself and the resulting contamination will remain hazardous.« less
Protein C-Terminal Labeling and Biotinylation Using Synthetic Peptide and Split-Intein
Volkmann, Gerrit; Liu, Xiang-Qin
2009-01-01
Background Site-specific protein labeling or modification can facilitate the characterization of proteins with respect to their structure, folding, and interaction with other proteins. However, current methods of site-specific protein labeling are few and with limitations, therefore new methods are needed to satisfy the increasing need and sophistications of protein labeling. Methodology A method of protein C-terminal labeling was developed using a non-canonical split-intein, through an intein-catalyzed trans-splicing reaction between a protein and a small synthetic peptide carrying the desired labeling groups. As demonstrations of this method, three different proteins were efficiently labeled at their C-termini with two different labels (fluorescein and biotin) either in solution or on a solid surface, and a transferrin receptor protein was labeled on the membrane surface of live mammalian cells. Protein biotinylation and immobilization on a streptavidin-coated surface were also achieved in a cell lysate without prior purification of the target protein. Conclusions We have produced a method of site-specific labeling or modification at the C-termini of recombinant proteins. This method compares favorably with previous protein labeling methods and has several unique advantages. It is expected to have many potential applications in protein engineering and research, which include fluorescent labeling for monitoring protein folding, location, and trafficking in cells, and biotinylation for protein immobilization on streptavidin-coated surfaces including protein microchips. The types of chemical labeling may be limited only by the ability of chemical synthesis to produce the small C-intein peptide containing the desired chemical groups. PMID:20027230
von Kalckreuth, Vera; Konings, Frank; Aaby, Peter; Adu-Sarkodie, Yaw; Ali, Mohammad; Aseffa, Abraham; Baker, Stephen; Breiman, Robert F.; Bjerregaard-Andersen, Morten; Clemens, John D.; Crump, John A.; Cruz Espinoza, Ligia Maria; Deerin, Jessica Fung; Gasmelseed, Nagla; Sow, Amy Gassama; Im, Justin; Keddy, Karen H.; Cosmas, Leonard; May, Jürgen; Meyer, Christian G.; Mintz, Eric D.; Montgomery, Joel M.; Olack, Beatrice; Pak, Gi Deok; Panzner, Ursula; Park, Se Eun; Rakotozandrindrainy, Raphaël; Schütt-Gerowitt, Heidi; Soura, Abdramane Bassiahi; Warren, Michelle R.; Wierzba, Thomas F.; Marks, Florian
2016-01-01
Background. New immunization programs are dependent on data from surveillance networks and disease burden estimates to prioritize target areas and risk groups. Data regarding invasive Salmonella disease in sub-Saharan Africa are currently limited, thus hindering the implementation of preventive measures. The Typhoid Fever Surveillance in Africa Program (TSAP) was established by the International Vaccine Institute to obtain comparable incidence data on typhoid fever and invasive nontyphoidal Salmonella (iNTS) disease in sub-Saharan Africa through standardized surveillance in multiple countries. Methods. Standardized procedures were developed and deployed across sites for study site selection, patient enrolment, laboratory procedures, quality control and quality assurance, assessment of healthcare utilization and incidence calculations. Results. Passive surveillance for bloodstream infections among febrile patients was initiated at thirteen sentinel sites in ten countries (Burkina Faso, Ethiopia, Ghana, Guinea-Bissau, Kenya, Madagascar, Senegal, South Africa, Sudan, and Tanzania). Each TSAP site conducted case detection using these standardized methods to isolate and identify aerobic bacteria from the bloodstream of febrile patients. Healthcare utilization surveys were conducted to adjust population denominators in incidence calculations for differing healthcare utilization patterns and improve comparability of incidence rates across sites. Conclusions. By providing standardized data on the incidence of typhoid fever and iNTS disease in sub-Saharan Africa, TSAP will provide vital input for targeted typhoid fever prevention programs. PMID:26933028
A Review on Microdialysis Calibration Methods: the Theory and Current Related Efforts.
Kho, Chun Min; Enche Ab Rahim, Siti Kartini; Ahmad, Zainal Arifin; Abdullah, Norazharuddin Shah
2017-07-01
Microdialysis is a sampling technique first introduced in the late 1950s. Although this technique was originally designed to study endogenous compounds in animal brain, it is later modified to be used in other organs. Additionally, microdialysis is not only able to collect unbound concentration of compounds from tissue sites; this technique can also be used to deliver exogenous compounds to a designated area. Due to its versatility, microdialysis technique is widely employed in a number of areas, including biomedical research. However, for most in vivo studies, the concentration of substance obtained directly from the microdialysis technique does not accurately describe the concentration of the substance on-site. In order to relate the results collected from microdialysis to the actual in vivo condition, a calibration method is required. To date, various microdialysis calibration methods have been reported, with each method being capable to provide valuable insights of the technique itself and its applications. This paper aims to provide a critical review on various calibration methods used in microdialysis applications, inclusive of a detailed description of the microdialysis technique itself to start with. It is expected that this article shall review in detail, the various calibration methods employed, present examples of work related to each calibration method including clinical efforts, plus the advantages and disadvantages of each of the methods.
NASA Astrophysics Data System (ADS)
Bakhoday-Paskyabi, Mostafa; Fer, Ilker; Reuder, Joachim
2018-01-01
We report concurrent measurements of ocean currents and turbulence at two sites in the North Sea, one site at upwind of the FINO1 platform and the other 200-m downwind of the Alpha Ventus wind farm. At each site, mean currents, Reynolds stresses, turbulence intensity and production of turbulent kinetic energy are obtained from two bottom-mounted 5-beam Nortek Signature1000s, high-frequency Doppler current profiler, at a water depth of approximately 30 m. Measurements from the two sites are compared to statistically identify the effects of wind farm and waves on ocean current variability and the turbulent structure in the water column. Profiles of Reynolds stresses are found to be sensible to both environmental forcing and the wind farm wake-induced distortions in both boundary layers near the surface and the seabed. Production of turbulent kinetic energy and turbulence intensity exhibit approximately similar, but less pronounced, patterns in the presence of farm wake effects.
C.R. Lane; E. Hobden; L. Laurenson; V.C. Barton; K.J.D. Hughes; H. Swan; N. Boonham; A.J. Inman
2008-01-01
Plant health regulations to prevent the introduction and spread of Phytophthora ramorum and P. kernoviae require rapid, cost effective diagnostic methods for screening large numbers of plant samples at the time of inspection. Current on-site techniques require expensive equipment, considerable expertise and are not suited for plant...
Stream Discharge Measurements From Cableways
Nolan, K. Michael; Sultz, Lucky
2000-01-01
Cableways have been used for decades as a platform for making stream discharge measurements. Use of cableways eliminates the need to expose personnel to hazards associated with working from highway bridges. In addition, cableways allow sites to be selected that offer the best possible hydraulic characteristics for measuring stream discharge. This training presentation describes methods currently used by the U.S. Geological Survey to make stream discharge measurements from cableways.
Human Performance-Based Measurement System
1999-12-28
is primarily achieved by increasing signal-to- noise , and image resolution through interpolation. One method for spatial resolution is the...potential at an electrode to a quantity that is proportional to the current that enters and exits the scalp at that site. Deblurring is another...direct digitization of EEG signals over analog recording are several, the most important of which is the avoidance of noise patterns that resemble
Prediction of TF target sites based on atomistic models of protein-DNA complexes
Angarica, Vladimir Espinosa; Pérez, Abel González; Vasconcelos, Ana T; Collado-Vides, Julio; Contreras-Moreira, Bruno
2008-01-01
Background The specific recognition of genomic cis-regulatory elements by transcription factors (TFs) plays an essential role in the regulation of coordinated gene expression. Studying the mechanisms determining binding specificity in protein-DNA interactions is thus an important goal. Most current approaches for modeling TF specific recognition rely on the knowledge of large sets of cognate target sites and consider only the information contained in their primary sequence. Results Here we describe a structure-based methodology for predicting sequence motifs starting from the coordinates of a TF-DNA complex. Our algorithm combines information regarding the direct and indirect readout of DNA into an atomistic statistical model, which is used to estimate the interaction potential. We first measure the ability of our method to correctly estimate the binding specificities of eight prokaryotic and eukaryotic TFs that belong to different structural superfamilies. Secondly, the method is applied to two homology models, finding that sampling of interface side-chain rotamers remarkably improves the results. Thirdly, the algorithm is compared with a reference structural method based on contact counts, obtaining comparable predictions for the experimental complexes and more accurate sequence motifs for the homology models. Conclusion Our results demonstrate that atomic-detail structural information can be feasibly used to predict TF binding sites. The computational method presented here is universal and might be applied to other systems involving protein-DNA recognition. PMID:18922190
Common mechanisms of inhibition for the Na+/glucose (hSGLT1) and Na+/Cl−/GABA (hGAT1) cotransporters
Hirayama, Bruce A; Díez-Sampedro, Ana; Wright, Ernest M
2001-01-01
Electrophysiological methods were used to investigate the interaction of inhibitors with the human Na+/glucose (hSGLT1) and Na+/Cl−/GABA (hGAT1) cotransporters. Inhibitor constants were estimated from both inhibition of substrate-dependent current and inhibitor-induced changes in cotransporter conformation. The competitive, non-transported inhibitors are substrate derivatives with inhibition constants from 200 nM (phlorizin) to 17 mM (esculin) for hSGLT1, and 300 nM (SKF89976A) to 10 mM (baclofen) for hGAT1. At least for hSGLT1, values determined using either method were proportional over 5-orders of magnitude. Correlation of inhibition to structure of the inhibitors resulted in a pharmacophore for glycoside binding to hSGLT1: the aglycone is coplanar with the pyranose ring, and binds to a hydrophobic/aromatic surface of at least 7×12Å. Important hydrogen bond interactions occur at five positions bordering this surface. In both hSGLT1 and hGAT1 the data suggests that there is a large, hydrophobic inhibitor binding site ∼8Å from the substrate binding site. This suggests an architectural similarity between hSGLT1 and hGAT1. There is also structural similarity between non-competitive and competitive inhibitors, e.g., phloretin is the aglycone of phlorizin (hSGLT1) and nortriptyline resembles SKF89976A without nipecotic acid (hGAT1). Our studies establish that measurement of the effect of inhibitors on presteady state currents is a valid non-radioactive method for the determination of inhibitor binding constants. Furthermore, analysis of the presteady state currents provide novel insights into partial reactions of the transport cycle and mode of action of the inhibitors. PMID:11588102
Mapping bathymetry in an active surf zone with the WorldView2 multispectral satellite
NASA Astrophysics Data System (ADS)
Trimble, S. M.; Houser, C.; Brander, R.; Chirico, P.
2015-12-01
Rip currents are strong, narrow seaward flows of water that originate in the surf zones of many global beaches. They are related to hundreds of international drownings each year, but exact numbers are difficult to calculate due to logistical difficulties in obtaining accurate incident reports. Annual average rip current fatalities are estimated to be ~100, 53 and 21 in the United States (US), Costa Rica, and Australia respectively. Current warning systems (e.g. National Weather Service) do not account for fine resolution nearshore bathymetry because it is difficult to capture. The method shown here could provide frequent, high resolution maps of nearshore bathymetry at a scale required for improved rip prediction and warning. This study demonstrates a method for mapping bathymetry in the surf zone (20m deep and less), specifically within rip channels, because rips form at topographically low spots in the bathymetry as a result of feedback amongst waves, substrate, and antecedent bathymetry. The methods employ the Digital Globe WorldView2 (WV2) multispectral satellite and field measurements of depth to generate maps of the changing bathymetry at two embayed, rip-prone beaches: Playa Cocles, Puerto Viejo de Talamanca, Costa Rica, and Bondi Beach, Sydney, Australia. WV2 has a 1.1 day pass-over rate with 1.84m ground pixel resolution of 8 bands, including 'yellow' (585-625 nm) and 'coastal blue' (400-450 nm). The data is used to classify bottom type and to map depth to the return in multiple bands. The methodology is tested at each site for algorithm consistency between dates, and again for applicability between sites.
Site-specific range uncertainties caused by dose calculation algorithms for proton therapy
NASA Astrophysics Data System (ADS)
Schuemann, J.; Dowdell, S.; Grassberger, C.; Min, C. H.; Paganetti, H.
2014-08-01
The purpose of this study was to assess the possibility of introducing site-specific range margins to replace current generic margins in proton therapy. Further, the goal was to study the potential of reducing margins with current analytical dose calculations methods. For this purpose we investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict the range of proton fields. Dose distributions predicted by an analytical pencil-beam algorithm were compared with those obtained using Monte Carlo (MC) simulations (TOPAS). A total of 508 passively scattered treatment fields were analyzed for seven disease sites (liver, prostate, breast, medulloblastoma-spine, medulloblastoma-whole brain, lung and head and neck). Voxel-by-voxel comparisons were performed on two-dimensional distal dose surfaces calculated by pencil-beam and MC algorithms to obtain the average range differences and root mean square deviation for each field for the distal position of the 90% dose level (R90) and the 50% dose level (R50). The average dose degradation of the distal falloff region, defined as the distance between the distal position of the 80% and 20% dose levels (R80-R20), was also analyzed. All ranges were calculated in water-equivalent distances. Considering total range uncertainties and uncertainties from dose calculation alone, we were able to deduce site-specific estimations. For liver, prostate and whole brain fields our results demonstrate that a reduction of currently used uncertainty margins is feasible even without introducing MC dose calculations. We recommend range margins of 2.8% + 1.2 mm for liver and prostate treatments and 3.1% + 1.2 mm for whole brain treatments, respectively. On the other hand, current margins seem to be insufficient for some breast, lung and head and neck patients, at least if used generically. If no case specific adjustments are applied, a generic margin of 6.3% + 1.2 mm would be needed for breast, lung and head and neck treatments. We conclude that the currently used generic range uncertainty margins in proton therapy should be redefined site specific and that complex geometries may require a field specific adjustment. Routine verifications of treatment plans using MC simulations are recommended for patients with heterogeneous geometries.
MIT Participation in the Data Analysis of the XRS and XIS Instruments on the Astro-E2 Mission
NASA Technical Reports Server (NTRS)
Bautz, Mark
2005-01-01
Since the inception of this grant six weeks ago, we have completed the initial activation of the Suzaku X-ray Imaging Spectrometer (XIS) (on 13 August) and we have supported initial calibration observations. The instrument is performing very well in all respects. We have characterized the spectral resolution and effective area of each XIS sensor. We are especially excited about the scientific opportunities provided by the XIS'S back- illuminated sensor, which exhibits spectral resolution in the sub-keV band unmatched by any X-ray CCD currently in orbit. As specified in our proposal, we have established a web site (http://space.mit.edu/XIS) on which we maintain an up-to-date summary of instrument performance characteristics. Gain, spectral resolution and system noise, as well as residual background rates, are currently available on this site. Although the particle background level is low compared with Chandra and XMM, we are currently evaluating methods to reduce it still further. Techniques under study include use of 5x5 mode information and alternative grade selection methods. Although the primary responsibility for development of instrument response functions rests with our Japanese colleagues, we are incorporating our latest measurements of spectral resolution into some temporary response functions which we hope to make available to the Suzaku General Observer Facility and the Science Working Group(SWG). We are also preparing proposals for use of SWG observing time.
Inoue, Motohiro; Katsumi, Yasukazu; Itoi, Megumi; Hojo, Tatsuya; Nakajima, Miwa; Ohashi, Suzuyo; Oi, Yuki; Kitakoji, Hiroshi
2011-06-01
To examine the therapeutic effect of a novel therapeutic method based on electroacupuncture with intermittent direct current (DCEA) and associated adverse events in patients with peripheral nerve damage and a poor clinical prognosis. In seven older patients with peripheral nerve damage (neurapraxia 2, axonotmesis 4, neuromesis 1), an acupuncture needle connected to an anode electrode was inserted proximal to the site of the injury along the route of the nerve, while the cathode electrode was inserted into the innervated muscle, and DCEA was performed (100 Hz for 20 min, weekly). Muscular paralysis was evaluated weekly with manual muscle testing, the active range of motion of joints related to the muscular paralysis and, when necessary, needle electromyography. Adverse events were also recorded during the course of the treatment. Complete functional recovery was observed in the two cases with neurapraxia and two with axonotmesis, while one axonotmesis case achieved improvement and the other showed reinnervation potential without functional recovery. No improvement was observed in the neurotmesis case. Pigmentation of the skin where the anode needle was inserted occurred in three cases. Although there was no definite causal link, one case showed excessive formation and resorption of bone in the area close to the cathode needle site. Accelerated nerve regeneration caused by DCEA may contribute to recovery. The skin pigmentation and callus formation suggest that the shape of the anode electrode, current intensity and other factors should be examined to establish a safer treatment method.
The Decision to Access Patient Information from a Social Media Site: What Would You Do?
Jent, Jason F.; Eaton, Cyd K.; Merrick, Melissa T.; Englebert, Nicole E.; Dandes, Susan K.; Chapman, Ana V.; Hershorin, Eugene R.
2011-01-01
Purpose The current study examined the prevalence with which healthcare providers use a social media site account (e.g., Facebook), the extent to which they utilize social media sites in clinical practice, and their decision-making process after accessing patient information from a social media site. Methods Pediatric faculty and trainees from a medical school campus were provided a social media site history form and seven fictional social media site adolescent profile vignettes that depicted concerning information. Participants were instructed to rate their personal use and beliefs about social media sites and to report how they would respond if they obtained concerning information about an adolescent patient from their public social media site profile. Results Healthcare providers generally believed it not to be an invasion of privacy to conduct an Internet/social media site search of someone they know. A small percentage of trainees reported a personal history of conducting an Internet search (18%) or a social media site search (14%) for a patient. However, no faculty endorsed a history of conducting searches for patients. Faculty and trainees also differed in how they would respond to concerning social media site adolescent profile information. Conclusions The findings that trainees are conducting Internet/social media site searches of patients and that faculty and trainees differ in how they would respond to concerning profile information suggest the need for specific guidelines regarding the role of social media sites in clinical practice. Practice, policy, and training implications are discussed. PMID:21939873
NASA Technical Reports Server (NTRS)
Munday, J. C., Jr.; Gordon, H. H.; Welch, C. S.; Williams, G.
1976-01-01
Projects for sewage outfall siting for pollution control in the lower Chesapeake Bay wetlands are reported. A dye-buoy/photogrammetry and remote sensing technique was employed to gather circulation data used in outfall siting. This technique is greatly favored over alternate methods because it is inexpensive, produces results quickly, and reveals Lagrangian current paths which are preferred in making siting decisions. Wetlands data were obtained by interpretation of color and color infrared photographic imagery from several altitudes. Historical sequences of photographs are shown that were used to document wetlands changes. Sequential infrared photography of inlet basins was employed to determine tidal prisms, which were input to mathematical models to be used by state agencies in pollution control. A direct and crucial link between remote sensing and management decisions was demonstrated in the various projects.
Secure Genomic Computation through Site-Wise Encryption
Zhao, Yongan; Wang, XiaoFeng; Tang, Haixu
2015-01-01
Commercial clouds provide on-demand IT services for big-data analysis, which have become an attractive option for users who have no access to comparable infrastructure. However, utilizing these services for human genome analysis is highly risky, as human genomic data contains identifiable information of human individuals and their disease susceptibility. Therefore, currently, no computation on personal human genomic data is conducted on public clouds. To address this issue, here we present a site-wise encryption approach to encrypt whole human genome sequences, which can be subject to secure searching of genomic signatures on public clouds. We implemented this method within the Hadoop framework, and tested it on the case of searching disease markers retrieved from the ClinVar database against patients’ genomic sequences. The secure search runs only one order of magnitude slower than the simple search without encryption, indicating our method is ready to be used for secure genomic computation on public clouds. PMID:26306278
Gene and translation initiation site prediction in metagenomic sequences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hyatt, Philip Douglas; LoCascio, Philip F; Hauser, Loren John
2012-01-01
Gene prediction in metagenomic sequences remains a difficult problem. Current sequencing technologies do not achieve sufficient coverage to assemble the individual genomes in a typical sample; consequently, sequencing runs produce a large number of short sequences whose exact origin is unknown. Since these sequences are usually smaller than the average length of a gene, algorithms must make predictions based on very little data. We present MetaProdigal, a metagenomic version of the gene prediction program Prodigal, that can identify genes in short, anonymous coding sequences with a high degree of accuracy. The novel value of the method consists of enhanced translationmore » initiation site identification, ability to identify sequences that use alternate genetic codes and confidence values for each gene call. We compare the results of MetaProdigal with other methods and conclude with a discussion of future improvements.« less
Secure Genomic Computation through Site-Wise Encryption.
Zhao, Yongan; Wang, XiaoFeng; Tang, Haixu
2015-01-01
Commercial clouds provide on-demand IT services for big-data analysis, which have become an attractive option for users who have no access to comparable infrastructure. However, utilizing these services for human genome analysis is highly risky, as human genomic data contains identifiable information of human individuals and their disease susceptibility. Therefore, currently, no computation on personal human genomic data is conducted on public clouds. To address this issue, here we present a site-wise encryption approach to encrypt whole human genome sequences, which can be subject to secure searching of genomic signatures on public clouds. We implemented this method within the Hadoop framework, and tested it on the case of searching disease markers retrieved from the ClinVar database against patients' genomic sequences. The secure search runs only one order of magnitude slower than the simple search without encryption, indicating our method is ready to be used for secure genomic computation on public clouds.
Lonsdale, Richard; Fort, Rachel M; Rydberg, Patrik; Harvey, Jeremy N; Mulholland, Adrian J
2016-06-20
The mechanism of cytochrome P450(CYP)-catalyzed hydroxylation of primary amines is currently unclear and is relevant to drug metabolism; previous small model calculations have suggested two possible mechanisms: direct N-oxidation and H-abstraction/rebound. We have modeled the N-hydroxylation of (R)-mexiletine in CYP1A2 with hybrid quantum mechanics/molecular mechanics (QM/MM) methods, providing a more detailed and realistic model. Multiple reaction barriers have been calculated at the QM(B3LYP-D)/MM(CHARMM27) level for the direct N-oxidation and H-abstraction/rebound mechanisms. Our calculated barriers indicate that the direct N-oxidation mechanism is preferred and proceeds via the doublet spin state of Compound I. Molecular dynamics simulations indicate that the presence of an ordered water molecule in the active site assists in the binding of mexiletine in the active site, but this is not a prerequisite for reaction via either mechanism. Several active site residues play a role in the binding of mexiletine in the active site, including Thr124 and Phe226. This work reveals key details of the N-hydroxylation of mexiletine and further demonstrates that mechanistic studies using QM/MM methods are useful for understanding drug metabolism.
Archaeological recording and chemical stratigraphy applied to contaminated land studies.
Photos-Jones, Effie; Hall, Allan J
2011-11-15
The method used by archaeologists for excavation and recording of the stratigraphic evidence, within trenches with or without archaeological remains, can potentially be useful to contaminated land consultants (CLCs). The implementation of archaeological practice in contaminated land assessments (CLAs) is not meant to be an exercise in data overkill; neither should it increase costs. Rather, we suggest, that if the excavation and recording, by a trained archaeologist, of the stratigraphy is followed by in-situ chemical characterisation then it is possible that much uncertainty associated with current field sampling practices, may be removed. This is because built into the chemical stratigraphy is the temporal and spatial relationship between different parts of the site reflecting the logic behind the distribution of contamination. An archaeological recording with chemical stratigraphy approach to sampling may possibly provide 'one method fits all' for potentially contaminated land sites (CLSs), just as archaeological characterisation of the stratigraphic record provides 'one method fits all' for all archaeological sites irrespective of period (prehistoric to modern) or type (rural, urban or industrial). We also suggest that there may be practical and financial benefits to be gained by pulling together expertise and resources stemming from different disciplines, not simply at the assessment phase, but also subsequent phases, in contaminated land improvement. Copyright © 2011 Elsevier B.V. All rights reserved.
Bioremediation in oil-contaminated sites: bacteria and surfactant accelerated remediation
NASA Astrophysics Data System (ADS)
Strong-Gunderson, Janet M.; Guzman, Francisco
1996-11-01
In Mexico, there are several environmental issues which are being addressed under the current governmental legislation. One important issue is restoring sites belonging to Petroleos Mexicanos (PEMEX). PEMEX is a large government owned oil company that regulates and manages the oil reserves. These sites are primarily contaminated with weathered hydrocarbons which are a consequence of extracting millions of barrels of oil. Within the southern regions of Mexico there are sites which were contaminated by activities and spills that have occurred during the past 30 years. PEMEX has taken the leadership in correcting environmental problems and is very concerned about cleaning up the contaminated sites as quickly as possible. The most significant contaminated sites are located to the north of Veracruz and south of Tabasco. These sites areas are close to refineries or locations of oil exploration. The primary category of contaminants are hydrocarbons, among them asphaltens, aromatic and other contaminants. The concentration of the contaminants varies depending on the location of the sites, but it can reach as high as 500,000 ppm. PEMEX has been searching for appropriate, and cost- effective technologies to clean up these sites. Biologically based remediation activities are of primary interest to PEMEX. However, other treatment technologies such as chemical-physical methods, encapsulation and incineration are also being considered. The present report summarizes preliminary experiments that measured the feasibility of bioremediation for a contaminated site in southern Mexico.
Williams, Larry R.; Aroniadou-Anderjaska, Vassiliki; Qashu, Felicia; Finne, Huckelberry; Pidoplichko, Volodymyr; Bannon, Desmond I.; Braga, Maria F. M.
2011-01-01
Background Hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) is a high-energy, trinitrated cyclic compound that has been used worldwide since World War II as an explosive in both military and civilian applications. RDX can be released in the environment by way of waste streams generated during the manufacture, use, and disposal of RDX-containing munitions and can leach into groundwater from unexploded munitions found on training ranges. For > 60 years, it has been known that exposure to high doses of RDX causes generalized seizures, but the mechanism has remained unknown. Objective We investigated the mechanism by which RDX induces seizures. Methods and results By screening the affinity of RDX for a number of neurotransmitter receptors, we found that RDX binds exclusively to the picrotoxin convulsant site of the γ-aminobutyric acid type A (GABAA) ionophore. Whole-cell in vitro recordings in the rat basolateral amygdala (BLA) showed that RDX reduces the frequency and amplitude of spontaneous GABAA receptor–mediated inhibitory postsynaptic currents and the amplitude of GABA-evoked postsynaptic currents. In extracellular field recordings from the BLA, RDX induced prolonged, seizure-like neuronal discharges. Conclusions These results suggest that binding to the GABAA receptor convulsant site is the primary mechanism of seizure induction by RDX and that reduction of GABAergic inhibitory transmission in the amygdala is involved in the generation of RDX-induced seizures. Knowledge of the molecular site and the mechanism of RDX action with respect to seizure induction can guide therapeutic strategies, allow more accurate development of safe thresholds for exposures, and help prevent the development of new explosives or other munitions that could pose similar health risks. PMID:21362589
Ansbacher, Tamar; Freud, Yehoshua; Major, Dan Thomas
2018-05-23
Taxadiene synthase (TXS) catalyzes the formation of the natural product Taxa-4(5),11(12)-diene (henceforth Taxadiene). Taxadiene is the precursor in the formation of Taxol, which is an important natural anti-cancer agent. In the current study, we present a detailed mechanistic view of the biosynthesis of Taxadiene by TXS, using a hybrid quantum mechanics-molecular mechanics potential in conjunction with free energy simulation methods. The obtained free energy landscape displays initial endergonic steps followed by a step-wise downhill profile, which is an emerging free energy fingerprint for type I terpene synthases. We identify an active site Trp residue (W753) as a key feature of the TXS active site architecture and propose that this residue stabilized intermediate cations via -cation interactions. To validate our proposed active TXS model, we examine a previously reported W753H mutation, which leads to exclusive formation of the side product, cembrene A. The simulations of the W753H mutant show that in the mutant structure, the His side-chain is in perfect position to deprotonate the cembrenyl cation en route to cembrene formation, and that this abortive deprotonation is an energetically facile process. Based on the current model, we propose that an analogous mutation of Y841 to His could possibly lead to verticillane. The current simulations stress the importance of precise positioning of key active site residues in stabilizing intermediate carbocations. In view of the great pharmaceutical importance of taxadiene, a detailed understanding of the TXS mechanism can provide important clues towards a synthetic strategy for taxol manufacturing.
NASA Astrophysics Data System (ADS)
Wren, A.; Xu, K.; Ma, Y.; Sanger, D.; Van Dolah, R.
2014-12-01
Bottom-mounted instrumentation was deployed at two sites on an ebb tidal delta to measure hydrodynamics, sediment transport, and seabed elevation. One site ('borrow site') was 2 km offshore and used as a dredging site for beach nourishment of nearby Hilton Head Island in South Carolina, and the other site ('reference site') was 10 km offshore and not directly impacted by the dredging. In-situ time-series data were collected during two periods after the dredging: March 15 - June 12, 2012('spring') and August 18 - November 18, 2012 ('fall'). At the reference site directional wave spectra and upper water column current velocities were measured, as well as high-resolution current velocity profiles and suspended sediment concentration profiles in the Bottom Boundary Layer (BBL). Seabed elevation and small-scale seabed changes were also measured. At the borrow site seabed elevation and near-bed wave and current velocities were collected using an Acoustic Doppler Velocimeter. Throughout both deployments bottom wave orbital velocities ranged from 0 - 110 m/s at the reference site. Wave orbital velocities were much lower at the borrow site ranging from 10-20 cm/s, as wave energy was dissipated on the extensive and rough sand banks before reaching the borrow site. Suspended sediment concentrations increased throughout the BBL when orbital velocities increased to approximately 20 cm/s. Sediment grain size and critical shear stresses were similar at both sites, therefore, re-suspension due to waves was less frequent at the borrow site. However, sediment concentrations were highly correlated with the tidal cycle at both sites. Semidiurnal tidal currents were similar at the two sites, typically ranging from 0 - 50 cm/s in the BBL. Maximum currents exceeded the critical shear stress and measured suspended sediment concentrations increased during the first hours of the tidal cycle when the tide switched to flood tide. Results indicate waves contributed more to sediment mobility at the reference site, while tidal forcing was the dominant factor at the borrow site. The seabed elevation data corraborates these results as active migrating ripples of 10 cm were measured at the reference site, while changes in seabed elevation at the borrow site were more gradual with approximately 30 cm of net accretion throughout the study.
NASA Astrophysics Data System (ADS)
Wu, Lunyu; Xiong, Xuejun; Li, Xiaolong; Shi, Maochong; Guo, Yongqing; Chen, Liang
2016-12-01
Bottom currents at about 1000 m depth in and around a submarine valley on the continental slope of the northern South China Sea were studied by a 14-month long experiment from July 2013 to September 2014. The observations reveal that bottom currents are strongly influenced by the topography, being along valley axis or isobaths. Power density spectrum analysis shows that all the currents have significant peaks at diurnal and semi-diurnal frequencies. Diurnal energy is dominant at the open slope site, which is consistent with many previous studies. However, at the site inside the valley the semi-diurnal energy dominates, although the distance between the two sites of observation is quite small (11 km) compared to a typical horizontal first-mode internal tide wavelength (200 km). We found this phenomenon is caused by the focusing of internal waves of certain frequencies in the valley. The inertial peak is found only at the open slope site in the first deployment but missing at the inside valley site and the rest of the deployments. Monthly averaged residual currents reveal that the near-bottom currents on the slope flow southwestward throughout the year except in August and September, 2013, from which we speculate that this is a result of the interaction between a mesoscale eddy and the canyon/sag topography. Currents inside the valley within about 10 mab basically flow along slope and in the layers above the 10 mab the currents are northwestward, that is, from the deep ocean to the shelf. The monthly mean current vectors manifest an Ekman layer-like vertical structure at both sites, which rotate counter-clockwise looking from above.
Federal health web sites: current & future roles.
Cronin, Carol
2002-09-01
An examination of the current and possible future roles of federal health Web sites, this paper provides an overview of site categories, functions, target audiences, marketing approaches, knowledge management, and evaluation strategies. It concludes with a look at future opportunities and challenges for the federal government in providing health information online.
When can time-dependent currents be reproduced by the Landauer steady-state approximation?
NASA Astrophysics Data System (ADS)
Carey, Rachel; Chen, Liping; Gu, Bing; Franco, Ignacio
2017-05-01
We establish well-defined limits in which the time-dependent electronic currents across a molecular junction subject to a fluctuating environment can be quantitatively captured via the Landauer steady-state approximation. For this, we calculate the exact time-dependent non-equilibrium Green's function (TD-NEGF) current along a model two-site molecular junction, in which the site energies are subject to correlated noise, and contrast it with that obtained from the Landauer approach. The ability of the steady-state approximation to capture the TD-NEGF behavior at each instant of time is quantified via the same-time correlation function of the currents obtained from the two methods, while their global agreement is quantified by examining differences in the average currents. The Landauer steady-state approach is found to be a useful approximation when (i) the fluctuations do not disrupt the degree of delocalization of the molecular eigenstates responsible for transport and (ii) the characteristic time for charge exchange between the molecule and leads is fast with respect to the molecular correlation time. For resonant transport, when these conditions are satisfied, the Landauer approach is found to accurately describe the current, both on average and at each instant of time. For non-resonant transport, we find that while the steady-state approach fails to capture the time-dependent transport at each instant of time, it still provides a good approximation to the average currents. These criteria can be employed to adopt effective modeling strategies for transport through molecular junctions in interaction with a fluctuating environment, as is necessary to describe experiments.
NASA Astrophysics Data System (ADS)
Molnar, S.; Cassidy, J. F.; Castellaro, S.; Cornou, C.; Crow, H.; Hunter, J. A.; Matsushima, S.; Sánchez-Sesma, F. J.; Yong, A.
2018-03-01
Nakamura (Q Rep Railway Tech Res Inst 30:25-33, 1989) popularized the application of the horizontal-to-vertical spectral ratio (HVSR) analysis of microtremor (seismic noise or ambient vibration) recordings to estimate the predominant frequency and amplification factor of earthquake shaking. During the following quarter century, popularity in the microtremor HVSR (MHVSR) method grew; studies have verified the stability of a site's MHVSR response over time and validated the MHVSR response with that of earthquake HVSR response. Today, MHVSR analysis is a popular reconnaissance tool used worldwide for seismic microzonation and earthquake site characterization in numerous regions, specifically, in the mapping of site period or fundamental frequency and inverted for shear-wave velocity depth profiles, respectively. However, the ubiquity of MHVSR analysis is predominantly a consequence of its ease in application rather than our full understanding of its theory. We present the state of the art in MHVSR analyses in terms of the development of its theoretical basis, current state of practice, and we comment on its future for applications in earthquake site characterization.
NASA Astrophysics Data System (ADS)
Molnar, S.; Cassidy, J. F.; Castellaro, S.; Cornou, C.; Crow, H.; Hunter, J. A.; Matsushima, S.; Sánchez-Sesma, F. J.; Yong, A.
2018-07-01
Nakamura (Q Rep Railway Tech Res Inst 30:25-33, 1989) popularized the application of the horizontal-to-vertical spectral ratio (HVSR) analysis of microtremor (seismic noise or ambient vibration) recordings to estimate the predominant frequency and amplification factor of earthquake shaking. During the following quarter century, popularity in the microtremor HVSR (MHVSR) method grew; studies have verified the stability of a site's MHVSR response over time and validated the MHVSR response with that of earthquake HVSR response. Today, MHVSR analysis is a popular reconnaissance tool used worldwide for seismic microzonation and earthquake site characterization in numerous regions, specifically, in the mapping of site period or fundamental frequency and inverted for shear-wave velocity depth profiles, respectively. However, the ubiquity of MHVSR analysis is predominantly a consequence of its ease in application rather than our full understanding of its theory. We present the state of the art in MHVSR analyses in terms of the development of its theoretical basis, current state of practice, and we comment on its future for applications in earthquake site characterization.
NASA Astrophysics Data System (ADS)
Darmawan, A.; Saputra, D. K.; Wiadnya, D. G. R.; Gusmida, A. M.
2018-04-01
Turtles, the most threatened coastal-marine fauna, are protected through both national and global regulations. However, many of their nesting sites have been degraded in the past years. Completing natal homing, adult females emerged at night to lay-down eggs in the upper intertidal and supra-tidal zone of sandy beach from where they hatched. This study explained coastal topology of beaches usually used for nesting sites, covering 117 km coastline at Pacitan Regency, Southern Java Sea. The shift in beach morphology through times was figured out based on Landsat 8 and Sentinel 2a satellite imagery and remote sensing (GIS methods). This was combined with in-situ data on current coastline features, slope, and tide variations. Results showed a typical sandy beach, called Taman Ria Beach, a long time identified as nesting site for Lepidochelys olivacea, locally named as Penyu Lekang. Also, there was approximatelly 3.49 ha of supratidal area predicted in Taman Ria Beach according this study
CO{sub 2} pellet decontamination technology at Westinghouse Hanford
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aldridge, T.L.; Aldrich, L.K. II; Bowman, E.V.
1995-03-01
Experimentation and testing with CO{sub 2} pellet decontamination technology is being conducted at Westinghosue Hanford Company (WHC), Richland, Washington. There are 1,100 known existing waste sites at Hanford. The sites specified by federal and state agencies are currently being studied to determine the appropriate cleanup methods best for each site. These sites are contaminated and work on them is in compliance with the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). There are also 63 treatment, storage, and disposal units, for example: groups of waste tanks or drums. In 1992, there were 100 planned activities scheduled to bring these unitsmore » into the Resource Conservation and Recovery Act (RCRA) compliance or close them after waste removal. Ninety-six of these were completed. The remaining four were delayed or are being negotiated with regulatory agencies. As a result of past defense program activities at Hanford a tremendous volume of materials and equipment have accumulated and require remediation.« less
Conductance of three-terminal molecular bridge based on tight-binding theory
NASA Astrophysics Data System (ADS)
Wang, Li-Guang; Li, Yong; Yu, Ding-Wen; Katsunori, Tagami; Masaru, Tsukada
2005-05-01
The quantum transmission characteristic of three-benzene ring nano-molecular bridge is investigated theoretically by using Green's function approach based on tight-binding theory with only a π orbital per carbon atom at the site. The transmission probabilities that electrons transport through the molecular bridge from one terminal to the other two terminals are obtained. The electronic current distributions inside the molecular bridge are calculated and shown in graphical analogy by the current density method based on Fisher-Lee formula at the energy points E = ±0.42, ±1.06 and ±1.5, respectively, where the transmission spectra appear peaks. We find that the transmission spectra are related to the incident electronic energy and the molecular levels strongly and the current distributions agree well with Kirchhoff quantum current momentum conservation law.
Agapiou, A; Zorba, E; Mikedi, K; McGregor, L; Spiliopoulou, C; Statheropoulos, M
2015-07-09
Field experiments were devised to mimic the entrapment conditions under the rubble of collapsed buildings aiming to investigate the evolution of volatile organic compounds (VOCs) during the early dead body decomposition stage. Three pig carcasses were placed inside concrete tunnels of a search and rescue (SAR) operational field terrain for simulating the entrapment environment after a building collapse. The experimental campaign employed both laboratory and on-site analytical methods running in parallel. The current work focuses only on the results of the laboratory method using thermal desorption coupled to comprehensive two-dimensional gas chromatography with time-of-flight mass spectrometry (TD-GC×GC-TOF MS). The flow-modulated TD-GC×GC-TOF MS provided enhanced separation of the VOC profile and served as a reference method for the evaluation of the on-site analytical methods in the current experimental campaign. Bespoke software was used to deconvolve the VOC profile to extract as much information as possible into peak lists. In total, 288 unique VOCs were identified (i.e., not found in blank samples). The majority were aliphatics (172), aromatics (25) and nitrogen compounds (19), followed by ketones (17), esters (13), alcohols (12), aldehydes (11), sulfur (9), miscellaneous (8) and acid compounds (2). The TD-GC×GC-TOF MS proved to be a sensitive and powerful system for resolving the chemical puzzle of above-ground "scent of death". Copyright © 2015 Elsevier B.V. All rights reserved.
Kaji, Amy H; Langford, Vinette; Lewis, Roger J
2008-09-01
There is currently no validated method for assessing hospital disaster preparedness. We determine the degree of correlation between the results of 3 methods for assessing hospital disaster preparedness: administration of an on-site survey, drill observation using a structured evaluation tool, and video analysis of team performance in the hospital incident command center. This was a prospective, observational study conducted during a regional disaster drill, comparing the results from an on-site survey, a structured disaster drill evaluation tool, and a video analysis of teamwork, performed at 6 911-receiving hospitals in Los Angeles County, CA. The on-site survey was conducted separately from the drill and assessed hospital disaster plan structure, vendor agreements, modes of communication, medical and surgical supplies, involvement of law enforcement, mutual aid agreements with other facilities, drills and training, surge capacity, decontamination capability, and pharmaceutical stockpiles. The drill evaluation tool, developed by Johns Hopkins University under contract from the Agency for Healthcare Research and Quality, was used to assess various aspects of drill performance, such as the availability of the hospital disaster plan, the geographic configuration of the incident command center, whether drill participants were identifiable, whether the noise level interfered with effective communication, and how often key information (eg, number of available staffed floor, intensive care, and isolation beds; number of arriving victims; expected triage level of victims; number of potential discharges) was received by the incident command center. Teamwork behaviors in the incident command center were quantitatively assessed, using the MedTeams analysis of the video recordings obtained during the disaster drill. Spearman rank correlations of the results between pair-wise groupings of the 3 assessment methods were calculated. The 3 evaluation methods demonstrated qualitatively different results with respect to each hospital's level of disaster preparedness. The Spearman rank correlation coefficient between the results of the on-site survey and the video analysis of teamwork was -0.34; between the results of the on-site survey and the structured drill evaluation tool, 0.15; and between the results of the video analysis and the drill evaluation tool, 0.82. The disparate results obtained from the 3 methods suggest that each measures distinct aspects of disaster preparedness, and perhaps no single method adequately characterizes overall hospital preparedness.
West, James E; O'Neill, Sandra M; Ylitalo, Gina M
2017-08-01
We modeled temporal trends in polychlorinated biphenyls (PCBs), polybrominated diphenyl ethers (PBDEs), and dichlorodiphenyltrichloroethane and its metabolites (DDTs) in two indicator fish species representing benthic and pelagic habitats in Puget Sound, Washington, USA. English sole (Parophrys vetulus, benthic) index sites and larger-scale Pacific herring (Clupea pallasii, pelagic) foraging areas represented a wide range of possible contamination conditions, with sampling locations situated adjacent to watersheds exhibiting high, medium and low development. Consistency in analytical data throughout the study was maintained by either calculating method-bias-correction factors on paired samples as methods evolved or by analyzing older archived samples by current methods. PCBs declined moderately in two herring stocks from a low-development basin (2.3 and 4.0% annual rate of decline) and showed no change in the highly developed and moderately developed basins during a 16- to 21-year period. PCBs increased in English sole from four of ten sites (2.9-7.1%), and the remaining six exhibited no significant change. PBDEs and DDTs declined significantly in all herring stocks (4.2-8.1%), although analytical challenges warrant caution in interpreting DDT results. PBDEs declined in English sole from two high-development and one low-development site (3.7-7.2%) and remained unchanged in the remaining seven. DDTs increased in English sole from one high-development site (Tacoma City Waterway) and declined in two high-development and one low development site. As with herring, analytical challenges warrant caution in interpreting the English sole DDT results. It is likely that source controls and mitigation efforts have contributed to the declines in PBDEs and DDTs overall, whereas PCBs appear to have persisted, especially in the pelagic food web, despite bans in PCB production and use.
Carroll, Carlos; McRae, Brad H; Brookes, Allen
2012-02-01
Centrality metrics evaluate paths between all possible pairwise combinations of sites on a landscape to rank the contribution of each site to facilitating ecological flows across the network of sites. Computational advances now allow application of centrality metrics to landscapes represented as continuous gradients of habitat quality. This avoids the binary classification of landscapes into patch and matrix required by patch-based graph analyses of connectivity. It also avoids the focus on delineating paths between individual pairs of core areas characteristic of most corridor- or linkage-mapping methods of connectivity analysis. Conservation of regional habitat connectivity has the potential to facilitate recovery of the gray wolf (Canis lupus), a species currently recolonizing portions of its historic range in the western United States. We applied 3 contrasting linkage-mapping methods (shortest path, current flow, and minimum-cost-maximum-flow) to spatial data representing wolf habitat to analyze connectivity between wolf populations in central Idaho and Yellowstone National Park (Wyoming). We then applied 3 analogous betweenness centrality metrics to analyze connectivity of wolf habitat throughout the northwestern United States and southwestern Canada to determine where it might be possible to facilitate range expansion and interpopulation dispersal. We developed software to facilitate application of centrality metrics. Shortest-path betweenness centrality identified a minimal network of linkages analogous to those identified by least-cost-path corridor mapping. Current flow and minimum-cost-maximum-flow betweenness centrality identified diffuse networks that included alternative linkages, which will allow greater flexibility in planning. Minimum-cost-maximum-flow betweenness centrality, by integrating both land cost and habitat capacity, allows connectivity to be considered within planning processes that seek to maximize species protection at minimum cost. Centrality analysis is relevant to conservation and landscape genetics at a range of spatial extents, but it may be most broadly applicable within single- and multispecies planning efforts to conserve regional habitat connectivity. ©2011 Society for Conservation Biology.
Research on Fault Characteristics and Line Protections Within a Large-scale Photovoltaic Power Plant
NASA Astrophysics Data System (ADS)
Zhang, Chi; Zeng, Jie; Zhao, Wei; Zhong, Guobin; Xu, Qi; Luo, Pandian; Gu, Chenjie; Liu, Bohan
2017-05-01
Centralized photovoltaic (PV) systems have different fault characteristics from distributed PV systems due to the different system structures and controls. This makes the fault analysis and protection methods used in distribution networks with distributed PV not suitable for a centralized PV power plant. Therefore, a consolidated expression for the fault current within a PV power plant under different controls was calculated considering the fault response of the PV array. Then, supported by the fault current analysis and the on-site testing data, the overcurrent relay (OCR) performance was evaluated in the collection system of an 850 MW PV power plant. It reveals that the OCRs at downstream side on overhead lines may malfunction. In this case, a new relay scheme was proposed using directional distance elements. In the PSCAD/EMTDC, a detailed PV system model was built and verified using the on-site testing data. Simulation results indicate that the proposed relay scheme could effectively solve the problems under variant fault scenarios and PV plant output levels.
Enhancing superconducting critical current by randomness
Wang, Y. L.; Thoutam, L. R.; Xiao, Z. L.; ...
2016-01-11
The key ingredient of high critical currents in a type-II superconductor is defect sites that pin vortices. Here, we demonstrate that a random pinscape, an overlooked pinning system in nanopatterned superconductors, can lead to a substantially larger critical current enhancement at high magnetic fields than an ordered array of vortex pin sites. We reveal that the better performance of a random pinscape is due to the variation of the local density of its pinning sites, which mitigates the motion of vortices. This is confirmed by achieving even higher enhancement of the critical current through a conformally mapped random pinscape, wheremore » the distribution of the local density of pinning sites is further enlarged. Our findings highlight the potential of random pinscapes in enhancing the superconducting critical currents of applied superconductors in which random pin sites of nanoscale defects emerging in the materials synthesis process or through ex-situ irradiation are the only practical choice for large-scale production. Our results may also stimulate research on effects of a random pinscape in other complementary systems such as colloidal crystals, Bose-Einstein condensates, and Luttinger liquids.« less
WGE: a CRISPR database for genome engineering.
Hodgkins, Alex; Farne, Anna; Perera, Sajith; Grego, Tiago; Parry-Smith, David J; Skarnes, William C; Iyer, Vivek
2015-09-15
The rapid development of CRISPR-Cas9 mediated genome editing techniques has given rise to a number of online and stand-alone tools to find and score CRISPR sites for whole genomes. Here we describe the Wellcome Trust Sanger Institute Genome Editing database (WGE), which uses novel methods to compute, visualize and select optimal CRISPR sites in a genome browser environment. The WGE database currently stores single and paired CRISPR sites and pre-calculated off-target information for CRISPRs located in the mouse and human exomes. Scoring and display of off-target sites is simple, and intuitive, and filters can be applied to identify high-quality CRISPR sites rapidly. WGE also provides a tool for the design and display of gene targeting vectors in the same genome browser, along with gene models, protein translation and variation tracks. WGE is open, extensible and can be set up to compute and present CRISPR sites for any genome. The WGE database is freely available at www.sanger.ac.uk/htgt/wge : vvi@sanger.ac.uk or skarnes@sanger.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
Health system reform and the role of field sites based upon demographic and health surveillance.
Tollman, S. M.; Zwi, A. B.
2000-01-01
Field sites for demographic and health surveillance have made well-recognized contributions to the evaluation of new or untested interventions, largely through efficacy trials involving new technologies or the delivery of selected services, e.g. vaccines, oral rehydration therapy and alternative contraceptive methods. Their role in health system reform, whether national or international, has, however, proved considerably more limited. The present article explores the characteristics and defining features of such field sites in low-income and middle-income countries and argues that many currently active sites have a largely untapped potential for contributing substantially to national and subnational health development. Since the populations covered by these sites often correspond with the boundaries of districts or subdistricts, the strategic use of information generated by demographic surveillance can inform the decentralization efforts of national and provincial health authorities. Among the areas of particular importance are the following: making population-based information available and providing an information resource; evaluating programmes and interventions; and developing applications to policy and practice. The question is posed as to whether their potential contribution to health system reform justifies arguing for adaptations to these field sites and expanded investment in them. PMID:10686747
Generalizing ecological site concepts of the Colorado Plateau for landscape-level applications
Duniway, Michael C.; Nauman, Travis; Johanson, Jamin K.; Green, Shane; Miller, Mark E.; Bestelmeyer, Brandon T.
2016-01-01
Numerous ecological site descriptions in the southern Utah portion of the Colorado Plateau can be difficult to navigate, so we held a workshop aimed at adding value and functionality to the current ecological site system.We created new groups of ecological sites and drafted state-and-transition models for these new groups.We were able to distill the current large number of ecological sites in the study area (ca. 150) into eight ecological site groups that capture important variability in ecosystem dynamics.Several inventory and monitoring programs and landscape scale planning actions will likely benefit from more generalized ecological site group concepts.
Althabe, Fernando; Onyamboko, Marie; Kaseba-Sata, Christine; Castilla, Eduardo E.; Freire, Salvio; Garces, Ana L.; Parida, Sailajanandan; Goudar, Shivaprasad S.; Kadir, Muhammad Masood; Goco, Norman; Thornberry, Jutta; Daniels, Magdalena; Bartz, Janet; Hartwell, Tyler; Moss, Nancy; Goldenberg, Robert
2008-01-01
Objectives. We examined pregnant women's use of cigarettes and other tobacco products and the exposure of pregnant women and their young children to secondhand smoke (SHS) in 9 nations in Latin America, Asia, and Africa. Methods. Face-to-face surveys were administered to 7961 pregnant women (more than 700 per site) between October 2004 and September 2005. Results. At all Latin American sites, pregnant women commonly reported that they had ever tried cigarette smoking (range: 78.3% [Uruguay] to 35.0% [Guatemala]). The highest levels of current smoking were found in Uruguay (18.3%), Argentina (10.3%), and Brazil (6.1%). Experimentation with smokeless tobacco occurred in the Democratic Republic of the Congo and India; one third of all respondents in Orissa, India, were current smokeless tobacco users. SHS exposure was common: between 91.6% (Pakistan) and 17.1% (Democratic Republic of the Congo) of pregnant women reported that smoking was permitted in their home. Conclusions. Pregnant women's tobacco use and SHS exposure are current or emerging problems in several low- and middle-income nations, jeopardizing ongoing efforts to improve maternal and child health. PMID:18309125
Use of social networking for dental hygiene program recruitment.
Ennis, Rachel S
2011-01-01
Social networking has become a popular and effective means of communication used by students in the millennial generation. Academic admissions officers are beginning to utilize social networking methods for recruitment of students. However, the dental hygiene literature has reported little information about the use of social networking for recruitment strategies. This paper describes one institutions' process of creating and implementing a social network site for prospective and current students.
Carlos A. Gonzalez-Benecke; Eric J. Jokela; Wendell P. Cropper; Rosvel Bracho; Daniel J. Leduc
2014-01-01
The forest simulation model, 3-PG, has been widely applied as a useful tool for predicting growth of forest species in many countries. The model has the capability to estimate the effects of management, climate and site characteristics on many stand attributes using easily available data. Currently, there is an increasing interest in estimating biomass and assessing...
Method for Implementing Subsurface Solid Derived Concentration Guideline Levels (DCGL) - 12331
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lively, J.W.
2012-07-01
The U.S. Nuclear Regulatory Commission (NRC) and other federal agencies currently approve the Multi-Agency Radiation Site Survey and Investigation Manual (MARSSIM) as guidance for licensees who are conducting final radiological status surveys in support of decommissioning. MARSSIM provides a method to demonstrate compliance with the applicable regulation by comparing residual radioactivity in surface soils with derived concentration guideline levels (DCGLs), but specifically discounts its applicability to subsurface soils. Many sites and facilities undergoing decommissioning contain subsurface soils that are potentially impacted by radiological constituents. In the absence of specific guidance designed to address the derivation of subsurface soil DCGLs andmore » compliance demonstration, decommissioning facilities have attempted to apply DCGLs and final status survey techniques designed specifically for surface soils to subsurface soils. The decision to apply surface soil limits and surface soil compliance metrics to subsurface soils typically results in significant over-excavation with associated cost escalation. MACTEC, Inc. has developed the overarching concepts and principles found in recent NRC decommissioning guidance in NUREG 1757 to establish a functional method to derive dose-based subsurface soil DCGLs. The subsurface soil method developed by MACTEC also establishes a rigorous set of criterion-based data evaluation metrics (with analogs to the MARSSIM methodology) that can be used to demonstrate compliance with the developed subsurface soil DCGLs. The method establishes a continuum of volume factors that relate the size and depth of a volume of subsurface soil having elevated concentrations of residual radioactivity with its ability to produce dose. The method integrates the subsurface soil sampling regime with the derivation of the subsurface soil DCGL such that a self-regulating optimization is naturally sought by both the responsible party and regulator. This paper describes the concepts and basis used by MACTEC to develop the dose-based subsurface soil DCGL method. The paper will show how MACTEC's method can be used to demonstrate that higher concentrations of residual radioactivity in subsurface soils (as compared with surface soils) can meet the NRC's dose-based regulations. MACTEC's method has been used successfully to obtain the NRC's radiological release at a site with known radiological impacts to subsurface soils exceeding the surface soil DCGL, saving both time and cost. Having considered the current NRC guidance for consideration of residual radioactivity in subsurface soils during decommissioning, MACTEC has developed a technically based approach to the derivation of and demonstration of compliance with subsurface soil DCGLs for radionuclides. In fact, the process uses the already accepted concepts and metrics approved for surface soils as the foundation for deriving scaling factors used to calculate subsurface soil DCGLs that are at least equally protective of the decommissioning annual dose standard. Each of the elements identified for consideration in the current NRC guidance is addressed in this proposed method. Additionally, there is considerable conservatism built into the assumptions and techniques used to arrive at subsurface soil scaling factors and DCGLs. The degree of conservatism embodied in the approach used is such that risk managers and decision makers approving and using subsurface soil DCGLs derived in accordance with this method can be confident that the future exposures will be well below permissible and safe levels. The technical basis for the method can be applied to a broad variety of sites with residual radioactivity in subsurface soils. Given the costly nature of soil surveys, excavation, and disposal of soils as low-level radioactive waste, MACTEC's method for deriving and demonstrating compliance with subsurface soil DCGLs offers the possibility of significant cost savings over the traditional approach of applying surface soil DCGLs to subsurface soils. Furthermore, while yet untested, MACTEC believes that the concepts and methods embodied in this approach could readily be applied to other types of contamination found in subsurface soils. (author)« less
Wu, Yuhua; Wang, Yulei; Li, Jun; Li, Wei; Zhang, Li; Li, Yunjing; Li, Xiaofei; Li, Jun; Zhu, Li; Wu, Gang
2014-01-01
The Cauliflower mosaic virus (CaMV) 35S promoter (P35S) is a commonly used target for detection of genetically modified organisms (GMOs). There are currently 24 reported detection methods, targeting different regions of the P35S promoter. Initial assessment revealed that due to the absence of primer binding sites in the P35S sequence, 19 of the 24 reported methods failed to detect P35S in MON88913 cotton, and the other two methods could only be applied to certain GMOs. The rest three reported methods were not suitable for measurement of P35S in some testing events, because SNPs in binding sites of the primer/probe would result in abnormal amplification plots and poor linear regression parameters. In this study, we discovered a conserved region in the P35S sequence through sequencing of P35S promoters from multiple transgenic events, and developed new qualitative and quantitative detection systems targeting this conserved region. The qualitative PCR could detect the P35S promoter in 23 unique GMO events with high specificity and sensitivity. The quantitative method was suitable for measurement of P35S promoter, exhibiting good agreement between the amount of template and Ct values for each testing event. This study provides a general P35S screening method, with greater coverage than existing methods. PMID:25483893
Watershed Planning within a Quantitative Scenario Analysis Framework.
Merriam, Eric R; Petty, J Todd; Strager, Michael P
2016-07-24
There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.
Hydrofluoric Acid-Based Derivatization Strategy To Profile PARP-1 ADP-Ribosylation by LC-MS/MS.
Gagné, Jean-Philippe; Langelier, Marie-France; Pascal, John M; Poirier, Guy G
2018-06-11
Despite significant advances in the development of mass spectrometry-based methods for the identification of protein ADP-ribosylation, current protocols suffer from several drawbacks that preclude their widespread applicability. Given the intrinsic heterogeneous nature of poly(ADP-ribose), a number of strategies have been developed to generate simple derivatives for effective interrogation of protein databases and site-specific localization of the modified residues. Currently, the generation of spectral signatures indicative of ADP-ribosylation rely on chemical or enzymatic conversion of the modification to a single mass increment. Still, limitations arise from the lability of the poly(ADP-ribose) remnant during tandem mass spectrometry, the varying susceptibilities of different ADP-ribose-protein bonds to chemical hydrolysis, or the context dependence of enzyme-catalyzed reactions. Here, we present a chemical-based derivatization method applicable to the confident identification of site-specific ADP-ribosylation by conventional mass spectrometry on any targeted amino acid residue. Using PARP-1 as a model protein, we report that treatment of ADP-ribosylated peptides with hydrofluoric acid generates a specific +132 Da mass signature that corresponds to the decomposition of mono- and poly(ADP-ribosylated) peptides into ribose adducts as a consequence of the cleavage of the phosphorus-oxygen bonds.
Alternatives to animal testing: information resources via the Internet and World Wide Web.
Hakkinen, P J Bert; Green, Dianne K
2002-04-25
Many countries, including the United States, Canada, European Union member states, and others, require that a comprehensive search for possible alternatives be completed before beginning some or all research involving animals. Completing comprehensive alternatives searches and keeping current with information associated with alternatives to animal testing is a challenge that will be made easier as people throughout the world gain access to the Internet and World Wide Web. Numerous Internet and World Wide Web resources are available to provide guidance and other information on in vitro and other alternatives to animal testing. A comprehensive Web site is Alternatives to Animal Testing on the Web (Altweb), which serves as an online clearinghouse for resources, information, and news about alternatives to animal testing. Examples of other important Web sites include the joint one for the (US) Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) and the National Toxicology Program (NTP) Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM) and the Norwegian Reference Centre for Laboratory Animal Science and Alternatives (The NORINA database). Internet mailing lists and online access to bulletin boards, discussion areas, newsletters, and journals are other ways to access and share information to stay current with alternatives to animal testing.
Studies on Electrical and Magnetic Properties of Mg-Substituted Nickel Ferrites
NASA Astrophysics Data System (ADS)
Chavan, Pradeep; Naik, L. R.; Belavi, P. B.; Chavan, Geeta; Ramesha, C. K.; Kotnala, R. K.
2017-01-01
The semiconducting polycrystalline ferrite materials with the general formula Ni1- x Mg x Fe2O4 were synthesized by using the solid state reaction method. X-ray diffraction (XRD), Fourier transform infrared (FTIR) spectrographs, and atomic force microscopy techniques were utilized to study the structural parameters. XRD confirms the formation of single phase cubic spinel structure of the ferrites. The crystallite sizes of ferrites determined using the Debye-Scherer formula ranges from 0.963 μm to 1.069 μm. The cation distribution of ferrite shows that Mg2+ ions occupy a tetrahedral site ( A-site) and the Ni2+ ion occupy an octahedral site ( B-site) whereas Fe3+ ions occupies an octahedral as well as a tetrahedral site. The study of elastic parameters such as the longitudinal modulus, rigidity modulus, Young's modulus, bulk modulus, and Debye temperature were estimated using the FTIR technique. The decrease of direct current (DC) resistivity with increase in temperature indicates the semiconducting nature of ferrites. The dielectric constant as well as loss tangent decreases with increase in frequency, and at still higher frequencies, they are almost constant. This shows usual dielectric dispersion behavior attributed to the Maxwell-Wagner type of interfacial polarization and is in accordance with Koop's phenomenological theory. The linear increase of alternating current conductivity with increase of frequency shows the small polaron hopping type of conduction mechanism in all the ferrites. The magnetic properties such as saturation magnetization ( M s ), magnetic moment, coercivity, remnant magnetization ( M r ), and the ratio of M r /M s was estimated using the M-H loop.
Thoemmes, Stephen F; Stutzke, Crystal A; Du, Yanmei; Browning, Michael D; Buttrick, Peter M; Walker, Lori A
2014-01-31
Phosphorylation of cardiac troponin I is a well established mechanism by which cardiac contractility is modulated. However, there are a number of phosphorylation sites on TnI which contribute singly or in combination to influence cardiac function. Accordingly, methods for accurately measuring site-specific TnI phosphorylation are needed. Currently, two strategies are employed: mass spectrometry, which is costly, difficult and has a low throughput; and Western blotting using phospho-specific antibodies, which is limited by the availability of reagents. In this report, we describe a cohort of new site-specific TnI phosphoantibodies, generated against physiologically relevant phosphorylation sites, that are superior to the current commercially available antibodies: to phospho-serine 22/23 which shows a >5-fold phospho-specificity for phosphorylated TnI; to phospho-serine 43, which has >3-fold phospho-specificity for phosphorylated TnI; and phospho-serine 150 which has >2-fold phospho-specificity for phosphorylated TnI. These new antibodies demonstrated greater sensitivity and specificity for the phosphorylated TnI than the most widely used commercially available reagents. For example, at a protein load of 20 μg of total cardiac extract, a commercially available antibody recognized both phosphorylated and dephosphorylated TnI to the same degree. At the same protein load our phospho-serine 22/23 antibody exhibited no cross-reactivity with dephosphorylated TnI. These new tools should allow a more accurate assessment and a better understanding of the role of TnI phosphorylation in the response of the heart to pathologic stress. Copyright © 2013 Elsevier B.V. All rights reserved.
Belnap, J.; Reynolds, R.L.; Reheis, M.C.; Phillips, S.L.; Urban, F.E.; Goldstein, H.L.
2009-01-01
Large sediment fluxes can have significant impacts on ecosystems. We measured incoming and outgoing sediment across a gradient of soil disturbance (livestock grazing, plowing) and annual plant invasion for 9 years. Our sites included two currently ungrazed sites: one never grazed by livestock and dominated by perennial grasses/well-developed biocrusts and one not grazed since 1974 and dominated by annual weeds with little biocrusts. We used two currently grazed sites: one dominated by annual weeds and the other dominated by perennial plants, both with little biocrusts. Precipitation was highly variable, with years of average, above-average, and extremely low precipitation. During years with average and above-average precipitation, the disturbed sites consistently produced 2.8 times more sediment than the currently undisturbed sites. The never grazed site always produced the least sediment of all the sites. During the drought years, we observed a 5600-fold increase in sediment production from the most disturbed site (dominated by annual grasses, plowed about 50 years previously and currently grazed by livestock) relative to the never grazed site dominated by perennial grasses and well-developed biocrusts, indicating a non-linear, synergistic response to increasing disturbance types and levels. Comparing sediment losses among the sites, biocrusts were most important in predicting site stability, followed by perennial plant cover. Incoming sediment was similar among the sites, and while inputs were up to 9-fold higher at the most heavily disturbed site during drought years compared to average years, the change during the drought conditions was small relative to the large change seen in the sediment outputs. ?? 2009 Elsevier B.V. All rights reserved.
The Conversion and Sustainable Use of Alumina Refinery Residues: Global Solution Examples
NASA Astrophysics Data System (ADS)
Fergusson, Lee
This paper introduces current industry best practice for the conversion of alumina refinery residues (or "red mud") from hazardous waste to benign, inert material. The paper will examine four neutralization methods and Basecon Technology, a sustainable conversion process. The paper will consider ways through which this converted material can be combined and processed for sustainable applications in the treatment of hazardous waste streams (such as industrial wastewater and sludges, biosolids, and CCA wastes), contaminated brownfield sites, and mine site wastes. Recent discoveries and applications, such as the successful treatment of high levels of radium in drinking water in the USA, will also be discussed. Examples of global solutions and their technical merits will be assessed.
Quality improvement project to reduce infiltration and extravasation events in a pediatric hospital.
Tofani, Barbara F; Rineair, Sylvia A; Gosdin, Craig H; Pilcher, Patricia M; McGee, Susan; Varadarajan, Kartik R; Schoettker, Pamela J
2012-12-01
A safety event response team at Cincinnati Children's Hospital Medical Center developed and tested improvement strategies to reduce peripheral intravenous (PIV) infiltration and extravasation injuries. Improvement activities included development of the touch-look-compare method for hourly PIV site assessment, staff education and mandatory demonstration of PIV site assessment, and performance monitoring and sharing of compliance results. We observed a significant reduction in the injury rate immediately following implementation of the interventions that corresponded with monitoring compliance in performing hourly assessments on patients with a PIV, but this was not sustained. The team is currently examining other strategies to reduce PIV injuries. Copyright © 2012 Elsevier Inc. All rights reserved.
Estimating vegetative biomass from LANDSAT-1 imagery for range management
NASA Technical Reports Server (NTRS)
Seevers, P. M.; Drew, J. V.; Carlson, M. P.
1975-01-01
Evaluation of LANDSAT-1, band 5 data for use in estimation of vegetative biomass for range management decisions was carried out for five selected range sites in the Sandhills region of Nebraska. Analysis of sets of optical density-vegetative biomass data indicated that comparisons of biomass estimation could be made within one frame but not between frames without correction factors. There was high correlation among sites within sets of radiance value-vegetative biomass data and also between sets, indicating comparisons of biomass could be made within and between frames. Landsat-1 data are shown to be a viable alternative to currently used methods of determining vegetative biomass production and stocking rate recommendations for Sandhills rangeland.
Tritium as an indicator of venues for nuclear tests.
Lyakhova, O N; Lukashenko, S N; Mulgin, S I; Zhdanov, S V
2013-10-01
Currently, due to the Treaty on the Non-proliferation of Nuclear Weapons there is a highly topical issue of an accurate verification of nuclear explosion venues. This paper proposes to consider new method for verification by using tritium as an indicator. Detailed studies of the tritium content in the air were carried in the locations of underground nuclear tests - "Balapan" and "Degelen" testing sites located in Semipalatinsk Test Site. The paper presents data on the levels and distribution of tritium in the air where tunnels and boreholes are located - explosion epicentres, wellheads and tunnel portals, as well as in estuarine areas of the venues for the underground nuclear explosions (UNE). Copyright © 2013 Elsevier Ltd. All rights reserved.
Leuthaeuser, Janelle B; Knutson, Stacy T; Kumar, Kiran; Babbitt, Patricia C; Fetrow, Jacquelyn S
2015-09-01
The development of accurate protein function annotation methods has emerged as a major unsolved biological problem. Protein similarity networks, one approach to function annotation via annotation transfer, group proteins into similarity-based clusters. An underlying assumption is that the edge metric used to identify such clusters correlates with functional information. In this contribution, this assumption is evaluated by observing topologies in similarity networks using three different edge metrics: sequence (BLAST), structure (TM-Align), and active site similarity (active site profiling, implemented in DASP). Network topologies for four well-studied protein superfamilies (enolase, peroxiredoxin (Prx), glutathione transferase (GST), and crotonase) were compared with curated functional hierarchies and structure. As expected, network topology differs, depending on edge metric; comparison of topologies provides valuable information on structure/function relationships. Subnetworks based on active site similarity correlate with known functional hierarchies at a single edge threshold more often than sequence- or structure-based networks. Sequence- and structure-based networks are useful for identifying sequence and domain similarities and differences; therefore, it is important to consider the clustering goal before deciding appropriate edge metric. Further, conserved active site residues identified in enolase and GST active site subnetworks correspond with published functionally important residues. Extension of this analysis yields predictions of functionally determinant residues for GST subgroups. These results support the hypothesis that active site similarity-based networks reveal clusters that share functional details and lay the foundation for capturing functionally relevant hierarchies using an approach that is both automatable and can deliver greater precision in function annotation than current similarity-based methods. © 2015 The Authors Protein Science published by Wiley Periodicals, Inc. on behalf of The Protein Society.
Leuthaeuser, Janelle B; Knutson, Stacy T; Kumar, Kiran; Babbitt, Patricia C; Fetrow, Jacquelyn S
2015-01-01
The development of accurate protein function annotation methods has emerged as a major unsolved biological problem. Protein similarity networks, one approach to function annotation via annotation transfer, group proteins into similarity-based clusters. An underlying assumption is that the edge metric used to identify such clusters correlates with functional information. In this contribution, this assumption is evaluated by observing topologies in similarity networks using three different edge metrics: sequence (BLAST), structure (TM-Align), and active site similarity (active site profiling, implemented in DASP). Network topologies for four well-studied protein superfamilies (enolase, peroxiredoxin (Prx), glutathione transferase (GST), and crotonase) were compared with curated functional hierarchies and structure. As expected, network topology differs, depending on edge metric; comparison of topologies provides valuable information on structure/function relationships. Subnetworks based on active site similarity correlate with known functional hierarchies at a single edge threshold more often than sequence- or structure-based networks. Sequence- and structure-based networks are useful for identifying sequence and domain similarities and differences; therefore, it is important to consider the clustering goal before deciding appropriate edge metric. Further, conserved active site residues identified in enolase and GST active site subnetworks correspond with published functionally important residues. Extension of this analysis yields predictions of functionally determinant residues for GST subgroups. These results support the hypothesis that active site similarity-based networks reveal clusters that share functional details and lay the foundation for capturing functionally relevant hierarchies using an approach that is both automatable and can deliver greater precision in function annotation than current similarity-based methods. PMID:26073648
Observation of sediment resuspension in Old Tampa Bay, Florida
Schoellhamer, David H.; ,
1990-01-01
Equipment and methodology have been developed to monitor sediment resuspension at two sites in Old Tampa Bay. Velocities are measured with electromagnetic current meters and suspended solids and turbidity are monitored with optical backscatterance sensors. In late November 1989, a vertical array of instrument pairs was deployed from a permanent platform at a deep-water site, and a submersible instrument package with a single pair of instruments was deployed at a shallow-water site. Wind waves caused resuspension at the shallow-water site, but not at the deeper platform site, and spring tidal currents did not cause resuspension at either site.
Iqbal, Muhammad; Hayat, Maqsood
2016-05-01
Gene splicing is a vital source of protein diversity. Perfectly eradication of introns and joining exons is the prominent task in eukaryotic gene expression, as exons are usually interrupted by introns. Identification of splicing sites through experimental techniques is complicated and time-consuming task. With the avalanche of genome sequences generated in the post genomic age, it remains a complicated and challenging task to develop an automatic, robust and reliable computational method for fast and effective identification of splicing sites. In this study, a hybrid model "iSS-Hyb-mRMR" is proposed for quickly and accurately identification of splicing sites. Two sample representation methods namely; pseudo trinucleotide composition (PseTNC) and pseudo tetranucleotide composition (PseTetraNC) were used to extract numerical descriptors from DNA sequences. Hybrid model was developed by concatenating PseTNC and PseTetraNC. In order to select high discriminative features, minimum redundancy maximum relevance algorithm was applied on the hybrid feature space. The performance of these feature representation methods was tested using various classification algorithms including K-nearest neighbor, probabilistic neural network, general regression neural network, and fitting network. Jackknife test was used for evaluation of its performance on two benchmark datasets S1 and S2, respectively. The predictor, proposed in the current study achieved an accuracy of 93.26%, sensitivity of 88.77%, and specificity of 97.78% for S1, and the accuracy of 94.12%, sensitivity of 87.14%, and specificity of 98.64% for S2, respectively. It is observed, that the performance of proposed model is higher than the existing methods in the literature so for; and will be fruitful in the mechanism of RNA splicing, and other research academia. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Acute care patient portals: a qualitative study of stakeholder perspectives on current practices.
Collins, Sarah A; Rozenblum, Ronen; Leung, Wai Yin; Morrison, Constance Rc; Stade, Diana L; McNally, Kelly; Bourie, Patricia Q; Massaro, Anthony; Bokser, Seth; Dwyer, Cindy; Greysen, Ryan S; Agarwal, Priyanka; Thornton, Kevin; Dalal, Anuj K
2017-04-01
To describe current practices and stakeholder perspectives of patient portals in the acute care setting. We aimed to: (1) identify key features, (2) recognize challenges, (3) understand current practices for design, configuration, and use, and (4) propose new directions for investigation and innovation. Mixed methods including surveys, interviews, focus groups, and site visits with stakeholders at leading academic medical centers. Thematic analyses to inform development of an explanatory model and recommendations. Site surveys were administered to 5 institutions. Thirty interviews/focus groups were conducted at 4 site visits that included a total of 84 participants. Ten themes regarding content and functionality, engagement and culture, and access and security were identified, from which an explanatory model of current practices was developed. Key features included clinical data, messaging, glossary, patient education, patient personalization and family engagement tools, and tiered displays. Four actionable recommendations were identified by group consensus. Design, development, and implementation of acute care patient portals should consider: (1) providing a single integrated experience across care settings, (2) humanizing the patient-clinician relationship via personalization tools, (3) providing equitable access, and (4) creating a clear organizational mission and strategy to achieve outcomes of interest. Portals should provide a single integrated experience across the inpatient and ambulatory settings. Core functionality includes tools that facilitate communication, personalize the patient, and deliver education to advance safe, coordinated, and dignified patient-centered care. Our findings can be used to inform a "road map" for future work related to acute care patient portals. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Washington, Simon; Haque, Md Mazharul; Oh, Jutaek; Lee, Dongmin
2014-05-01
Hot spot identification (HSID) aims to identify potential sites-roadway segments, intersections, crosswalks, interchanges, ramps, etc.-with disproportionately high crash risk relative to similar sites. An inefficient HSID methodology might result in either identifying a safe site as high risk (false positive) or a high risk site as safe (false negative), and consequently lead to the misuse the available public funds, to poor investment decisions, and to inefficient risk management practice. Current HSID methods suffer from issues like underreporting of minor injury and property damage only (PDO) crashes, challenges of accounting for crash severity into the methodology, and selection of a proper safety performance function to model crash data that is often heavily skewed by a preponderance of zeros. Addressing these challenges, this paper proposes a combination of a PDO equivalency calculation and quantile regression technique to identify hot spots in a transportation network. In particular, issues related to underreporting and crash severity are tackled by incorporating equivalent PDO crashes, whilst the concerns related to the non-count nature of equivalent PDO crashes and the skewness of crash data are addressed by the non-parametric quantile regression technique. The proposed method identifies covariate effects on various quantiles of a population, rather than the population mean like most methods in practice, which more closely corresponds with how black spots are identified in practice. The proposed methodology is illustrated using rural road segment data from Korea and compared against the traditional EB method with negative binomial regression. Application of a quantile regression model on equivalent PDO crashes enables identification of a set of high-risk sites that reflect the true safety costs to the society, simultaneously reduces the influence of under-reported PDO and minor injury crashes, and overcomes the limitation of traditional NB model in dealing with preponderance of zeros problem or right skewed dataset. Copyright © 2014 Elsevier Ltd. All rights reserved.
Evaluation of offshore stocking of Lake Trout in Lake Ontario
Lantry, B.F.; O'Gorman, R.; Strang, T.G.; Lantry, J.R.; Connerton, M.J.; Schanger, T.
2011-01-01
Restoration stocking of hatchery-reared lake trout Salvelinus namaycush has occurred in Lake Ontario since 1973. In U.S. waters, fish stocked through 1990 survived well and built a large adult population. Survival of yearlings stocked from shore declined during 1990–1995, and adult numbers fell during 1998–2005. Offshore stocking of lake trout was initiated in the late 1990s in response to its successful mitigation of predation losses to double-crested cormorants Phalacrocorax auritus and the results of earlier studies that suggested it would enhance survival in some cases. The current study was designed to test the relative effectiveness of three stocking methods at a time when poststocking survival for lake trout was quite low and losses due to fish predators was a suspected factor. The stocking methods tested during 2000–2002 included May offshore, May onshore, and June onshore. Visual observations during nearshore stockings and hydroacoustic observations of offshore stockings indicated that release methods were not a direct cause of fish mortality. Experimental stockings were replicated for 3 years at one site in the southwest and for 2 years at one site in the southeast. Offshore releases used a landing craft to transport hatchery trucks from 3 to 6 km offshore out to 55–60-m-deep water. For the southwest site, offshore stocking significantly enhanced poststocking survival. Among the three methods, survival ratios were 1.74 : 1.00 : 1.02 (May offshore : May onshore : June onshore). Although not statistically significant owing to the small samples, the trends were similar for the southeast site, with survival ratios of 1.67 : 1.00 : 0.72. Consistent trends across years and sites indicated that offshore stocking of yearling lake trout during 2000–2002 provided nearly a twofold enhancement in survival; however, this increase does not appear to be great enough to achieve the 12-fold enhancement necessary to return population abundance to restoration targets.
NASA Astrophysics Data System (ADS)
Iqbal, M.; Islam, A.; Hossain, A.; Mustaque, S.
2016-12-01
Multi-Criteria Decision Making(MCDM) is advanced analytical method to evaluate appropriate result or decision from multiple criterion environment. Present time in advanced research, MCDM technique is progressive analytical process to evaluate a logical decision from various conflict. In addition, Present day Geospatial approach (e.g. Remote sensing and GIS) also another advanced technical approach in a research to collect, process and analyze various spatial data at a time. GIS and Remote sensing together with the MCDM technique could be the best platform to solve a complex decision making process. These two latest process combined very effectively used in site selection for solid waste management in urban policy. The most popular MCDM technique is Weighted Linear Method (WLC) where Analytical Hierarchy Process (AHP) is another popular and consistent techniques used in worldwide as dependable decision making. Consequently, the main objective of this study is improving a AHP model as MCDM technique with Geographic Information System (GIS) to select a suitable landfill site for urban solid waste management. Here AHP technique used as a MCDM tool to select the best suitable landfill location for urban solid waste management. To protect the urban environment in a sustainable way municipal waste needs an appropriate landfill site considering environmental, geological, social and technical aspect of the region. A MCDM model generate from five class related which related to environmental, geological, social and technical using AHP method and input the result set in GIS for final model location for urban solid waste management. The final suitable location comes out that 12.2% of the area corresponds to 22.89 km2 considering the total study area. In this study, Keraniganj sub-district of Dhaka district in Bangladesh is consider as study area which is densely populated city currently undergoes an unmanaged waste management system especially the suitable landfill sites for waste dumping site.
[Current status and prospects of portable NIR spectrometer].
Yu, Xin-Yang; Lu, Qi-Peng; Gao, Hong-Zhi; Peng, Zhong-Qi
2013-11-01
Near-infrared spectroscopy (NIRS) is a reliable, rapid, and non-destructive analytical method widely applied in as a number of fields such as agriculture, food, chemical and oil industry. In order to suit different applications, near-infrared spectrometers are now varied. Portable near-infrared spectrometers are needed for rapid on-site identification and analysis. Instruments of this kind are rugged, compact and easy to be transported. In this paper, the current states of portable near-infrared spectrometers are reviewed. Portable near-infrared spectrometers are built of different monochromator systems: filter, grating, Fourier-transform methods, acousto-optic tunable filter (AOTF) and a large number of new methods based on micro-electro-mechanical systems (MEMS). The first part focuses on working principles of different monochromator systems. Advantages and disadvantages of different systems are also briefly mentioned. Descriptions of each method are given in turn. Typical spectrometers of each kind are introduced, and some parameters of these instruments are listed. In the next part we discuss sampling adapters, display, power supply and some other parts, which are designed to make the spectrometer more portable and easier to use. In the end, the current states of portable near-infrared spectrometers are summarized. Future trends of development of portable near-infrared spectrometers in China and abroad are discussed.
Teschoviruses as Indicators of Porcine Fecal Contamination of Surface Water
Jiménez-Clavero, Miguel Angel; Fernández, Carlos; Ortiz, José Antonio; Pro, Javier; Carbonell, Gregoria; Tarazona, José Vicente; Roblas, Neftalí; Ley, Victoria
2003-01-01
Teschoviruses specifically infect pigs and are shed in pig feces. Hence, their presence in water should indicate contamination with pig fecal residues. To assess this hypothesis, we have developed a real-time reverse transcriptase PCR (RT-PCR) method that allows the quantitative detection of pig teschovirus (PTV) RNA. The method is able to detect 92 fg of PTV RNA per ml of sample. Using this method, we have detected the presence of PTV RNA in water and fecal samples from all pig farms examined (n = 5). Feces from other animal species (cattle, sheep, and goats) were negative in this test. To compare the PTV RNA detection method with conventional chemical determinations currently in use for evaluation of water contamination, we analyzed water samples collected downstream from a pig slurry spillage site. We have found a positive correlation within both types of determinations. The sensitivity of the PTV detection assay was similar to that achieved by unspecific organic matter determination and superior to all other conventional chemical analyses performed. Furthermore, the new method is highly specific, revealing the porcine origin of the contamination, a feature that is lacking in currently available methods for the assessment of water contamination. PMID:14532098
Cao, Yiping; Sivaganesan, Mano; Kelty, Catherine A; Wang, Dan; Boehm, Alexandria B; Griffith, John F; Weisberg, Stephen B; Shanks, Orin C
2018-01-01
Human fecal pollution of recreational waters remains a public health concern worldwide. As a result, there is a growing interest in the application of human-associated fecal source identification quantitative real-time PCR (qPCR) technologies for water quality research and management. However, there are currently no standardized approaches for field implementation and interpretation of qPCR data. In this study, a standardized HF183/BacR287 qPCR method was combined with a water sampling strategy and a novel Bayesian weighted average approach to establish a human fecal contamination score (HFS) that can be used to prioritize sampling sites for remediation based on measured human waste levels. The HFS was then used to investigate 975 study design scenarios utilizing different combinations of sites with varying sampling intensities (daily to once per week) and number of qPCR replicates per sample (2-14 replicates). Findings demonstrate that site prioritization with HFS is feasible and that both sampling intensity and number of qPCR replicates influence reliability of HFS estimates. The novel data analysis strategy presented here provides a prescribed approach for the implementation and interpretation of human-associated HF183/BacR287 qPCR data with the goal of site prioritization based on human fecal pollution levels. In addition, information is provided for future users to customize study designs for optimal HFS performance. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Suzuki, Koichi; Kusano, Yukiko; Ochi, Ryota; Nishiyama, Nariaki; Tokunaga, Tomochika; Tanaka, Kazuhiro
2017-01-01
Estimating the spatial distribution of groundwater salinity in coastal plain regions is becoming increasingly important for site characterisation and the prediction of hydrogeological environmental conditions resulting from radioactive waste disposal and underground CO2 storage. In previous studies of the freshwater-saltwater interface, electromagnetic methods were used for sites characterised by unconsolidated deposits or Neocene soft sedimentary rocks. However, investigating the freshwater-saltwater interface in hard rock sites (e.g. igneous areas) is more complex, with the permeability of the rocks greatly influenced by fractures. In this study, we investigated the distribution of high-salinity groundwater at two volcanic rock sites and one sedimentary rock site, each characterised by different hydrogeological features. Our investigations included (1) applying the controlled source audio-frequency magnetotelluric (CSAMT) method and (2) conducting laboratory tests to measure the electrical properties of rock core samples. We interpreted the 2D resistivity sections by referring to previous data on geology and geochemistry of groundwater. At the Tokusa site, an area of inland volcanic rocks, low resistivity zones were detected along a fault running through volcanic rocks and shallow sediments. The results suggest that fluids rise through the Tokusa-Jifuku Fault to penetrate shallow sediments in a direction parallel to the river, and some fluids are diluted by rainwater. At the Oki site, a volcanic island on a continental shelf, four resistivity zones (in upward succession: low, high, low and high) were detected. The results suggest that these four zones were formed during a transgression-regression cycle caused by the last glacial period. At the Saijo site, located on a coastal plain composed of thick sediments, we observed a deep low resistivity zone, indicative of fossil seawater remnant from a transgression after the last glacial period. The current coastal plain formed in historical times, following which fresh water penetrated the upper parts of the fossil seawater zone to form a freshwater aquifer ~200 m in thickness.
The application of micro UAV in construction project
NASA Astrophysics Data System (ADS)
Kaamin, Masiri; Razali, Siti Nooraiin Mohd; Ahmad, Nor Farah Atiqah; Bukari, Saifullizan Mohd; Ngadiman, Norhayati; Kadir, Aslila Abd; Hamid, Nor Baizura
2017-10-01
In every outstanding construction project, there is definitely have an effective construction management. Construction management allows a construction project to be implemented according to plan. Every construction project must have a progress development works that is usually created by the site engineer. Documenting the progress of works is one of the requirements in construction management. In a progress report it is necessarily have a visual image as an evidence. The conventional method used for photographing on the construction site is by using common digital camera which is has few setback comparing to Micro Unmanned Aerial Vehicles (UAV). Besides, site engineer always have a current issues involving limitation of monitoring on high reach point and entire view of the construction site. The purpose of this paper is to provide a concise review of Micro UAV technology in monitoring the progress on construction site through visualization approach. The aims of this study are to replace the conventional method of photographing on construction site using Micro UAV which can portray the whole view of the building, especially on high reach point and allows to produce better images, videos and 3D model and also facilitating site engineer to monitor works in progress. The Micro UAV was flown around the building construction according to the Ground Control Points (GCPs) to capture images and record videos. The images taken from Micro UAV have been processed generate 3D model and were analysed to visualize the building construction as well as monitoring the construction progress work and provides immediate reliable data for project estimation. It has been proven that by using Micro UAV, a better images and videos can give a better overview of the construction site and monitor any defects on high reach point building structures. Not to be forgotten, with Micro UAV the construction site progress is more efficiently tracked and kept on the schedule.
Precursor-Based Synthesis of Porous Colloidal Particles towards Highly Efficient Catalysts.
Zheng, Yun; Geng, Hongbo; Zhang, Yufei; Chen, Libao; Li, Cheng Chao
2018-04-02
In recent years, porous colloidal particles have found promising applications in catalytic fields, such as photocatalysis, electrocatalysis, industrial and automotive byproducts removal, as well as biomass upgrading. These applications are critical for alleviating the energy crisis and environmental pollution. Porous colloidal particles have remarkable specific areas and abundant reactive sites, which can significantly improve the mass/charge transport and reaction rate in catalysis. Precursor-based synthesis is among the most facile and widely-adopted methods to achieve monodisperse and homogeneous porous colloidal particles. In the current review, we briefly introduce the general catalytic applications of porous colloidal particles. The conventional precursor-based methods are reviewed to design state-of-the-art porous colloidal particles as highly efficient catalysts. The recent development of porous colloidal particles derived from metal-organic frameworks (MOFs), glycerates, carbonate precursors, and ion exchange methods are reviewed. In the end, the current concerns and future development of porous colloidal particles are outlined. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
An Inductive Logic Programming Approach to Validate Hexose Binding Biochemical Knowledge.
Nassif, Houssam; Al-Ali, Hassan; Khuri, Sawsan; Keirouz, Walid; Page, David
2010-01-01
Hexoses are simple sugars that play a key role in many cellular pathways, and in the regulation of development and disease mechanisms. Current protein-sugar computational models are based, at least partially, on prior biochemical findings and knowledge. They incorporate different parts of these findings in predictive black-box models. We investigate the empirical support for biochemical findings by comparing Inductive Logic Programming (ILP) induced rules to actual biochemical results. We mine the Protein Data Bank for a representative data set of hexose binding sites, non-hexose binding sites and surface grooves. We build an ILP model of hexose-binding sites and evaluate our results against several baseline machine learning classifiers. Our method achieves an accuracy similar to that of other black-box classifiers while providing insight into the discriminating process. In addition, it confirms wet-lab findings and reveals a previously unreported Trp-Glu amino acids dependency.
Development of a concept for non-monetary assessment of urban ecosystem services at the site level.
Wurster, Daniel; Artmann, Martina
2014-05-01
Determining the performance of ecosystem services at the city or regional level cannot accurately take into account the fine differences between green or gray structures. The supply of regulating ecosystem services in, for instance, parks can differ as parks vary in their land cover composition. A comprehensive ecosystem service assessment approach also needs to reflect land use to consider the demands placed on ecosystem services, which are mostly neglected by current research yet important for urban planning. For instance, if a sealed surface is no longer used, it could be unsealed to improve ecosystem service supply. Because of these scientific shortcomings, this article argues for a conceptual framework for the non-monetary assessment of urban ecosystem services at the site scale. This paper introduces a standardized method for selecting representative sites and evaluating their supply of and demand on ecosystem services. The conceptual design is supplemented by examples of Salzburg, Austria.
NASA Astrophysics Data System (ADS)
Ayadi, Omar; Felfel, Houssem; Masmoudi, Faouzi
2017-07-01
The current manufacturing environment has changed from traditional single-plant to multi-site supply chain where multiple plants are serving customer demands. In this article, a tactical multi-objective, multi-period, multi-product, multi-site supply-chain planning problem is proposed. A corresponding optimization model aiming to simultaneously minimize the total cost, maximize product quality and maximize the customer satisfaction demand level is developed. The proposed solution approach yields to a front of Pareto-optimal solutions that represents the trade-offs among the different objectives. Subsequently, the analytic hierarchy process method is applied to select the best Pareto-optimal solution according to the preferences of the decision maker. The robustness of the solutions and the proposed approach are discussed based on a sensitivity analysis and an application to a real case from the textile and apparel industry.
The Restoration Rapid Assessment Tool: An Access/Visual Basic application
Hiebert, Ron; Larson, D.L.; Thomas, K.; Tancreto, N.; Haines, D.; Richey, A.; Dow, T.; Drees, L.
2009-01-01
Managers of parks and natural areas are increasingly faced with difficult decisions concerning restoration of disturbed lands. Financial and workforce resources often limit these restoration efforts, and rarely can a manager afford to address all concerns within the region of interest. With limited resources, managers and scientists have to decide which areas will be targeted for restoration and the restoration treatments to use in these areas. A broad range of approaches are used to make such decisions, from well-researched expert opinions (Cipollini et al. 2005) to gut feeling, with variable degrees of input from site visits, data collection, and data analysis used to support the decision. A standardized approach including an analytical assessment of site characteristics based on the best information available, with a written or electronic record of all the steps taken along the way, would make comparisons among a group of sites easier and lend credibility through use of common, documented criteria at all sites. In response to these concerns, we have developed the Restoration Rapid Assessment Tool (RRAT). RRAT is based on field observations of key indicators of site degradation, stressors influencing the site, value of the site with respect to larger management objectives, likelihood of achieving the management goals, and logistical constraints to restoration. The purpose of RRAT is not to make restoration decisions or prescribe methods, but rather to ensure that a basic set of pertinent issues are considered for each site and to facilitate comparisons among sites. Several concepts have been central to the development of RRAT. First, the management goal (also known as desired future condition) of any site under evaluation should be defined before the field evaluation begins. Second, the evaluation should be based upon readily observable indicators so as to avoid cumbersome field methods. Third, the ease with which site stressors can be ameliorated must be factored into the evaluation. Fourth, intrinsic site value must be assessed independently of current condition. Finally, logistical considerations must also be addressed. Our initial focus has been on riparian areas because they are among the most heavily impacted habitat types, and RRAT indicators reflect this focus.
A log-linear model approach to estimation of population size using the line-transect sampling method
Anderson, D.R.; Burnham, K.P.; Crain, B.R.
1978-01-01
The technique of estimating wildlife population size and density using the belt or line-transect sampling method has been used in many past projects, such as the estimation of density of waterfowl nestling sites in marshes, and is being used currently in such areas as the assessment of Pacific porpoise stocks in regions of tuna fishing activity. A mathematical framework for line-transect methodology has only emerged in the last 5 yr. In the present article, we extend this mathematical framework to a line-transect estimator based upon a log-linear model approach.
Mao, Xiangju; Hu, Bin; He, Man; Fan, Wenying
2012-10-19
In this study, novel off/on-site stir bar sorptive extraction (SBSE) approaches with a home-made portable electric stirrer have been developed for the analysis of polycyclic aromatic hydrocarbon compounds (PAHs). In these approaches, a miniature battery-operated electric stirrer was employed to provide agitation of sample solutions instead of the commonly used large size magnetic stirrer powered by alternating current in conventional SBSE process, which could extend the SBSE technique from the conventional off-site analysis to the on-site sampling. The applicability of the designed off/on-site SBSE sampling approaches was evaluated by polydimethylsiloxane (PDMS) coating SBSE-high performance liquid chromatography-fluorescence detection (HPLC-FLD) analysis of six target PAHs in environmental water. The home-made portable electric stirrer is simple, easy-to-operate, user friendly, low cost, easy-to-be-commercialized, and can be processed in direct immersion SBSE, headspace sorptive extraction (HSSE) and continuous flow (CF)-SBSE modes. Since the stir bar was fixed onto the portable device by magnetic force, it is very convenient to install, remove and replace the stir bar, and the coating friction loss which occurred frequently in conventional SBSE process could be avoided. The parameters affecting the extraction of six target PAHs by the home-made portable SBSE sampling device with different sampling modes were studied. Under the optimum extraction conditions, good linearity was obtained by all of three SBSE extraction modes with correlation coefficient (R) higher than 0.9971. The limits of detection (LODs, S/N=3) were 0.05-3.41 ng L(-1) for direct immersion SBSE, 0.03-2.23 ng L(-1) for HSSE and 0.09-3.75 ng L(-1) for CF-SBSE, respectively. The proposed portable PDMS-SBSE-HPLC-FLD method was applied for the analysis of six target PAHs in East Lake water, and the analytical results obtained by on-site SBSE sampling were in good agreement with that obtained by off-site SBSE sampling. The accuracy of the developed method was evaluated by recovery test and the recoveries for the spiked sample were found to be in the range of 87.1-122.8% for off-site CF-SBSE, 88.8-114.3% for on-site sampling, and 87.7-123.6% for off-site SBSE, respectively. The developed method is one of the most sensitive methods for PAHs determination and the home-designed SBSE system is feasible for the field sampling. Copyright © 2012 Elsevier B.V. All rights reserved.
Measurement of Gaseous Oxidized Mercury at a SEARCH Network Site in Florida, USA
NASA Astrophysics Data System (ADS)
Huang, J.; Miller, M. B.; Gustin, M. S.
2013-12-01
There are three operationally defined forms of mercury (Hg) that have been measured in the atmosphere. These include gaseous elemental Hg (GEM), gaseous oxidized Hg (GOM), and particle-bound Hg (PBM). The chemical compounds that make up GOM are currently not well understood, and because of this we do not understand its transport and fate. Additionally, there are limitations associated with the current measurement method, the Tekran 2537/1130/1135 system. Recent work has shown that this system underestimates GOM concentrations, and may not measure all forms. Here we describe work building on ongoing research that focuses on understanding the limitations associated with the instrument, and the chemical forms of GOM. Mercury data have been collected at a Southeastern Aerosol Research and Characterization (SEARCH) network site, Outlying Landing Field (OLF), by the University of Nevada-Reno since 2006. This site is located near the Gulf of Mexico in western Florida. This site is potentially influenced by multiple Hg sources including marine air, electricity generating facilities, mobile sources, and long range transport from high elevation and inland regions. Recent work using data from this location and two others in Florida indicated that on top of background deposition, Hg input to OLF is due to local mobile sources, and long range transport in the spring. Air masses with different chemistry have been hypothesized to carry different GOM compounds. To test this hypothesis, an active Hg sampling system that collects GOM on nylon and cation-exchange membranes is being deployed at OLF. Measurements started March 2013. Here we will present data collected so far, and compare concentrations measured to those obtained using a Tekran system. Ancillary data including meteorology, criteria air pollutants, and those collected using surrogated surfaces for dry Hg deposition and Hg passive samplers will be applied to help understand the sources of GOM. Back trajectory analyses will also applied. This new method shows that different forms of GOM are present at OLF.
Milic, Natasa M.; Trajkovic, Goran Z.; Bukumiric, Zoran M.; Cirkovic, Andja; Nikolic, Ivan M.; Milin, Jelena S.; Milic, Nikola V.; Savic, Marko D.; Corac, Aleksandar M.; Marinkovic, Jelena M.; Stanisavljevic, Dejana M.
2016-01-01
Background Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. Methods This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013–14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Results Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (p<0.001). Conclusion This study provides empirical evidence to support educator decisions to implement different learning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics. PMID:26859832
Alternative Chemical Cleaning Methods for High Level Waste Tanks: Simulant Studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rudisill, T.; King, W.; Hay, M.
Solubility testing with simulated High Level Waste tank heel solids has been conducted in order to evaluate two alternative chemical cleaning technologies for the dissolution of sludge residuals remaining in the tanks after the exhaustion of mechanical cleaning and sludge washing efforts. Tests were conducted with non-radioactive pure phase metal reagents, binary mixtures of reagents, and a Savannah River Site PUREX heel simulant to determine the effectiveness of an optimized, dilute oxalic/nitric acid cleaning reagent and pure, dilute nitric acid toward dissolving the bulk non-radioactive waste components. A focus of this testing was on minimization of oxalic acid additions duringmore » tank cleaning. For comparison purposes, separate samples were also contacted with pure, concentrated oxalic acid which is the current baseline chemical cleaning reagent. In a separate study, solubility tests were conducted with radioactive tank heel simulants using acidic and caustic permanganate-based methods focused on the “targeted” dissolution of actinide species known to be drivers for Savannah River Site tank closure Performance Assessments. Permanganate-based cleaning methods were evaluated prior to and after oxalic acid contact.« less
NASA Astrophysics Data System (ADS)
Ankri, Rinat; Fixler, Dror
2017-07-01
Optical imaging is a powerful tool for investigating the structure and function of tissues. Tissue optical imaging technologies are generally discussed under two broad regimes: microscopic and macroscopic, while the latter is widely investigated in the field of light-tissue interaction. Among the developed optical technologies for tissue investigation, the diffusion reflectance (DR) method is a simple and safe technology. However, this method suffers from low specificity and low signal-to-noise ratio, so the extraction of the tissue properties is not an easy task. In this review, we describe the use of gold nanorods (GNRs) in DR spectroscopy. The GNRs present unique optical properties which enhance the scattering and absorption properties of a tissue. The GNRs can be easily targeted toward abnormal sites in order to improve the DR signal and to distinguish between the healthy and the abnormal sites in the tissue, with high specificity. This article describes the use of the DR-GNRs method for the detection of cancer and atherosclerosis, from light transfer theory, through the extraction of the tissue properties using the diffusion theory and up to DR in vivo measurements.
Prediction and Dissection of Protein-RNA Interactions by Molecular Descriptors.
Liu, Zhi-Ping; Chen, Luonan
2016-01-01
Protein-RNA interactions play crucial roles in numerous biological processes. However, detecting the interactions and binding sites between protein and RNA by traditional experiments is still time consuming and labor costing. Thus, it is of importance to develop bioinformatics methods for predicting protein-RNA interactions and binding sites. Accurate prediction of protein-RNA interactions and recognitions will highly benefit to decipher the interaction mechanisms between protein and RNA, as well as to improve the RNA-related protein engineering and drug design. In this work, we summarize the current bioinformatics strategies of predicting protein-RNA interactions and dissecting protein-RNA interaction mechanisms from local structure binding motifs. In particular, we focus on the feature-based machine learning methods, in which the molecular descriptors of protein and RNA are extracted and integrated as feature vectors of representing the interaction events and recognition residues. In addition, the available methods are classified and compared comprehensively. The molecular descriptors are expected to elucidate the binding mechanisms of protein-RNA interaction and reveal the functional implications from structural complementary perspective.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hattem, M V; Paterson, L; Woollett, J
2008-08-20
65 surveys were completed in 2002 to assess the current distribution of special status amphibians at the Lawrence Livermore National Laboratory's (LLNL) Livermore Site and Site 300. Combined with historical information from previous years, the information presented herein illustrates the dynamic and probable risk that amphibian populations face at both sites. The Livermore Site is developed and in stark contrast to the mostly undeveloped Site 300. Yet both sites have significant issues threatening the long-term sustainability of their respective amphibian populations. Livermore Site amphibians are presented with a suite of challenges inherent of urban interfaces, most predictably the bullfrog (Ranamore » catesbeiana), while Site 300's erosion issues and periodic feral pig (Sus scrofa) infestations reduce and threaten populations. The long-term sustainability of LLNL's special status amphibians will require active management and resource commitment to maintain and restore amphibian habitat at both sites.« less
Hestand, Matthew S; van Galen, Michiel; Villerius, Michel P; van Ommen, Gert-Jan B; den Dunnen, Johan T; 't Hoen, Peter AC
2008-01-01
Background The identification of transcription factor binding sites is difficult since they are only a small number of nucleotides in size, resulting in large numbers of false positives and false negatives in current approaches. Computational methods to reduce false positives are to look for over-representation of transcription factor binding sites in a set of similarly regulated promoters or to look for conservation in orthologous promoter alignments. Results We have developed a novel tool, "CORE_TF" (Conserved and Over-REpresented Transcription Factor binding sites) that identifies common transcription factor binding sites in promoters of co-regulated genes. To improve upon existing binding site predictions, the tool searches for position weight matrices from the TRANSFACR database that are over-represented in an experimental set compared to a random set of promoters and identifies cross-species conservation of the predicted transcription factor binding sites. The algorithm has been evaluated with expression and chromatin-immunoprecipitation on microarray data. We also implement and demonstrate the importance of matching the random set of promoters to the experimental promoters by GC content, which is a unique feature of our tool. Conclusion The program CORE_TF is accessible in a user friendly web interface at . It provides a table of over-represented transcription factor binding sites in the users input genes' promoters and a graphical view of evolutionary conserved transcription factor binding sites. In our test data sets it successfully predicts target transcription factors and their binding sites. PMID:19036135
NASA Astrophysics Data System (ADS)
Poggi, V.; Burjanek, J.; Michel, C.; Fäh, D.
2017-08-01
The Swiss Seismological Service (SED) has recently finalised the installation of ten new seismological broadband stations in northern Switzerland. The project was led in cooperation with the National Cooperative for the Disposal of Radioactive Waste (Nagra) and Swissnuclear to monitor micro seismicity at potential locations of nuclear-waste repositories. To further improve the quality and usability of the seismic recordings, an extensive characterization of the sites surrounding the installation area was performed following a standardised investigation protocol. State-of-the-art geophysical techniques have been used, including advanced active and passive seismic methods. The results of all analyses converged to the definition of a set of best-representative 1-D velocity profiles for each site, which are the input for the computation of engineering soil proxies (traveltime averaged velocity and quarter-wavelength parameters) and numerical amplification models. Computed site response is then validated through comparison with empirical site amplification, which is currently available for any station connected to the Swiss seismic networks. With the goal of a high-sensitivity network, most of the NAGRA stations have been installed on stiff-soil sites of rather high seismic velocity. Seismic characterization of such sites has always been considered challenging, due to lack of relevant velocity contrast and the large wavelengths required to investigate the frequency range of engineering interest. We describe how ambient vibration techniques can successfully be applied in these particular conditions, providing practical recommendations for best practice in seismic site characterization of high-velocity sites.
Ultra-portable field transfer radiometer for vicarious calibration of earth imaging sensors
NASA Astrophysics Data System (ADS)
Thome, Kurtis; Wenny, Brian; Anderson, Nikolaus; McCorkel, Joel; Czapla-Myers, Jeffrey; Biggar, Stuart
2018-06-01
A small portable transfer radiometer has been developed as part of an effort to ensure the quality of upwelling radiance from test sites used for vicarious calibration in the solar reflective. The test sites are used to predict top-of-atmosphere reflectance relying on ground-based measurements of the atmosphere and surface. The portable transfer radiometer is designed for one-person operation for on-site field calibration of instrumentation used to determine ground-leaving radiance. The current work describes the detector- and source-based radiometric calibration of the transfer radiometer highlighting the expected accuracy and SI-traceability. The results indicate differences between the detector-based and source-based results greater than the combined uncertainties of the approaches. Results from recent field deployments of the transfer radiometer using a solar radiation based calibration agree with the source-based laboratory calibration within the combined uncertainties of the methods. The detector-based results show a significant difference to the solar-based calibration. The source-based calibration is used as the basis for a radiance-based calibration of the Landsat-8 Operational Land Imager that agrees with the OLI calibration to within the uncertainties of the methods.
Telepathology in cytopathology: challenges and opportunities.
Collins, Brian T
2013-01-01
Telepathology in cytopathology is becoming more commonly utilized, and newer technologic infrastructures afford the laboratory a variety of options. The options and design of a telepathology system are driven by the clinical needs. This is primarily focused on providing rapid on-site evaluation service for fine needle aspiration. The clinical requirements and needs of a system are described. Available tools to design and implement a telepathology system are covered, including methods of image capture, network connectivity and remote viewing options. The primary telepathology method currently used and described involves the delivery via a network connection of a live video image to a remote site which is passively viewed by an internet web-based browser. By utilizing live video information and a voice connection to the on-site location, the remote viewer can collect clinical information and direct their view of the slides. Telepathology systems for use in cytopathology can be designed and implemented with commercially available infrastructure. It is necessary for the laboratory to validate the designed system and adhere to the required regulatory requirements. Telepathology for cytopathology can be reliably utilized by adapting existing technology, and newer advances hold great promise for further applications in the cytopathology laboratory. Copyright © 2013 S. Karger AG, Basel.
Chinna Reddy, P; Chaitanya, K.S.C.; Madhusudan Rao, Y.
2011-01-01
Owing to the ease of the administration, the oral cavity is an attractive site for the delivery of drugs. Through this route it is possible to realize mucosal (local effect) and transmucosal (systemic effect) drug administration. In the first case, the aim is to achieve a site-specific release of the drug on the mucosa, whereas the second case involves drug absorption through the mucosal barrier to reach the systemic circulation. The main obstacles that drugs meet when administered via the buccal route derive from the limited absorption area and the barrier properties of the mucosa. The effective physiological removal mechanisms of the oral cavity that take the formulation away from the absorption site are the other obstacles that have to be considered. The strategies studied to overcome such obstacles include the employment of new materials that, possibly, combine mucoadhesive, enzyme inhibitory and penetration enhancer properties and the design of innovative drug delivery systems which, besides improving patient compliance, favor a more intimate contact of the drug with the absorption mucosa. This presents a brief description of advantages and limitations of buccal drug delivery and the anatomical structure of oral mucosa, mechanisms of drug permeation followed by current formulation design in line with developments in buccal delivery systems and methodology in evaluating buccal formulations. PMID:23008684
Evaluating the Effectiveness of Natura 2000 Network for Wolf Conservation: A Case-Study in Greece
NASA Astrophysics Data System (ADS)
Votsi, Nefta-Eleftheria P.; Zomeni, Maria S.; Pantis, J. D.
2016-02-01
The wolf ( Canis lupus) is used as a case study to rate Natura 2000 sites in Greece based on preferred wolf habitat characteristics and test whether the network is suitable for their conservation. Road density, agricultural area, site area, connectivity, food availability (i.e., presence of natural prey), and elevation in 237 sites are combined in a logistic regression model. The occurrence of the wolf's natural prey was the most prevalent factor determining wolf presence, followed by agricultural cover. Considering the current status of these features at N2K site level, most sites currently hosting wolves (85.7 %) have good or excellent prospects for the long-term presence of the wolf. On the contrary, 11 sites which now have wolves are predicted to be ineffective in keeping them in the future due to the absence of wild ungulates and their high agricultural coverage. Four sites with no wolf presence currently have excellent prospects to host wolves in the future. Roadless sites are a priority for protection and retaining their current condition is strongly suggested. The proposed approach aims to detect gaps in protection for the wolf and identify priority sites in need of mitigation actions. It can also assist the assessment of conservation policies in Greece and elsewhere toward accomplishing set goals in protected areas. By focusing on wolf protection, we hope to increase agencies' attention to deal with conservation effectiveness, especially in cases like Greece, where a number of sites are insufficiently known and protected and management measures are not properly implemented.
Su, Hongsheng
2017-12-18
Distributed power grids generally contain multiple diverse types of distributed generators (DGs). Traditional particle swarm optimization (PSO) and simulated annealing PSO (SA-PSO) algorithms have some deficiencies in site selection and capacity determination of DGs, such as slow convergence speed and easily falling into local trap. In this paper, an improved SA-PSO (ISA-PSO) algorithm is proposed by introducing crossover and mutation operators of genetic algorithm (GA) into SA-PSO, so that the capabilities of the algorithm are well embodied in global searching and local exploration. In addition, diverse types of DGs are made equivalent to four types of nodes in flow calculation by the backward or forward sweep method, and reactive power sharing principles and allocation theory are applied to determine initial reactive power value and execute subsequent correction, thus providing the algorithm a better start to speed up the convergence. Finally, a mathematical model of the minimum economic cost is established for the siting and sizing of DGs under the location and capacity uncertainties of each single DG. Its objective function considers investment and operation cost of DGs, grid loss cost, annual purchase electricity cost, and environmental pollution cost, and the constraints include power flow, bus voltage, conductor current, and DG capacity. Through applications in an IEEE33-node distributed system, it is found that the proposed method can achieve desirable economic efficiency and safer voltage level relative to traditional PSO and SA-PSO algorithms, and is a more effective planning method for the siting and sizing of DGs in distributed power grids.
Systematic Review of Quality of Patient Information on Liposuction in the Internet
Zuk, Grzegorz; Eylert, Gertraud; Raptis, Dimitri Aristotle; Guggenheim, Merlin; Shafighi, Maziar
2016-01-01
Background: A large number of patients who are interested in esthetic surgery actively search the Internet, which represents nowadays the first source of information. However, the quality of information available in the Internet on liposuction is currently unknown. The aim of this study was to assess the quality of patient information on liposuction available in the Internet. Methods: The quantitative and qualitative assessment of Web sites was based on a modified Ensuring Quality Information for Patients tool (36 items). Five hundred Web sites were identified by the most popular web search engines. Results: Two hundred forty-five Web sites were assessed after duplicates and irrelevant sources were excluded. Only 72 (29%) Web sites addressed >16 items, and scores tended to be higher for professional societies, portals, patient groups, health departments, and academic centers than for Web sites developed by physicians, respectively. The Ensuring Quality Information for Patients score achieved by Web sites ranged between 8 and 29 of total 36 points, with a median value of 16 points (interquartile range, 14–18). The top 10 Web sites with the highest scores were identified. Conclusions: The quality of patient information on liposuction available in the Internet is poor, and existing Web sites show substantial shortcomings. There is an urgent need for improvement in offering superior quality information on liposuction for patients intending to undergo this procedure. PMID:27482498
Automatic identification of alpine mass movements based on seismic and infrasound signals
NASA Astrophysics Data System (ADS)
Schimmel, Andreas; Hübl, Johannes
2017-04-01
The automatic detection and identification of alpine mass movements like debris flows, debris floods or landslides gets increasing importance for mitigation measures in the densely populated and intensively used alpine regions. Since this mass movement processes emits characteristically seismic and acoustic waves in the low frequency range this events can be detected and identified based on this signals. So already several approaches for detection and warning systems based on seismic or infrasound signals has been developed. But a combination of both methods, which can increase detection probability and reduce false alarms is currently used very rarely and can serve as a promising method for developing an automatic detection and identification system. So this work presents an approach for a detection and identification system based on a combination of seismic and infrasound sensors, which can detect sediment related mass movements from a remote location unaffected by the process. The system is based on one infrasound sensor and one geophone which are placed co-located and a microcontroller where a specially designed detection algorithm is executed which can detect mass movements in real time directly at the sensor site. Further this work tries to get out more information from the seismic and infrasound spectrum produced by different sediment related mass movements to identify the process type and estimate the magnitude of the event. The system is currently installed and tested on five test sites in Austria, two in Italy and one in Switzerland as well as one in Germany. This high number of test sites is used to get a large database of very different events which will be the basis for a new identification method for alpine mass movements. These tests shows promising results and so this system provides an easy to install and inexpensive approach for a detection and warning system.
Mapping Sites of O-Glycosylation and Fringe Elongation on Drosophila Notch*
Harvey, Beth M.; Rana, Nadia A.; Moss, Hillary; Leonardi, Jessica; Jafar-Nejad, Hamed; Haltiwanger, Robert S.
2016-01-01
Glycosylation of the Notch receptor is essential for its activity and serves as an important modulator of signaling. Three major forms of O-glycosylation are predicted to occur at consensus sites within the epidermal growth factor-like repeats in the extracellular domain of the receptor: O-fucosylation, O-glucosylation, and O-GlcNAcylation. We have performed comprehensive mass spectral analyses of these three types of O-glycosylation on Drosophila Notch produced in S2 cells and identified peptides containing all 22 predicted O-fucose sites, all 18 predicted O-glucose sites, and all 18 putative O-GlcNAc sites. Using semiquantitative mass spectral methods, we have evaluated the occupancy and relative amounts of glycans at each site. The majority of the O-fucose sites were modified to high stoichiometries. Upon expression of the β3-N-acetylglucosaminyltransferase Fringe with Notch, we observed varying degrees of elongation beyond O-fucose monosaccharide, indicating that Fringe preferentially modifies certain sites more than others. Rumi modified O-glucose sites to high stoichiometries, although elongation of the O-glucose was site-specific. Although the current putative consensus sequence for O-GlcNAcylation predicts 18 O-GlcNAc sites on Notch, we only observed apparent O-GlcNAc modification at five sites. In addition, we performed mass spectral analysis on endogenous Notch purified from Drosophila embryos and found that the glycosylation states were similar to those found on Notch from S2 cells. These data provide foundational information for future studies investigating the mechanisms of how O-glycosylation regulates Notch activity. PMID:27268051
Restoration of the Donor Face After Facial Allotransplantation
Grant, Gerald T.; Liacouras, Peter; Santiago, Gabriel F.; Garcia, Juan R.; Al Rakan, Mohammed; Murphy, Ryan; Armand, Mehran; Gordon, Chad R.
2014-01-01
Introduction Current protocols for facial transplantation include the mandatory fabrication of an alloplastic “mask” to restore the congruency of the donor site in the setting of “open casket” burial. However, there is currently a paucity of literature describing the current state-of-the-art and available options. Methods During this study, we identified that most of donor masks are fabricated using conventional methods of impression, molds, silicone, and/or acrylic application by an experienced anaplastologist or maxillofacial prosthetics technician. However, with the recent introduction of several enhanced computer-assisted technologies, our facial transplant team hypothesized that there were areas for improvement with respect to cost and preparation time. Results The use of digital imaging for virtual surgical manipulation, computer-assisted planning, and prefabricated surgical cutting guides—in the setting of facial transplantation—provided us a novel opportunity for digital design and fabrication of a donor mask. The results shown here demonstrate an acceptable appearance for “open-casket” burial while maintaining donor identity after facial organ recovery. Conclusions Several newer techniques for fabrication of facial transplant donor masks exist currently and are described within the article. These encompass digital impression, digital design, and additive manufacturing technology. PMID:24835867
Field Trials of the Multi-Source Approach for Resistivity and Induced Polarization Data Acquisition
NASA Astrophysics Data System (ADS)
LaBrecque, D. J.; Morelli, G.; Fischanger, F.; Lamoureux, P.; Brigham, R.
2013-12-01
Implementing systems of distributed receivers and transmitters for resistivity and induced polarization data is an almost inevitable result of the availability of wireless data communication modules and GPS modules offering precise timing and instrument locations. Such systems have a number of advantages; for example, they can be deployed around obstacles such as rivers, canyons, or mountains which would be difficult with traditional 'hard-wired' systems. However, deploying a system of identical, small, battery powered, transceivers, each capable of injecting a known current and measuring the induced potential has an additional and less obvious advantage in that multiple units can inject current simultaneously. The original purpose for using multiple simultaneous current sources (multi-source) was to increase signal levels. In traditional systems, to double the received signal you inject twice the current which requires you to apply twice the voltage and thus four times the power. Alternatively, one approach to increasing signal levels for large-scale surveys collected using small, battery powered transceivers is it to allow multiple units to transmit in parallel. In theory, using four 400 watt transmitters on separate, parallel dipoles yields roughly the same signal as a single 6400 watt transmitter. Furthermore, implementing the multi-source approach creates the opportunity to apply more complex current flow patterns than simple, parallel dipoles. For a perfect, noise-free system, multi-sources adds no new information to a data set that contains a comprehensive set of data collected using single sources. However, for realistic, noisy systems, it appears that multi-source data can substantially impact survey results. In preliminary model studies, the multi-source data produced such startling improvements in subsurface images that even the authors questioned their veracity. Between December of 2012 and July of 2013, we completed multi-source surveys at five sites with depths of exploration ranging from 150 to 450 m. The sites included shallow geothermal sites near Reno Nevada, Pomarance Italy, and Volterra Italy; a mineral exploration site near Timmins Quebec; and a landslide investigation near Vajont Dam in northern Italy. These sites provided a series of challenges in survey design and deployment including some extremely difficult terrain and a broad range of background resistivity and induced values. Despite these challenges, comparison of multi-source results to resistivity and induced polarization data collection with more traditional methods support the thesis that the multi-source approach is capable of providing substantial improvements in both depth of penetration and resolution over conventional approaches.
Distributive, Non-destructive Real-time System and Method for Snowpack Monitoring
NASA Technical Reports Server (NTRS)
Frolik, Jeff (Inventor); Skalka, Christian (Inventor)
2013-01-01
A ground-based system that provides quasi real-time measurement and collection of snow-water equivalent (SWE) data in remote settings is provided. The disclosed invention is significantly less expensive and easier to deploy than current methods and less susceptible to terrain and snow bridging effects. Embodiments of the invention include remote data recovery solutions. Compared to current infrastructure using existing SWE technology, the disclosed invention allows more SWE sites to be installed for similar cost and effort, in a greater variety of terrain; thus, enabling data collection at improved spatial resolutions. The invention integrates a novel computational architecture with new sensor technologies. The invention's computational architecture is based on wireless sensor networks, comprised of programmable, low-cost, low-powered nodes capable of sophisticated sensor control and remote data communication. The invention also includes measuring attenuation of electromagnetic radiation, an approach that is immune to snow bridging and significantly reduces sensor footprints.
Physically facilitating drug-delivery systems
Rodriguez-Devora, Jorge I; Ambure, Sunny; Shi, Zhi-Dong; Yuan, Yuyu; Sun, Wei; Xu, Tao
2012-01-01
Facilitated/modulated drug-delivery systems have emerged as a possible solution for delivery of drugs of interest to pre-allocated sites at predetermined doses for predefined periods of time. Over the past decade, the use of different physical methods and mechanisms to mediate drug release and delivery has grown significantly. This emerging area of research has important implications for development of new therapeutic drugs for efficient treatments. This review aims to introduce and describe different modalities of physically facilitating drug-delivery systems that are currently in use for cancer and other diseases therapy. In particular, delivery methods based on ultrasound, electrical, magnetic and photo modulations are highlighted. Current uses and areas of improvement for these different physically facilitating drug-delivery systems are discussed. Furthermore, the main advantages and drawbacks of these technologies reviewed are compared. The review ends with a speculative viewpoint of how research is expected to evolve in the upcoming years. PMID:22485192
EPA Begins Reviews of 24 New England Site Cleanups during Current Fiscal Year
EPA plans to conduct comprehensive reviews of site cleanups at 24 National Priorities List Sites (Superfund Sites), including two Federal Facilities, across New England by performing required Five-Year Reviews of sites.
Vrabel, Joseph; Teeple, Andrew; Kress, Wade H.
2009-01-01
With increasing demands for reliable water supplies and availability estimates, groundwater flow models often are developed to enhance understanding of surface-water and groundwater systems. Specific hydraulic variables must be known or calibrated for the groundwater-flow model to accurately simulate current or future conditions. Surface geophysical surveys, along with selected test-hole information, can provide an integrated framework for quantifying hydrogeologic conditions within a defined area. In 2004, the U.S. Geological Survey, in cooperation with the North Platte Natural Resources District, performed a surface geophysical survey using a capacitively coupled resistivity technique to map the lithology within the top 8 meters of the near-surface for 110 kilometers of the Interstate and Tri-State Canals in western Nebraska and eastern Wyoming. Assuming that leakage between the surface-water and groundwater systems is affected primarily by the sediment directly underlying the canal bed, leakage potential was estimated from the simple vertical mean of inverse-model resistivity values for depth levels with geometrically increasing layer thickness with depth which resulted in mean-resistivity values biased towards the surface. This method generally produced reliable results, but an improved analysis method was needed to account for situations where confining units, composed of less permeable material, underlie units with greater permeability. In this report, prepared by the U.S. Geological Survey in cooperation with the North Platte Natural Resources District, the authors use geostatistical analysis to develop the minimum-unadjusted method to compute a relative leakage potential based on the minimum resistivity value in a vertical column of the resistivity model. The minimum-unadjusted method considers the effects of homogeneous confining units. The minimum-adjusted method also is developed to incorporate the effect of local lithologic heterogeneity on water transmission. Seven sites with differing geologic contexts were selected following review of the capacitively coupled resistivity data collected in 2004. A reevaluation of these sites using the mean, minimum-unadjusted, and minimum-adjusted methods was performed to compare the different approaches for estimating leakage potential. Five of the seven sites contained underlying confining units, for which the minimum-unadjusted and minimum-adjusted methods accounted for the confining-unit effect. Estimates of overall leakage potential were lower for the minimum-unadjusted and minimum-adjusted methods than those estimated by the mean method. For most sites, the local heterogeneity adjustment procedure of the minimum-adjusted method resulted in slightly larger overall leakage-potential estimates. In contrast to the mean method, the two minimum-based methods allowed the least permeable areas to control the overall vertical permeability of the subsurface. The minimum-adjusted method refined leakage-potential estimation by additionally including local lithologic heterogeneity effects.
NASA Astrophysics Data System (ADS)
Frigeri, A.; Cardellini, C.; Chiodini, G.; Frondini, F.; Bagnato, E.; Aiuppa, A.; Fischer, T. P.; Lehnert, K. A.
2014-12-01
The study of the main pathways of carbon flux from the deep Earth requires the analysis of a large quantity and variety of data on volcanic and non-volcanic gas emissions. Hence, there is need for common frameworks to aggregate available data and insert new observations. Since 2010 we have been developing the Mapping Gas emissions (MaGa) web-based database to collect data on carbon degassing form volcanic and non-volcanic environments. MaGa uses an Object-relational model, translating the experience of field surveyors into the database schema. The current web interface of MaGa allows users to browse the data in tabular format or by browsing an interactive web-map. Enabled users can insert information as measurement methods, instrument details as well as the actual values collected in the field. Measurements found in the literature can be inserted as well as direct field observations made by human-operated instruments. Currently the database includes fluxes and gas compositions from active craters degassing, diffuse soil degassing and fumaroles both from dormant volcanoes and open-vent volcanoes from literature survey and data about non-volcanic emission of the Italian territory. Currently, MaGa holds more than 1000 volcanic plume degassing fluxes, data from 30 sites of diffuse soil degassing from italian volcanoes, and about 60 measurements from fumarolic and non volcanic emission sites. For each gas emission site, the MaGa holds data, pictures, descriptions on gas sampling, analysis and measurement methods, together with bibliographic references and contacts to researchers having experience on each site. From 2012, MaGa developments started to be focused towards the framework of the Deep Earth Carbon Degassing research initiative of the Deep Carbon Observatory. Whithin the DECADE initiative, there are others data systems, as EarthChem and the Smithsonian Institution's Global Volcanism Program. An interoperable interaction between the DECADE data systems is being planned. MaGa is showing good potentials to improve the knowledge on Earth degassing firstly by making data more accessible and encouraging participation among researchers, and secondly by allowing to observe and explore, for the first time, a gas emission dataset with spatial and temporal extents never analyzed before.
The Savannah River Site's Groundwater Monitoring Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-01-10
The Environmental Protection Department/Environmental Monitoring Section (EPD/EMS) administers the Savannah River Site's (SRS) Groundwater Monitoring Program. During second quarter 1991 EPD/EMS conducted extensive sampling of monitoring wells. EPD/EMS established two sets of flagging criteria in 1986 to assist in the management of sample results. The flagging criteria do not define contamination levels; instead, they aid personnel in sample scheduling, interpretation of data, and trend identification. Beginning in 1991, the flagging criteria are based on EPA drinking water standards and method detection limits. A detailed explanation of the current flagging criteria is presented in the Flagging Criteria section of this document.more » Analytical results from second quarter 1991 are listed in this report.« less
The prospect of gene therapy for prostate cancer: update on theory and status.
Koeneman, K S; Hsieh, J T
2001-09-01
Molecularly based novel therapeutic agents are needed to address the problem of locally recurrent, or metastatic, advanced hormone-refractory prostate cancer. Recent basic science advances in mechanisms of gene expression, vector delivery, and targeting have rendered clinically relevant gene therapy to the prostatic fossa and distant sites feasible in the near future. Current research and clinical investigative efforts involving methods for more effective vector delivery and targeting, with enhanced gene expression to selected (specific) sites, are reviewed. These areas of research involve tissue-specific promoters, transgene exploration, vector design and delivery, and selective vector targeting. The 'vectorology' involved mainly addresses selective tissue homing with ligands, mechanisms of innate immune system evasion for durable transgene expression, and the possibility of repeat administration.
The Savannah River Site's Groundwater Monitoring Program: Second quarter 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogers, C.D.
1992-10-07
The Environmental Protection Department/Environmental Monitoring Section (EPD/EMS) administers the Savannah River Site's (SRS) Groundwater Monitoring Program. During second quarter 1992, EPD/EMS conducted extensive sampling of monitoring wells. EPD/EMS established two sets of criteria to assist in the management of sample results. The flagging criteria do not define contamination levels; instead, they aid personnel in sample scheduling, interpretation of data, and trend identification. Since 1991, the flagging criteria have been based on the federal Environmental Protection Agency (EPA) drinking water standards and on method detection limits. A detailed explanation of the current flagging criteria is presented in the Flagging Criteria sectionmore » of this document. Analytical results from second quarter 1992 are listed in this report.« less
Remediation strategies for historical mining and smelting sites.
Dybowska, Agnieszka; Farago, Margaret; Valsami-Jones, Eugenia; Thornton, Iain
2006-01-01
The environmental, social and economic problems associated with abandoned mine sites are serious and global. Environmental damage arising from polluted waters and dispersal of contaminated waste is a feature characteristic of many old mines in North America, Australia, Europe and elsewhere. Today, because of the efficiency of mining operations and legal requirements in many countries for prevention of environmental damage from mining operations, the release of metals to the environment from modern mining is low. However, many mineralized areas that were extensively worked in the 18th and 19th centuries and left abandoned after mining had ceased, have left a legacy of metal contaminated land. Unlike organic chemicals and plastics, metals cannot be degraded chemically or biologically into non-toxic and environmentally neutral constituents. Thus sites contaminated with toxic metals present a particular challenge for remediation. Soil remediation has been the subject of a significant amount of research work in the past decade; this has resulted in a number of remediation options currently available or being developed. Remediation strategies for metal/metalloid contaminated historical mining sites are reviewed and summarized in this article. It focuses on the current applications of in situ remediation with the use of soil amendments (adsorption and precipitation based methods are discussed) and phytoremediation (in situ plant based technology for environmental clean up and restoration). These are promising alternative technologies to traditional options of excavation and ex situ treatment, offering an advantage of being non-invasive and low cost. In particular, they have been shown to be effective in remediation of mining and smelting contaminated sites, although the long-term durability of these treatments cannot be predicted.
Restoring and rehabilitating sagebrush habitats
Pyke, David A.; Knick, S.T.; Connelly, J.W.
2011-01-01
Less than half of the original habitat of the Greater Sage-Grouse (Centrocercus uropha-sianus) currently exists. Some has been perma-nently lost to farms and urban areas, but the remaining varies in condition from high quality to no longer adequate. Restoration of sagebrush (Artemisia spp.) grassland ecosystems may be pos-sible for resilient lands. However, Greater Sage-Grouse require a wide variety of habitats over large areas to complete their life cycle. Effective restoration will require a regional approach for prioritizing and identifying appropriate options across the landscape. A landscape triage method is recommended for prioritizing lands for restora-tion. Spatial models can indicate where to protect and connect intact quality habitat with other simi-lar habitat via restoration. The ecological site con-cept of land classification is recommended for characterizing potential habitat across the region along with their accompanying state and transi-tion models of plant community dynamics. These models assist in identifying if passive, manage-ment-based or active, vegetation manipulation?based restoration might accomplish the goals of improved Greater Sage-Grouse habitat. A series of guidelines help formulate questions that manag-ers might consider when developing restoration plans: (1) site prioritization through a landscape triage; (2) soil verification and the implications of soil features on plant establishment success; (3) a comparison of the existing plant community to the potential for the site using ecological site descriptions; (4) a determination of the current successional status of the site using state and transition models to aid in predicting if passive or active restoration is necessary; and (5) implemen-tation of post-treatment monitoring to evaluate restoration effectiveness and post-treatment man-agement implications to restoration success.
Monitoring of heavy metal particle emission in the exhaust duct of a foundry using LIBS.
Dutouquet, C; Gallou, G; Le Bihan, O; Sirven, J B; Dermigny, A; Torralba, B; Frejafon, E
2014-09-01
Heavy metals have long been known to be detrimental to human health and the environment. Their emission is mainly considered to occur via the atmospheric route. Most of airborne heavy metals are of anthropogenic origin and produced through combustion processes at industrial sites such as incinerators and foundries. Current regulations impose threshold limits on heavy metal emissions. The reference method currently implemented for quantitative measurements at exhaust stacks consists of on-site sampling of heavy metals on filters for the particulate phase (the most prominent and only fraction considered in this study) prior to subsequent laboratory analysis. Results are therefore known only a few days after sampling. Stiffer regulations require the development of adapted tools allowing automatic, on-site or even in-situ measurements with temporal resolutions. The Laser-Induced Breakdown Spectroscopy (LIBS) technique was deemed as a potential candidate to meet these requirements. On site experiments were run by melting copper bars and monitoring emission of this element in an exhaust duct at a pilot-scale furnace in a French research center dedicated to metal casting. Two approaches designated as indirect and direct analysis were broached in these experiments. The former corresponds to filter enrichment prior to subsequent LIBS interrogation whereas the latter entails laser focusing right through the aerosol for detection. On-site calibration curves were built and compared with those obtained at laboratory scale in order to investigate possible matrix and analyte effects. Eventually, the obtained results in terms of detection limits and quantitative temporal monitoring of copper emission clearly emphasize the potentialities of the direct LIBS measurements. Copyright © 2014 Elsevier B.V. All rights reserved.
Characteristics of HIV Care and Treatment in PEPFAR-Supported Sites
Filler, Scott; Berruti, Andres A.; Menzies, Nick; Berzon, Rick; Ellerbrock, Tedd V.; Ferris, Robert; Blandford, John M.
2011-01-01
Background The U.S. President’s Emergency Plan for AIDS Relief (PEPFAR) has supported the extension of HIV care and treatment to 2.4 million individuals by September 2009. With increasing resources targeted toward scale-up, it is important to understand the characteristics of current PEPFAR-supported HIV care and treatment sites. Methods Forty-five sites in Botswana, Ethiopia, Nigeria, Uganda, and Vietnam were sampled. Data were collected retrospectively from successive 6-month periods of site operations, through reviews of facility records and interviews with site personnel between April 2006 and March 2007. Facility size and scale-up rate, patient characteristics, staffing models, clinical and laboratory monitoring, and intervention mix were compared. Results Sites added a median of 293 patients per quarter. By the evaluation’s end, sites supported a median of 1,649 HIV patients, 922 of them receiving antiretroviral therapy (ART). Patients were predominantly adult (97.4%) and the majority (96.5%) were receiving regimens based on nonnucleoside reverse transcriptase inhibitors (NNRTIs). The ratios of physicians to patients dropped substantially as sites matured. ART patients were commonly seen monthly or quarterly for clinical and laboratory monitoring, with CD4 counts being taken at 6-month intervals. One-third of sites provided viral load testing. Cotrimoxazole prophylaxis was the most prevalent supportive service. Conclusions HIV treatment sites scaled up rapidly with the influx of resources and technical support through PEPFAR, providing complex health services to progressively expanding patient cohorts. Human resources are stretched thin, and delivery models and intervention mix differ widely between sites. Ongoing research is needed to identify best-practice service delivery models. PMID:21346585
NASA Astrophysics Data System (ADS)
Lee, G.; Ahn, J. Y.; Chang, L. S.; Kim, J.; Park, R.
2017-12-01
During the KORUS-AQ, extensive sets of chemical measurements for reactive gases and aerosol species were made at 3 major sites on upwind island (Baengyeong Island), urban (Olympic Park in Seoul) and downwind rural forest location (Taewha Forest). Also, intensive aerosol size and composition observations from 5 NIER super sites, 3 NIMR monitoring sites, and 5 other university sites were currently facilitated in the KORUS-AQ data set. In addition, air quality criteria species data from 264 nation-wide ground monitoring sites with 5 minute temporal resolution during the whole campaign period were supplemented to cover mostly in densely populated urban areas, but sparsely in rural areas. The specific objectives of these ground sites were to provide highly comprehensive data set to coordinate the close collaborations among other research platforms including airborne measurements, remote sensing, and model studies. The continuous measurements at ground sites were well compared with repetitive low-level aircraft observations of NASA's DC-8 over Olympic Park and Taewha Forest site. Similarly, many ground measurements enabled the validation of chemical transport models and the remote sensing observations from ground and NASA's King Air. The observed results from inter-comparison studies in many reactive gases and aerosol compositions between different measurement methods and platforms will be presented. Compiling data sets from ground sites, source-wise analysis for ozone and aerosol, their in-situ formations, and transport characteristics by local/regional circulation will be discussed, too.
Identification of Candidate Transcription Factor Binding Sites in the Cattle Genome
Bickhart, Derek M.; Liu, George E.
2013-01-01
A resource that provides candidate transcription factor binding sites (TFBSs) does not currently exist for cattle. Such data is necessary, as predicted sites may serve as excellent starting locations for future omics studies to develop transcriptional regulation hypotheses. In order to generate this resource, we employed a phylogenetic footprinting approach—using sequence conservation across cattle, human and dog—and position-specific scoring matrices to identify 379,333 putative TFBSs upstream of nearly 8000 Mammalian Gene Collection (MGC) annotated genes within the cattle genome. Comparisons of our predictions to known binding site loci within the PCK1, ACTA1 and G6PC promoter regions revealed 75% sensitivity for our method of discovery. Additionally, we intersected our predictions with known cattle SNP variants in dbSNP and on the Illumina BovineHD 770k and Bos 1 SNP chips, finding 7534, 444 and 346 overlaps, respectively. Due to our stringent filtering criteria, these results represent high quality predictions of putative TFBSs within the cattle genome. All binding site predictions are freely available at http://bfgl.anri.barc.usda.gov/BovineTFBS/ or http://199.133.54.77/BovineTFBS. PMID:23433959
Demirarslan, K Onur; Korucu, M Kemal; Karademir, Aykan
2016-08-01
Ecological problems arising after the construction and operation of a waste incineration plant generally originate from incorrect decisions made during the selection of the location of the plant. The main objective of this study is to investigate how the selection method for the location of a new municipal waste incineration plant can be improved by using a dispersion modelling approach supported by geographical information systems and multi-criteria decision analysis. Considering this aim, the appropriateness of the current location of an existent plant was assessed by applying a pollution dispersion model. Using this procedure, the site ranking for a total of 90 candidate locations and the site of the existing incinerator were determined by a new location selection practice and the current place of the plant was evaluated by ANOVA and Tukey tests. This ranking, made without the use of modelling approaches, was re-evaluated based on the modelling of various variables, including the concentration of pollutants, population and population density, demography, temporality of meteorological data, pollutant type, risk formation type by CALPUFF and re-ranking the results. The findings clearly indicate the impropriety of the location of the current plant, as the pollution distribution model showed that its location was the fourth-worst choice among 91 possibilities. It was concluded that the location selection procedures for waste incinerators should benefit from the improvements obtained by the articulation of pollution dispersion studies combined with the population density data to obtain the most suitable location. © The Author(s) 2016.
Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise
2010-01-01
A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware. PMID:21344013
Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise
2011-01-01
A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware.
Systematic Analysis and Prediction of In Situ Cross Talk of O-GlcNAcylation and Phosphorylation
Li, Ao; Wang, Minghui
2015-01-01
Reversible posttranslational modification (PTM) plays a very important role in biological process by changing properties of proteins. As many proteins are multiply modified by PTMs, cross talk of PTMs is becoming an intriguing topic and draws much attention. Currently, lots of evidences suggest that the PTMs work together to accomplish a specific biological function. However, both the general principles and underlying mechanism of PTM crosstalk are elusive. In this study, by using large-scale datasets we performed evolutionary conservation analysis, gene ontology enrichment, motif extraction of proteins with cross talk of O-GlcNAcylation and phosphorylation cooccurring on the same residue. We found that proteins with in situ O-GlcNAc/Phos cross talk were significantly enriched in some specific gene ontology terms and no obvious evolutionary pressure was observed. Moreover, 3 functional motifs associated with O-GlcNAc/Phos sites were extracted. We further used sequence features and GO features to predict O-GlcNAc/Phos cross talk sites based on phosphorylated sites and O-GlcNAcylated sites separately by the use of SVM model. The AUC of classifier based on phosphorylated sites is 0.896 and the other classifier based on GlcNAcylated sites is 0.843. Both classifiers achieved a relatively better performance compared with other existing methods. PMID:26601103
Grogl, Max; Boni, Marina; Carvalho, Edgar M.; Chebli, Houda; Cisse, Mamoudou; Diro, Ermias; Fernandes Cota, Gláucia; Erber, Astrid C.; Gadisa, Endalamaw; Handjani, Farhad; Khamesipour, Ali; Llanos-Cuentas, Alejandro; López Carvajal, Liliana; Grout, Lise; Lmimouni, Badre Eddine; Mokni, Mourad; Nahzat, Mohammad Sami; Ben Salah, Afif; Ozbel, Yusuf; Pascale, Juan Miguel; Rizzo Molina, Nidia; Rode, Joelle; Romero, Gustavo; Ruiz-Postigo, José Antonio; Gore Saravia, Nancy; Soto, Jaime; Uzun, Soner; Mashayekhi, Vahid; Vélez, Ivan Dario; Vogt, Florian; Zerpa, Olga; Arana, Byron
2018-01-01
Introduction Progress with the treatment of cutaneous leishmaniasis (CL) has been hampered by inconsistent methodologies used to assess treatment effects. A sizable number of trials conducted over the years has generated only weak evidence backing current treatment recommendations, as shown by systematic reviews on old-world and new-world CL (OWCL and NWCL). Materials and methods Using a previously published guidance paper on CL treatment trial methodology as the reference, consensus was sought on key parameters including core eligibility and outcome measures, among OWCL (7 countries, 10 trial sites) and NWCL (7 countries, 11 trial sites) during two separate meetings. Results Findings and level of consensus within and between OWCL and NWCL sites are presented and discussed. In addition, CL trial site characteristics and capacities are summarized. Conclusions The consensus reached allows standardization of future clinical research across OWCL and NWCL sites. We encourage CL researchers to adopt and adapt as required the proposed parameters and outcomes in their future trials and provide feedback on their experience. The expertise afforded between the two sets of clinical sites provides the basis for a powerful consortium with potential for extensive, standardized assessment of interventions for CL and faster approval of candidate treatments. PMID:29329311
Ecological carrying capacity assessment of diving site: A case study of Mabul Island, Malaysia.
Zhang, Li-Ye; Chung, Shan-Shan; Qiu, Jian-Wen
2016-12-01
Despite considered a non-consumptive use of the marine environment, diving-related activities can cause damages to coral reefs. It is imminent to assess the maximum numbers of divers that can be accommodated by a diving site before it is subject to irreversible deterioration. This study aimed to assess the ecological carrying capacity of a diving site in Mabul Island, Malaysia. Photo-quadrat line transect method was used in the benthic survey. The ecological carrying capacity was assessed based on the relationship between the number of divers and the proportion of diver damaged hard corals in Mabul Island. The results indicated that the proportion of diver damaged hard corals occurred exponentially with increasing use. The ecological carrying capacity of Mabul Island is 15,600-16,800 divers per diving site per year at current levels of diver education and training with a quarterly threshold of 3900-4200 per site. Our calculation shows that management intervention (e.g. limiting diving) is justified at 8-14% of hard coral damage. In addition, the use of coral reef dominated diving sites should be managed according to their sensitivity to diver damage and the depth of the reefs. Copyright © 2016 Elsevier Ltd. All rights reserved.
Systematic Analysis and Prediction of In Situ Cross Talk of O-GlcNAcylation and Phosphorylation.
Yao, Heming; Li, Ao; Wang, Minghui
2015-01-01
Reversible posttranslational modification (PTM) plays a very important role in biological process by changing properties of proteins. As many proteins are multiply modified by PTMs, cross talk of PTMs is becoming an intriguing topic and draws much attention. Currently, lots of evidences suggest that the PTMs work together to accomplish a specific biological function. However, both the general principles and underlying mechanism of PTM crosstalk are elusive. In this study, by using large-scale datasets we performed evolutionary conservation analysis, gene ontology enrichment, motif extraction of proteins with cross talk of O-GlcNAcylation and phosphorylation cooccurring on the same residue. We found that proteins with in situ O-GlcNAc/Phos cross talk were significantly enriched in some specific gene ontology terms and no obvious evolutionary pressure was observed. Moreover, 3 functional motifs associated with O-GlcNAc/Phos sites were extracted. We further used sequence features and GO features to predict O-GlcNAc/Phos cross talk sites based on phosphorylated sites and O-GlcNAcylated sites separately by the use of SVM model. The AUC of classifier based on phosphorylated sites is 0.896 and the other classifier based on GlcNAcylated sites is 0.843. Both classifiers achieved a relatively better performance compared with other existing methods.
Clean Water State Revolving Fund (CWSRF): Contaminated Sites
Communities can use the CWSRF to address the water quality aspects of site assessment and cleanup of brownfields, Superfund sites, and sites of current or former aboveground or underground storage tanks.
The application of quantum mechanics in structure-based drug design.
Mucs, Daniel; Bryce, Richard A
2013-03-01
Computational chemistry has become an established and valuable component in structure-based drug design. However the chemical complexity of many ligands and active sites challenges the accuracy of the empirical potentials commonly used to describe these systems. Consequently, there is a growing interest in utilizing electronic structure methods for addressing problems in protein-ligand recognition. In this review, the authors discuss recent progress in the development and application of quantum chemical approaches to modeling protein-ligand interactions. The authors specifically consider the development of quantum mechanics (QM) approaches for studying large molecular systems pertinent to biology, focusing on protein-ligand docking, protein-ligand binding affinities and ligand strain on binding. Although computation of binding energies remains a challenging and evolving area, current QM methods can underpin improved docking approaches and offer detailed insights into ligand strain and into the nature and relative strengths of complex active site interactions. The authors envisage that QM will become an increasingly routine and valued tool of the computational medicinal chemist.
Clinical effectiveness of late maxillary protraction in cleft lip and palate: a methods paper
Lee, MK; Lane, C; Azeredo, F; Landsberger, M; Kapadia, H; Sheller, B; Yen, SL
2017-01-01
Objectives A prospective parallel cohort trial was conducted to compare outcomes of patients treated with maxillary protraction vs. LeFort 1 maxillary advancement surgery. Setting and Sample Population The primary site for the clinical trial is Children’s Hospital Los Angeles; the satellite test site is Seattle Children’s Hospital. All patients have isolated cleft lip and palate and a skeletal Class III malocclusion. Material & Methods A total of 50 patients, ages 11–14 will be recruited for the maxillary protraction cohort. The maxillary surgery cohort consists of 50 patients, ages 16–21, who will undergo LeFort 1 maxillary advancement surgery. Patients with additional medical or cognitive handicaps were excluded from the study. Results Current recruitment of patients is on track to complete the study within the proposed recruitment period. Conclusion This observational trial is collecting information that will examine dental, skeletal, financial, and quality of life issues from both research cohorts. PMID:28643931
Sequence signatures of allosteric proteins towards rational design.
Namboodiri, Saritha; Verma, Chandra; Dhar, Pawan K; Giuliani, Alessandro; Nair, Achuthsankar S
2010-12-01
Allostery is the phenomenon of changes in the structure and activity of proteins that appear as a consequence of ligand binding at sites other than the active site. Studying mechanistic basis of allostery leading to protein design with predetermined functional endpoints is an important unmet need of synthetic biology. Here, we screened the amino acid sequence landscape in search of sequence-signatures of allostery using Recurrence Quantitative Analysis (RQA) method. A characteristic vector, comprised of 10 features extracted from RQA was defined for amino acid sequences. Using Principal Component Analysis, four factors were found to be important determinants of allosteric behavior. Our sequence-based predictor method shows 82.6% accuracy, 85.7% sensitivity and 77.9% specificity with the current dataset. Further, we show that Laminarity-Mean-hydrophobicity representing repeated hydrophobic patches is the most crucial indicator of allostery. To our best knowledge this is the first report that describes sequence determinants of allostery based on hydrophobicity. As an outcome of these findings, we plan to explore possibility of inducing allostery in proteins.
Outfall siting with dye-buoy remote sensing of coastal circulation
NASA Technical Reports Server (NTRS)
Munday, J. C., Jr.; Welch, C. S.; Gordon, H. H.
1978-01-01
A dye-buoy remote sensing technique has been applied to estuarine siting problems that involve fine-scale circulation. Small hard cakes of sodium fluorescein and polyvinyl alcohol, in anchored buoys and low-windage current followers, dissolve to produce dye marks resolvable in 1:60,000 scale color and color infrared imagery. Lagrangian current vectors are determined from sequential photo coverage. Careful buoy placement reveals surface currents and submergence near fronts and convergence zones. The technique has been used in siting two sewage outfalls in Hampton Roads, Virginia: In case one, the outfall region during flood tide gathered floating materials in a convergence zone, which then acted as a secondary source during ebb; for better dispersion during ebb, the proposed outfall site was moved further offshore. In case two, flow during late flood was found to divide, with one half passing over shellfish beds; the proposed outfall site was consequently moved to keep effluent in the other half.
2012-01-01
Human Immunodeficiency Virus Type 1 (HIV-1) protease inhibitors (PIs) are the most potent class of drugs in antiretroviral therapies. However, viral drug resistance to PIs could emerge rapidly thus reducing the effectiveness of those drugs. Of note, all current FDA-approved PIs are competitive inhibitors, i.e., inhibitors that compete with substrates for the active enzymatic site. This common inhibitory approach increases the likelihood of developing drug resistant HIV-1 strains that are resistant to many or all current PIs. Hence, new PIs that move away from the current target of the active enzymatic site are needed. Specifically, allosteric inhibitors, inhibitors that prohibit PR enzymatic activities through non-competitive binding to PR, should be sought. Another common feature of current PIs is they were all developed based on the structure-based design. Drugs derived from a structure-based strategy may generate target specific and potent inhibitors. However, this type of drug design can only target one site at a time and drugs discovered by this method are often associated with strong side effects such as cellular toxicity, limiting its number of target choices, efficacy, and applicability. In contrast, a cell-based system may provide a useful alternative strategy that can overcome many of the inherited shortcomings associated with structure-based drug designs. For example, allosteric PIs can be sought using a cell-based system without considering the site or mechanism of inhibition. In addition, a cell-based system can eliminate those PIs that have strong cytotoxic effect. Most importantly, a simple, economical, and easy-to-maintained eukaryotic cellular system such as yeast will allow us to search for potential PIs in a large-scaled high throughput screening (HTS) system, thus increasing the chances of success. Based on our many years of experience in using fission yeast as a model system to study HIV-1 Vpr, we propose the use of fission yeast as a possible surrogate system to study the effects of HIV-1 protease on cellular functions and to explore its utility as a HTS system to search for new PIs to battle HIV-1 resistant strains. PMID:22971934
Molnar, S.; Cassidy, J. F.; Castellaro, S.; Cornou, C.; Crow, H.; Hunter, J. A.; Matsushima, S.; Sanchez-Sesma, F. J.; Yong, Alan
2018-01-01
Nakamura (Q Rep Railway Tech Res Inst 30:25–33, 1989) popularized the application of the horizontal-to-vertical spectral ratio (HVSR) analysis of microtremor (seismic noise or ambient vibration) recordings to estimate the predominant frequency and amplification factor of earthquake shaking. During the following quarter century, popularity in the microtremor HVSR (MHVSR) method grew; studies have verified the stability of a site’s MHVSR response over time and validated the MHVSR response with that of earthquake HVSR response. Today, MHVSR analysis is a popular reconnaissance tool used worldwide for seismic microzonation and earthquake site characterization in numerous regions, specifically, in the mapping of site period or fundamental frequency and inverted for shear-wave velocity depth profiles, respectively. However, the ubiquity of MHVSR analysis is predominantly a consequence of its ease in application rather than our full understanding of its theory. We present the state of the art in MHVSR analyses in terms of the development of its theoretical basis, current state of practice, and we comment on its future for applications in earthquake site characterization.
Alquezar, Ralph; Glendenning, Lionel; Costanzo, Simon
2013-12-15
Nutrient bioindicators are increasingly being recognised as a diagnostic tool for nutrient enrichment of estuarine and marine ecosystems. Few studies, however, have focused on field translocation of bioindicator organisms to detect nutrient discharge from industrial waste. The brown macroalgae, Sargassum flavicans, was investigated as a potential bioindicator of nutrient-enriched industrial effluent originating from a nickel refinery in tropical north-eastern Australia. S. flavicans was translocated to a number of nutrient enriched creek and oceanic sites over two seasons and assessed for changes in stable isotope ratios of (15)N and (13)C within the plant tissue in comparison to reference sites. Nutrient uptake in macroalgae, translocated to the nutrient enriched sites adjacent to the refinery, increased 3-4-fold in δ(15)N, compared to reference sites. Using δ(15)N of translocated S. flavicans proved to be a successful method for monitoring time-integrated uptake of nitrogen, given the current lack of passive sampler technology for nutrient monitoring. Copyright © 2013 Elsevier Ltd. All rights reserved.
Spatial correlation of auroral zone geomagnetic variations
NASA Astrophysics Data System (ADS)
Jackel, B. J.; Davalos, A.
2016-12-01
Magnetic field perturbations in the auroral zone are produced by a combination of distant ionospheric and local ground induced currents. Spatial and temporal structure of these currents is scientifically interesting and can also have a significant influence on critical infrastructure.Ground-based magnetometer networks are an essential tool for studying these phenomena, with the existing complement of instruments in Canada providing extended local time coverage. In this study we examine the spatial correlation between magnetic field observations over a range of scale lengths. Principal component and canonical correlation analysis are used to quantify relationships between multiple sites. Results could be used to optimize network configurations, validate computational models, and improve methods for empirical interpolation.
Effects of space weather on high-latitude ground systems
NASA Astrophysics Data System (ADS)
Pirjola, Risto
Geomagnetically induced currents (GIC) in technological systems, such as power grids, pipelines, cables and railways, are a ground manifestation of space weather. The first GIC observations were already made in early telegraph equipment more than 150 years ago. In power networks, GIC may saturate transformers with possible harmful consequences extending even to a collapse of the whole system or to permanent damage of transformers. In pipelines, GIC and the associated pipe-to-soil voltages may enhance corrosion or disturb surveys associated with corrosion control. GIC are driven by the geoelectric field induced by a geomagnetic variation at the Earth’s surface. The electric and magnetic fields are primarily produced by ionospheric currents and secondarily affected by the ground conductivity. Of great importance is the auroral electrojet with other rapidly varying currents indicating that GIC are a particular high-latitude problem. In this paper, we summarize the GIC research done in Finland during about 25 years, and discuss the calculation of GIC in a given network. Special attention is paid to modelling a power system. It is shown that, when considering GIC at a site, it is usually sufficient to take account for a smaller grid in the vicinity of the particular site. Modelling GIC also provides a basis for developing forecasting and warning methods of GIC.
Early Warning and Outbreak Detection Using Social Networking Websites: The Potential of Twitter
NASA Astrophysics Data System (ADS)
de Quincey, Ed; Kostkova, Patty
Epidemic Intelligence is being used to gather information about potential diseases outbreaks from both formal and increasingly informal sources. A potential addition to these informal sources are social networking sites such as Facebook and Twitter. In this paper we describe a method for extracting messages, called "tweets" from the Twitter website and the results of a pilot study which collected over 135,000 tweets in a week during the current Swine Flu pandemic.
Radiometric ages for basement rocks from the Emperor Seamounts, ODP Leg 197
NASA Astrophysics Data System (ADS)
Duncan, Robert A.; Keller, Randall A.
2004-08-01
The Hawaiian-Emperor seamount chain is the "type" example of an age-progressive, hot spot-generated intraplate volcanic lineament. However, our current knowledge of the age distribution within this province is based largely on radiometric ages determined several decades ago. Improvements in instrumentation, sample preparation methods, and new material obtained by recent drilling warrant a reexamination of the age relations among the older Hawaiian volcanoes. We report new age determinations (40Ar-39Ar incremental heating method) on whole rocks and feldspar separates from Detroit (Sites 1203 and 1204), Nintoku (Site 1205), and Koko (Site 1206) Seamounts (Ocean Drilling Program (ODP) Leg 197) and Meiji Seamount (Deep Sea Drilling Project (DSDP) Leg 19, Site 192). Plateaus in incremental heating age spectra for Site 1203 lava flows give a mean age of 75.8 ± 0.6 (2σ) Ma, which is consistent with the normal magnetic polarity directions observed and biostratigraphic age assignments. Site 1204 lavas produced discordant spectra, indicating Ar loss by reheating and K mobilization. Six plateau ages from lava flows at Site 1205 give a mean age of 55.6 ± 0.2 Ma, corresponding to Chron 24r. Drilling at Site 1206 intersected a N-R-N magnetic polarity sequence of lava flows, from which six plateau ages give a mean age of 49.1 ± 0.2 Ma, corresponding to the Chron 21n-22r-22n sequence. Plateau ages from two feldspar separates and one lava from DSDP Site 192 range from 34 to 41 Ma, significantly younger than the Cretaceous age of overlying sediments, which we relate to postcrystallization K mobilization. Combined with new dating results from Suiko Seamount (DSDP Site 433) and volcanoes near the prominent bend in the lineament [, 2002], the overall trend is increasing volcano age from south to north along the Emperor Seamounts, consistent with the hot spot model. However, there appear to be important departures from the earlier modeled simple linear age progression, which we relate to changes in Pacific plate motion and the rate of southward motion of the Hawaiian hot spot.
van der Borden, Arnout J; Maathuis, Patrick G M; Engels, Eefje; Rakhorst, Gerhard; van der Mei, Henny C; Busscher, Henk J; Sharma, Prashant Kumar
2007-04-01
Pin tract infections of external fixators used in orthopaedic reconstructive bone surgery are serious complications that can eventually lead to periostitis and osteomyelitis. In vitro experiments have demonstrated that bacteria adhering to stainless steel in a biofilm mode of growth detach under the influence of small electric currents, while remaining bacteria become less viable upon current application. Therefore, we have investigated whether a 100microA electric current can prevent signs of clinical infection around percutaneous pins, implanted in the tibia of goats. Three pins were inserted into the lateral right tibia of nine goats, of which one served for additional frame support. Two pins were infected with a Staphylococcus epidermidis strain of which one pin was subjected to electric current, while the other pin was used as control. Pin sites were examined daily. The wound electrical resistance decreased with worsening of the infection from a dry condition to a purulent stage. After 21 days, animals were sacrificed and the pins taken out. Infection developed in 89% of the control pin sites, whereas only 11% of the pin sites in the current group showed infection. These results show that infection of percutaneous pin sites of external fixators in reconstructive bone surgery can be prevented by the application of a small DC electric current.
Diving down the reefs? Intensive diving tourism threatens the reefs of the northern Red Sea.
Hasler, Harald; Ott, Jörg A
2008-10-01
Intensive recreational SCUBA diving threatens coral reef ecosystems. The reefs at Dahab, South Sinai, Egypt, are among the world's most dived (>30,000 dives y(-1)). We compared frequently dived sites to sites with no or little diving. Benthic communities and condition of corals were examined by the point intercept sampling method in the reef crest zone (3m) and reef slope zone (12 m). Additionally, the abundance of corallivorous and herbivorous fish was estimated based on the visual census method. Sediments traps recorded the sedimentation rates caused by SCUBA divers. Zones subject to intensive SCUBA diving showed a significantly higher number of broken and damaged corals and significantly lower coral cover. Reef crest coral communities were significantly more affected than those of the reef slope: 95% of the broken colonies were branching ones. No effect of diving on the abundance of corallivorous and herbivorous fish was evident. At heavily used dive sites, diver-related sedimentation rates significantly decreased with increasing distance from the entrance, indicating poor buoyancy regulation at the initial phase of the dive. The results show a high negative impact of current SCUBA diving intensities on coral communities and coral condition. Corallivorous and herbivorous fishes are apparently not yet affected, but are endangered if coral cover decline continues. Reducing the number of dives per year, ecologically sustainable dive plans for individual sites, and reinforcing the environmental education of both dive guides and recreational divers are essential to conserve the ecological and the aesthetic qualities of these dive sites.
Characterization Approaches to Place Invariant Sites on SI-Traceable Scales
NASA Technical Reports Server (NTRS)
Thome, Kurtis
2012-01-01
The effort to understand the Earth's climate system requires a complete integration of remote sensing imager data across time and multiple countries. Such an integration necessarily requires ensuring inter-consistency between multiple sensors to create the data sets needed to understand the climate system. Past efforts at inter-consistency have forced agreement between two sensors using sources that are viewed by both sensors at nearly the same time, and thus tend to be near polar regions over snow and ice. The current work describes a method that would provide an absolute radiometric calibration of a sensor rather than an inter-consistency of a sensor relative to another. The approach also relies on defensible error budgets that eventually provides a cross comparison of sensors without systematic errors. The basis of the technique is a model-based, SI-traceable prediction of at-sensor radiance over selected sites. The predicted radiance would be valid for arbitrary view and illumination angles and for any date of interest that is dominated by clear-sky conditions. The effort effectively works to characterize the sites as sources with known top-of-atmosphere radiance allowing accurate intercomparison of sensor data that without the need for coincident views. Data from the Advanced Spaceborne Thermal Emission and Reflection and Radiometer (ASTER), Enhanced Thematic Mapper Plus (ETM+), and Moderate Resolution Imaging Spectroradiometer (MODIS) are used to demonstrate the difficulties of cross calibration as applied to current sensors. Special attention is given to the differences caused in the cross-comparison of sensors in radiance space as opposed to reflectance space. The radiance comparisons lead to significant differences created by the specific solar model used for each sensor. The paper also proposes methods to mitigate the largest error sources in future systems. The results from these historical intercomparisons provide the basis for a set of recommendations to ensure future SI-traceable cross calibration using future missions such as CLARREO and TRUTHS. The paper describes a proposed approach that relies on model-based, SI-traceable predictions of at-sensor radiance over selected sites. The predicted radiance would be valid for arbitrary view and illumination angles and for any date of interest that is dominated by clear-sky conditions. The basis of the method is highly accurate measurements of at-sensor radiance of sufficient quality to understand the spectral and BRDF characteristics of the site and sufficient historical data to develop an understanding of temporal effects from changing surface and atmospheric conditions.
Balls, Michael; Clothier, Richard
2010-10-01
This response on behalf of FRAME to the European Commission's consultation on the five chapters of the Draft Report on Alternative (Non-animal) Methods for Cosmetics Testing: Current Status and Future Prospects--2010, is via a Comment in ATLA, rather than via the template supplied by the Commission. This is principally so that a number of general points about cosmetic ingredient testing can be made. It is concluded that the five draft chapters do not provide a credible basis for the Commission's forthcoming report to the European Parliament and the European Council on the five cosmetic ingredient safety issues for which the 7th Amendment to the Cosmetic Directive's ban on animal testing was postponed until 2013. This is mainly because there is insufficient focus in the draft chapters on the specific nature of cosmetic ingredients, their uses, their local effects and metabolism at their sites of application, and, in particular, on whether their possible absorption into the body would be likely to lead to their accumulation in target sites at levels approaching Thresholds of Toxicological Concern. Meanwhile, there continues to be uncertainty about how the provisions of the Cosmetics Directive should be applied, given the requirements of the REACH system and directives concerned with the safety of other chemicals and products. © 2010 FRAME.
Downscaling NASA Climatological Data to Produce Detailed Climate Zone Maps
NASA Technical Reports Server (NTRS)
Chandler, William S.; Hoell, James M.; Westberg, David J.; Whitlock, Charles H.; Zhang, Taiping; Stackhouse, P. W.
2011-01-01
The design of energy efficient sustainable buildings is heavily dependent on accurate long-term and near real-time local weather data. To varying degrees the current meteorological networks over the globe have been used to provide these data albeit often from sites far removed from the desired location. The national need is for access to weather and solar resource data accurate enough to use to develop preliminary building designs within a short proposal time limit, usually within 60 days. The NASA Prediction Of Worldwide Energy Resource (POWER) project was established by NASA to provide industry friendly access to globally distributed solar and meteorological data. As a result, the POWER web site (power.larc.nasa.gov) now provides global information on many renewable energy parameters and several buildings-related items but at a relatively coarse resolution. This paper describes a method of downscaling NASA atmospheric assimilation model results to higher resolution and maps those parameters to produce building climate zone maps using estimates of temperature and precipitation. The distribution of climate zones for North America with an emphasis on the Pacific Northwest for just one year shows very good correspondence to the currently defined distribution. The method has the potential to provide a consistent procedure for deriving climate zone information on a global basis that can be assessed for variability and updated more regularly.
Bone mineral density at different sites and vertebral fractures in Serbian postmenopausal women.
Ilic Stojanovic, O; Vuceljic, M; Lazovic, M; Gajic, M; Radosavljevic, N; Nikolic, D; Andjic, M; Spiroski, D; Vujovic, S
2017-02-01
This randomized study aimed to evaluate the correlation between bone mineral densities (BMD) measured at different sites and the frequency of vertebral fractures in a group of Serbian postmenopausal women. BMD was measured in 130 naïve postmenopausal women by dual X-ray absorptiometry (DXA) at the ultra-distal part of the forearms, at the hip and at the lumbar spine. At each of the measurement sites, the patients were categorized as osteoporotic, or osteopenic, or in the reference range. Vertebral fractures were examined using thoracic and lumbar spine radiography. A T-score at different skeletal sites showed discordance in the site-specific region. Vertebral fractures were found in 58.82% of patients with hip osteopenia, in 45% with forearm osteopenia and in 54.54% with lumbar spine osteoporosis. The study confirmed that the reduction of BMD depends on age and choice of measurement site. The best correlation was obtained in the women with osteopenia at all measurement sites. The discovery of vertebral fractures by lateral thoracic and lumbar spine radiography improves prompt treatment. Reference values of BMD do not exclude vertebral fractures. Of vertebral fractures, 72.5% were asymptomatic and thus spine radiographies are obligatory. Currently discussed is the position of DXA for measuring BMD as a method of detection for patients at risk of fracture.
Rosenberger, Kurt J.; Noble, Marlene A.; Norris, Benjamin
2014-01-01
An array of seven moorings housing current meters and oceanographic sensors was deployed for 6 months at 5 sites on the Continental Shelf and slope off Newport Beach, California, from July 2011 to January 2012. Full water-column profiles of currents were acquired at all five sites, and a profile of water-column temperature was also acquired at two of the five sites for the duration of the deployment. In conjunction with this deployment, the Orange County Sanitation District deployed four bottom platforms with current meters on the San Pedro Shelf, and these meters provided water-column profiles of currents. The data from this program will provide the basis for an investigation of the interaction between the deep water flow over the slope and the internal tide on the Continental Shelf.
The Focusing Optics X-ray Solar Imager (FOXSI): Instrument and First Flight
NASA Astrophysics Data System (ADS)
Glesener, Lindsay; Christe, S.; Ishikawa, S.; Ramsey, B.; Takahashi, T.; Saito, S.; Lin, R. P.; Krucker, S.; FOXSI Team
2013-04-01
Understanding electron acceleration in solar flares requires hard X-ray studies with greater sensitivity and dynamic range than are available with current solar hard X-ray observers (i.e. the RHESSI spacecraft). Both these capabilities can be advanced by the use of direct focusing optics instead of the indirect Fourier methods of current and previous generations. The Focusing Optics X-ray Solar Imager (FOXSI) sounding rocket payload demonstrates the feasibility and usefulness of hard X-ray focusing optics for solar observation. FOXSI flew for the first time on 2012 November 2, producing images and spectra of a microflare and performing a search for nonthermal X-rays from the quiet Sun. Such measurements are important for characterizing the impact of small "nanoflares" on the solar coronal heating problem. A spaceborne solar observer featuring similar optics could make detailed observations of hard X-rays from flare-accelerated electrons, identifying and characterizing particle acceleration sites and mapping out paths of energetic electrons as they leave these sites and propagate throughout the solar corona. Solar observations from NuSTAR are also expected to be an important step in this direction.
NASA Astrophysics Data System (ADS)
Wang, Zicheng; Wei, Renbo; Liu, Xiaobo
2017-01-01
Reduced graphene oxide/copper phthalocyanine nanocomposites are successfully prepared through a simple and effective two-step method, involving preferential reduction of graphene oxide and followed by self-assembly with copper phthalocyanine. The results of photographs, ultraviolet visible, x-ray diffraction, x-ray photoelectron spectroscopy, and scanning electron microscopy show that the in situ blending method can effectively facilitate graphene sheets to disperse homogenously in the copper phthalocyanine matrix through π- π interactions. As a result, the reduction of graphene oxide and restoration of the sp 2 carbon sites in graphene can enhance the dielectric properties and alternating current conductivity of copper phthalocyanine effectively.
Docking and scoring in virtual screening for drug discovery: methods and applications.
Kitchen, Douglas B; Decornez, Hélène; Furr, John R; Bajorath, Jürgen
2004-11-01
Computational approaches that 'dock' small molecules into the structures of macromolecular targets and 'score' their potential complementarity to binding sites are widely used in hit identification and lead optimization. Indeed, there are now a number of drugs whose development was heavily influenced by or based on structure-based design and screening strategies, such as HIV protease inhibitors. Nevertheless, there remain significant challenges in the application of these approaches, in particular in relation to current scoring schemes. Here, we review key concepts and specific features of small-molecule-protein docking methods, highlight selected applications and discuss recent advances that aim to address the acknowledged limitations of established approaches.
Automated location detection of injection site for preclinical stereotactic neurosurgery procedure
NASA Astrophysics Data System (ADS)
Abbaszadeh, Shiva; Wu, Hemmings C. H.
2017-03-01
Currently, during stereotactic neurosurgery procedures, the manual task of locating the proper area for needle insertion or implantation of electrode/cannula/optic fiber can be time consuming. The requirement of the task is to quickly and accurately find the location for insertion. In this study we investigate an automated method to locate the entry point of region of interest. This method leverages a digital image capture system, pattern recognition, and motorized stages. Template matching of known anatomical identifiable regions is used to find regions of interest (e.g. Bregma) in rodents. For our initial study, we tackle the problem of automatically detecting the entry point.
Fast generation of spin-squeezed states in bosonic Josephson junctions
NASA Astrophysics Data System (ADS)
Juliá-Díaz, B.; Torrontegui, E.; Martorell, J.; Muga, J. G.; Polls, A.
2012-12-01
We describe methods for the fast production of highly coherent-spin-squeezed many-body states in bosonic Josephson junctions. We start from the known mapping of the two-site Bose-Hubbard (BH) Hamiltonian to that of a single effective particle evolving according to a Schrödinger-like equation in Fock space. Since, for repulsive interactions, the effective potential in Fock space is nearly parabolic, we extend recently derived protocols for shortcuts to adiabatic evolution in harmonic potentials to the many-body BH Hamiltonian. A comparison with current experiments shows that our methods allow for an important reduction in the preparation times of highly squeezed spin states.
What is the story that soil tells us? Environmental and anthropogenic change
NASA Astrophysics Data System (ADS)
Shanskiy, Merrit; Kriiska, Aivar; Oras, Ester
2015-04-01
The archaeological studies have shown the evidence of human impact on soils functioning. On the other hand, the changed conditions of normal soil functioning will influence the human settlement in specific area. This study is part of a wider archaeological project on the environmental studies of the Kohtla Iron Age sacrificial site. To obtain a data about soil cover around historical finding some 1500 years ago, special sampling and research were carried out at the study site located in Kohtla Vanaküla, northeastern Estonia where a valuable collection of metal weapons and tools was discovered. The aim of current study was to analyze the site-specific soils to find out the connections between soil records and human mediated historical land degradation. Also, the site specific conditions were studied in order to understand its impact on archaeological artefacts and their preservation conditions. For the current investigation the soil sampling was carried out in July, 2014. The soils were described based on 20 soil pits. The site-specific soil morphological description was finalized and chemical analyses were performed at the laboratory of Soil Science and Agrochemistry, Estonian University of Life Science. Soil was air dried and passed through a 2- mm sieve. The chemical elements (P, K, Ca, Mg, Fe) were analyzed by using Mehlich 3 extraction by MP-AES analytical performance. Soil pH was measured from the soil suspension with 1M KCl. Ctot was analysed by dry combustion method in a vario MAX CNS elemental analyser (ELEMENTAR, Germany). Organic C was determined with elemental analyzer. According to World Reference Base for Soil Resources (WRB) classification system (FAO, 1998), the soil in the study area belongs to the soils subgroups of Gley soils on yellowish-grey calcareous till. The study area soil cover has strong anthropogenic influence due to different human activities. First, there are agricultural activities in the area. Although the region is currently exploited as grassland for animal grazing, it is known to have been ploughed in the past, resulting in amelioration of this soil type. Second, a result of surrounding oil-shale mining the status of groundwater has been changed as well.
Wind-Driven Erosion and Exposure Potential at Mars 2020 Rover Candidate-Landing Sites.
Chojnacki, Matthew; Banks, Maria; Urso, Anna
2018-02-01
Aeolian processes have likely been the predominant geomorphic agent for most of Mars' history and have the potential to produce relatively young exposure ages for geologic units. Thus, identifying local evidence for aeolian erosion is highly relevant to the selection of landing sites for future missions, such as the Mars 2020 Rover mission that aims to explore astrobiologically relevant ancient environments. Here we investigate wind-driven activity at eight Mars 2020 candidate-landing sites to constrain erosion potential at these locations. To demonstrate our methods, we found that contemporary dune-derived abrasion rates were in agreement with rover-derived exhumation rates at Gale crater and could be employed elsewhere. The Holden crater candidate site was interpreted to have low contemporary erosion rates, based on the presence of a thick sand coverage of static ripples. Active ripples at the Eberswalde and southwest Melas sites may account for local erosion and the dearth of small craters. Moderate-flux regional dunes near Mawrth Vallis were deemed unrepresentative of the candidate site, which is interpreted to currently be experiencing low levels of erosion. The Nili Fossae site displayed the most unambiguous evidence for local sand transport and erosion, likely yielding relatively young exposure ages. The downselected Jezero crater and northeast Syrtis sites had high-flux neighboring dunes and exhibited substantial evidence for sediment pathways across their ellipses. Both sites had relatively high estimated abrasion rates, which would yield young exposure ages. The downselected Columbia Hills site lacked evidence for sand movement, and contemporary local erosion rates are estimated to be relatively low.
Wind-Driven Erosion and Exposure Potential at Mars 2020 Rover Candidate-Landing Sites
Chojnacki, Matthew; Banks, Maria; Urso, Anna
2018-01-01
Aeolian processes have likely been the predominant geomorphic agent for most of Mars’ history and have the potential to produce relatively young exposure ages for geologic units. Thus, identifying local evidence for aeolian erosion is highly relevant to the selection of landing sites for future missions, such as the Mars 2020 Rover mission that aims to explore astrobiologically relevant ancient environments. Here we investigate wind-driven activity at eight Mars 2020 candidate-landing sites to constrain erosion potential at these locations. To demonstrate our methods, we found that contemporary dune-derived abrasion rates were in agreement with rover-derived exhumation rates at Gale crater and could be employed elsewhere. The Holden crater candidate site was interpreted to have low contemporary erosion rates, based on the presence of a thick sand coverage of static ripples. Active ripples at the Eberswalde and southwest Melas sites may account for local erosion and the dearth of small craters. Moderate-flux regional dunes near Mawrth Vallis were deemed unrepresentative of the candidate site, which is interpreted to currently be experiencing low levels of erosion. The Nili Fossae site displayed the most unambiguous evidence for local sand transport and erosion, likely yielding relatively young exposure ages. The downselected Jezero crater and northeast Syrtis sites had high-flux neighboring dunes and exhibited substantial evidence for sediment pathways across their ellipses. Both sites had relatively high estimated abrasion rates, which would yield young exposure ages. The downselected Columbia Hills site lacked evidence for sand movement, and contemporary local erosion rates are estimated to be relatively low. PMID:29568719
Wind-Driven Erosion and Exposure Potential at Mars 2020 Rover Candidate-Landing Sites
NASA Astrophysics Data System (ADS)
Chojnacki, Matthew; Banks, Maria; Urso, Anna
2018-02-01
Aeolian processes have likely been the predominant geomorphic agent for most of Mars' history and have the potential to produce relatively young exposure ages for geologic units. Thus, identifying local evidence for aeolian erosion is highly relevant to the selection of landing sites for future missions, such as the Mars 2020 Rover mission that aims to explore astrobiologically relevant ancient environments. Here we investigate wind-driven activity at eight Mars 2020 candidate-landing sites to constrain erosion potential at these locations. To demonstrate our methods, we found that contemporary dune-derived abrasion rates were in agreement with rover-derived exhumation rates at Gale crater and could be employed elsewhere. The Holden crater candidate site was interpreted to have low contemporary erosion rates, based on the presence of a thick sand coverage of static ripples. Active ripples at the Eberswalde and southwest Melas sites may account for local erosion and the dearth of small craters. Moderate-flux regional dunes near Mawrth Vallis were deemed unrepresentative of the candidate site, which is interpreted to currently be experiencing low levels of erosion. The Nili Fossae site displayed the most unambiguous evidence for local sand transport and erosion, likely yielding relatively young exposure ages. The downselected Jezero crater and northeast Syrtis sites had high-flux neighboring dunes and exhibited substantial evidence for sediment pathways across their ellipses. Both sites had relatively high estimated abrasion rates, which would yield young exposure ages. The downselected Columbia Hills site lacked evidence for sand movement, and contemporary local erosion rates are estimated to be relatively low.
Development of a cyclic voltammetry method for the detection of Clostridium novyi in black disease.
Liu, L L; Jiang, D N; Xiang, G M; Liu, C; Yu, J C; Pu, X Y
2014-03-17
Black disease is an acute disease of sheep and cattle. The pathogen is the obligate anaerobe, Clostridium novyi. Due to difficulties of anaerobic culturing in the country or disaster sites, a simple, rapid, and sensitive method is required. In this study, an electrochemical method, the cyclic voltammetry method, basing on loop-mediated isothermal amplification (LAMP), electrochemical ion bonding (positive dye, methylene blue), was introduced. DNA extracted from C. novyi specimens was amplified through the LAMP reaction. Then the products combined were with methylene blue, which lead to a reduction in the oxidation peak current (ipA) and the reduction peak current (ipC) of the cyclic voltammetry. The changes of ipA/ipC were real-time measured by special designed electrode, so the DNA was quantitatively detected. The results displayed that this electrochemical detection of C. novyi could be completed in 1-2 h with the lowest bacterial concentration of 10(2) colony forming units/mL, and high accuracy (96.5%), sensitivity (96%), and specificity (97%) compared to polymerase chain reation. The cyclic voltammetry method was a simple and fast method, with high sensitivity and high specificity, and has great potential to be a usable molecular tool for fast diagnosis of Black disease.
Gentry, Amanda Elswick; Jackson-Cook, Colleen K; Lyon, Debra E; Archer, Kellie J
2015-01-01
The pathological description of the stage of a tumor is an important clinical designation and is considered, like many other forms of biomedical data, an ordinal outcome. Currently, statistical methods for predicting an ordinal outcome using clinical, demographic, and high-dimensional correlated features are lacking. In this paper, we propose a method that fits an ordinal response model to predict an ordinal outcome for high-dimensional covariate spaces. Our method penalizes some covariates (high-throughput genomic features) without penalizing others (such as demographic and/or clinical covariates). We demonstrate the application of our method to predict the stage of breast cancer. In our model, breast cancer subtype is a nonpenalized predictor, and CpG site methylation values from the Illumina Human Methylation 450K assay are penalized predictors. The method has been made available in the ordinalgmifs package in the R programming environment.
2015-01-01
Molecular docking is a powerful tool used in drug discovery and structural biology for predicting the structures of ligand–receptor complexes. However, the accuracy of docking calculations can be limited by factors such as the neglect of protein reorganization in the scoring function; as a result, ligand screening can produce a high rate of false positive hits. Although absolute binding free energy methods still have difficulty in accurately rank-ordering binders, we believe that they can be fruitfully employed to distinguish binders from nonbinders and reduce the false positive rate. Here we study a set of ligands that dock favorably to a newly discovered, potentially allosteric site on the flap of HIV-1 protease. Fragment binding to this site stabilizes a closed form of protease, which could be exploited for the design of allosteric inhibitors. Twenty-three top-ranked protein–ligand complexes from AutoDock were subject to the free energy screening using two methods, the recently developed binding energy analysis method (BEDAM) and the standard double decoupling method (DDM). Free energy calculations correctly identified most of the false positives (≥83%) and recovered all the confirmed binders. The results show a gap averaging ≥3.7 kcal/mol, separating the binders and the false positives. We present a formula that decomposes the binding free energy into contributions from the receptor conformational macrostates, which provides insights into the roles of different binding modes. Our binding free energy component analysis further suggests that improving the treatment for the desolvation penalty associated with the unfulfilled polar groups could reduce the rate of false positive hits in docking. The current study demonstrates that the combination of docking with free energy methods can be very useful for more accurate ligand screening against valuable drug targets. PMID:25189630
NASA Astrophysics Data System (ADS)
Yu, Z.; Bedig, A.; Quigley, M.; Montalto, F. A.
2017-12-01
In-situ field monitoring can help to improve the design and management of decentralized Green Infrastructure (GI) systems in urban areas. Because of the vast quantity of continuous data generated from multi-site sensor systems, cost-effective post-construction opportunities for real-time control are limited; and the physical processes that influence the observed phenomena (e.g. soil moisture) are hard to track and control. To derive knowledge efficiently from real-time monitoring data, there is currently a need to develop more efficient approaches to data quality control. In this paper, we employ dynamic time warping method to compare the similarity of two soil moisture patterns without ignoring the inherent autocorrelation. We also use a rule-based machine learning method to investigate the feasibility of detecting anomalous responses from soil moisture probes. The data was generated from both individual and clusters of probes, deployed in a GI site in Milwaukee, WI. In contrast to traditional QAQC methods, which seek to detect outliers at individual time steps, the new method presented here converts the continuous time series into event-based symbolic sequences from which unusual response patterns can be detected. Different Matching rules are developed on different physical characteristics for different seasons. The results suggest that this method could be used alternatively to detect sensor failure, to identify extreme events, and to call out abnormal change patterns, compared to intra-probe and inter-probe historical observations. Though this algorithm was developed for soil moisture probes, the same approach could easily be extended to advance QAQC efficiency for any continuous environmental datasets.
Likelihood of Bone Recurrence in Prior Sites of Metastasis in Patients With High-Risk Neuroblastoma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polishchuk, Alexei L.; Li, Richard; Hill-Kayser, Christine
2014-07-15
Purpose/Objectives: Despite recent improvements in outcomes, 40% of children with high-risk neuroblastoma will experience relapse, facing a guarded prognosis for long-term cure. Whether recurrences are at new sites or sites of original disease may guide decision making during initial therapy. Methods and Materials: Eligible patients were retrospectively identified from institutional databases at first metastatic relapse of high-risk neuroblastoma. Included patients had disease involving metaiodobenzylguanidine (MIBG)-avid metastatic sites at diagnosis and first relapse, achieved a complete or partial response with no more than one residual MIBG-avid site before first relapse, and received no total body irradiation or therapy with {sup 131}I-MIBGmore » before first relapse. Anatomically defined metastatic sites were tracked from diagnosis through first relapse to determine tendency of disease to recur at previously involved versus uninvolved sites and to assess whether this pattern was influenced by site irradiation. Results: Of 159 MIBG-avid metastatic sites identified among 43 patients at first relapse, 131 (82.4%) overlapped anatomically with the set of 525 sites present at diagnosis. This distribution was similar for bone sites, but patterns of relapse were more varied for the smaller subset of soft tissue metastases. Among all metastatic sites at diagnosis in our subsequently relapsed patient cohort, only 3 of 19 irradiated sites (15.8%) recurred as compared with 128 of 506 (25.3%) unirradiated sites. Conclusions: Metastatic bone relapse in neuroblastoma usually occurs at anatomic sites of previous disease. Metastatic sites identified at diagnosis that did not receive radiation during frontline therapy appeared to have a higher risk of involvement at first relapse relative to previously irradiated metastatic sites. These observations support the current paradigm of irradiating metastases that persist after induction chemotherapy in high-risk patients. Furthermore, they raise the hypothesis that metastatic sites appearing to clear with induction chemotherapy may also benefit from radiotherapeutic treatment modalities (external beam radiation or {sup 131}I-MIBG)« less
TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuemann, J; Grassberger, C; Paganetti, H
2014-06-15
Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50)more » were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend treatment plan verification using Monte Carlo simulations for patients with complex geometries.« less
Tsachaki, Maria; Birk, Julia; Egert, Aurélie; Odermatt, Alex
2015-07-01
Membrane proteins of the endoplasmic reticulum (ER) are involved in a wide array of essential cellular functions. Identification of the topology of membrane proteins can provide significant insight into their mechanisms of action and biological roles. This is particularly important for membrane enzymes, since their topology determines the subcellular site where a biochemical reaction takes place and the dependence on luminal or cytosolic co-factor pools and substrates. The methods currently available for the determination of topology of proteins are rather laborious and require post-lysis or post-fixation manipulation of cells. In this work, we have developed a simple method for defining intracellular localization and topology of ER membrane proteins in living cells, based on the fusion of the respective protein with redox-sensitive green-fluorescent protein (roGFP). We validated the method and demonstrated that roGFP fusion proteins constitute a reliable tool for the study of ER membrane protein topology, using as control microsomal 11β-hydroxysteroid dehydrogenase (11β-HSD) proteins whose topology has been resolved, and comparing with an independent approach. We then implemented this method to determine the membrane topology of six microsomal members of the 17β-hydroxysteroid dehydrogenase (17β-HSD) family. The results revealed a luminal orientation of the catalytic site for three enzymes, i.e. 17β-HSD6, 7 and 12. Knowledge of the intracellular location of the catalytic site of these enzymes will enable future studies on their biological functions and on the role of the luminal co-factor pool. Copyright © 2015 Elsevier B.V. All rights reserved.
Burton, Bethany L.; Ball, Lyndsay B.
2011-01-01
Red Devil Mine, located in southwestern Alaska near the Village of Red Devil, was the state's largest producer of mercury and operated from 1933 to 1971. Throughout the lifespan of the mine, various generations of mills and retort buildings existed on both sides of Red Devil Creek, and the tailings and waste rock were deposited across the site. The mine was located on public Bureau of Land Management property, and the Bureau has begun site remediation by addressing mercury, arsenic, and antimony contamination caused by the minerals associated with the ore deposit (cinnabar, stibnite, realgar, and orpiment). In August 2010, the U.S. Geological Survey completed a geophysical survey at the site using direct-current resistivity and electromagnetic induction surface methods. Eight two-dimensional profiles and one three-dimensional grid of direct-current resistivity data as well as about 5.7 kilometers of electromagnetic induction profile data were acquired across the site. On the basis of the geophysical data and few available soil borings, there is not sufficient electrical or electromagnetic contrast to confidently distinguish between tailings, waste rock, and weathered bedrock. A water table is interpreted along the two-dimensional direct-current resistivity profiles based on correlation with monitoring well water levels and a relatively consistent decrease in resistivity typically at 2-6 meters depth. Three settling ponds used in the last few years of mine operation to capture silt and sand from a flotation ore processing technique possessed conductive values above the interpreted water level but more resistive values below the water level. The cause of the increased resistivity below the water table is unknown, but the increased resistivity may indicate that a secondary mechanism is affecting the resistivity structure under these ponds if the depth of the ponds is expected to extend below the water level. The electromagnetic induction data clearly identified the three monofills and indicate, in conjunction with the three-dimensional resistivity data, additional possible landfill features on the north side of Red Devil Creek. No obvious shallow feature was identified as a possible source for a spring that is feeding into Red Devil Creek from the north bank. However, a discrete, nearly vertical conductive feature observed on the direct-current resistivity line that passes within 5 meters of the spring may be worth investigating. Additional deep soil borings that better differentiate between tailings, waste rock, and weathered bedrock may be very useful in more confidently identifying these rock types in the direct-current resistivity data.
Binding Site and Affinity Prediction of General Anesthetics to Protein Targets Using Docking
Liu, Renyu; Perez-Aguilar, Jose Manuel; Liang, David; Saven, Jeffery G.
2012-01-01
Background The protein targets for general anesthetics remain unclear. A tool to predict anesthetic binding for potential binding targets is needed. In this study, we explore whether a computational method, AutoDock, could serve as such a tool. Methods High-resolution crystal data of water soluble proteins (cytochrome C, apoferritin and human serum albumin), and a membrane protein (a pentameric ligand-gated ion channel from Gloeobacter violaceus, GLIC) were used. Isothermal titration calorimetry (ITC) experiments were performed to determine anesthetic affinity in solution conditions for apoferritin. Docking calculations were performed using DockingServer with the Lamarckian genetic algorithm and the Solis and Wets local search method (https://www.dockingserver.com/web). Twenty general anesthetics were docked into apoferritin. The predicted binding constants are compared with those obtained from ITC experiments for potential correlations. In the case of apoferritin, details of the binding site and their interactions were compared with recent co-crystallization data. Docking calculations for six general anesthetics currently used in clinical settings (isoflurane, sevoflurane, desflurane, halothane, propofol, and etomidate) with known EC50 were also performed in all tested proteins. The binding constants derived from docking experiments were compared with known EC50s and octanol/water partition coefficients for the six general anesthetics. Results All 20 general anesthetics docked unambiguously into the anesthetic binding site identified in the crystal structure of apoferritin. The binding constants for 20 anesthetics obtained from the docking calculations correlate significantly with those obtained from ITC experiments (p=0.04). In the case of GLIC, the identified anesthetic binding sites in the crystal structure are among the docking predicted binding sites, but not the top ranked site. Docking calculations suggest a most probable binding site located in the extracellular domain of GLIC. The predicted affinities correlated significantly with the known EC50s for the six commonly used anesthetics in GLIC for the site identified in the experimental crystal data (p=0.006). However, predicted affinities in apoferritin, human serum albumin, and cytochrome C did not correlate with these six anesthetics’ known experimental EC50s. A weak correlation between the predicted affinities and the octanol/water partition coefficients was observed for the sites in GLIC. Conclusion We demonstrated that anesthetic binding sites and relative affinities can be predicted using docking calculations in an automatic docking server (Autodock) for both water soluble and membrane proteins. Correlation of predicted affinity and EC50 for six commonly used general anesthetics was only observed in GLIC, a member of a protein family relevant to anesthetic mechanism. PMID:22392968
MO-E-18C-01: Open Access Web-Based Peer-To-Peer Training and Education in Radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pawlicki, T; Brown, D; Dunscombe, P
Purpose: Current training and education delivery models have limitations which result in gaps in clinical proficiency with equipment, procedures, and techniques. Educational and training opportunities offered by vendors and professional societies are by their nature not available at point of need or for the life of clinical systems. The objective of this work is to leverage modern communications technology to provide peer-to-peer training and education for radiotherapy professionals, in the clinic and on demand, as they undertake their clinical duties. Methods: We have developed a free of charge web site ( https://i.treatsafely.org ) using the Google App Engine and datastoremore » (NDB, GQL), Python with AJAX-RPC, and Javascript. The site is a radiotherapy-specific hosting service to which user-created videos illustrating clinical or physics processes and other relevant educational material can be uploaded. Efficient navigation to the material of interest is provided through several RT specific search tools and videos can be scored by users, thus providing comprehensive peer review of the site content. The site also supports multilingual narration\\translation of videos, a quiz function for competence assessment and a library function allowing groups or institutions to define their standard operating procedures based on the video content. Results: The website went live in August 2013 and currently has over 680 registered users from 55 countries; 27.2% from the United States, 9.8% from India, 8.3% from the United Kingdom, 7.3% from Brazil, and 47.5% from other countries. The users include physicists (57.4%), Oncologists (12.5%), therapists (8.2%) and dosimetrists (4.8%). There are 75 videos to date including English, Portuguese, Mandarin, and Thai. Conclusion: Based on the initial acceptance of the site, we conclude that this open access web-based peer-to-peer tool is fulfilling an important need in radiotherapy training and education. Site functionality should expand in the future to include document sharing and continuing education credits.« less
Feasibility of using backscattered muons for archeological imaging
NASA Astrophysics Data System (ADS)
Bonal, N.; Preston, L. A.
2013-12-01
Use of nondestructive methods to accurately locate and characterize underground objects such as rooms and tools found at archeological sites is ideal to preserve these historic sites. High-energy cosmic ray muons are very sensitive to density variation and have been used to image volcanoes and archeological sites such as the Egyptian and Mayan pyramids. Muons are subatomic particles produced in the upper atmosphere that penetrate the earth's crust up to few kilometers. Their absorption rate depends on the density of the materials through which they pass. Measurements of muon flux rate at differing directions provide density variations of the materials between the muon source (cosmic rays and neutrino interactions) and the detector, much like a CAT scan. Currently, muon tomography can resolve features to the sub-meter scale making it useful for this type of work. However, the muon detector must be placed below the target of interest. For imaging volcanoes, the upper portion is imaged when the detector is placed on the earth's surface at the volcano's base. For sites of interest beneath the ground surface, the muon detector would need to be placed below the site in a tunnel or borehole. Placing the detector underground can be costly and may disturb the historical site. We will assess the feasibility of imaging the subsurface using upward traveling muons, to eliminate the current constraint of positioning the detector below the target. This work consists of three parts 1) determine the backscattered flux rate from theory, 2) distinguish backscattered from forward scattered muons at the detector, and 3) validate the theoretical results with field experimentation. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Barth, Nancy A.; Veilleux, Andrea G.
2012-01-01
The U.S. Geological Survey (USGS) is currently updating at-site flood frequency estimates for USGS streamflow-gaging stations in the desert region of California. The at-site flood-frequency analysis is complicated by short record lengths (less than 20 years is common) and numerous zero flows/low outliers at many sites. Estimates of the three parameters (mean, standard deviation, and skew) required for fitting the log Pearson Type 3 (LP3) distribution are likely to be highly unreliable based on the limited and heavily censored at-site data. In a generalization of the recommendations in Bulletin 17B, a regional analysis was used to develop regional estimates of all three parameters (mean, standard deviation, and skew) of the LP3 distribution. A regional skew value of zero from a previously published report was used with a new estimated mean squared error (MSE) of 0.20. A weighted least squares (WLS) regression method was used to develop both a regional standard deviation and a mean model based on annual peak-discharge data for 33 USGS stations throughout California’s desert region. At-site standard deviation and mean values were determined by using an expected moments algorithm (EMA) method for fitting the LP3 distribution to the logarithms of annual peak-discharge data. Additionally, a multiple Grubbs-Beck (MGB) test, a generalization of the test recommended in Bulletin 17B, was used for detecting multiple potentially influential low outliers in a flood series. The WLS regression found that no basin characteristics could explain the variability of standard deviation. Consequently, a constant regional standard deviation model was selected, resulting in a log-space value of 0.91 with a MSE of 0.03 log units. Yet drainage area was found to be statistically significant at explaining the site-to-site variability in mean. The linear WLS regional mean model based on drainage area had a Pseudo- 2 R of 51 percent and a MSE of 0.32 log units. The regional parameter estimates were then used to develop a set of equations for estimating flows with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities for ungaged basins. The final equations are functions of drainage area.Average standard errors of prediction for these regression equations range from 214.2 to 856.2 percent.
NASA Astrophysics Data System (ADS)
Bowles, Frederick A.; Vogt, Peter R.; Jung, Woo-Yeol
1998-05-01
Placing waste on the seafloor, with the intention that it remain in place and isolated from mankind, requires a knowledge of the environmental factors that may be applicable to a specific seafloor area. DBDB5 (Digital Bathymetric Database gridded at 5' latitude by 5' longitude cell dimension) is used here for regional assessments of seafloor depth, slope, and relief at five surrogate abyssal waste sites; two each in the western Atlantic and eastern Pacific, and one in the Gulf of Mexico. Only Pacific-1 exhibits a `high' slope (2°) by DBDB5 standards, whereas the remaining sites are located on almost level seafloor. Detailed examination of the sites using multibeam-based contour sheets show the area around Atlantic-1 to be a featureless plain. Atlantic-2 and both Pacific sites are surrounded by abyssal hill topography, with local slopes ranging from greater than 6° at all sites to above 15° at Pacific-2. Neither Pacific site features a seafloor as `flat' as at Atlantic-1 or at the Gulf of Mexico site. Locating waste sites on sedimented slopes could have serious consequences due to catastrophic slope failure and downslope displacement of waste by mass sediment-transport processes. Neither slumping nor sliding are perceived as critical processes affecting the surrogate sites because of their locations on negligibly sloping seafloors. However, debris flows and turbidity currents are capable of transporting large volumes of sediment for long distances over low gradients and, in the case of turbidity currents, at great speed. Dispersal of loose waste material by these processes is virtually assured, but less likely if the waste is bagged. The turbidity current problem is alleviated (but not eliminated) by locating waste sites on distal portions of abyssal plains. Both Pacific sites are surrounded by abyssal hills and, in the case of Pacific-2, far beyond the reach of land-derived turbidity currents. Thin sediment cover and low rates of sedimentation have also resulted in highly stable slope (abyssal hill) deposits. Hence, the probability of locally derived, small-volume flows is low at these sites. Existing high sea levels have also resulted in a worldwide decrease in turbidity current activity relative to glacial times when sea levels were much lower.
Content and Accessibility of Shoulder and Elbow Fellowship Web Sites in the United States.
Young, Bradley L; Oladeji, Lasun O; Cichos, Kyle; Ponce, Brent
2016-01-01
Increasing numbers of training physicians are using the Internet to gather information about graduate medical education programs. The content and accessibility of web sites that provide this information have been demonstrated to influence applicants' decisions. Assessments of orthopedic fellowship web sites including sports medicine, pediatrics, hand and spine have found varying degrees of accessibility and material. The purpose of this study was to evaluate the accessibility and content of the American Shoulder and Elbow Surgeons (ASES) fellowship web sites (SEFWs). A complete list of ASES programs was obtained from a database on the ASES web site. The accessibility of each SEFWs was assessed by the existence of a functioning link found in the database and through Google®. Then, the following content areas of each SEFWs were evaluated: fellow education, faculty/previous fellow information, and recruitment. At the time of the study, 17 of the 28 (60.7%) ASES programs had web sites accessible through Google®, and only five (17.9%) had functioning links in the ASES database. Nine programs lacked a web site. Concerning web site content, the majority of SEFWs contained information regarding research opportunities, research requirements, case descriptions, meetings and conferences, teaching responsibilities, attending faculty, the application process, and a program description. Fewer than half of the SEFWs provided information regarding rotation schedules, current fellows, previous fellows, on-call expectations, journal clubs, medical school of current fellows, residency of current fellows, employment of previous fellows, current research, and previous research. A large portion of ASES fellowship programs lacked functioning web sites, and even fewer provided functioning links through the ASES database. Valuable information for potential applicants was largely inadequate across present SEFWs.
Prostate cancer invasion and metastasis: Insights from mining genomic data
Hudson, Bryan D.; Kulp, Kristen S.; Loots, Gabriela G.
2013-07-22
Prostate cancer (PCa) is the second most commonly diagnosed malignancy in men in the Western world and the second leading cause of cancer-related deaths among men worldwide. Although most cancers have the potential to metastasize under appropriate conditions, PCa favors the skeleton as a primary site of metastasis, suggesting that the bone microenvironment is conducive to its growth. PCa metastasis proceeds through a complex series of molecular events that include angiogenesis at the site of the original tumor, local migration within the primary site, intravasation into the blood stream, survival within the circulation, extravasation of the tumor cells to themore » target organ and colonization of those cells within the new site. In turn, each one of these steps involves a complicated chain of events that utilize multiple protein–protein interactions, protein signaling cascades and transcriptional changes. Despite the urgent need to improve current biomarkers for diagnosis, prognosis and drug resistance, advances have been slow. Global gene expression methods such as gene microarrays and RNA sequencing enable the study of thousands of genes simultaneously and allow scientists to examine molecular pathways of cancer pathogenesis. In this review, we summarize the current literature that explored high-throughput transcriptome analysis toward the advancement of biomarker discovery for PCa. Novel biomarkers are strongly needed to enable more accurate detection of PCa, improve prediction of tumor aggressiveness and facilitate the discovery of new therapeutic targets for tailored medicine. Furthermore, promising molecular markers identified from gene expression profiling studies include HPN, CLU1, WT1, WNT5A, AURKA and SPARC.« less
Prostate cancer invasion and metastasis: Insights from mining genomic data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hudson, Bryan D.; Kulp, Kristen S.; Loots, Gabriela G.
Prostate cancer (PCa) is the second most commonly diagnosed malignancy in men in the Western world and the second leading cause of cancer-related deaths among men worldwide. Although most cancers have the potential to metastasize under appropriate conditions, PCa favors the skeleton as a primary site of metastasis, suggesting that the bone microenvironment is conducive to its growth. PCa metastasis proceeds through a complex series of molecular events that include angiogenesis at the site of the original tumor, local migration within the primary site, intravasation into the blood stream, survival within the circulation, extravasation of the tumor cells to themore » target organ and colonization of those cells within the new site. In turn, each one of these steps involves a complicated chain of events that utilize multiple protein–protein interactions, protein signaling cascades and transcriptional changes. Despite the urgent need to improve current biomarkers for diagnosis, prognosis and drug resistance, advances have been slow. Global gene expression methods such as gene microarrays and RNA sequencing enable the study of thousands of genes simultaneously and allow scientists to examine molecular pathways of cancer pathogenesis. In this review, we summarize the current literature that explored high-throughput transcriptome analysis toward the advancement of biomarker discovery for PCa. Novel biomarkers are strongly needed to enable more accurate detection of PCa, improve prediction of tumor aggressiveness and facilitate the discovery of new therapeutic targets for tailored medicine. Furthermore, promising molecular markers identified from gene expression profiling studies include HPN, CLU1, WT1, WNT5A, AURKA and SPARC.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewy, Ann; Heim, Kenneth J.; McGonigal, Sean T.
A comparative groundwater hydrogeologic modeling analysis is presented herein to simulate potential contaminant migration pathways in a sole source aquifer in Nassau County, Long Island, New York. The source of contamination is related to historical operations at the Sylvania Corning Plant ('Site'), a 9.49- acre facility located at 70, 100 and 140 Cantiague Rock Road, Town of Oyster Bay in the westernmost portion of Hicksville, Long Island. The Site had historically been utilized as a nuclear materials manufacturing facility (e.g., cores, slug, and fuel elements) for reactors used in both research and electric power generation in early 1950's until latemore » 1960's. The Site is contaminated with various volatile organic and inorganic compounds, as well as radionuclides. The major contaminants of concern at the Site are tetrachloroethene (PCE), trichloroethene (TCE), nickel, uranium, and thorium. These compounds are present in soil and groundwater underlying the Site and have migrated off-site. The Site is currently being investigated as part of the Formerly Utilized Sites Remedial Action Program (FUSRAP). The main objective of the current study is to simulate the complex hydrogeologic features in the region, such as numerous current and historic production well fields; large, localized recharge basins; and, multiple aquifers, and to assess potential contaminant migration pathways originating from the Site. For this purpose, the focus of attention was given to the underlying Magothy formation, which has been impacted by the contaminants of concern. This aquifer provides more than 90% of potable water supply in the region. Nassau and Suffolk Counties jointly developed a three-dimensional regional groundwater flow model to help understand the factors affecting groundwater flow regime in the region, to determine adequate water supply for public consumption, to investigate salt water intrusion in localized areas, to evaluate the impacts of regional pumping activity, and to better understand the contaminant transport and fate mechanisms through the underlying aquifers. This regional model, developed for the N.Y. State Department of Environmental Conservation (NYSDEC) by Camp Dresser and McKee (CDM), uses the finite element model DYNFLOW developed by CDM, Cambridge, Massachusetts. The coarseness of the regional model, however, could not adequately capture the hydrogeologic heterogeneity of the aquifer. Specifically, the regional model did not adequately capture the interbedded nature of the Magothy aquifer and, as such, simulated particles tended to track down-gradient from the Site in relatively straight lines while the movement of groundwater in such a heterogeneous aquifer is expected to proceed along a more tortuous path. This paper presents a qualitative comparison of site-specific groundwater flow modeling results with results obtained from the regional model. In order to assess the potential contaminant migration pathways, a particle tracking method was employed. Available site-specific and regional hydraulic conductivity data measured in-situ with respect to depth and location were incorporated into the T-PROG module in GMS model to define statistical variation to better represent the actual stratigraphy and layer heterogeneity. The groundwater flow characteristics in the Magothy aquifer were simulated with the stochastic hydraulic conductivity variation as opposed to constant values as employed in the regional model. Contaminant sources and their exact locations have been fully delineated at the Site during the Remedial Investigation (RI) phase of the project. Contaminant migration pathways originating from these source locations at the Site are qualitatively traced within the sole source aquifer utilizing particles introduced at source locations. Contaminant transport mechanism modeled in the current study is based on pure advection (i.e., plug flow) and mechanical dispersion while molecular diffusion effects are neglected due to relatively high groundwater velocities encountered in the aquifer. In addition, fate of contaminants is ignored hereby to simulate the worst-case scenario, which considers the contaminants of concern as tracer-like compounds for modeling purposes. The results of the modeling analysis are qualitatively compared with the County's regional model, and patterns of contaminant migration in the region are presented. (authors)« less
Peavey, Mary C; Reynolds, Corey L; Szwarc, Maria M; Gibbons, William E; Valdes, Cecilia T; DeMayo, Francesco J; Lydon, John P
2017-10-24
High-frequency ultrasonography (HFUS) is a common method to non-invasively monitor the real-time development of the human fetus in utero. The mouse is routinely used as an in vivo model to study embryo implantation and pregnancy progression. Unfortunately, such murine studies require pregnancy interruption to enable follow-up phenotypic analysis. To address this issue, we used three-dimensional (3-D) reconstruction of HFUS imaging data for early detection and characterization of murine embryo implantation sites and their individual developmental progression in utero. Combining HFUS imaging with 3-D reconstruction and modeling, we were able to accurately quantify embryo implantation site number as well as monitor developmental progression in pregnant C57BL6J/129S mice from 5.5 days post coitus (d.p.c.) through to 9.5 d.p.c. with the use of a transducer. Measurements included: number, location, and volume of implantation sites as well as inter-implantation site spacing; embryo viability was assessed by cardiac activity monitoring. In the immediate post-implantation period (5.5 to 8.5 d.p.c.), 3-D reconstruction of the gravid uterus in both mesh and solid overlay format enabled visual representation of the developing pregnancies within each uterine horn. As genetically engineered mice continue to be used to characterize female reproductive phenotypes derived from uterine dysfunction, this method offers a new approach to detect, quantify, and characterize early implantation events in vivo. This novel use of 3-D HFUS imaging demonstrates the ability to successfully detect, visualize, and characterize embryo-implantation sites during early murine pregnancy in a non-invasive manner. The technology offers a significant improvement over current methods, which rely on the interruption of pregnancies for gross tissue and histopathologic characterization. Here we use a video and text format to describe how to successfully perform ultrasounds of early murine pregnancy to generate reliable and reproducible data with reconstruction of the uterine form in mesh and solid 3-D images.
The Global Space Geodesy Network and the Essential Role of Latin America Sites
NASA Astrophysics Data System (ADS)
Pearlman, M. R.; Ma, C.; Neilan, R.; Noll, C. E.; Pavlis, E. C.; Wetzel, S.
2013-05-01
The improvements in the reference frame and other space geodesy data products spelled out in the GGOS 2020 plan will evolve over time as new space geodesy sites enhance the global distribution of the network, and new technologies are implemented at current and new sites, thus enabling improved data processing and analysis. The goal of 30 globally distributed core sites with VLBI, SLR, GNSS and DORIS (where available) will take time to materialize. Co-location sites with less than the full core complement will continue to play a very important role in filling out the network while it is evolving and even after full implementation. GGOS, through its Call for Participation, bi-lateral and multi-lateral discussions, and work through the scientific Services have been encouraging current groups to upgrade and new groups to join the activity. This talk will give an update on the current expansion of the global network and the projection for the network configuration that we forecast over the next 10 years based on discussions and planning that has already occurred. We will also discuss some of the historical contributions to the reference frame from sites in Latin America and need for new sites in the future.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-16
..., Automotive Holding Group, Instrument Cluster Plant, Currently Known as General Motors Corporation, Including... Corporation, Automotive Holding Group, Instrument Cluster Plant, including on-site leased workers from... Material Management working on-site at Delphi Corporation, Automotive Holding Group, Instrument Cluster...
Direct push injection logging for high resolution characterization of low permeability zones
NASA Astrophysics Data System (ADS)
Liu, G.; Knobbe, S.; Butler, J. J., Jr.; Reboulet, E. C.; Borden, R. C.; Bohling, G.
2017-12-01
One of the grand challenges for groundwater protection and contaminated site remediation efforts is dealing with the slow, yet persistent, release of contaminants from low permeability zones. In zones of higher permeability, groundwater flow is relatively fast and contaminant transport can be more effectively affected by treatment activities. In the low permeability zones, however, groundwater flow and contaminant transport are slow and thus become largely insensitive to many in-situ treatment efforts. Clearly, for sites with low permeability zones, accurate depiction of the mass exchange between the low and higher permeability zones is critical for designing successful groundwater protection and remediation systems, which requires certain information such as the hydraulic conductivity (K) and porosity of the subsurface. The current generation of field methods is primarily developed for relatively permeable zones, and little work has been undertaken for characterizing zones of low permeability. For example, the direct push injection logging (DPIL) approach (e.g., Hydraulic Profiling Tool by Geoprobe) is commonly used for high resolution estimation of K over a range of 0.03 to 23 m/d. When K is below 0.03 m/d, the pressure responses from the current DPIL are generally too high for both the formation (potential formation alteration at high pressure) and measuring device (pressure exceeding the upper sensor limit). In this work, we modified the current DPIL tool by adding a low-flow pump and flowmeter so that injection logging can be performed with much reduced flow rates when K is low. Numerical simulations showed that the reduction in injection rates (reduced from 250 to 1 mL/min) allowed pressures to be measurable even when K was as low as 0.001 m/d. They also indicated that as the K decreased, the pore water pressure increase induced by probe advancement had a more significant impact on DPIL results. A new field DPIL profiling procedure was developed for reducing that impact. Our preliminary test results in both the lab and at a field site have demonstrated the promise of the modified DPIL approach as a practical method for characterizing low permeability zones.
Pyne, Jeffrey M.; Fortney, John C.; Mouden, Sip; Lu, Liya; Hudson, Teresa J; Mittal, Dinesh
2018-01-01
Objective Collaborative care for depression is effective and cost-effective in primary care settings. However, there is minimal evidence to inform the choice of on-site versus off-site models. This study examined the cost-effectiveness of on-site practice-based collaborative care (PBCC) versus off-site telemedicine-based collaborative care (TBCC) for depression in Federally Qualified Health Centers (FQHCs). Methods Multi-site randomized pragmatic comparative cost-effectiveness trial. 19,285 patients were screened for depression, 14.8% (n=2,863) screened positive (PHQ9 ≥10) and 364 were enrolled. Telephone interview data were collected at baseline, 6-, 12-, and 18-months. Base case analysis used Arkansas FQHC healthcare costs and secondary analysis used national cost estimates. Effectiveness measures were depression-free days and quality-adjusted life years (QALYs) derived from depression-free days, Medical Outcomes Study SF-12, and Quality of Well Being scale (QWB). Nonparametric bootstrap with replacement methods were used to generate an empirical joint distribution of incremental costs and QALYs and acceptability curves. Results Mean base case FQHC incremental cost-effectiveness ratio (ICER) using depression-free days was $10.78/depression-free day. Mean base case ICERs using QALYs ranged from $14,754/QALY (depression-free day QALY) to $37,261/QALY (QWB QALY). Mean secondary national ICER using depression-free days was $8.43/depression-free day and using QALYs ranged from $11,532/QALY (depression-free day QALY) to $29,234/QALY (QWB QALY). Conclusions These results support the cost-effectiveness of the TBCC intervention in medically underserved primary care settings. Results can inform the decision about whether to insource (make) or outsource (buy) depression care management in the FQHC setting within the current context of Patient-Centered Medical Home, value-based purchasing, and potential bundled payments for depression care. The www.clinicaltrials.gov # for this study is NCT00439452. PMID:25686811
DDT Analysis of Wetland Sediments in Upper Escambia Bay, Florida
NASA Astrophysics Data System (ADS)
Hopko, M. N.; Wright, J.; Liebens, J.; Vaughan, P.
2017-12-01
Dichlorodiphenyltrichloroethane (DDT) was a commonly used pesticide from World War II through the 1960's. DDT is generally used to control mosquito populations and as an agricultural insecticide. The pesticide and its degradation products (DDD and DDE) can bioaccumulate within ecosystems having negative implications for animal and human health. Consequently, DDT usage was banned in the United States in 1973. In a contaminant study performed in Escambia Bay, Florida, in 2009, DDT was present in 25% of study sites, most of which were located in the upper bay wetlands. Concentrations were well above the Florida Department of Environmental Protection's (FDEP) Probable Effect Level (PEL) and ratios of DDT and its metabolites indicated a recent introduction to the system. A follow-up study performed in 2016 found no DDT, but did show DDE at several sites. The current study repeated sampling in May 2017 at sites from the 2009 and 2016 studies. Sediment samples were collected in triplicate using a ponar sampler and DDT, DDD and DDE were extracted using EPA methods 3540c and 3620c. Extracts were analyzed using a gas chromatograph with electron capture detection (GC-ECD) as per EPA method 8081c. Sediment was also analyzed for organic carbon and particle size using an elemental NC analyzer and a laser diffraction particle sizer. Results show the presence of breakdown products DDE and DDD at multiple sites, but no detectable levels of DDT at any site. Sampling sites with high levels of DDT contamination in 2009 show only breakdown products in both 2016 and 2017. Particle size has little influence on DDD or DDE concentrations but OC is a controlling factor as indicated for contaminated sites by Pearson correlations between OC and DDE and DDD of 0.82 and 0.92, respectively. The presence of only DDD and/or DDE in the 2016 and 2017 studies indicates that the parent, DDT, has not been re-introduced into the watershed since 2009 but is degrading in the environment.
NASA Astrophysics Data System (ADS)
Sellers, Michael; Lisal, Martin; Schweigert, Igor; Larentzos, James; Brennan, John
2015-06-01
In discrete particle simulations, when an atomistic model is coarse-grained, a trade-off is made: a boost in computational speed for a reduction in accuracy. Dissipative Particle Dynamics (DPD) methods help to recover accuracy in viscous and thermal properties, while giving back a small amount of computational speed. One of the most notable extensions of DPD has been the introduction of chemical reactivity, called DPD-RX. Today, pairing the current evolution of DPD-RX with a coarse-grained potential and its chemical decomposition reactions allows for the simulation of the shock behavior of energetic materials at a timescale faster than an atomistic counterpart. In 2007, Maillet et al. introduced implicit chemical reactivity in DPD through the concept of particle reactors and simulated the decomposition of liquid nitromethane. We have recently extended the DPD-RX method and have applied it to solid hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) under shock conditions using a recently developed single-site coarse-grain model and a reduced RDX decomposition mechanism. A description of the methods used to simulate RDX and its tranition to hot product gases within DPD-RX will be presented. Additionally, examples of the effect of microstructure on shock behavior will be shown. Approved for public release. Distribution is unlimited.
A review of combinations of electrokinetic applications.
Moghadam, Mohamad Jamali; Moayedi, Hossein; Sadeghi, Masoud Mirmohamad; Hajiannia, Alborz
2016-12-01
Anthropogenic activities contaminate many lands and underground waters with dangerous materials. Although polluted soils occupy small parts of the land, the risk they pose to plants, animals, humans, and groundwater is too high. Remediation technologies have been used for many years in order to mitigate pollution or remove pollutants from soils. However, there are some deficiencies in the remediation in complex site conditions such as low permeability and complex composition of some clays or heterogeneous subsurface conditions. Electrokinetic is an effective method in which electrodes are embedded in polluted soil, usually vertically but in some cases horizontally, and a low direct current voltage gradient is applied between the electrodes. The electric gradient initiates movement of contaminants by electromigration (charged chemical movement), electro-osmosis (movement of fluid), electrolysis (chemical reactions due to the electric field), and diffusion. However, sites that are contaminated with heavy metals or mixed contaminants (e.g. a combination of organic compounds with heavy metals and/or radionuclides) are difficult to remediate. There is no technology that can achieve the best results, but combining electrokinetic with other remediation methods, such as bioremediation and geosynthetics, promises to be the most effective method so far. This review focuses on the factors that affect electrokinetic remediation and the state-of-the-art methods that can be combined with electrokinetic.
Aquifer Characterization from Surface Geo-electrical Method, western coast of Maharashtra, India
NASA Astrophysics Data System (ADS)
DAS, A.; Maiti, D. S.
2017-12-01
Knowledge of aquifer parameters are necessary for managing groundwater amenity. These parameters are evaluated through pumping tests bring off from bore wells. But it is quite expensive as well as time consuming to carry out pumping tests at various sites and sometimes it is difficult to find bore hole at every required site. Therefore, an alternate method is put forward in which the aquifer parameters are evaluated from surface geophysical method. In this method, vertical electrical sounding (VES) with Schlumberger configuration were accomplished in 85 stations over Sindhudurg district. Sindhudurg district is located in the Konkan region of Maharashtra state, India. The district is located between north latitude 15°37' and 16° 40' and east longitude 73° 19' and 74° 13'. The area is having hard rock and acute groundwater problem. In this configuration, we have taken the maximum current electrode spacing of 200 m for every vertical electrical sounding (VES). Geo-electrical sounding data (true resistivity and thickness) is interpreted through resistivity inversion approach. The required parameters are achieved through resistivity inversion technique from which the aquifer variables (D-Z parameters, mean resistivity, hydraulic conductivity, transmissivity, and coefficient of anisotropy) are calculated by using some empirical formulae. Cross-correlation investigation has been done between these parameters, which eventually used to characterize the aquifer over the study area. At the end, the contour plot for these aquifer parameters has been raised which reveals the detailed distribution of aquifer parameters throughout the study area. From contour plot, high values of longitudinal conductance, hydraulic conductivity and transmissivity are demarcate over Kelus, Vengurle, Mochemar and Shiroda villages. This may be due to intrusion of saline water from Arabian sea. From contour trends, the aquifers are characterized from which the groundwater resources could be assess and manage properly in western Maharashtra. The current method which include DC resistivity inversion could be applicable further in hydrological characterization in tangled coastal parts of India.
Jimmy Carter National Historic Site : transportation assistance group report
DOT National Transportation Integrated Search
2016-10-16
The Jimmy Carter National Historic Site (NHS), a National Park Service (NPS) site, in Plains, Georgia currently comprises four distinct sites associated with former President Jimmy Carter: the Boyhood Farm, where he was raised; the Plains Depot, whic...
Features of traffic and transit internet sites
DOT National Transportation Integrated Search
2000-02-01
This paper summarizes the current state of internet sites with respect to these features, first : considering whether sites with the features are available in metro areas, then comparing sites : developed by public and private sectors. In order to de...
msgbsR: An R package for analysing methylation-sensitive restriction enzyme sequencing data.
Mayne, Benjamin T; Leemaqz, Shalem Y; Buckberry, Sam; Rodriguez Lopez, Carlos M; Roberts, Claire T; Bianco-Miotto, Tina; Breen, James
2018-02-01
Genotyping-by-sequencing (GBS) or restriction-site associated DNA marker sequencing (RAD-seq) is a practical and cost-effective method for analysing large genomes from high diversity species. This method of sequencing, coupled with methylation-sensitive enzymes (often referred to as methylation-sensitive restriction enzyme sequencing or MRE-seq), is an effective tool to study DNA methylation in parts of the genome that are inaccessible in other sequencing techniques or are not annotated in microarray technologies. Current software tools do not fulfil all methylation-sensitive restriction sequencing assays for determining differences in DNA methylation between samples. To fill this computational need, we present msgbsR, an R package that contains tools for the analysis of methylation-sensitive restriction enzyme sequencing experiments. msgbsR can be used to identify and quantify read counts at methylated sites directly from alignment files (BAM files) and enables verification of restriction enzyme cut sites with the correct recognition sequence of the individual enzyme. In addition, msgbsR assesses DNA methylation based on read coverage, similar to RNA sequencing experiments, rather than methylation proportion and is a useful tool in analysing differential methylation on large populations. The package is fully documented and available freely online as a Bioconductor package ( https://bioconductor.org/packages/release/bioc/html/msgbsR.html ).
SOIL AND SEDIMENT SAMPLING METHODS | Science ...
The EPA Office of Solid Waste and Emergency Response's (OSWER) Office of Superfund Remediation and Technology Innovation (OSRTI) needs innovative methods and techniques to solve new and difficult sampling and analytical problems found at the numerous Superfund sites throughout the United States. Inadequate site characterization and a lack of knowledge of surface and subsurface contaminant distributions hinders EPA's ability to make the best decisions on remediation options and to conduct the most effective cleanup efforts. To assist OSWER, NERL conducts research to improve their capability to more accurately, precisely, and efficiently characterize Superfund, RCRA, LUST, oil spills, and brownfield sites and to improve their risk-based decision making capabilities, research is being conducted on improving soil and sediment sampling techniques and improving the sampling and handling of volatile organic compound (VOC) contaminated soils, among the many research programs and tasks being performed at ESD-LV.Under this task, improved sampling approaches and devices will be developed for characterizing the concentration of VOCs in soils. Current approaches and devices used today can lose up to 99% of the VOCs present in the sample due inherent weaknesses in the device and improper/inadequate collection techniques. This error generally causes decision makers to markedly underestimate the soil VOC concentrations and, therefore, to greatly underestimate the ecological
Hens, Bart; Hens, Luc
2017-12-21
Polychlorinated biphenyl (PCB)-contaminated sites around the world affect human health for many years, showing long latency periods of health effects. The impact of the different PCB congeners on human health should not be underestimated, as they are ubiquitous, stable molecules and reactive in biological tissues, leading to neurological, endocrine, genetic, and systemic adverse effects in the human body. Moreover, bioaccumulation of these compounds in fatty tissues of animals (e.g., fish and mammals) and in soils/sediments, results in chronic exposure to these substances. Efficient destruction methods are important to decontaminate polluted sites worldwide. This paper provides an in-depth overview of (i) the history and accidents with PCBs in the 20th century, (ii) the mechanisms that are responsible for the hazardous effects of PCBs, and (iii) the current policy regarding PCB control and decontamination. Contemporary impacts on human health of historical incidents are discussed next to an up to date overview of the health effects caused by PCBs and their mechanisms. Methods to decontaminate sites are reviewed. Steps which lead to a policy of banning the production and distribution of PCBs are overviewed in a context of preventing future accidents and harm to the environment and human health.
Hens, Luc
2017-01-01
Polychlorinated biphenyl (PCB)-contaminated sites around the world affect human health for many years, showing long latency periods of health effects. The impact of the different PCB congeners on human health should not be underestimated, as they are ubiquitous, stable molecules and reactive in biological tissues, leading to neurological, endocrine, genetic, and systemic adverse effects in the human body. Moreover, bioaccumulation of these compounds in fatty tissues of animals (e.g., fish and mammals) and in soils/sediments, results in chronic exposure to these substances. Efficient destruction methods are important to decontaminate polluted sites worldwide. This paper provides an in-depth overview of (i) the history and accidents with PCBs in the 20th century, (ii) the mechanisms that are responsible for the hazardous effects of PCBs, and (iii) the current policy regarding PCB control and decontamination. Contemporary impacts on human health of historical incidents are discussed next to an up to date overview of the health effects caused by PCBs and their mechanisms. Methods to decontaminate sites are reviewed. Steps which lead to a policy of banning the production and distribution of PCBs are overviewed in a context of preventing future accidents and harm to the environment and human health. PMID:29267240
Crysalis: an integrated server for computational analysis and design of protein crystallization.
Wang, Huilin; Feng, Liubin; Zhang, Ziding; Webb, Geoffrey I; Lin, Donghai; Song, Jiangning
2016-02-24
The failure of multi-step experimental procedures to yield diffraction-quality crystals is a major bottleneck in protein structure determination. Accordingly, several bioinformatics methods have been successfully developed and employed to select crystallizable proteins. Unfortunately, the majority of existing in silico methods only allow the prediction of crystallization propensity, seldom enabling computational design of protein mutants that can be targeted for enhancing protein crystallizability. Here, we present Crysalis, an integrated crystallization analysis tool that builds on support-vector regression (SVR) models to facilitate computational protein crystallization prediction, analysis, and design. More specifically, the functionality of this new tool includes: (1) rapid selection of target crystallizable proteins at the proteome level, (2) identification of site non-optimality for protein crystallization and systematic analysis of all potential single-point mutations that might enhance protein crystallization propensity, and (3) annotation of target protein based on predicted structural properties. We applied the design mode of Crysalis to identify site non-optimality for protein crystallization on a proteome-scale, focusing on proteins currently classified as non-crystallizable. Our results revealed that site non-optimality is based on biases related to residues, predicted structures, physicochemical properties, and sequence loci, which provides in-depth understanding of the features influencing protein crystallization. Crysalis is freely available at http://nmrcen.xmu.edu.cn/crysalis/.
Crysalis: an integrated server for computational analysis and design of protein crystallization
Wang, Huilin; Feng, Liubin; Zhang, Ziding; Webb, Geoffrey I.; Lin, Donghai; Song, Jiangning
2016-01-01
The failure of multi-step experimental procedures to yield diffraction-quality crystals is a major bottleneck in protein structure determination. Accordingly, several bioinformatics methods have been successfully developed and employed to select crystallizable proteins. Unfortunately, the majority of existing in silico methods only allow the prediction of crystallization propensity, seldom enabling computational design of protein mutants that can be targeted for enhancing protein crystallizability. Here, we present Crysalis, an integrated crystallization analysis tool that builds on support-vector regression (SVR) models to facilitate computational protein crystallization prediction, analysis, and design. More specifically, the functionality of this new tool includes: (1) rapid selection of target crystallizable proteins at the proteome level, (2) identification of site non-optimality for protein crystallization and systematic analysis of all potential single-point mutations that might enhance protein crystallization propensity, and (3) annotation of target protein based on predicted structural properties. We applied the design mode of Crysalis to identify site non-optimality for protein crystallization on a proteome-scale, focusing on proteins currently classified as non-crystallizable. Our results revealed that site non-optimality is based on biases related to residues, predicted structures, physicochemical properties, and sequence loci, which provides in-depth understanding of the features influencing protein crystallization. Crysalis is freely available at http://nmrcen.xmu.edu.cn/crysalis/. PMID:26906024
Xu, Shuman; Tang, Peng; Chen, Xianchun; Yang, Xi; Pan, Qinwen; Gui, Yu; Chen, Li
2016-01-01
Abstract Backgroud: An important drawback of the traditional technique for harvesting latissimus dorsi (LD) myocutaneous flap is a long, posterior donor-site incision. Current techniques involve endoscopic or robotic harvesting via a combined approach of open and closed surgery, which necessitates an open axillary incision and the use of special retractors. In this paper, we introduce a fully enclosed laparoscopic technique for harvesting LD flap (LDF) using only 3 small trocar ports. This technique eliminates the need for axillary and donor-site incisions and specialized retractors and considerably reduces the incision size. Methods: We performed laparoscopic harvesting of LDF with prosthesis implantation for immediate breast reconstruction (IBR) after nipple-sparing mastectomy in 2 patients with malignant breast neoplasm who wished to avoid a long scar on the back. Results: IBR using this technique was uneventful in both cases, without any donor-site complications or flap failure. Both patients were satisfied with the esthetic results of the procedure, especially the absence of a visible scar on the back. Conclusion: Enclosed laparoscopic harvesting of LDF is simpler and less invasive than the traditional methods. These preliminary results warrant further evaluation in a larger population to validate the benefits of this technique. PMID:27861385
CATE 2016 Indonesia: Optics and Focus Strategy
NASA Astrophysics Data System (ADS)
McKay, M. A.; Jenson, L.; Kovac, S. A.; Bosh, R.; Mitchell, A. M.; Hare, H. S.; Watson, Z.; Penn, M. J.
2016-12-01
The 2017 solar eclipse will be a natural phenomenon that will sweep across the United State would provide an excellent opportunity to observe and study the solar corona. The Citizens Continental Astronomical Telescopic Eclipse (CATE) Experiment directed my Matt Penn, intends to take advantage of this scientific opportunity by organizing 60 sites along the path of totality from Oregon to South Carolina to observe the eclipse and make a 90 min continuous video of the solar corona. The preliminary observation was done with the 2016 eclipse in Indonesia, with 5 sites along the path of totality. The sites were provided with an 80mm diameter Telescope with a 480mm focal length with an extension tube, Celestron equatorial mount, a CMOS camera, a Dell dual processor running Windows, GPS and an Arduino box, more details will be provided. I observed at the furthest east site in Ternate, Indonesia, with Dr. Donald Walter. The day of the eclipse we had clouds but still had a successful observation. The observation was successful with 4 out of the 5 sites collected eclipse data, due to weather the other site was not able to observe. The data was then collected and processed over the summer. To prepare for the observation in 2017, the 60 sites will be provided with the equipment, software and training. The groups will then practice by doing solar and lunar observations, where they will follow an almost identical procedure for the eclipse to do their observations. These test will increase our chances to have a successful observation among all sites. The focus will play a crucial role in this observation to provide a high quality image. Currently, a new focusing method using an image derivative method to provide quantitative feedback to the user is being developed. Finally, a Graphical User Interface is also being developed using the codes produces from the summer 2016 data analysis, to process the images from each site with minimal effort and produce quality scientific images. This work was made possible through the NSO Training for the 2017 Citizen CATE Experiment funded by NASA (NASA NNX16AB92A).
Sylvester, B.D.; Zammit, K.; Fong, A.J.; Sabiston, C.M.
2017-01-01
Background Cancer centre Web sites can be a useful tool for distributing information about the benefits of physical activity for breast cancer (bca) survivors, and they hold potential for supporting health behaviour change. However, the extent to which cancer centre Web sites use evidence-based behaviour change techniques to foster physical activity behaviour among bca survivors is currently unknown. The aim of our study was to evaluate the presentation of behaviour-change techniques on Canadian cancer centre Web sites to promote physical activity behaviour for bca survivors. Methods All Canadian cancer centre Web sites (n = 39) were evaluated by two raters using the Coventry, Aberdeen, and London–Refined (calo-re) taxonomy of behaviour change techniques and the eEurope 2002 Quality Criteria for Health Related Websites. Descriptive statistics were calculated. Results The most common behaviour change techniques used on Web sites were providing information about consequences in general (80%), suggesting goal-setting behaviour (56%), and planning social support or social change (46%). Overall, Canadian cancer centre Web sites presented an average of M = 6.31 behaviour change techniques (of 40 that were coded) to help bca survivors increase their physical activity behaviour. Evidence of quality factors ranged from 90% (sites that provided evidence of readability) to 0% (sites that provided an editorial policy). Conclusions Our results provide preliminary evidence that, of 40 behaviour-change techniques that were coded, fewer than 20% were used to promote physical activity behaviour to bca survivors on cancer centre Web sites, and that the most effective techniques were inconsistently used. On cancer centre Web sites, health promotion specialists could focus on emphasizing knowledge mobilization efforts using available research into behaviour-change techniques to help bca survivors increase their physical activity. PMID:29270056
NASA Astrophysics Data System (ADS)
Smeekens, Johanna M.; Chen, Weixuan; Wu, Ronghu
2015-04-01
Cell surface N-glycoproteins play extraordinarily important roles in cell-cell communication, cell-matrix interactions, and cellular response to environmental cues. Global analysis is exceptionally challenging because many N-glycoproteins are present at low abundances and effective separation is difficult to achieve. Here, we have developed a novel strategy integrating metabolic labeling, copper-free click chemistry, and mass spectrometry (MS)-based proteomics methods to analyze cell surface N-glycoproteins comprehensively and site-specifically. A sugar analog containing an azido group, N-azidoacetylgalactosamine, was fed to cells to label glycoproteins. Glycoproteins with the functional group on the cell surface were then bound to dibenzocyclooctyne-sulfo-biotin via copper-free click chemistry under physiological conditions. After protein extraction and digestion, glycopeptides with the biotin tag were enriched by NeutrAvidin conjugated beads. Enriched glycopeptides were deglycosylated with peptide- N-glycosidase F in heavy-oxygen water, and in the process of glycan removal, asparagine was converted to aspartic acid and tagged with 18O for MS analysis. With this strategy, 144 unique N-glycopeptides containing 152 N-glycosylation sites were identified in 110 proteins in HEK293T cells. As expected, 95% of identified glycoproteins were membrane proteins, which were highly enriched. Many sites were located on important receptors, transporters, and cluster of differentiation proteins. The experimental results demonstrated that the current method is very effective for the comprehensive and site-specific identification of the cell surface N-glycoproteome and can be extensively applied to other cell surface protein studies.
Kanje, Sara; Hober, Sophia
2015-04-01
Antibodies are important molecules in many research fields, where they play a key role in various assays. Antibody labeling is therefore of great importance. Currently, most labeling techniques take advantage of certain amino acid side chains that commonly appear throughout proteins. This makes it hard to control the position and exact degree of labeling of each antibody. Hence, labeling of the antibody may affect the antibody-binding site. This paper presents a novel protein domain based on the IgG-binding domain C2 of streptococcal protein G, containing the unnatural amino acid BPA, that can cross-link other molecules. This novel domain can, with improved efficiency compared to previously reported similar domains, site-specifically cross-link to IgG at the Fc region. An efficient method for simultaneous in vivo incorporation of BPA and specific biotinylation in a flask cultivation of Escherichia coli is described. In comparison to a traditionally labeled antibody sample, the C2-labeled counterpart proved to have a higher proportion of functional antibodies when immobilized on a solid surface and the same limit of detection in an ELISA. This method of labeling is, due to its efficiency and simplicity, of high interest for all antibody-based assays where it is important that labeling does not interfere with the antibody-binding site. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Immediate Partial Breast Reconstruction with Endoscopic Latissimus Dorsi Muscle Flap Harvest
Yang, Chae Eun; Roh, Tai Suk; Yun, In Sik; Lew, Dae Hyun
2014-01-01
Background Currently, breast conservation therapy is commonly performed for the treatment of early breast cancer. Depending on the volume excised, patients may require volume replacement, even in cases of partial mastectomy. The use of the latissimus dorsi muscle is the standard method, but this procedure leaves an unfavorable scar on the donor site. We used an endoscope for latissimus dorsi harvesting to minimize the incision, thus reducing postoperative scars. Methods Ten patients who underwent partial mastectomy and immediate partial breast reconstruction with endoscopic latissimus dorsi muscle flap harvest were reviewed retrospectively. The total operation time, hospital stay, and complications were reviewed. Postoperative scarring, overall shape of the reconstructed breast, and donor site deformity were assessed using a 10-point scale. Results In the mean follow-up of 11 weeks, no tumor recurrence was reported. The mean operation time was 294.5 (±38.2) minutes. The postoperative hospital stay was 11.4 days. Donor site seroma was reported in four cases and managed by office aspiration and compressive dressing. Postoperative scarring, donor site deformity, and the overall shape of the neobreast were acceptable, scoring above 7. Conclusions Replacement of 20% to 40% of breast volume in the upper and the lower outer quadrants with a latissimus dorsi muscle flap by using endoscopic harvesting is a good alternative reconstruction technique after partial mastectomy. Short incision benefits from a very acceptable postoperative scar, less pain, and early upper extremity movement. PMID:25276643
Methods for accurate estimation of net discharge in a tidal channel
Simpson, M.R.; Bland, R.
2000-01-01
Accurate estimates of net residual discharge in tidally affected rivers and estuaries are possible because of recently developed ultrasonic discharge measurement techniques. Previous discharge estimates using conventional mechanical current meters and methods based on stage/discharge relations or water slope measurements often yielded errors that were as great as or greater than the computed residual discharge. Ultrasonic measurement methods consist of: 1) the use of ultrasonic instruments for the measurement of a representative 'index' velocity used for in situ estimation of mean water velocity and 2) the use of the acoustic Doppler current discharge measurement system to calibrate the index velocity measurement data. Methods used to calibrate (rate) the index velocity to the channel velocity measured using the Acoustic Doppler Current Profiler are the most critical factors affecting the accuracy of net discharge estimation. The index velocity first must be related to mean channel velocity and then used to calculate instantaneous channel discharge. Finally, discharge is low-pass filtered to remove the effects of the tides. An ultrasonic velocity meter discharge-measurement site in a tidally affected region of the Sacramento-San Joaquin Rivers was used to study the accuracy of the index velocity calibration procedure. Calibration data consisting of ultrasonic velocity meter index velocity and concurrent acoustic Doppler discharge measurement data were collected during three time periods. Two sets of data were collected during a spring tide (monthly maximum tidal current) and one of data collected during a neap tide (monthly minimum tidal current). The relative magnitude of instrumental errors, acoustic Doppler discharge measurement errors, and calibration errors were evaluated. Calibration error was found to be the most significant source of error in estimating net discharge. Using a comprehensive calibration method, net discharge estimates developed from the three sets of calibration data differed by less than an average of 4 cubic meters per second, or less than 0.5% of a typical peak tidal discharge rate of 750 cubic meters per second.
Lu, Yu Yu; Wang, Hsin Yi; Lin, Ying; Lin, Wan Yu
2012-09-01
Radionuclide Cisternography (RNC) is of potential value in pointing out the sites of cerebrospinal fluid (CSF) leakage in patients with spontaneous intracranial hypotension (SIH). In the current report, we present two patients who underwent RNC for suspected CSF leakage. Both patients underwent magnetic resonance imaging (MRI) and RNC for evaluation. We describe a simple method to increase the detection ability of RNC for CSF leakage in patients with SIH.
Methods for Intelligent Mapping of the IPV6 Address Space
2015-03-01
the " Internet of Things " ( IoT ). (2013, Jan. 7). Forbes. [Online]. Available: http://www.forbes.com/sites/quora/2013/01/07/ how-many- things -are...currently-connected-to-the- internet -of- things - iot / 57 [13] G. Huston, “IPv4 address report,” Mar 2015. [Online]. Available: http://www.potaroo.net/tools/ipv4...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) Due to the rapid growth of the Internet , the available pool of unique
de la Cruz, Norberto B.; Spiece, Leslie J.
2000-01-01
Understanding and communicating the who, what, where, when, why, and how of the clinics and services for which the computerized patient record (CPR) will be built is an integral part of the implementation process. Formal methodologies have been developed to diagram information flow -- flow charts, state-transition diagram (STDs), data flow diagrams (DFDs). For documentation of the processes at our ambulatory CPR pilot site, flowcharting was selected as the preferred method based upon its versatility and understandability.
Özkan, Aysun
2013-02-01
Healthcare waste should be managed carefully because of infected, pathological, etc. content especially in developing countries. Applied management systems must be the most appropriate solution from a technical, environmental, economic and social point of view. The main objective of this study was to analyse the current status of healthcare waste management in Turkey, and to investigate the most appropriate treatment/disposal option by using different decision-making techniques. For this purpose, five different healthcare waste treatment/disposal alternatives including incineration, microwaving, on-site sterilization, off-site sterilization and landfill were evaluated according to two multi-criteria decision-making techniques: analytic network process (ANP) and ELECTRE. In this context, benefits, costs and risks for the alternatives were taken into consideration. Furthermore, the prioritization and ranking of the alternatives were determined and compared for both methods. According to the comparisons, the off-site sterilization technique was found to be the most appropriate solution in both cases.
Brigham, Mark E.; Payne, Gregory A.; Andrews, William J.; Abbott, Marvin M.
2002-01-01
The sampling network was evaluated with respect to areal coverage, sampling frequency, and analytical schedules. Areal coverage could be expanded to include one additional watershed that is not part of the current network. A new sampling site on the North Canadian River might be useful because of expanding urbanization west of the city, but sampling at some other sites could be discontinued or reduced based on comparisons of data between the sites. Additional real-time or periodic monitoring for dissolved oxygen may be useful to prevent anoxic conditions in pools behind new low-water dams. The sampling schedules, both monthly and quarterly, are adequate to evaluate trends, but additional sampling during flow extremes may be needed to quantify loads and evaluate water-quality during flow extremes. Emerging water-quality issues may require sampling for volatile organic compounds, sulfide, total phosphorus, chlorophyll-a, Esherichia coli, and enterococci, as well as use of more sensitive laboratory analytical methods for determination of cadmium, mercury, lead, and silver.
Taiwan's underwater cultural heritage documentation management
NASA Astrophysics Data System (ADS)
Tung, Y.-Y.
2015-09-01
Taiwan is an important trading and maritime channels for many countries since ancient time. Numerous relics lie underwater due to weather, wars, and other factors. In the year of 2006, Bureau of Cultural Heritage (BOCH) entrusted the Underwater Archaeological Team of Academia Sinica to execute the underwater archaeological investigation projects. Currently, we verified 78 underwater targets, with 78 site of those had been recognized as shipwrecks sites. Up to date, there is a collection of 638 underwater objects from different underwater archaeological sites. Those artefacts are distributed to different institutions and museums. As very diverse management methods/systems are applied for every individual institution, underwater cultural heritage data such as survey, excavation report, research, etc. are poorly organized and disseminated for use. For better communication regarding to Taiwan's underwater cultural heritage in every level, a universal format of documentation should be established. By comparing the existing checklist used in Taiwan with guidelines that are followed in other countries, a more intact and appropriate underwater cultural heritage condition documentation system can be established and adapted in Taiwan.
Combined Delivery of Consolidating Pulps to the Remote Sites of Deposits
NASA Astrophysics Data System (ADS)
Golik, V. I.; Efremenkov, A. B.
2017-07-01
The problems of modern mining production include limitation of the scope of application of environmental and resource-saving technologies with application of consolidating pulps when developing the sites of the ore field remote from the stowing complexes which leads to the significant reduction of the performance indicators of underground mining of metallic ores. Experimental approach to the problem solution is characterized by the proof of technological capability and efficiency of the combined vibration-pneumatic-gravity-flowing method of pulps delivery at the distance exceeding the capacity of current delivery methods as it studies the vibration phenomenon in industrial special structure pipeline. The results of the full-scale experiment confirm the theoretical calculations of the capability of consolidating stowing delivery of common composition at the distance exceeding the capacity of usual pneumatic-gravity-flowing delivery method due to reduction of the friction-induced resistance of the consolidating stowing to the movement along the pipeline. The parameters of the interaction of the consolidating stowing components improve in the process of its delivery via the pipeline resulting in the stowing strength increase, completeness of subsurface use improves, the land is saved for agricultural application and the environmental stress is relieved.
The Current Status and Tendency of China Millimeter Coordinate Frame Implementation and Maintenance
NASA Astrophysics Data System (ADS)
Cheng, P.; Cheng, Y.; Bei, J.
2017-12-01
China Geodetic Coordinate System 2000 (CGCS2000) was first officially declared as the national standard coordinate system on July 1, 2008. This reference frame was defined in the ITRF97 frame at epoch 2000.0 and included 2600 GPS geodetic control points. The paper discusses differences between China Geodetic Coordinate System 2000 (CGCS2000) and later updated ITRF versions, such as ITRF2014,in terms of technical implementation and maintenance. With the development of the Beidou navigation satellite system, especially third generation of BDS with signal global coverage in the future, and with progress of space geodetic technology, it is possible for us to establish a global millimeter-level reference frame based on space geodetic technology including BDS. The millimeter reference frame implementation concerns two factors: 1) The variation of geocenter motion estimation, and 2) the site nonlinear motion modeling. In this paper, the geocentric inversion methods are discussed and compared among results derived from various technical methods. Our nonlinear site movement modeling focuses on singular spectrum analysis method, which is of apparent advantages over earth physical effect modeling. All presented in the paper expected to provide reference to our future CGCS2000 maintenance.
Combined Dust Detection Algorithm by Using MODIS Infrared Channels over East Asia
NASA Technical Reports Server (NTRS)
Park, Sang Seo; Kim, Jhoon; Lee, Jaehwa; Lee, Sukjo; Kim, Jeong Soo; Chang, Lim Seok; Ou, Steve
2014-01-01
A new dust detection algorithm is developed by combining the results of multiple dust detectionmethods using IR channels onboard the MODerate resolution Imaging Spectroradiometer (MODIS). Brightness Temperature Difference (BTD) between two wavelength channels has been used widely in previous dust detection methods. However, BTDmethods have limitations in identifying the offset values of the BTDto discriminate clear-sky areas. The current algorithm overcomes the disadvantages of previous dust detection methods by considering the Brightness Temperature Ratio (BTR) values of the dual wavelength channels with 30-day composite, the optical properties of the dust particles, the variability of surface properties, and the cloud contamination. Therefore, the current algorithm shows improvements in detecting the dust loaded region over land during daytime. Finally, the confidence index of the current dust algorithm is shown in 10 × 10 pixels of the MODIS observations. From January to June, 2006, the results of the current algorithm are within 64 to 81% of those found using the fine mode fraction (FMF) and aerosol index (AI) from the MODIS and Ozone Monitoring Instrument (OMI). The agreement between the results of the current algorithm and the OMI AI over the non-polluted land also ranges from 60 to 67% to avoid errors due to the anthropogenic aerosol. In addition, the developed algorithm shows statistically significant results at four AErosol RObotic NETwork (AERONET) sites in East Asia.
Ocampo, Wrechelle; Geransar, Rose; Clayden, Nancy; Jones, Jessica; de Grood, Jill; Joffe, Mark; Taylor, Geoffrey; Missaghi, Bayan; Pearce, Craig; Ghali, William; Conly, John
2017-10-01
Ward closure is a method of controlling hospital-acquired infectious diseases outbreaks and is often coupled with other practices. However, the value and efficacy of ward closures remains uncertain. To understand the current practices and perceptions with respect to ward closure for hospital-acquired infectious disease outbreaks in acute care hospital settings across Canada. A Web-based environmental scan survey was developed by a team of infection prevention and control (IPC) experts and distributed to 235 IPC professionals at acute care sites across Canada. Data were analyzed using a mixed-methods approach of descriptive statistics and thematic analysis. A total of 110 completed responses showed that 70% of sites reported at least 1 outbreak during 2013, 44% of these sites reported the use of ward closure. Ward closure was considered an "appropriate," "sometimes appropriate," or "not appropriate" strategy to control outbreaks by 50%, 45%, and 5% of participants, respectively. System capacity issues and overall risk assessment were main factors influencing the decision to close hospital wards following an outbreak. Results suggest the use of ward closure for containment of hospital-acquired infectious disease outbreaks in Canadian acute care health settings is mixed, with outbreak control methods varying. The successful implementation of ward closure was dependent on overall support for the IPC team within hospital administration. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Szunerits, Sabine; Walt, David R
2002-02-15
The localized corrosion behavior of a galvanic aluminum copper couple was investigated by in situ fluorescence imaging with a fiber-optic imaging sensor. Three different, but complementary methods were used for visualizing remote corrosion sites, mapping the topography of the metal surface, and measuring local chemical concentrations of H+, OH-, and Al3+. The first method is based on a pH-sensitive imaging fiber, where the fluorescent dye SNAFL was covalently attached to the fiber's distal end. Fluorescence images were acquired as a function of time at different areas of the galvanic couple. In a second method, the fluorescent dye morin was immobilized on the fiber-optic imaging sensor, which allowed the in situ localization of corrosion processes on pure aluminum to be visualized over time by monitoring the release of Al3+. The development of fluorescence on the aluminum surface defined the areas associated with the anodic dissolution of aluminum. We also investigated the inhibition of corrosion of pure aluminum by CeCl3 and 8-hydroxyquinoline. The decrease in current, the decrease in the number of active sites on the aluminum surface, and the faster surface passivation are all consistent indications that cerium chloride and 8-hydroxyquinoline inhibit corrosion effectively. From the number and extent of corrosion sites and the release of aluminum ions monitored with the fiber, it was shown that 8-hydroxyquinoline is a more effective inhibitor than cerium chloride.
Donor site reconstitution for ear reconstruction.
Fattah, Adel; Sebire, Neil J; Bulstrode, Neil W
2010-09-01
Current techniques of autologous ear reconstruction involve the soft tissue coverage of a carved costal cartilage framework. However, assessment of the morbidity associated with this donor site has been little documented. This study describes a method to reconstruct the defect and analyses the outcomes with or without donor site reconstitution. The donor site was reconstituted by wrapping morcelised cartilage in a vicryl mesh. Twenty-one patients with reconstitution and nine without were recruited to the study. Scar quality and length, dimensions of donor defect and visible deformity were recorded according to a modified Vancouver scar scale. Patients were also assessed by the SF36 questionnaire, a well-validated health survey. In a subset of our study group, we assessed the fate of the donor site reconstitution by direct visualisation in situ and histological analysis. Fifteen donor sites of patients without donor site reconstitution were compared to 23 reconstructed donor sites. In those without, all had a palpable defect with nearly half exhibiting visible chest deformity. In contrast, those that had rib reconstitution did not demonstrate significant chest wall deformity. Intraoperative examination demonstrated formation of a neo-rib, histologically proven to comprise hyaline cartilage admixed with fibrous tissue. Analysis of SF36 results showed a higher satisfaction in the reconstituted group, but in both groups, the donor site was of little overall morbidity. Although there is little difference between the groups in terms of subjectively perceived benefit, rib reconstitution is objectively associated with better costal margin contour and less chest wall deformity. Copyright 2009 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-20
..., Currently Known as Henkel Electronic Materials, LLC, Electronic Adhesives Division, Including On-Site Leased..., Electronic Adhesives Division, including on-site leased workers from Aerotek Professional Services, Billerica..., Electronic Adhesives Division had their wages reported under a separate unemployment insurance (UI) tax...
Site-Selection in Single-Molecule Junction for Highly Reproducible Molecular Electronics.
Kaneko, Satoshi; Murai, Daigo; Marqués-González, Santiago; Nakamura, Hisao; Komoto, Yuki; Fujii, Shintaro; Nishino, Tomoaki; Ikeda, Katsuyoshi; Tsukagoshi, Kazuhito; Kiguchi, Manabu
2016-02-03
Adsorption sites of molecules critically determine the electric/photonic properties and the stability of heterogeneous molecule-metal interfaces. Then, selectivity of adsorption site is essential for development of the fields including organic electronics, catalysis, and biology. However, due to current technical limitations, site-selectivity, i.e., precise determination of the molecular adsorption site, remains a major challenge because of difficulty in precise selection of meaningful one among the sites. We have succeeded the single site-selection at a single-molecule junction by performing newly developed hybrid technique: simultaneous characterization of surface enhanced Raman scattering (SERS) and current-voltage (I-V) measurements. The I-V response of 1,4-benzenedithiol junctions reveals the existence of three metastable states arising from different adsorption sites. Notably, correlated SERS measurements show selectivity toward one of the adsorption sites: "bridge sites". This site-selectivity represents an essential step toward the reliable integration of individual molecules on metallic surfaces. Furthermore, the hybrid spectro-electric technique reveals the dependence of the SERS intensity on the strength of the molecule-metal interaction, showing the interdependence between the optical and electronic properties in single-molecule junctions.
2001 Evaluation of Tritium Removal & Mitigation Technologies for Waste Water Treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
PENWELL, D.L.
2001-06-01
This report contains the 2001 biennial update evaluation of separation technologies and other mitigation techniques to control tritium in liquid effluents and groundwater at the Hanford site. A thorough literature review was completed, and national and international experts in the field of tritium separation and mitigation techniques were consulted. Current state-of-the-art technologies to address the control of tritium in wastewaters were identified and are described. This report was prepared to satisfy the Hanford Federal Facility Agreement and Consent Order Tri-Party Agreement, Milestone M-29-O5H (Ecology, EPA, and DOE 1996). Tritium separation and isolation technologies are evaluated on a biennial basis tomore » determine their feasibility for implementation for the control of Hanford site liquid effluents and groundwater to meet the US. Code of Federal Regulations (CFR), Title 40 CFR 141.16, drinking water maximum contaminant level (MCL) for tritium of 0.02 {mu} Ci/l ({approx}2 parts per quadrillion [10{sup -15}]) and/or DOE Order 5400.5 as low as reasonably achievable (ALARA) policy The objectives of this evaluation were to (1) status the development of potentially viable tritium separations technologies with regard to reducing tritium concentrations in current Hanford site process waters and existing groundwater to MCL levels and (2) status control methods to prevent the flow of tritiated water at concentrations greater than the MCL to the environment. Current tritium releases are in compliance with applicable US Environmental Protection Agency, Washington State Department of Ecology, and U.S. Department of Energy requirements under the Tri-Party Agreement. Advances in technologies for the separation of tritium from wastewater since the 1999 Hanford Site evaluation report include: (1) construction and testing of the Combined Industrial Reforming and Catalytic Exchange (CIRCE) Prototype Plant by Atomic Energy Canada Limited (AECL). The plant has a stage that uses the combined electrolysis catalytic exchange (CECE) and a stage that uses the bithermal hydrogen-waterprocess. The testing is still ongoing at the time of the development of this evaluation report, therefore, final results of the testing are not available; (2) further testing and a DOE sponsored American Society of Mechanical Engineers (ASME) peer review of a tritium resin separations process to remove tritium from wastewaters; and (3) completion of the design of the water detritiation system for the International Thermonuclear Experimental Reactor (ITER). The system uses a variation of the CECE process, and is designed to process 20 Whr of feed. The primary advance in technologies to control tritium migration in groundwater are the implementation of phytoremediation as a method of reducing the amount of tritium contaminated groundwater reaching the surface waters at Argonne National Laboratory, and initiation of a project for phytoremediation at the Savannah River Site.« less
Hassanpour, Saeid; Saboonchi, Ahmad
2016-12-01
This paper aims to evaluate the role of small vessels in heat transfer mechanisms of a tissue-like medium during local intensive heating processes, for example, an interstitial hyperthermia treatment. To this purpose, a cylindrical tissue with two co- and counter-current vascular networks and a central heat source is introduced. Next, the energy equations of tissue, supply fluid (arterial blood), and return fluid (venous blood) are derived using porous media approach. Then, a 2D computer code is developed to predict the temperature of blood (fluid phase) and tissue (solid phase) by conventional volume averaging method and a more realistic solution method. In latter method, despite the volume averaging the blood of interconnect capillaries is separated from the arterial and venous blood phases. It is found that in addition to blood perfusion rate, the arrangement of vascular network has considerable effects on the pattern and amount of the achieved temperature. In contrast to counter-current network, the co-current network of vessels leads to considerable asymmetric pattern of temperature contours and relocation of heat affected zone along the blood flow direction. However this relocation can be prevented by changing the site of hyperthermia heat source. The results show that the cooling effect of co-current blood vessels during of interstitial heating is more efficient. Despite much anatomical dissimilarities, these findings can be useful in designing of protocols for hyperthermia cancer treatment of living tissue. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-04-01
This Closure Report summarizes the corrective actions which were completed at the Corrective Action Sites within Corrective Action Unit 211 Area 15 Farm Waste Sties at the Nevada Test Site. Current site descriptions, observations and identification of wastes removed are included on FFACO Corrective Action Site housekeeping closure verification forms.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-19
... Maricopa County, AZ; Application for Expansion; (New Magnet Site) Under Alternative Site Framework An... additional new magnet sites in western Maricopa County, Arizona and request usage-driven designation for an..., Arizona. The current zone project includes the following magnet sites: Site 1 (230.25 acres)--within the...
Prestressing force monitoring method for a box girder through distributed long-gauge FBG sensors
NASA Astrophysics Data System (ADS)
Chen, Shi-Zhi; Wu, Gang; Xing, Tuo; Feng, De-Cheng
2018-01-01
Monitoring prestressing forces is essential for prestressed concrete box girder bridges. However, the current monitoring methods used for prestressing force were not applicable for a box girder neither because of the sensor’s setup being constrained or shear lag effect not being properly considered. Through combining with the previous analysis model of shear lag effect in the box girder, this paper proposed an indirect monitoring method for on-site determination of prestressing force in a concrete box girder utilizing the distributed long-gauge fiber Bragg grating sensor. The performance of this method was initially verified using numerical simulation for three different distribution forms of prestressing tendons. Then, an experiment involving two concrete box girders was conducted to study the feasibility of this method under different prestressing levels preliminarily. The results of both numerical simulation and lab experiment validated this method’s practicability in a box girder.
In vivo generation of DNA sequence diversity for cellular barcoding
Peikon, Ian D.; Gizatullina, Diana I.; Zador, Anthony M.
2014-01-01
Heterogeneity is a ubiquitous feature of biological systems. A complete understanding of such systems requires a method for uniquely identifying and tracking individual components and their interactions with each other. We have developed a novel method of uniquely tagging individual cells in vivo with a genetic ‘barcode’ that can be recovered by DNA sequencing. Our method is a two-component system comprised of a genetic barcode cassette whose fragments are shuffled by Rci, a site-specific DNA invertase. The system is highly scalable, with the potential to generate theoretical diversities in the billions. We demonstrate the feasibility of this technique in Escherichia coli. Currently, this method could be employed to track the dynamics of populations of microbes through various bottlenecks. Advances of this method should prove useful in tracking interactions of cells within a network, and/or heterogeneity within complex biological samples. PMID:25013177
Farage, Miranda A; Meyer, Sandy; Walter, Dave
2004-05-01
The first main objective of the work presented in this paper was to investigate ways of optimizing the current arm patch test protocol by (1) increasing the sensitivity of the test in order to evaluate more effectively the products that are inherently non-irritating, and/or (2) reducing the costs of these types of studies by shortening the protocol. The second main objective was to use the results of these studies and the results of the parallel studies conducted using the behind-the-knee method to better understand the contribution of mechanical irritation to the skin effects produced by these types of products. In addition, we were interested in continuing the evaluation of sensory effects and their relationship to objective measures of irritation. Test materials were prepared from three, currently marketed feminine protection pads. Wet and dry samples were applied to the upper arm using the standard 24-h patch test. Applications were repeated daily for 4 consecutive days. The test sites were scored for irritation prior to the first patch application, and 30-60 min after removal of each patch. Some test sites were treated by tape stripping the skin prior to the initial patch application. In addition, in one experiment, panelists were asked to keep a daily diary describing any sensory skin effects they noticed at each test site. All protocol variations ([intact skin/dry samples], [compromised skin/dry samples], [intact skin/wet samples], and [compromised skin/wet samples]) gave similar results for the products tested. When compared to the behind-the-knee test method, the standard upper arm patch test gave consistently lower levels of irritation when the test sites were scored shortly after patch removal, even though the sample application was longer (24 vs. 6 h) in the standard patch test. The higher level of irritation in the behind-the-knee method was likely due to mechanical irritation. The sensory skin effects did not appear to be related to a particular test product or a particular protocol variation. However, the mean irritation scores at those sites where a sensory effect was reported were higher than the mean irritation scores at those sites were no sensory effects were reported. All four protocol variations of the standard upper arm patch test can be used to assess the inherent chemical irritant properties of feminine protection products. For these products, which are inherently non-irritating, tape stripping and/or applying wet samples does not increase the sensitivity of the patch test method. Differences in irritation potential were apparent after one to three 24-h applications. Therefore, the standard patch test protocol can be shortened to three applications without compromising our ability to detect differences in the chemical irritation produced by the test materials. The patch test can be used to evaluate effectively the inherent chemical irritation potential of these types of products. However, this method is not suitable for testing the mechanical irritation due to friction that occurs during product use. There is no relationship between specific test conditions, i.e., compromised skin and/or testing wet samples and reports of perceived sensory reactions. However, there seems to be a clear relationship between sensory reactions and objective irritation scores.
State of the art in PEGylation: the great versatility achieved after forty years of research.
Pasut, Gianfranco; Veronese, Francesco M
2012-07-20
In the recent years, protein PEGylation has become an established and highly refined technology by moving forward from initial simple random coupling approaches based on conjugation at the level of lysine ε-amino group. Nevertheless, amino PEGylation is still yielding important conjugates, currently in clinical practice, where the degree of homogeneity was improved by optimizing the reaction conditions and implementing the purification processes. However, the current research is mainly focused on methods of site-selective PEGylation that allow the obtainment of a single isomer, thus highly increasing the degree of homogeneity and the preservation of bioactivity. Protein N-terminus and free cysteines were the first sites exploited for selective PEGylation but currently further positions can be addressed thanks to approaches like bridging PEGylation (disulphide bridges), enzymatic PEGylation (glutamines and C-terminus) and glycoPEGylation (sites of O- and N-glycosylation or the glycans of a glycoprotein). Furthermore, by combining the tools of genetic engineering with specific PEGylation approaches, the polymer can be basically coupled at any position on the protein surface, owing to the substitution of a properly chosen amino acid in the sequence with a natural or unnatural amino acid bearing an orthogonal reactive group. On the other hand, PEGylation has not achieved the same success in the delivery of small drugs, despite the large interest and several studies in this field. Targeted conjugates and PEGs for combination therapy might represent the promising answers for the so far unmet needs of PEG as carrier of small drugs. This review presents a thorough panorama of recent advances in the field of PEGylation. Copyright © 2011 Elsevier B.V. All rights reserved.
AEIS Policy vs. Site-Based Management: Research Agenda Implications.
ERIC Educational Resources Information Center
Nash, John B.
This paper examines the problems of centralized academic-indicator systems in light of the move toward site-based management. Problems with current practice are examined in the framework of critical inquiry. Alternatives to current accountability guidelines are presented that harmonize positivism with critical inquiry, while respecting both local…
There is currently a dearth of data characterizing best management practice impacts on runoff production at the parcel-level. This data is of critical importance insofar as judging the effectiveness and reliability of on-site stormwater BMPs, with significant implications for bot...
NASA Astrophysics Data System (ADS)
Kiaalhosseini, Saeed
In modern contaminant hydrology, management of contaminated sites requires a holistic characterization of subsurface conditions. Delineation of contaminant distribution in all phases (i.e., aqueous, non-aqueous liquid, sorbed, and gas), as well as associated biogeochemical processes in a complex heterogeneous subsurface, is central to selecting effective remedies. Arguably, a factor contributing to the lack of success of managing contaminated sites effectively has been the limitations of site characterization methods that rely on monitoring wells and grab sediment samples. The overarching objective of this research is to advance a set of third-generation (3G) site characterization methods to overcome shortcomings of current site characterization techniques. 3G methods include 1) cryogenic core collection (C3) from unconsolidated geological subsurface to improve recovery of sediments and preserving key attributes, 2) high-throughput analysis (HTA) of frozen core in the laboratory to provide high-resolution, depth discrete data of subsurface conditions and processes, 3) resolution of non-aqueous phase liquid (NAPL) distribution within the porous media using a nuclear magnetic resonance (NMR) method, and 4) application of a complex resistivity method to track NAPL depletion in shallow geological formation over time. A series of controlled experiments were conducted to develop the C 3 tools and methods. The critical aspects of C3 are downhole circulation of liquid nitrogen via a cooling system, the strategic use of thermal insulation to focus cooling into the core, and the use of back pressure to optimize cooling. The C3 methods were applied at two contaminated sites: 1) F.E. Warren (FEW) Air Force Base near Cheyenne, WY and 2) a former refinery in the western U.S. The results indicated that the rate of core collection using the C3 methods is on the order of 30 foot/day. The C3 methods also improve core recovery and limits potential biases associated with flowing sands. HTA of frozen core was employed at the former refinery and FEW. Porosity and fluid saturations (i.e., aqueous, non-aqueous liquid, and gas) from the former refinery indicate that given in situ freezing, the results are not biased by drainage of pore fluids from the core during sample collection. At FEW, a comparison between the results of HTA of the frozen core collected in 2014 and the results of site characterization using unfrozen core, (second-generation (2G) methods) at the same locations (performed in 2010) indicate consistently higher contaminant concentrations using C 3. Many factors contribute to the higher quantification of contaminant concentrations using C3. The most significant factor is the preservation of the sediment attributes, in particular, pore fluids and volatile organic compounds (VOCs) in comparison to the unfrozen conventional sediment core. The NMR study was performed on laboratory-fabricated sediment core to resolve NAPL distribution within the porous media qualitatively and quantitatively. The fabricated core consisted of Colorado silica sand saturated with deionized water and trichloroethylene (TCE). The cores were scanned with a BRUKER small-animal scanner (2.3 Tesla, 100 MHz) at 20 °C and while the core was frozen at -25 °C. The acquired images indicated that freezing the water within the core suppressed the NMR signals of water-bound hydrogen. The hydrogen associated with TCE was still detectable since the TCE was in its liquid state (melting point of TCE is -73 °C). Therefore, qualitative detection of TCE within the sediment core was performed via the NMR scanning by freezing the water. A one-dimensional NMR scanning method was used for quantification of TCE mass distribution within the frozen core. However, the results indicated inconsistency in estimating the total TCE mass within the porous media. Downhole NMR logging was performed at the former refinery in the western U.S. to detect NAPL and to discriminate NAPL from water in the formation. The results indicated that detection of NMR signals to discriminate NAPL from water is compromised by the noise stemming from the active facilities and/or power lines passing over the site. A laboratory experiment was performed to evaluate the electrical response of unconsolidated porous media through time (30 days) while NAPL was being depleted. Sand columns (Colorado silica sand) contaminated with methyl tert-butyl ether (MTBE, a light non-aqueous phase liquid (LNAPL)) were studied. A multilevel electrode system was used to measure electrical resistivity of impacted sand by imposing alternative current. The trend of reduction in resistivity through the depth of columns over time followed depletion of LNAPL by volatilization. Finally, a field experiment was performed at the former refinery in the western U.S. to track natural losses of LNAPL over time. Multilevel systems consisting of water samplers, thermocouples, and electrodes were installed at a clean zone (background zone) and an LNAPL-impacted zone. In situ measurements of complex resistivity and temperature were taken and water sampling was performed for each depth (from 3 to 14 feet below the ground surface at one-foot spacing) within almost a year. At both locations, the results indicated decreases in apparent resistivity below the water table over time. This trend was supported by the geochemistry of the pore fluids. Overall, results indicate that application of the electrical resistivity method to track LNAPL depletion at field sites is difficult due to multiple conflicting factors affecting the geoelectrical response of LNAPL-impacted zones over time.
Hui, Yiang; Manna, Pradip; Ou, Joyce J; Kerley, Spencer; Zhang, Cunxian; Sung, C James; Lawrence, W Dwayne; Quddus, M Ruhul
2015-09-01
High-risk human papillomavirus infection usually is seen at one anatomic site in an individual. Rarely, infection at multiple anatomic sites of the female lower genital tract in the same individual is encountered either simultaneously and/or at a later date. The current study identifies the various subtypes of high-risk human papillomavirus infection in these scenarios and analyzes the potential significance of these findings. High-risk human papillomavirus infection involving 22 anatomic sites from 7 individuals was identified after institutional review board approval. Residual paraffin-embedded tissue samples were retrieved, and all 15 high-risk human papillomavirus were identified and viral load quantified using multiplex real-time polymerase chain reaction-based method. Multiple high-risk human papillomavirus subtypes were identified in 32% of the samples and as many as 5 different subtypes of high-risk human papillomavirus infection in a single anatomic site. In general, each anatomic site has unique combination of viral subtypes, although one individual showed overlapping subtypes in the vagina, cervix, and vulvar samples. Higher viral load and rare subtypes are more frequent in younger patients and in dysplasia compared with carcinoma. Follow-up ranging from 3 to 84 months revealed persistent high-risk human papillomavirus infection in 60% of cases. Copyright © 2015 Elsevier Inc. All rights reserved.
Forced-Air Warmers and Surgical Site Infections in Patients Undergoing Knee or Hip Arthroplasty.
Austin, Paul N
2017-01-01
The majority of the evidence indicates preventing inadvertent perioperative hypothermia reduces the incidence of many perioperative complications. Among the results of inadvertent perioperative hypothermia are increased bleeding, myocardial events, impaired wound healing, and diminished renal function. Most researchers agree there is an increased incidence of surgical site infections in patients who experience inadvertent perioperative hypothermia. Forced-air warming is effective in preventing inadvertent perioperative hypothermia. Paradoxically, forced-air warmers have been implicated in causing surgical site infections in patients undergoing total knee or hip arthroplasty. The results of investigations suggest these devices harbor pathogens and cause unwanted airflow disturbances. However, no significant increases in bacterial counts were found when forced-air warmers were used according to the manufacturer's directions. The results of one study suggested the incidence of surgical site infections in patients undergoing total joint arthroplasty was increased when using a forced-air warmer. However these researchers did not control for other factors affecting the incidence of surgical site infections in these patients. Current evidence does not support forced-air warmers causing surgical site infections in patients undergoing total knee or hip arthroplasty. Clinicians must use and maintain these devices as per the manufacturer's directions. They may consider using alternative warming methods. Well-conducted studies are needed to help determine the role of forced-air warmers in causing infections in these patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Serne, R.J.; Wood, M.I.
1990-05-01
This report documents the currently available geochemical data base for release and retardation for actual Hanford Site materials (wastes and/or sediments). The report also recommends specific laboratory tests and presents the rationale for the recommendations. The purpose of this document is threefold: to summarize currently available information, to provide a strategy for generating additional data, and to provide recommendations on specific data collection methods and tests matrices. This report outlines a data collection approach that relies on feedback from performance analyses to ascertain when adequate data have been collected. The data collection scheme emphasizes laboratory testing based on empiricism. 196more » refs., 4 figs., 36 tabs.« less
Electronic voltage and current transformers testing device.
Pan, Feng; Chen, Ruimin; Xiao, Yong; Sun, Weiming
2012-01-01
A method for testing electronic instrument transformers is described, including electronic voltage and current transformers (EVTs, ECTs) with both analog and digital outputs. A testing device prototype is developed. It is based on digital signal processing of the signals that are measured at the secondary outputs of the tested transformer and the reference transformer when the same excitation signal is fed to their primaries. The test that estimates the performance of the prototype has been carried out at the National Centre for High Voltage Measurement and the prototype is approved for testing transformers with precision class up to 0.2 at the industrial frequency (50 Hz or 60 Hz). The device is suitable for on-site testing due to its high accuracy, simple structure and low-cost hardware.
ProMateus—an open research approach to protein-binding sites analysis
Neuvirth, Hani; Heinemann, Uri; Birnbaum, David; Tishby, Naftali; Schreiber, Gideon
2007-01-01
The development of bioinformatic tools by individual labs results in the abundance of parallel programs for the same task. For example, identification of binding site regions between interacting proteins is done using: ProMate, WHISCY, PPI-Pred, PINUP and others. All servers first identify unique properties of binding sites and then incorporate them into a predictor. Obviously, the resulting prediction would improve if the most suitable parameters from each of those predictors would be incorporated into one server. However, because of the variation in methods and databases, this is currently not feasible. Here, the protein-binding site prediction server is extended into a general protein-binding sites research tool, ProMateus. This web tool, based on ProMate's infrastructure enables the easy exploration and incorporation of new features and databases by the user, providing an evaluation of the benefit of individual features and their combination within a set framework. This transforms the individual research into a community exercise, bringing out the best from all users for optimized predictions. The analysis is demonstrated on a database of protein protein and protein-DNA interactions. This approach is basically different from that used in generating meta-servers. The implications of the open-research approach are discussed. ProMateus is available at http://bip.weizmann.ac.il/promate. PMID:17488838