Sample records for field sampling protocols

  1. NHEXAS PHASE I REGION 5 STUDY--STANDARD OPERATING PROCEDURE--HANDLING QUALITY CONTROL SAMPLES IN THE FIELD (RTI/ACS-AP-209-090)

    EPA Science Inventory

    This protocol describes how quality control samples should be handled in the field, and was designed as a quick reference source for the field staff. The protocol describes quality control samples for air-VOCs, air-particles, water samples, house dust, soil, urine, blood, hair, a...

  2. Sampling protocol for post-landfall Deepwater Horizon oil release, Gulf of Mexico, 2010

    USGS Publications Warehouse

    Wilde, F.D.; Skrobialowski, S.C.; Hart, J.S.

    2010-01-01

    The protocols and procedures described in this report are designed to be used by U.S. Geological Survey (USGS) field teams for the collection of environmental data and samples in coastal areas affected by the 2010 Deepwater Horizon oil spill in the Gulf of Mexico. This sampling protocol focuses specifically on sampling for water, sediments, benthic invertebrates, and microorganisms (ambient bacterial populations) after shoreline arrival of petroleum-associated product on beach, barrier island, and wetland environments of the Gulf of Mexico coastal states. Deployment to sampling sites, site setup, and sample collection in these environments necessitates modifications to standard USGS sampling procedures in order to address the regulatory, logistical, and legal requirements associated with samples collected in oil-impacted coastal areas. This document, therefore, has been written as an addendum to the USGS National Field Manual for the Collection of Water-Quality Data (NFM) (http://pubs.water.usgs.gov/twri9A/), which provides the basis for training personnel in the use of standard USGS sampling protocols. The topics covered in this Gulf of Mexico oil-spill sampling protocol augment NFM protocols for field-deployment preparations, health and safety precautions, sampling and quality-assurance procedures, and decontamination requirements under potentially hazardous environmental conditions. Documentation procedures and maintenance of sample integrity by use of chain-of-custody procedures also are described in this protocol.

  3. FIELD SAMPLING PROTOCOLS AND ANALYSIS

    EPA Science Inventory

    I have been asked to speak again to the environmental science class regarding actual research scenarios related to my work at Kerr Lab. I plan to discuss sampling protocols along with various field analyses performed during sampling activities. Many of the students have never see...

  4. A simplified protocol for molecular identification of Eimeria species in field samples.

    PubMed

    Haug, Anita; Thebo, Per; Mattsson, Jens G

    2007-05-15

    This study aimed to find a fast, sensitive and efficient protocol for molecular identification of chicken Eimeria spp. in field samples. Various methods for each of the three steps of the protocol were evaluated: oocyst wall rupturing methods, DNA extraction methods, and identification of species-specific DNA sequences by PCR. We then compared and evaluated five complete protocols. Three series of oocyst suspensions of known number of oocysts from Eimeria mitis, Eimeria praecox, Eimeria maxima and Eimeria tenella were prepared and ground using glass beads or mini-pestle. DNA was extracted from ruptured oocysts using commercial systems (GeneReleaser, Qiagen Stoolkit and Prepman) or phenol-chloroform DNA extraction, followed by identification of species-specific ITS-1 sequences by optimised single species PCR assays. The Stoolkit and Prepman protocols showed insufficient repeatability, and the former was also expensive and relatively time-consuming. In contrast, both the GeneReleaser protocol and phenol-chloroform protocols were robust and sensitive, detecting less than 0.4 oocysts of each species per PCR. Finally, we evaluated our new protocol on 68 coccidia positive field samples. Our data suggests that rupturing the oocysts by mini-pestle grinding, preparing the DNA with GeneReleaser, followed by optimised single species PCR assays, makes a robust and sensitive procedure for identifying chicken Eimeria species in field samples. Importantly, it also provides minimal hands-on-time in the pre-PCR process, lower contamination risk and no handling of toxic chemicals.

  5. Field Geologic Observation and Sample Collection Strategies for Planetary Surface Exploration: Insights from the 2010 Desert RATS Geologist Crewmembers

    NASA Technical Reports Server (NTRS)

    Hurtado, Jose M., Jr.; Young, Kelsey; Bleacher, Jacob E.; Garry, W. Brent; Rice, James W., Jr.

    2012-01-01

    Observation is the primary role of all field geologists, and geologic observations put into an evolving conceptual context will be the most important data stream that will be relayed to Earth during a planetary exploration mission. Sample collection is also an important planetary field activity, and its success is closely tied to the quality of contextual observations. To test protocols for doing effective planetary geologic field- work, the Desert RATS(Research and Technology Studies) project deployed two prototype rovers for two weeks of simulated exploratory traverses in the San Francisco volcanic field of northern Arizona. The authors of this paper represent the geologist crew members who participated in the 2010 field test.We document the procedures adopted for Desert RATS 2010 and report on our experiences regarding these protocols. Careful consideration must be made of various issues that impact the interplay between field geologic observations and sample collection, including time management; strategies relatedtoduplicationofsamplesandobservations;logisticalconstraintson the volume and mass of samples and the volume/transfer of data collected; and paradigms for evaluation of mission success. We find that the 2010 field protocols brought to light important aspects of each of these issues, and we recommend best practices and modifications to training and operational protocols to address them. Underlying our recommendations is the recognition that the capacity of the crew to flexibly execute their activities is paramount. Careful design of mission parameters, especially field geologic protocols, is critical for enabling the crews to successfully meet their science objectives.

  6. Tools and Technologies Needed for Conducting Planetary Field Geology While On EVA: Insights from the 2010 Desert RATS Geologist Crewmembers

    NASA Technical Reports Server (NTRS)

    Young, Kelsey; Hurtado, Jose M., Jr.; Bleacher, Jacob E.; Garry, W. Brent; Bleisath, Scott; Buffington, Jesse; Rice, James W., Jr.

    2011-01-01

    Observation is the primary role of all field geologists, and geologic observations put into an evolving conceptual context will be the most important data stream that will be relayed to Earth during a planetary exploration mission. Sample collection is also an important planetary field activity, and its success is closely tied to the quality of contextual observations. To test protocols for doing effective planetary geologic fieldwork, the Desert RATS (Research and Technology Studies) project deployed two prototype rovers for two weeks of simulated exploratory traverses in the San Francisco volcanic field of northern Arizona. The authors of this paper represent the geologist crewmembers who participated in the 2010 field test. We document the procedures adopted for Desert RATS 2010 and report on our experiences regarding these protocols. Careful consideration must be made of various issues that impact the interplay between field geologic observations and sample collection, including time management; strategies related to duplication of samples and observations; logistical constraints on the volume and mass of samples and the volume/transfer of data collected; and paradigms for evaluation of mission success. We find that the 2010 field protocols brought to light important aspects of each of these issues, and we recommend best practices and modifications to training and operational protocols to address them. Underlying our recommendations is the recognition that the capacity of the crew to "flexibly execute" their activities is paramount. Careful design of mission parameters, especially field geologic protocols, is critical for enabling the crews to successfully meet their science objectives.

  7. NHEXAS PHASE I MARYLAND STUDY--LIST OF AVAILABLE DOCUMENTS: PROTOCOLS AND SOPS

    EPA Science Inventory

    This document lists available protocols and SOPs for the NHEXAS Phase I Maryland study. It identifies protocols and SOPs for the following study components: (1) Sample collection and field operations, (2) Sample analysis and general laboratory procedures, (3) Data Analysis Proced...

  8. A distance limited method for sampling downed coarse woody debris

    Treesearch

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2012-01-01

    A new sampling method for down coarse woody debris is proposed based on limiting the perpendicular distance from individual pieces to a randomly chosen sample point. Two approaches are presented that allow different protocols to be used to determine field measurements; estimators for each protocol are also developed. Both protocols are compared via simulation against...

  9. Investigation of differences between field and laboratory pH measurements of national atmospheric deposition program/national trends network precipitation samples

    USGS Publications Warehouse

    Latysh, N.; Gordon, J.

    2004-01-01

    A study was undertaken to investigate differences between laboratory and field pH measurements for precipitation samples collected from 135 weekly precipitation-monitoring sites in the National Trends Network from 12/30/1986 to 12/28/1999. Differences in pH between field and laboratory measurements occurred for 96% of samples collected during this time period. Differences between the two measurements were evaluated for precipitation samples collected before and after January 1994, when modifications to sample-handling protocol and elimination of the contaminating bucket o-ring used in sample shipment occurred. Median hydrogen-ion and pH differences between field and laboratory measurements declined from 3.9 ??eq L-1 or 0.10 pH units before the 1994 protocol change to 1.4 ??eq L-1 or 0.04 pH units after the 1994 protocol change. Hydrogen-ion differences between field and laboratory measurements had a high correlation with the sample pH determined in the field. The largest pH differences between the two measurements occurred for high-pH samples (>5.6), typical of precipitation collected in Western United States; however low- pH samples (<5.0) displayed the highest variability in hydrogen-ion differences between field and laboratory analyses. Properly screened field pH measurements are a useful alternative to laboratory pH values for trend analysis, particularly before 1994 when laboratory pH values were influenced by sample-collection equipment.

  10. A simplified field protocol for genetic sampling of birds using buccal swabs

    USGS Publications Warehouse

    Vilstrup, Julia T.; Mullins, Thomas D.; Miller, Mark P.; McDearman, Will; Walters, Jeffrey R.; Haig, Susan M.

    2018-01-01

    DNA sampling is an essential prerequisite for conducting population genetic studies. For many years, blood sampling has been the preferred method for obtaining DNA in birds because of their nucleated red blood cells. Nonetheless, use of buccal swabs has been gaining favor because they are less invasive yet still yield adequate amounts of DNA for amplifying mitochondrial and nuclear markers; however, buccal swab protocols often include steps (e.g., extended air-drying and storage under frozen conditions) not easily adapted to field settings. Furthermore, commercial extraction kits and swabs for buccal sampling can be expensive for large population studies. We therefore developed an efficient, cost-effective, and field-friendly protocol for sampling wild birds after comparing DNA yield among 3 inexpensive buccal swab types (2 with foam tips and 1 with a cotton tip). Extraction and amplification success was high (100% and 97.2% respectively) using inexpensive generic swabs. We found foam-tipped swabs provided higher DNA yields than cotton-tipped swabs. We further determined that omitting a drying step and storing swabs in Longmire buffer increased efficiency in the field while still yielding sufficient amounts of DNA for detailed population genetic studies using mitochondrial and nuclear markers. This new field protocol allows time- and cost-effective DNA sampling of juveniles or small-bodied birds for which drawing blood may cause excessive stress to birds and technicians alike.

  11. COMPARISON OF USEPA FIELD SAMPLING METHODS FOR BENTHIC MACROINVERTEBRATE STUDIES

    EPA Science Inventory

    Two U.S. Environmental Protection Agency (USEPA) macroinvertebrate sampling protocols were compared in the Mid-Atlantic Highlands region. The Environmental Monitoring and Assessment Program (EMAP) wadeable streams protocol results in a single composite sample from nine transects...

  12. A Field-Based Cleaning Protocol for Sampling Devices Used in Life-Detection Studies

    NASA Astrophysics Data System (ADS)

    Eigenbrode, Jennifer; Benning, Liane G.; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E. F.

    2009-06-01

    Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.

  13. A field-based cleaning protocol for sampling devices used in life-detection studies.

    PubMed

    Eigenbrode, Jennifer; Benning, Liane G; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E F

    2009-06-01

    Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.

  14. A Field Comparison of Sampling Protocols for Measuring Lead in Drinking Water

    EPA Science Inventory

    US EPA Region 5 conducted a sampling study that demonstrates existing sampling protocols used for the Lead and Copper Rule (LCR) underestimate peak and probable mass of lead released in a system with lead service lines (LSLs). This comparative stagnation sampling was conducted i...

  15. Rapid Waterborne Pathogen Detection with Mobile Electronics.

    PubMed

    Wu, Tsung-Feng; Chen, Yu-Chen; Wang, Wei-Chung; Kucknoor, Ashwini S; Lin, Che-Jen; Lo, Yu-Hwa; Yao, Chun-Wei; Lian, Ian

    2017-06-09

    Pathogen detection in water samples, without complex and time consuming procedures such as fluorescent-labeling or culture-based incubation, is essential to public safety. We propose an immunoagglutination-based protocol together with the microfluidic device to quantify pathogen levels directly from water samples. Utilizing ubiquitous complementary metal-oxide-semiconductor (CMOS) imagers from mobile electronics, a low-cost and one-step reaction detection protocol is developed to enable field detection for waterborne pathogens. 10 mL of pathogen-containing water samples was processed using the developed protocol including filtration enrichment, immune-reaction detection and imaging processing. The limit of detection of 10 E. coli O157:H7 cells/10 mL has been demonstrated within 10 min of turnaround time. The protocol can readily be integrated into a mobile electronics such as smartphones for rapid and reproducible field detection of waterborne pathogens.

  16. NHEXAS PHASE I ARIZONA STUDY--LIST OF STANDARD OPERATING PROCEDURES

    EPA Science Inventory

    This document lists available protocols and SOPs for the NHEXAS Phase I Arizona study. It identifies protocols and SOPs for the following study components: (1) Sample collection and field operations, (2) Sample analysis, (3) General laboratory procedures, (4) Quality Assurance, (...

  17. Estimating occupancy and abundance of stream amphibians using environmental DNA from filtered water samples

    USGS Publications Warehouse

    Pilliod, David S.; Goldberg, Caren S.; Arkle, Robert S.; Waits, Lisette P.

    2013-01-01

    Environmental DNA (eDNA) methods for detecting aquatic species are advancing rapidly, but with little evaluation of field protocols or precision of resulting estimates. We compared sampling results from traditional field methods with eDNA methods for two amphibians in 13 streams in central Idaho, USA. We also evaluated three water collection protocols and the influence of sampling location, time of day, and distance from animals on eDNA concentration in the water. We found no difference in detection or amount of eDNA among water collection protocols. eDNA methods had slightly higher detection rates than traditional field methods, particularly when species occurred at low densities. eDNA concentration was positively related to field-measured density, biomass, and proportion of transects occupied. Precision of eDNA-based abundance estimates increased with the amount of eDNA in the water and the number of replicate subsamples collected. eDNA concentration did not vary significantly with sample location in the stream, time of day, or distance downstream from animals. Our results further advance the implementation of eDNA methods for monitoring aquatic vertebrates in stream habitats.

  18. Microbial Groundwater Sampling Protocol for Fecal-Rich Environments

    PubMed Central

    Harter, Thomas; Watanabe, Naoko; Li, Xunde; Atwill, Edward R; Samuels, William

    2014-01-01

    Inherently, confined animal farming operations (CAFOs) and other intense fecal-rich environments are potential sources of groundwater contamination by enteric pathogens. The ubiquity of microbial matter poses unique technical challenges in addition to economic constraints when sampling wells in such environments. In this paper, we evaluate a groundwater sampling protocol that relies on extended purging with a portable submersible stainless steel pump and Teflon® tubing as an alternative to equipment sterilization. The protocol allows for collecting a large number of samples quickly, relatively inexpensively, and under field conditions with limited access to capacity for sterilizing equipment. The protocol is tested on CAFO monitoring wells and considers three cross-contamination sources: equipment, wellbore, and ambient air. For the assessment, we use Enterococcus, a ubiquitous fecal indicator bacterium (FIB), in laboratory and field tests with spiked and blank samples, and in an extensive, multi-year field sampling campaign on 17 wells within 2 CAFOs. The assessment shows that extended purging can successfully control for equipment cross-contamination, but also controls for significant contamination of the well-head, within the well casing and within the immediate aquifer vicinity of the well-screen. Importantly, our tests further indicate that Enterococcus is frequently entrained in water samples when exposed to ambient air at a CAFO during sample collection. Wellbore and air contamination pose separate challenges in the design of groundwater monitoring strategies on CAFOs that are not addressed by equipment sterilization, but require adequate QA/QC procedures and can be addressed by the proposed sampling strategy. PMID:24903186

  19. STANDARD MEASUREMENT PROTOCOLS - FLORIDA RADON RESEARCH PROGRAM

    EPA Science Inventory

    The manual, in support of the Florida Radon Research Program, contains standard protocols for key measurements where data quality is vital to the program. t contains two sections. he first section, soil measurements, contains field sampling protocols for soil gas permeability and...

  20. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--LIST OF STANDARD OPERATING PROCEDURES

    EPA Science Inventory

    This document lists available protocols and SOPs for the U.S.-Mexico Border Program study. It identifies protocols and SOPs for the following study components: (1) Sample collection and field operations, (2) Sample analysis, (3) General laboratory procedures, (4) Quality Assuranc...

  1. Protocol for collecting eDNA samples from streams [Version 2.3

    Treesearch

    K. J. Carim; T. Wilcox; M. K. Young; K. S. McKelvey; M. K. Schwartz

    2015-01-01

    Throughout the 2014 field season, we had over two dozen biologist throughout the western US collect over 300 samples for eDNA analysis with paired controls. Control samples were collected by filtering 0.5 L of distilled water. No samples had any evidence of field contamination. This method of sampling verifies the cleanliness of the field equipment, as well as the...

  2. NHEXAS PHASE I REGION 5 STUDY--STANDARD OPERATING PROCEDURE--SAMPLE SHIPPING PROCEDURES (RTI/ACS-AP-209-083)

    EPA Science Inventory

    This procedure summarizes the sample shipping procedures that have been described in the individual NHEXAS sample collection protocols. This procedure serves as a quick reference tool for the field staff when samples are prepared for shipment at the field lab/staging area. For ea...

  3. Immune system changes during simulated planetary exploration on Devon Island, high arctic

    PubMed Central

    Crucian, Brian; Lee, Pascal; Stowe, Raymond; Jones, Jeff; Effenhauser, Rainer; Widen, Raymond; Sams, Clarence

    2007-01-01

    Background Dysregulation of the immune system has been shown to occur during spaceflight, although the detailed nature of the phenomenon and the clinical risks for exploration class missions have yet to be established. Also, the growing clinical significance of immune system evaluation combined with epidemic infectious disease rates in third world countries provides a strong rationale for the development of field-compatible clinical immunology techniques and equipment. In July 2002 NASA performed a comprehensive immune assessment on field team members participating in the Haughton-Mars Project (HMP) on Devon Island in the high Canadian Arctic. The purpose of the study was to evaluate the effect of mission-associated stressors on the human immune system. To perform the study, the development of techniques for processing immune samples in remote field locations was required. Ten HMP-2002 participants volunteered for the study. A field protocol was developed at NASA-JSC for performing sample collection, blood staining/processing for immunophenotype analysis, whole-blood mitogenic culture for functional assessments and cell-sample preservation on-location at Devon Island. Specific assays included peripheral leukocyte distribution; constitutively activated T cells, intracellular cytokine profiles, plasma cortisol and EBV viral antibody levels. Study timepoints were 30 days prior to mission start, mid-mission and 60 days after mission completion. Results The protocol developed for immune sample processing in remote field locations functioned properly. Samples were processed on Devon Island, and stabilized for subsequent analysis at the Johnson Space Center in Houston. The data indicated that some phenotype, immune function and stress hormone changes occurred in the HMP field participants that were largely distinct from pre-mission baseline and post-mission recovery data. These immune changes appear similar to those observed in astronauts following spaceflight. Conclusion The immune system changes described during the HMP field deployment validate the use of the HMP as a ground-based spaceflight/planetary exploration analog for some aspects of human physiology. The sample processing protocol developed for this study may have applications for immune studies in remote terrestrial field locations. Elements of this protocol could possibly be adapted for future in-flight immunology studies conducted during space missions. PMID:17521440

  4. Field Immune Assessment during Simulated Planetary Exploration in the Canadian Arctic

    NASA Technical Reports Server (NTRS)

    Crucian, Brian; Lee, Pascal; Stowe, Raymond; Jones, Jeff; Effenhauser, Rainer; Widen, Raymond; Sams, Clarence

    2006-01-01

    Dysregulation of the immune system has been shown to occur during space flight, although the detailed nature of the phenomenon and the clinical risks for exploration class missions has yet to be established. In addition, the growing clinical significance of immune system evaluation combined with epidemic infectious disease rates in third world countries provides a strong rationale for the development of field-compatible clinical immunology techniques and equipment. In July 2002 NASA performed a comprehensive field immunology assessment on crewmembers participating in the Haughton-Mars Project (HMP) on Devon Island in the high Canadian Arctic. The purpose of the study was to evaluate mission-associated effects on the human immune system, as well as to evaluate techniques developed for processing immune samples in remote field locations. Ten HMP-2002 participants volunteered for the study. A field protocol was developed at NASA-JSC for performing sample collection, blood staining/processing for immunophenotype analysis, wholeblood mitogenic culture for functional assessments and cell-sample preservation on-location at Devon Island. Specific assays included peripheral leukocyte distribution; constitutively activated T cells, intracellular cytokine profiles and plasma EBV viral antibody levels. Study timepoints were L-30, midmission and R+60. The protocol developed for immune sample processing in remote field locations functioned properly. Samples were processed in the field location, and stabilized for subsequent analysis at the Johnson Space Center in Houston. The data indicated that some phenotype, immune function and stress hormone changes occurred in the HMP field participants that were largely distinct from pre-mission baseline and post-mission recovery data. These immune changes appear similar to those observed in Astronauts following spaceflight. The sample processing protocol developed for this study may have applications for immune assessment during exploration-class space missions or in remote terrestrial field locations. The data validate the use of the HMP as a ground-based spaceflight/planetary exploration analog for some aspects of human physiology.

  5. General introduction for the “National Field Manual for the Collection of Water-Quality Data”

    USGS Publications Warehouse

    ,

    2018-02-28

    BackgroundAs part of its mission, the U.S. Geological Survey (USGS) collects data to assess the quality of our Nation’s water resources. A high degree of reliability and standardization of these data are paramount to fulfilling this mission. Documentation of nationally accepted methods used by USGS personnel serves to maintain consistency and technical quality in data-collection activities. “The National Field Manual for the Collection of Water-Quality Data” (NFM) provides documented guidelines and protocols for USGS field personnel who collect water-quality data. The NFM provides detailed, comprehensive, and citable procedures for monitoring the quality of surface water and groundwater. Topics in the NFM include (1) methods and protocols for sampling water resources, (2) methods for processing samples for analysis of water quality, (3) methods for measuring field parameters, and (4) specialized procedures, such as sampling water for low levels of mercury and organic wastewater chemicals, measuring biological indicators, and sampling bottom sediment for chemistry. Personnel who collect water-quality data for national USGS programs and projects, including projects supported by USGS cooperative programs, are mandated to use protocols provided in the NFM per USGS Office of Water Quality Technical Memorandum 2002.13. Formal training, for example, as provided in the USGS class, “Field Water-Quality Methods for Groundwater and Surface Water,” and field apprenticeships supplement the guidance provided in the NFM and ensure that the data collected are high quality, accurate, and scientifically defensible.

  6. A comparison of single and multiple stressor protocols to assess acute stress in a coastal shark species, Rhizoprionodon terraenovae.

    PubMed

    Hoffmayer, Eric R; Hendon, Jill M; Parsons, Glenn R; Driggers, William B; Campbell, Matthew D

    2015-10-01

    Elasmobranch stress responses are traditionally measured in the field by either singly or serially sampling an animal after a physiologically stressful event. Although capture and handling techniques are effective at inducing a stress response, differences in protocols could affect the degree of stress experienced by an individual, making meaningful comparisons between the protocols difficult, if not impossible. This study acutely stressed Atlantic sharpnose sharks, Rhizoprionodon terraenovae, by standardized capture (rod and reel) and handling methods and implemented either a single or serial blood sampling protocol to monitor four indicators of the secondary stress response. Single-sampled sharks were hooked and allowed to swim around the boat until retrieved for a blood sample at either 0, 15, 30, 45, or 60 min post-hooking. Serially sampled sharks were retrieved, phlebotomized, released while still hooked, and subsequently resampled at 15, 30, 45, and 60 min intervals post-hooking. Blood was analyzed for hematocrit, and plasma glucose, lactate, and osmolality levels. Although both single and serial sampling protocols resulted in an increase in glucose, no significant difference in glucose level was found between protocols. Serially sampled sharks exhibited cumulatively heightened levels for lactate and osmolality at all time intervals when compared to single-sampled animals at the same time. Maximal concentration differences of 217.5, 9.8, and 41.6 % were reported for lactate, osmolality, and glucose levels, respectively. Hematocrit increased significantly over time for the single sampling protocol but did not change significantly during the serial sampling protocol. The differences in resultant blood chemistry levels between implemented stress protocols and durations are significant and need to be considered when assessing stress in elasmobranchs.

  7. An Organic Decontamination Method for Sampling Devices used in Life-detection Studies

    NASA Technical Reports Server (NTRS)

    Eigenbrode, Jennifer; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E.F.

    2008-01-01

    Organic decontamination of sampling and storage devices are crucial steps for life-detection, habitability, and ecological investigations of extremophiles living in the most inhospitable niches of Earth, Mars and elsewhere. However, one of the main stumbling blocks for Mars-analogue life-detection studies in terrestrial remote field-sites is the capability to clean instruments and sampling devices to organic levels consistent with null values. Here we present a new seven-step, multi-reagent cleaning and decontamination protocol that was adapted and tested on a glacial ice-coring device and on a rover-guided scoop used for sediment sampling both deployed multiple times during two field seasons of the Arctic Mars Analog Svalbard Expedition AMASE). The effectiveness of the protocols for both devices was tested by (1)in situ metabolic measurements via APT, (2)in situ lipopolysacchride (LPS) quantifications via low-level endotoxin assays, and(3) laboratory-based molecular detection via gas chromatography-mass spectrometry. Our results show that the combination and step-wise application of disinfectants with oxidative and solvation properties for sterilization are effective at removing cellular remnants and other organic traces to levels necessary for molecular organic- and life-detection studies. The validation of this seven-step protocol - specifically for ice sampling - allows us to proceed with confidence in kmskia4 analogue investigations of icy environments. However, results from a rover scoop test showed that this protocol is also suitable for null-level decontamination of sample acquisition devices. Thus, this protocol may be applicable to a variety of sampling devices and analytical instrumentation used for future astrobiology missions to Enceladus, and Europa, as well as for sample-return missions.

  8. An optimised protocol for molecular identification of Eimeria from chickens☆

    PubMed Central

    Kumar, Saroj; Garg, Rajat; Moftah, Abdalgader; Clark, Emily L.; Macdonald, Sarah E.; Chaudhry, Abdul S.; Sparagano, Olivier; Banerjee, Partha S.; Kundu, Krishnendu; Tomley, Fiona M.; Blake, Damer P.

    2014-01-01

    Molecular approaches supporting identification of Eimeria parasites infecting chickens have been available for more than 20 years, although they have largely failed to replace traditional measures such as microscopy and pathology. Limitations of microscopy-led diagnostics, including a requirement for specialist parasitological expertise and low sample throughput, are yet to be outweighed by the difficulties associated with accessing genomic DNA from environmental Eimeria samples. A key step towards the use of Eimeria species-specific PCR as a sensitive and reproducible discriminatory tool for use in the field is the production of a standardised protocol that includes sample collection and DNA template preparation, as well as primer selection from the numerous PCR assays now published. Such a protocol will facilitate development of valuable epidemiological datasets which may be easily compared between studies and laboratories. The outcome of an optimisation process undertaken in laboratories in India and the UK is described here, identifying four steps. First, samples were collected into a 2% (w/v) potassium dichromate solution. Second, oocysts were enriched by flotation in saturated saline. Third, genomic DNA was extracted using a QIAamp DNA Stool mini kit protocol including a mechanical homogenisation step. Finally, nested PCR was carried out using previously published primers targeting the internal transcribed spacer region 1 (ITS-1). Alternative methods tested included sample processing in the presence of faecal material, DNA extraction using a traditional phenol/chloroform protocol, the use of SCAR multiplex PCR (one tube and two tube versions) and speciation using the morphometric tool COCCIMORPH for the first time with field samples. PMID:24138724

  9. Protocol to obtain targeted transcript sequence data from snake venom samples collected in the Colombian field.

    PubMed

    Fonseca, Alejandra; Renjifo-Ibáñez, Camila; Renjifo, Juan Manuel; Cabrera, Rodrigo

    2018-03-21

    Snake venoms are a mixture of different molecules that can be used in the design of drugs for various diseases. The study of these venoms has relied on strategies that use complete venom extracted from animals in captivity or from venom glands that require the sacrifice of the animals. Colombia, a country with political and geographical conflicts has difficult access to certain regions. A strategy that can prevent the sacrifice of animals and could allow the study of samples collected in the field is necessary. We report the use of lyophilized venom from Crotalus durissus cumanensis as a model to test, for the first time, a protocol for the amplification of complete toxins from Colombian venom samples collected in the field. In this protocol, primers were designed from conserved region from Crotalus sp. mRNA and EST regions to maximize the likelihood of coding sequence amplification. We obtained the sequences of Metalloproteinases II, Disintegrins, Disintegrin-Like, Phospholipases A 2, C-type Lectins and Serine proteinases from Crotalus durissus cumanensis and compared them to different Crotalus sp sequences available on databases obtaining concordance between the toxins amplified and those reported. Our strategy allows the use of lyophilized venom to obtain complete toxin sequences from samples collected in the field and the study of poorly characterized venoms in challenging environments. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Protocol for Cohesionless Sample Preparation for Physical Experimentation

    DTIC Science & Technology

    2016-05-01

    protocol for specimen preparation that will enable the use of soil strength curves based on expedient field classification testing (e.g., grain-size...void ratio and relative compaction, which compares field compaction to a laboratory maximum density. Gradation charts for the two materials used in...the failure stress. Ring shear testing was performed using the GCTS Residual-Ring Shear System SRS-150 in order to measure the peak torsional

  11. Manuals Used in the National Aquatic Resource Surveys

    EPA Pesticide Factsheets

    Various manuals are used to communicate the methods and guidelines for the National Aquatic Resource Surveys. The Field Operations Manual: outlines the field protocols that crews will utilize to sample sites.

  12. Comparison of PCR methods for the detection of genetic variants of carp edema virus.

    PubMed

    Adamek, Mikolaj; Matras, Marek; Jung-Schroers, Verena; Teitge, Felix; Heling, Max; Bergmann, Sven M; Reichert, Michal; Way, Keith; Stone, David M; Steinhagen, Dieter

    2017-09-20

    The infection of common carp and its ornamental variety, koi, with the carp edema virus (CEV) is often associated with the occurrence of a clinical disease called 'koi sleepy disease'. The disease may lead to high mortality in both koi and common carp populations. To prevent further spread of the infection and the disease, a reliable detection method for this virus is required. However, the high genetic variability of the CEV p4a gene used for PCR-based diagnostics could be a serious obstacle for successful and reliable detection of virus infection in field samples. By analysing 39 field samples from different geographical origins obtained from koi and farmed carp and from all 3 genogroups of CEV, using several recently available PCR protocols, we investigated which of the protocols would allow the detection of CEV from all known genogroups present in samples from Central European carp or koi populations. The comparison of 5 different PCR protocols showed that the PCR assays (both end-point and quantitative) developed in the Centre for Environment, Fisheries and Aquaculture Science exhibited the highest analytical inclusivity and diagnostic sensitivity. Currently, this makes them the most suitable protocols for detecting viruses from all known CEV genogroups.

  13. Further studies on the problems of geomagnetic field intensity determination from archaeological baked clay materials

    NASA Astrophysics Data System (ADS)

    Kostadinova-Avramova, M.; Kovacheva, M.

    2015-10-01

    Archaeological baked clay remains provide valuable information about the geomagnetic field in historical past, but determination of the geomagnetic field characteristics, especially intensity, is often a difficult task. This study was undertaken to elucidate the reasons for unsuccessful intensity determination experiments obtained from two different Bulgarian archaeological sites (Nessebar - Early Byzantine period and Malenovo - Early Iron Age). With this aim, artificial clay samples were formed in the laboratory and investigated. The clay used for the artificial samples preparation differs according to its initial state. Nessebar clay was baked in the antiquity, but Malenovo clay was raw, taken from the clay deposit near the site. The obtained artificial samples were repeatedly heated eight times in known magnetic field to 700 °C. X-ray diffraction analyses and rock-magnetic experiments were performed to obtain information about the mineralogical content and magnetic properties of the initial and laboratory heated clays. Two different protocols were applied for the intensity determination-Coe version of Thellier and Thellier method and multispecimen parallel differential pTRM protocol. Various combinations of laboratory fields and mutual positions of the directions of laboratory field and carried thermoremanence were used in the performed Coe experiment. The obtained results indicate that the failure of this experiment is probably related to unfavourable grain sizes of the prevailing magnetic carriers combined with the chosen experimental conditions. The multispecimen parallel differential pTRM protocol in its original form gives excellent results for the artificial samples, but failed for the real samples (samples coming from previously studied kilns of Nessebar and Malenovo sites). Obviously the strong dependence of this method on the homogeneity of the used subsamples hinders its implementation in its original form for archaeomaterials. The latter are often heterogeneous due to variable heating conditions in the different parts of the archaeological structures. The study draws attention to the importance of multiple heating for the stabilization of grain size distribution in baked clay materials and the need of elucidation of this question.

  14. An optimised protocol for molecular identification of Eimeria from chickens.

    PubMed

    Kumar, Saroj; Garg, Rajat; Moftah, Abdalgader; Clark, Emily L; Macdonald, Sarah E; Chaudhry, Abdul S; Sparagano, Olivier; Banerjee, Partha S; Kundu, Krishnendu; Tomley, Fiona M; Blake, Damer P

    2014-01-17

    Molecular approaches supporting identification of Eimeria parasites infecting chickens have been available for more than 20 years, although they have largely failed to replace traditional measures such as microscopy and pathology. Limitations of microscopy-led diagnostics, including a requirement for specialist parasitological expertise and low sample throughput, are yet to be outweighed by the difficulties associated with accessing genomic DNA from environmental Eimeria samples. A key step towards the use of Eimeria species-specific PCR as a sensitive and reproducible discriminatory tool for use in the field is the production of a standardised protocol that includes sample collection and DNA template preparation, as well as primer selection from the numerous PCR assays now published. Such a protocol will facilitate development of valuable epidemiological datasets which may be easily compared between studies and laboratories. The outcome of an optimisation process undertaken in laboratories in India and the UK is described here, identifying four steps. First, samples were collected into a 2% (w/v) potassium dichromate solution. Second, oocysts were enriched by flotation in saturated saline. Third, genomic DNA was extracted using a QIAamp DNA Stool mini kit protocol including a mechanical homogenisation step. Finally, nested PCR was carried out using previously published primers targeting the internal transcribed spacer region 1 (ITS-1). Alternative methods tested included sample processing in the presence of faecal material, DNA extraction using a traditional phenol/chloroform protocol, the use of SCAR multiplex PCR (one tube and two tube versions) and speciation using the morphometric tool COCCIMORPH for the first time with field samples. Copyright © 2013 Dirk Vulpius The Authors. Published by Elsevier B.V. All rights reserved.

  15. Utility of a fecal real-time PCR protocol for detection of Mycobacterium bovis infection in African buffalo (Syncerus caffer).

    PubMed

    Roug, Annette; Geoghegan, Claire; Wellington, Elizabeth; Miller, Woutrina A; Travis, Emma; Porter, David; Cooper, David; Clifford, Deana L; Mazet, Jonna A K; Parsons, Sven

    2014-01-01

    A real-time PCR protocol for detecting Mycobacterium bovis in feces was evaluated in bovine tuberculosis-infected African buffalo (Syncerus caffer). Fecal samples spiked with 1.42 × 10(3) cells of M. bovis culture/g and Bacille Calmette-Guérin standards with 1.58 × 10(1) genome copies/well were positive by real-time PCR but all field samples were negative.

  16. System and method for non-destructive evaluation of surface characteristics of a magnetic material

    DOEpatents

    Jiles, David C.; Sipahi, Levent B.

    1994-05-17

    A system and a related method for non-destructive evaluation of the surface characteristics of a magnetic material. The sample is excited by an alternating magnetic field. The field frequency, amplitude and offset are controlled according to a predetermined protocol. The Barkhausen response of the sample is detected for the various fields and offsets and is analyzed. The system produces information relating to the frequency content, the amplitude content, the average or RMS energy content, as well as count rate information, for each of the Barkhausen responses at each of the excitation levels applied during the protocol. That information provides a contiguous body of data, heretofore unavailable, which can be analyzed to deduce information about the surface characteristics of the material at various depths below the surface.

  17. Effect of variable rates of daily sampling of fly larvae on decomposition and carrion insect community assembly: implications for forensic entomology field study protocols.

    PubMed

    Michaud, Jean-Philippe; Moreau, Gaétan

    2013-07-01

    Experimental protocols in forensic entomology successional field studies generally involve daily sampling of insects to document temporal changes in species composition on animal carcasses. One challenge with that method has been to adjust the sampling intensity to obtain the best representation of the community present without affecting the said community. To this date, little is known about how such investigator perturbations affect decomposition-related processes. Here, we investigated how different levels of daily sampling of fly eggs and fly larvae affected, over time, carcass decomposition rate and the carrion insect community. Results indicated that a daily sampling of <5% of the egg and larvae volumes present on a carcass, a sampling intensity believed to be consistent with current accepted practices in successional field studies, had little effect overall. Higher sampling intensities, however, slowed down carcass decomposition, affected the abundance of certain carrion insects, and caused an increase in the volume of eggs laid by dipterans. This study suggests that the carrion insect community not only has a limited resilience to recurrent perturbations but that a daily sampling intensity equal to or <5% of the egg and larvae volumes appears adequate to ensure that the system is representative of unsampled conditions. Hence we propose that this threshold be accepted as best practice in future forensic entomology successional field studies.

  18. Towards robust and repeatable sampling methods in eDNA based studies.

    PubMed

    Dickie, Ian A; Boyer, Stephane; Buckley, Hannah; Duncan, Richard P; Gardner, Paul; Hogg, Ian D; Holdaway, Robert J; Lear, Gavin; Makiola, Andreas; Morales, Sergio E; Powell, Jeff R; Weaver, Louise

    2018-05-26

    DNA based techniques are increasingly used for measuring the biodiversity (species presence, identity, abundance and community composition) of terrestrial and aquatic ecosystems. While there are numerous reviews of molecular methods and bioinformatic steps, there has been little consideration of the methods used to collect samples upon which these later steps are based. This represents a critical knowledge gap, as methodologically sound field sampling is the foundation for subsequent analyses. We reviewed field sampling methods used for metabarcoding studies of both terrestrial and freshwater ecosystem biodiversity over a nearly three-year period (n = 75). We found that 95% (n = 71) of these studies used subjective sampling methods, inappropriate field methods, and/or failed to provide critical methodological information. It would be possible for researchers to replicate only 5% of the metabarcoding studies in our sample, a poorer level of reproducibility than for ecological studies in general. Our findings suggest greater attention to field sampling methods and reporting is necessary in eDNA-based studies of biodiversity to ensure robust outcomes and future reproducibility. Methods must be fully and accurately reported, and protocols developed that minimise subjectivity. Standardisation of sampling protocols would be one way to help to improve reproducibility, and have additional benefits in allowing compilation and comparison of data from across studies. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  19. Methods for Monitoring Fish Communities of Buffalo National River and Ozark National Scenic Riverways in the Ozark Plateaus of Arkansas and Missouri: Version 1.0

    USGS Publications Warehouse

    Petersen, James C.; Justus, B.G.; Dodd, H.R.; Bowles, D.E.; Morrison, L.W.; Williams, M.H.; Rowell, G.A.

    2008-01-01

    Buffalo National River located in north-central Arkansas, and Ozark National Scenic Riverways, located in southeastern Missouri, are the two largest units of the National Park Service in the Ozark Plateaus physiographic province. The purpose of this report is to provide a protocol that will be used by the National Park Service to sample fish communities and collect related water-quality, habitat, and stream discharge data of Buffalo National River and Ozark National Scenic Riverways to meet inventory and long-term monitoring objectives. The protocol includes (1) a protocol narrative, (2) several standard operating procedures, and (3) supplemental information helpful for implementation of the protocol. The protocol narrative provides background information about the protocol such as the rationale of why a particular resource or resource issue was selected for monitoring, information concerning the resource or resource issue of interest, a description of how monitoring results will inform management decisions, and a discussion of the linkages between this and other monitoring projects. The standard operating procedures cover preparation, training, reach selection, water-quality sampling, fish community sampling, physical habitat collection, measuring stream discharge, equipment maintenance and storage, data management and analysis, reporting, and protocol revision procedures. Much of the information in the standard operating procedures was gathered from existing protocols of the U.S. Geological Survey National Water Quality Assessment program or other sources. Supplemental information that would be helpful for implementing the protocol is included. This information includes information on fish species known or suspected to occur in the parks, sample sites, sample design, fish species traits, index of biotic integrity metrics, sampling equipment, and field forms.

  20. A novel method of genomic DNA extraction for Cactaceae1

    PubMed Central

    Fehlberg, Shannon D.; Allen, Jessica M.; Church, Kathleen

    2013-01-01

    • Premise of the study: Genetic studies of Cactaceae can at times be impeded by difficult sampling logistics and/or high mucilage content in tissues. Simplifying sampling and DNA isolation through the use of cactus spines has not previously been investigated. • Methods and Results: Several protocols for extracting DNA from spines were tested and modified to maximize yield, amplification, and sequencing. Sampling of and extraction from spines resulted in a simplified protocol overall and complete avoidance of mucilage as compared to typical tissue extractions. Sequences from one nuclear and three plastid regions were obtained across eight genera and 20 species of cacti using DNA extracted from spines. • Conclusions: Genomic DNA useful for amplification and sequencing can be obtained from cactus spines. The protocols described here are valuable for any cactus species, but are particularly useful for investigators interested in sampling living collections, extensive field sampling, and/or conservation genetic studies. PMID:25202521

  1. USGS/EPA collection protocol for bacterial pathogens in soil

    USGS Publications Warehouse

    Griffin, Dale W.; Shaefer, F.L.; Charlena Bowling,; Dino Mattorano,; Tonya Nichols,; Erin Silvestri,

    2014-01-01

    This Sample Collection Procedure (SCP) describes the activities and considerations for the collection of bacterial pathogens from representative surface soil samples (0-5 cm). This sampling depth can be reached without the use of a drill rig, direct-push technology, or other mechanized equipment. This procedure can be used in most soil types but is limited to sampling at or near the ground surface. This protocol has components for two different types of sampling applications: (1) typical sampling, when there is no suspicion of contamination (e.g., surveillance or background studies); and (2) in response to known or suspected accidental contamination (e.g., the presence of animal carcasses). This protocol does not cover sampling in response to a suspected bioterrorist or intentional release event. Surface material is removed to the required depth (0-5 cm) and clean trowel or 50 ml sample tube is used to collect the sample. Sample containers are sealed, bagged, and shipped to the laboratory for analysis. Associated documentation, including a Field Data Log and Chain-of-Custody are also included in this document.

  2. Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling

    PubMed Central

    Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David

    2016-01-01

    Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464

  3. Chapter A10. Lakes and reservoirs: Guidelines for study design and sampling

    USGS Publications Warehouse

    Green, William R.; Robertson, Dale M.; Wilde, Franceska D.

    2015-09-29

    Within this chapter are references to other chapters of the NFM that provide more detailed guidelines related to specific topics and more detailed protocols for the quality assurance and assessment of the lake and reservoir data. Protocols and procedures to address and document the quality of lake and reservoir investigations are adapted from, or referenced to, the protocols and standard operating procedures contained in related chapters of this National Field Manual.

  4. Spin ensemble-based AC magnetometry using concatenated dynamical decoupling at low temperatures

    NASA Astrophysics Data System (ADS)

    Farfurnik, D.; Jarmola, A.; Budker, D.; Bar-Gill, N.

    2018-01-01

    Ensembles of nitrogen-vacancy centers in diamond are widely used as AC magnetometers. While such measurements are usually performed using standard (XY) dynamical decoupling (DD) protocols at room temperature, we study the sensitivities achieved by utilizing various DD protocols, for measuring magnetic AC fields at frequencies in the 10-250 kHz range, at room temperature and 77 K. By performing measurements on an isotopically pure 12C sample, we find that the Carr-Purcell-Meiboom-Gill protocol, which is not robust against pulse imperfections, is less efficient for magnetometry than robust XY-based sequences. The concatenation of a standard XY-based protocol may enhance the sensitivities only for measuring high-frequency fields, for which many (> 500) DD pulses are necessary and the robustness against pulse imperfections is critical. Moreover, we show that cooling is effective only for measuring low-frequency fields (˜10 kHz), for which the experiment time approaches T 1 at a small number of applied DD pulses.

  5. Assessing five field sampling methods to monitor Yellowstone National Park's northern ungulate winter range: the advantages and disadvantages of implementing a new sampling protocol

    Treesearch

    Pamela G. Sikkink; Roy Renkin; Geneva Chong; Art Sikkink

    2013-01-01

    The five field sampling methods tested for this study differed in richness and Simpson's Index values calculated from the raw data. How much the methods differed, and which ones were most similar to each other, depended on which diversity measure and which type of data were used for comparisons. When the number of species (richness) was used as a measure of...

  6. Environmental DNA as a Tool for Inventory and Monitoring of Aquatic Vertebrates

    DTIC Science & Technology

    2017-07-01

    geomorphic calculations and description of each reach. Methods Channel Surveys We initially selected reaches based on access and visual indicators...WA 99164 I-2 Environmental DNA lab protocol: designing species-specific qPCR assays Species-specific surveys should use quantitative polymerase...to traditional field sampling with respect to sensitivity, detection probabilities, and cost efficiency. Compared to field surveys , eDNA sampling

  7. Lichen elements as pollution indicators: evaluation of methods for large monitoring programmes

    Treesearch

    Susan Will-Wolf; Sarah Jovan; Michael C. Amacher

    2017-01-01

    Lichen element content is a reliable indicator for relative air pollution load in research and monitoring programmes requiring both efficiency and representation of many sites. We tested the value of costly rigorous field and handling protocols for sample element analysis using five lichen species. No relaxation of rigour was supported; four relaxed protocols generated...

  8. Multiplex PCR method for MinION and Illumina sequencing of Zika and other virus genomes directly from clinical samples.

    PubMed

    Quick, Joshua; Grubaugh, Nathan D; Pullan, Steven T; Claro, Ingra M; Smith, Andrew D; Gangavarapu, Karthik; Oliveira, Glenn; Robles-Sikisaka, Refugio; Rogers, Thomas F; Beutler, Nathan A; Burton, Dennis R; Lewis-Ximenez, Lia Laura; de Jesus, Jaqueline Goes; Giovanetti, Marta; Hill, Sarah C; Black, Allison; Bedford, Trevor; Carroll, Miles W; Nunes, Marcio; Alcantara, Luiz Carlos; Sabino, Ester C; Baylis, Sally A; Faria, Nuno R; Loose, Matthew; Simpson, Jared T; Pybus, Oliver G; Andersen, Kristian G; Loman, Nicholas J

    2017-06-01

    Genome sequencing has become a powerful tool for studying emerging infectious diseases; however, genome sequencing directly from clinical samples (i.e., without isolation and culture) remains challenging for viruses such as Zika, for which metagenomic sequencing methods may generate insufficient numbers of viral reads. Here we present a protocol for generating coding-sequence-complete genomes, comprising an online primer design tool, a novel multiplex PCR enrichment protocol, optimized library preparation methods for the portable MinION sequencer (Oxford Nanopore Technologies) and the Illumina range of instruments, and a bioinformatics pipeline for generating consensus sequences. The MinION protocol does not require an Internet connection for analysis, making it suitable for field applications with limited connectivity. Our method relies on multiplex PCR for targeted enrichment of viral genomes from samples containing as few as 50 genome copies per reaction. Viral consensus sequences can be achieved in 1-2 d by starting with clinical samples and following a simple laboratory workflow. This method has been successfully used by several groups studying Zika virus evolution and is facilitating an understanding of the spread of the virus in the Americas. The protocol can be used to sequence other viral genomes using the online Primal Scheme primer designer software. It is suitable for sequencing either RNA or DNA viruses in the field during outbreaks or as an inexpensive, convenient method for use in the lab.

  9. AEROBIC SOIL MICROCOSMS FOR LONG-TERM BIODEGRADATION OF HYDROCARBON VAPORS

    EPA Science Inventory

    The aims of this research project included the development of laboratory protocols for the preparation of aerobic soil microcosms using aseptic field soil samples, and for the gas chromatographic analysis of hydrocarbon vapor biodegradation based on vapor samples obtained from th...

  10. CONCEPTS AND APPROACHES FOR THE BIOASSESSMENT OF NON-WADEABLE STREAMS AND RIVERS

    EPA Science Inventory

    This document is intended to assist users in establishing or refining protocols, including the specific methods related to field sampling, laboratory sample processing, taxonomy, data entry, management and analysis, and final assessment and reporting. It also reviews and provide...

  11. DIETARY EXPOSURES OF YOUNG CHILDREN, PART II: FIELD STUDY

    EPA Science Inventory

    A small, pilot field study was conducted to determine the adequacy of protocols for dietary exposure measurements. Samples were collected to estimate the amount of pesticides transferred from contaminated surfaces or hands to foods of young children and to validate a dietary mod...

  12. A Data Scheduling and Management Infrastructure for the TEAM Network

    NASA Astrophysics Data System (ADS)

    Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.; Unwin, R.

    2009-04-01

    The objective of the Tropical Ecology Assessment and Monitoring Network (www.teamnetwork.org) is "To generate real time data for monitoring long-term trends in tropical biodiversity through a global network of TEAM sites (i.e. field stations in tropical forests), providing an early warning system on the status of biodiversity to effectively guide conservation action". To achieve this, the TEAM Network operates by collecting data via standardized protocols at TEAM Sites. The standardized TEAM protocols include the Climate, Vegetation and Terrestrial Vertebrate Protocols. Some sites also implement additional protocols. There are currently 7 TEAM Sites with plans to grow the network to 15 by June 30, 2009 and 50 TEAM Sites by the end of 2010. Climate Protocol The Climate Protocol entails the collection of climate data via meteorological stations located at the TEAM Sites. This includes information such as precipitation, temperature, wind direction and strength and various solar radiation measurements. Vegetation Protocol The Vegetation Protocol collects standardized information on tropical forest trees and lianas. A TEAM Site will have between 6-9 1ha plots where trees and lianas larger than a pre-specified size are mapped, identified and measured. This results in each TEAM Site repeatedly measuring between 3000-5000 trees annually. Terrestrial Vertebrate Protocol The Terrestrial Vertebrate Protocol collects standardized information on mid-sized tropical forest fauna (i.e. birds and mammals). This information is collected via camera traps (i.e. digital cameras with motion sensors housed in weather proof casings). The images taken by the camera trap are reviewed to identify what species are captured in the image by the camera trap. The image and the interpretation of what is in the image are the data for the Terrestrial Vertebrate Protocol. The amount of data collected through the TEAM protocols provides a significant yet exciting IT challenge. The TEAM Network is currently partnering with the San Diego Super Computer Center to build the data management infrastructure. Data collected from the three core protocols as well as others are currently made available through the TEAM Network portal, which provides the content management framework, the data scheduling and management framework, an administrative framework to implement and manage TEAM sites, collaborative tools and a number of tools and applications utilizing Google Map and Google Earth products. A critical element of the TEAM Network data management infrastructure is to make the data publicly available in as close to real-time as possible (the TEAM Network Data Use Policy: http://www.teamnetwork.org/en/data/policy). This requires two essential tasks to be accomplished, 1) A data collection schedule has to be planned, proposed and approved for a given TEAM site. This is a challenging process since TEAM sites are geographically distributed across the tropics and hence have different seasons where they schedule field sampling for the different TEAM protocols. Capturing this information and ensuring that TEAM sites follow the outlined legal contract is key to the data collection process and 2) A stream-lined and efficient information management system to ensure data collected from the field meet the minimum data standards (i.e. are of the highest scientific quality) and are securely transferred, archived, processed and be rapidly made publicaly available, as a finished consumable product via the TEAM Network portal. The TEAM Network is achieving these goals by implementing an end-to-end framework consisting of the Sampling Scheduler application and the Data Management Framework. Sampling Scheduler The Sampling Scheduler is a project management, calendar based portal application that will allow scientists at a TEAM site to schedule field sampling for each of the TEAM protocols implemented at that site. The sampling scheduler addresses the specific requirements established in the TEAM protocols with the logistical scheduling needs of each TEAM Site. For example, each TEAM protocol defines when data must be collected (e.g. time of day, number of times per year, during which seasons, etc) as well as where data must be collected (from which sampling units, which trees, etc). Each TEAM Site has a limited number of resources and must create plans that will both satisfy the requirements of the protocols as well as be logistically feasible for their TEAM Site. With 15 TEAM Sites (and many more coming soon) the schedules of each TEAM Site must be communicated to the Network Office to ensure data are being collected as scheduled and to address the many problems when working in difficult environments like Tropical Forests. The Sampling Schedule provides built-in proposal and approval functionality to ensure that the TEAM Sites are and the Network office are in sync as well as provides the capability to modify schedules when needed. The Data Management Framework The Data Management framework is a three-tier data ingestion, edit and review application for protocols defined in the TEAM network. The data ingestion framework provides online web forms for field personnel to submit and edit data collected at TEAM Sites. These web forms will be accessible from the TEAM content management site. Once the data is securely uploaded, cured, processed and approved, it will be made publicly available for consumption by the scientific community. The Data Management framework, when combined with the Sampling Scheduler provides a closed loop Data Scheduling and Management infrastructure. All information starting from data collection plan, tools to input, modify and curate data, review and run QA/QC tests, as well as verify data are collected as planed are included. Finally, TEAM Network data are available for download via the Data Query and Download Application. This application utilizes a Google Maps custom interface to search, visualize, and download TEAM Network data. References • TEAM Network, http://www.teamnetwork.org • Center for Applied Biodiversity Science, Conservation International. http://science.conservation.org/portal/server.pt • TEAM Data Query and Download Application, http://www.teamnetwork.org/en/data/query

  13. Optimization and validation of sample preparation for metagenomic sequencing of viruses in clinical samples.

    PubMed

    Lewandowska, Dagmara W; Zagordi, Osvaldo; Geissberger, Fabienne-Desirée; Kufner, Verena; Schmutz, Stefan; Böni, Jürg; Metzner, Karin J; Trkola, Alexandra; Huber, Michael

    2017-08-08

    Sequence-specific PCR is the most common approach for virus identification in diagnostic laboratories. However, as specific PCR only detects pre-defined targets, novel virus strains or viruses not included in routine test panels will be missed. Recently, advances in high-throughput sequencing allow for virus-sequence-independent identification of entire virus populations in clinical samples, yet standardized protocols are needed to allow broad application in clinical diagnostics. Here, we describe a comprehensive sample preparation protocol for high-throughput metagenomic virus sequencing using random amplification of total nucleic acids from clinical samples. In order to optimize metagenomic sequencing for application in virus diagnostics, we tested different enrichment and amplification procedures on plasma samples spiked with RNA and DNA viruses. A protocol including filtration, nuclease digestion, and random amplification of RNA and DNA in separate reactions provided the best results, allowing reliable recovery of viral genomes and a good correlation of the relative number of sequencing reads with the virus input. We further validated our method by sequencing a multiplexed viral pathogen reagent containing a range of human viruses from different virus families. Our method proved successful in detecting the majority of the included viruses with high read numbers and compared well to other protocols in the field validated against the same reference reagent. Our sequencing protocol does work not only with plasma but also with other clinical samples such as urine and throat swabs. The workflow for virus metagenomic sequencing that we established proved successful in detecting a variety of viruses in different clinical samples. Our protocol supplements existing virus-specific detection strategies providing opportunities to identify atypical and novel viruses commonly not accounted for in routine diagnostic panels.

  14. FIELD AND LABORATORY PERFORMANCE CHARACTERISTICS OF A NEW SAMPLING PROTOCOL FOR RIVERINE MACROINVERTEBRATE ASSEMBLAGES

    EPA Science Inventory

    Measurement and estimation of performance characteristics (i.e., precision, bias, performance range, interferences and sensitivity) are often neglected in the development and use of new biological sampling methods. However, knowledge of this information is critical in enabling p...

  15. Detection and Evaluation of Elevated Lead Release from Service Lines: A Field Study

    EPA Science Inventory

    Comparative stagnation sampling conducted in 32 homes in Chicago, Illinois with lead service (LSLs) demonstrated that the existing regulatory sampling protocol under the U. S. Lead and Copper Rule (LCR) systematically underestimated lead corrosion. Lead levels were highest within...

  16. A new real-time PCR protocol for detection of avian haemosporidians.

    PubMed

    Bell, Jeffrey A; Weckstein, Jason D; Fecchio, Alan; Tkach, Vasyl V

    2015-07-19

    Birds possess the most diverse assemblage of haemosporidian parasites; including three genera, Plasmodium, Haemoproteus, and Leucocytozoon. Currently there are over 200 morphologically identified avian haemosporidian species, although true species richness is unknown due to great genetic diversity and insufficient sampling in highly diverse regions. Studies aimed at surveying haemosporidian diversity involve collecting and screening samples from hundreds to thousands of individuals. Currently, screening relies on microscopy and/or single or nested standard PCR. Although effective, these methods are time and resource consuming, and in the case of microscopy require substantial expertise. Here we report a newly developed real-time PCR protocol designed to quickly and reliably detect all three genera of avian haemosporidians in a single biochemical reaction. Using available DNA sequences from avian haemosporidians we designed primers R330F and R480RL, which flank a 182 base pair fragment of mitochondrial conserved rDNA. These primers were initially tested using real-time PCR on samples from Malawi, Africa, previously screened for avian haemosporidians using traditional nested PCR. Our real time protocol was further tested on 94 samples from the Cerrado biome of Brazil, previously screened using a single PCR assay for haemosporidian parasites. These samples were also amplified using modified nested PCR protocols, allowing for comparisons between the three different screening methods (single PCR, nested PCR, real-time PCR). The real-time PCR protocol successfully identified all three genera of avian haemosporidians from both single and mixed infections previously detected from Malawi. There was no significant difference between the three different screening protocols used for the 94 samples from the Brazilian Cerrado (χ(2) = 0.3429, df = 2, P = 0.842). After proving effective, the real-time protocol was used to screen 2113 Brazilian samples, identifying 693 positive samples. Our real-time PCR assay proved as effective as two widely used molecular screening techniques, single PCR and nested PCR. However, the real-time protocol has the distinct advantage of detecting all three genera in a single reaction, which significantly increases efficiency by greatly decreasing screening time and cost. Our real-time PCR protocol is therefore a valuable tool in the quickly expanding field of avian haemosporidian research.

  17. Sample size requirements for estimating effective dose from computed tomography using solid-state metal-oxide-semiconductor field-effect transistor dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trattner, Sigal; Cheng, Bin; Pieniazek, Radoslaw L.

    2014-04-15

    Purpose: Effective dose (ED) is a widely used metric for comparing ionizing radiation burden between different imaging modalities, scanners, and scan protocols. In computed tomography (CT), ED can be estimated by performing scans on an anthropomorphic phantom in which metal-oxide-semiconductor field-effect transistor (MOSFET) solid-state dosimeters have been placed to enable organ dose measurements. Here a statistical framework is established to determine the sample size (number of scans) needed for estimating ED to a desired precision and confidence, for a particular scanner and scan protocol, subject to practical limitations. Methods: The statistical scheme involves solving equations which minimize the sample sizemore » required for estimating ED to desired precision and confidence. It is subject to a constrained variation of the estimated ED and solved using the Lagrange multiplier method. The scheme incorporates measurement variation introduced both by MOSFET calibration, and by variation in MOSFET readings between repeated CT scans. Sample size requirements are illustrated on cardiac, chest, and abdomen–pelvis CT scans performed on a 320-row scanner and chest CT performed on a 16-row scanner. Results: Sample sizes for estimating ED vary considerably between scanners and protocols. Sample size increases as the required precision or confidence is higher and also as the anticipated ED is lower. For example, for a helical chest protocol, for 95% confidence and 5% precision for the ED, 30 measurements are required on the 320-row scanner and 11 on the 16-row scanner when the anticipated ED is 4 mSv; these sample sizes are 5 and 2, respectively, when the anticipated ED is 10 mSv. Conclusions: Applying the suggested scheme, it was found that even at modest sample sizes, it is feasible to estimate ED with high precision and a high degree of confidence. As CT technology develops enabling ED to be lowered, more MOSFET measurements are needed to estimate ED with the same precision and confidence.« less

  18. Sample size requirements for estimating effective dose from computed tomography using solid-state metal-oxide-semiconductor field-effect transistor dosimetry

    PubMed Central

    Trattner, Sigal; Cheng, Bin; Pieniazek, Radoslaw L.; Hoffmann, Udo; Douglas, Pamela S.; Einstein, Andrew J.

    2014-01-01

    Purpose: Effective dose (ED) is a widely used metric for comparing ionizing radiation burden between different imaging modalities, scanners, and scan protocols. In computed tomography (CT), ED can be estimated by performing scans on an anthropomorphic phantom in which metal-oxide-semiconductor field-effect transistor (MOSFET) solid-state dosimeters have been placed to enable organ dose measurements. Here a statistical framework is established to determine the sample size (number of scans) needed for estimating ED to a desired precision and confidence, for a particular scanner and scan protocol, subject to practical limitations. Methods: The statistical scheme involves solving equations which minimize the sample size required for estimating ED to desired precision and confidence. It is subject to a constrained variation of the estimated ED and solved using the Lagrange multiplier method. The scheme incorporates measurement variation introduced both by MOSFET calibration, and by variation in MOSFET readings between repeated CT scans. Sample size requirements are illustrated on cardiac, chest, and abdomen–pelvis CT scans performed on a 320-row scanner and chest CT performed on a 16-row scanner. Results: Sample sizes for estimating ED vary considerably between scanners and protocols. Sample size increases as the required precision or confidence is higher and also as the anticipated ED is lower. For example, for a helical chest protocol, for 95% confidence and 5% precision for the ED, 30 measurements are required on the 320-row scanner and 11 on the 16-row scanner when the anticipated ED is 4 mSv; these sample sizes are 5 and 2, respectively, when the anticipated ED is 10 mSv. Conclusions: Applying the suggested scheme, it was found that even at modest sample sizes, it is feasible to estimate ED with high precision and a high degree of confidence. As CT technology develops enabling ED to be lowered, more MOSFET measurements are needed to estimate ED with the same precision and confidence. PMID:24694150

  19. NHEXAS PHASE I REGION 5 STUDY--STANDARD OPERATING PROCEDURE--NHEXAS FILTER HANDLING, WEIGHING AND ARCHIVING PROCEDURES FOR AEROSOL SAMPLES (RTI/ACS-AP-209-011)

    EPA Science Inventory

    This protocol describes the procedures for weighing, handling, and archiving aerosol filters and for managing the associated analytical and quality assurance data. Filter samples were weighed for aerosol mass at RTI laboratory, with only the automated field sampling data transfer...

  20. Collecting, archiving and processing DNA from wildlife samples using FTA® databasing paper

    PubMed Central

    Smith, LM; Burgoyne, LA

    2004-01-01

    Background Methods involving the analysis of nucleic acids have become widespread in the fields of traditional biology and ecology, however the storage and transport of samples collected in the field to the laboratory in such a manner to allow purification of intact nucleic acids can prove problematical. Results FTA® databasing paper is widely used in human forensic analysis for the storage of biological samples and for purification of nucleic acids. The possible uses of FTA® databasing paper in the purification of DNA from samples of wildlife origin were examined, with particular reference to problems expected due to the nature of samples of wildlife origin. The processing of blood and tissue samples, the possibility of excess DNA in blood samples due to nucleated erythrocytes, and the analysis of degraded samples were all examined, as was the question of long term storage of blood samples on FTA® paper. Examples of the end use of the purified DNA are given for all protocols and the rationale behind the processing procedures is also explained to allow the end user to adjust the protocols as required. Conclusions FTA® paper is eminently suitable for collection of, and purification of nucleic acids from, biological samples from a wide range of wildlife species. This technology makes the collection and storage of such samples much simpler. PMID:15072582

  1. Modeling the Sensitivity of Field Surveys for Detection of Environmental DNA (eDNA)

    PubMed Central

    Schultz, Martin T.; Lance, Richard F.

    2015-01-01

    The environmental DNA (eDNA) method is the practice of collecting environmental samples and analyzing them for the presence of a genetic marker specific to a target species. Little is known about the sensitivity of the eDNA method. Sensitivity is the probability that the target marker will be detected if it is present in the water body. Methods and tools are needed to assess the sensitivity of sampling protocols, design eDNA surveys, and interpret survey results. In this study, the sensitivity of the eDNA method is modeled as a function of ambient target marker concentration. The model accounts for five steps of sample collection and analysis, including: 1) collection of a filtered water sample from the source; 2) extraction of DNA from the filter and isolation in a purified elution; 3) removal of aliquots from the elution for use in the polymerase chain reaction (PCR) assay; 4) PCR; and 5) genetic sequencing. The model is applicable to any target species. For demonstration purposes, the model is parameterized for bighead carp (Hypophthalmichthys nobilis) and silver carp (H. molitrix) assuming sampling protocols used in the Chicago Area Waterway System (CAWS). Simulation results show that eDNA surveys have a high false negative rate at low concentrations of the genetic marker. This is attributed to processing of water samples and division of the extraction elution in preparation for the PCR assay. Increases in field survey sensitivity can be achieved by increasing sample volume, sample number, and PCR replicates. Increasing sample volume yields the greatest increase in sensitivity. It is recommended that investigators estimate and communicate the sensitivity of eDNA surveys to help facilitate interpretation of eDNA survey results. In the absence of such information, it is difficult to evaluate the results of surveys in which no water samples test positive for the target marker. It is also recommended that invasive species managers articulate concentration-based sensitivity objectives for eDNA surveys. In the absence of such information, it is difficult to design appropriate sampling protocols. The model provides insights into how sampling protocols can be designed or modified to achieve these sensitivity objectives. PMID:26509674

  2. Modeling the Sensitivity of Field Surveys for Detection of Environmental DNA (eDNA).

    PubMed

    Schultz, Martin T; Lance, Richard F

    2015-01-01

    The environmental DNA (eDNA) method is the practice of collecting environmental samples and analyzing them for the presence of a genetic marker specific to a target species. Little is known about the sensitivity of the eDNA method. Sensitivity is the probability that the target marker will be detected if it is present in the water body. Methods and tools are needed to assess the sensitivity of sampling protocols, design eDNA surveys, and interpret survey results. In this study, the sensitivity of the eDNA method is modeled as a function of ambient target marker concentration. The model accounts for five steps of sample collection and analysis, including: 1) collection of a filtered water sample from the source; 2) extraction of DNA from the filter and isolation in a purified elution; 3) removal of aliquots from the elution for use in the polymerase chain reaction (PCR) assay; 4) PCR; and 5) genetic sequencing. The model is applicable to any target species. For demonstration purposes, the model is parameterized for bighead carp (Hypophthalmichthys nobilis) and silver carp (H. molitrix) assuming sampling protocols used in the Chicago Area Waterway System (CAWS). Simulation results show that eDNA surveys have a high false negative rate at low concentrations of the genetic marker. This is attributed to processing of water samples and division of the extraction elution in preparation for the PCR assay. Increases in field survey sensitivity can be achieved by increasing sample volume, sample number, and PCR replicates. Increasing sample volume yields the greatest increase in sensitivity. It is recommended that investigators estimate and communicate the sensitivity of eDNA surveys to help facilitate interpretation of eDNA survey results. In the absence of such information, it is difficult to evaluate the results of surveys in which no water samples test positive for the target marker. It is also recommended that invasive species managers articulate concentration-based sensitivity objectives for eDNA surveys. In the absence of such information, it is difficult to design appropriate sampling protocols. The model provides insights into how sampling protocols can be designed or modified to achieve these sensitivity objectives.

  3. Comparison of two cooling protocols for llama semen: with and without collagenase and seminal plasma in the medium.

    PubMed

    Carretero, M I; Giuliano, S M; Arraztoa, C C; Santa Cruz, R C; Fumuso, F G; Neild, D M

    2017-08-01

    Seminal plasma (SP) of South American Camelids could interfere with the interaction of spermatozoa with the extenders; therefore it becomes necessary to improve semen management using enzymatic treatment. Our objective was to compare two cooling protocols for llama semen. Twelve ejaculates were incubated in 0.1% collagenase and then were divided into two aliquots. One was extended in lactose and egg yolk (LEY) (Protocol A: collagenase and SP present). The other aliquot was centrifuged, and the pellet was resuspended in LEY (Protocol B: collagenase and SP absent). Both samples were maintained at 5°C during 24 hr. Routine and DNA evaluations were carried out in raw and cooled semen. Both cooling protocols maintained sperm viability, membrane function and DNA fragmentation, with Protocol A showing a significantly lowered total and progressive motility (p < .05) and Protocol B showing a significant increase in chromatin decondensation (p < .05). Protocol A avoids centrifugation, reducing processing times and making application in the field simpler. However, as neither protocol showed a significant superiority over the other, studies should be carried out in vivo to evaluate the effect on pregnancy rates of the presence of collagenase and SP in semen samples prior to either cooling or freeze-thawing. © 2016 Blackwell Verlag GmbH.

  4. Comparison of seven protocols to identify fecal contamination sources using Escherichia coli

    USGS Publications Warehouse

    Stoeckel, D.M.; Mathes, M.V.; Hyer, K.E.; Hagedorn, C.; Kator, H.; Lukasik, J.; O'Brien, T. L.; Fenger, T.W.; Samadpour, M.; Strickler, K.M.; Wiggins, B.A.

    2004-01-01

    Microbial source tracking (MST) uses various approaches to classify fecal-indicator microorganisms to source hosts. Reproducibility, accuracy, and robustness of seven phenotypic and genotypic MST protocols were evaluated by use of Escherichia coli from an eight-host library of known-source isolates and a separate, blinded challenge library. In reproducibility tests, measuring each protocol's ability to reclassify blinded replicates, only one (pulsed-field gel electrophoresis; PFGE) correctly classified all test replicates to host species; three protocols classified 48-62% correctly, and the remaining three classified fewer than 25% correctly. In accuracy tests, measuring each protocol's ability to correctly classify new isolates, ribotyping with EcoRI and PvuII approached 100% correct classification but only 6% of isolates were classified; four of the other six protocols (antibiotic resistance analysis, PFGE, and two repetitive-element PCR protocols) achieved better than random accuracy rates when 30-100% of challenge isolates were classified. In robustness tests, measuring each protocol's ability to recognize isolates from nonlibrary hosts, three protocols correctly classified 33-100% of isolates as "unknown origin," whereas four protocols classified all isolates to a source category. A relevance test, summarizing interpretations for a hypothetical water sample containing 30 challenge isolates, indicated that false-positive classifications would hinder interpretations for most protocols. Study results indicate that more representation in known-source libraries and better classification accuracy would be needed before field application. Thorough reliability assessment of classification results is crucial before and during application of MST protocols.

  5. Pilot studies for the North American Soil Geochemical Landscapes Project - Site selection, sampling protocols, analytical methods, and quality control protocols

    USGS Publications Warehouse

    Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.

    2009-01-01

    In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The <2-mm fraction of each sample was analyzed for Al, Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of 19 organochlorine pesticides by gas chromatography. Only three of these samples had detectable pesticide concentrations. A separate sample of A-horizon soil was collected for microbial characterization by phospholipid fatty acid analysis (PLFA), soil enzyme assays, and determination of selected human and agricultural pathogens. Collection, preservation and analysis of samples for both organic compounds and microbial characterization add a great degree of complication to the sampling and preservation protocols and a significant increase to the cost for a continental-scale survey. Both these issues must be considered carefully prior to adopting these parameters as part of the soil geochemical survey of North America.

  6. A paleointensity technique for multidomain igneous rocks

    NASA Astrophysics Data System (ADS)

    Wang, Huapei; Kent, Dennis V.

    2013-10-01

    We developed a paleointensity technique to account for concave-up Arai diagrams due to multidomain (MD) contributions to determine unbiased paleointensities for 24 trial samples from site GA-X in Pleistocene lavas from Floreana Island, Galapagos Archipelago. The main magnetization carrier is fine-grained low-titanium magnetite of variable grain size. We used a comprehensive back-zero-forth (BZF) heating technique by adding an additional zero-field heating between the Thellier two opposite in-field heating steps in order to estimate paleointensities in various standard protocols and provide internal self-consistency checks. After the first BZF experiment, we gave each sample a total thermal remanent magnetization (tTRM) by cooling from the Curie point in the presence of a low (15 µT) laboratory-applied field. Then we repeated the BZF protocol, with the laboratory-applied tTRM as a synthetic natural remanent magnetization (NRM), using the same laboratory-applied field and temperature steps to obtain the synthetic Arai signatures, which should only represent the domain-state dependent properties of the samples. We corrected the original Arai diagrams from the first BZF experiment by using the Arai signatures from the repeated BZF experiment, which neutralizes the typical MD concave-up effect. Eleven samples meet the Arai diagram post-selection criteria and provide qualified paleointensity estimates with a mean value for site GA-X of 4.23 ± 1.29 µT, consistent with an excursional geomagnetic field direction reported for this site.

  7. A COMPREHENSIVE NONPOINT SOURCE FIELD STUDY FOR SEDIMENT, NUTRIENTS, AND PATHOGENS IN THE SOUTH FORK BROAD RIVER WATERSHED IN NORTHEAST GEORGIA

    EPA Science Inventory

    This technical report provides a description of the field project design, quality control, the sampling protocols and analysis methodology used, and standard operating procedures for the South Fork Broad River Watershed (SFBR) Total Maximum Daily Load (TMDL) project. This watersh...

  8. Evaluating Parametrization Protocols for Hydration Free Energy Calculations with the AMOEBA Polarizable Force Field.

    PubMed

    Bradshaw, Richard T; Essex, Jonathan W

    2016-08-09

    Hydration free energy (HFE) calculations are often used to assess the performance of biomolecular force fields and the quality of assigned parameters. The AMOEBA polarizable force field moves beyond traditional pairwise additive models of electrostatics and may be expected to improve upon predictions of thermodynamic quantities such as HFEs over and above fixed-point-charge models. The recent SAMPL4 challenge evaluated the AMOEBA polarizable force field in this regard but showed substantially worse results than those using the fixed-point-charge GAFF model. Starting with a set of automatically generated AMOEBA parameters for the SAMPL4 data set, we evaluate the cumulative effects of a series of incremental improvements in parametrization protocol, including both solute and solvent model changes. Ultimately, the optimized AMOEBA parameters give a set of results that are not statistically significantly different from those of GAFF in terms of signed and unsigned error metrics. This allows us to propose a number of guidelines for new molecule parameter derivation with AMOEBA, which we expect to have benefits for a range of biomolecular simulation applications such as protein-ligand binding studies.

  9. Crowdsourcing Science to Promote Human Health: New Tools to Promote Sampling of Mosquito Populations by Citizen Scientists

    NASA Astrophysics Data System (ADS)

    Boger, R. A.; Low, R.; Jaroensutasinee, M.; Jaroensutasinee, K.; Sparrow, E. B.; Costosa, J. I.; Medina, J.; Randolph, G.

    2015-12-01

    GLOBE in Thailand and GLOBE in Africa independently developed citizen science protocols for collecting and analyzing mosquito larvae. These protocols have been piloted in several workshops and implemented in schools. Data collected have been used for several secondary, undergraduate and graduate student research studies. Over this past year, 2015, these protocols have been synthesized into one protocol that will be made available to the world-wide community through the GLOBE website (www.globe.gov). This new protocol is designed to be flexible in the mosquito species that can be collected and the types of environments sampled (e.g., containers in and around the house, ponds, irrigation ditches in a rice paddy field). Plans are underway to enable web-based data entry and mobile apps for data collection and submission. Once everything is finalized, a GLOBE field campaign will be initiated for citizen scientists to collect meaningful data on where different types of mosquito larvae are found and how the abundance and distribution is changing seasonally. To assist in the standardization of data collection and quality control, training slides are being developed and will be made available in early 2016. This will enable a wider participation of citizen scientists to participate in this effort to collect mosquito data by making it easier to become part of the GLOBE community. As with mosquito larvae, training slides are being created for hydrosphere, biosphere, atmosphere, and pedosphere GLOBE measurement protocols. The development of the mosquito protocol and the training slides are in direct response to the GLOBE community's desire to increase citizen science participation beyond primary and secondary schools, in observing and measuring environmental change.

  10. Strategies for Achieving High Sequencing Accuracy for Low Diversity Samples and Avoiding Sample Bleeding Using Illumina Platform

    PubMed Central

    Mitra, Abhishek; Skrzypczak, Magdalena; Ginalski, Krzysztof; Rowicka, Maga

    2015-01-01

    Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding). Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants). Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol) that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer’s, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively, we discuss how analysis can be repeated from saved sequencing images using the Long Template Protocol to increase accuracy. PMID:25860802

  11. Protocols for the analytical characterization of therapeutic monoclonal antibodies. II - Enzymatic and chemical sample preparation.

    PubMed

    Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy

    2017-08-15

    The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Introduction to Field Water-Quality Methods for the Collection of Metals - 2007 Project Summary

    USGS Publications Warehouse

    Allen, Monica L.

    2008-01-01

    The U.S. Geological Survey (USGS), Region VI of the U.S. Environmental Protection Agency (USEPA), and the Osage Nation presented three 3-day workshops, in June-August 2007, entitled ?Introduction to Field Water-Quality Methods for the Collection of Metals.? The purpose of the workshops was to provide instruction to tribes within USEPA Region VI on various USGS surface-water measurement methods and water-quality sampling protocols for the collection of surface-water samples for metals analysis. Workshop attendees included members from over 22 tribes and pueblos. USGS instructors came from Oklahoma, New Mexico, and Georgia. Workshops were held in eastern and south-central Oklahoma and New Mexico and covered many topics including presampling preparation, water-quality monitors, and sampling for metals in surface water. Attendees spent one full classroom day learning the field methods used by the USGS Water Resources Discipline and learning about the complexity of obtaining valid water-quality and quality-assurance data. Lectures included (1) a description of metal contamination sources in surface water; (2) introduction on how to select field sites, equipment, and laboratories for sample analysis; (3) collection of sediment in surface water; and (4) utilization of proper protocol and methodology for sampling metals in surface water. Attendees also were provided USGS sampling equipment for use during the field portion of the class so they had actual ?hands-on? experience to take back to their own organizations. The final 2 days of the workshop consisted of field demonstrations of current USGS water-quality sample-collection methods. The hands-on training ensured that attendees were exposed to and experienced proper sampling procedures. Attendees learned integrated-flow techniques during sample collection, field-property documentation, and discharge measurements and calculations. They also used enclosed chambers for sample processing and collected quality-assurance samples to verify their techniques. Benefits of integrated water-quality sample-collection methods are varied. Tribal environmental programs now have the ability to collect data that are comparable across watersheds. The use of consistent sample collection, manipulation, and storage techniques will provide consistent quality data that will enhance the understanding of local water resources. The improved data quality also will help the USEPA better document the condition of the region?s water. Ultimately, these workshops equipped tribes to use uniform sampling methods and to provide consistent quality data that are comparable across the region.

  13. Sampling the Soils around a Residence Containing Lead-Based Paints: An X-Ray Fluorescence Experiment

    ERIC Educational Resources Information Center

    Bachofer, Steven J.

    2008-01-01

    Sampling experiments utilizing field portable instruments are instructional since students collect data following regulatory protocols, evaluate it, and begin to recognize their civic responsibilities upon collecting useful data. A lead-in-soil experiment educated students on a prevalent exposure pathway. The experimental site was a pre-1950…

  14. Chapter A5. Processing of Water Samples

    USGS Publications Warehouse

    Wilde, Franceska D.; Radtke, Dean B.; Gibs, Jacob; Iwatsubo, Rick T.

    1999-01-01

    The National Field Manual for the Collection of Water-Quality Data (National Field Manual) describes protocols and provides guidelines for U.S. Geological Survey (USGS) personnel who collect data used to assess the quality of the Nation's surface-water and ground-water resources. This chapter addresses methods to be used in processing water samples to be analyzed for inorganic and organic chemical substances, including the bottling of composite, pumped, and bailed samples and subsamples; sample filtration; solid-phase extraction for pesticide analyses; sample preservation; and sample handling and shipping. Each chapter of the National Field Manual is published separately and revised periodically. Newly published and revised chapters will be announced on the USGS Home Page on the World Wide Web under 'New Publications of the U.S. Geological Survey.' The URL for this page is http:/ /water.usgs.gov/lookup/get?newpubs.

  15. Toxoplasma Gondii and Pre-treatment Protocols for Polymerase Chain Reaction Analysis of Milk Samples: A Field Trial in Sheep from Southern Italy.

    PubMed

    Vismarra, Alice; Barilli, Elena; Miceli, Maura; Mangia, Carlo; Bacci, Cristina; Brindani, Franco; Kramer, Laura

    2017-01-24

    Toxoplasmosis is a zoonotic disease caused by the protozoan Toxoplasma gondii. Ingestion of raw milk has been suggested as a risk for transmission to humans. Here the authors evaluated pre-treatment protocols for DNA extraction on T. gondii tachyzoite-spiked sheep milk with the aim of identifying the method that resulted in the most rapid and reliable polymerase chain reaction (PCR) positivity. This protocol was then used to analyse milk samples from sheep of three different farms in Southern Italy, including real time PCR for DNA quantification and PCR-restriction fragment length polymorphism for genotyping. The pre-treatment protocol using ethylenediaminetetraacetic acid and Tris-HCl to remove casein gave the best results in the least amount of time compared to the others on spiked milk samples. One sample of 21 collected from sheep farms was positive on one-step PCR, real time PCR and resulted in a Type I genotype at one locus (SAG3). Milk usually contains a low number of tachyzoites and this could be a limiting factor for molecular identification. Our preliminary data has evaluated a rapid, cost-effective and sensitive protocol to treat milk before DNA extraction. The results of the present study also confirm the possibility of T. gondii transmission through consumption of raw milk and its unpasteurised derivatives.

  16. Development of a method for detection and quantification of B. brongniartii and B. bassiana in soil

    NASA Astrophysics Data System (ADS)

    Canfora, L.; Malusà, E.; Tkaczuk, C.; Tartanus, M.; Łabanowska, B. H.; Pinzari, F.

    2016-03-01

    A culture independent method based on qPCR was developed for the detection and quantification of two fungal inoculants in soil. The aim was to adapt a genotyping approach based on SSR (Simple Sequence Repeat) marker to a discriminating tracing of two different species of bioinoculants in soil, after their in-field release. Two entomopathogenic fungi, Beauveria bassiana and B. brongniartii, were traced and quantified in soil samples obtained from field trials. These two fungal species were used as biological agents in Poland to control Melolontha melolontha (European cockchafer), whose larvae live in soil menacing horticultural crops. Specificity of SSR markers was verified using controls consisting of: i) soil samples containing fungal spores of B. bassiana and B. brongniartii in known dilutions; ii) the DNA of the fungal microorganisms; iii) soil samples singly inoculated with each fungus species. An initial evaluation of the protocol was performed with analyses of soil DNA and mycelial DNA. Further, the simultaneous detection and quantification of B. bassiana and B. brongniartii in soil was achieved in field samples after application of the bio-inoculants. The protocol can be considered as a relatively low cost solution for the detection, identification and traceability of fungal bio-inoculants in soil.

  17. Development of a method for detection and quantification of B. brongniartii and B. bassiana in soil

    PubMed Central

    Canfora, L.; Malusà, E.; Tkaczuk, C.; Tartanus, M.; Łabanowska, B.H.; Pinzari, F.

    2016-01-01

    A culture independent method based on qPCR was developed for the detection and quantification of two fungal inoculants in soil. The aim was to adapt a genotyping approach based on SSR (Simple Sequence Repeat) marker to a discriminating tracing of two different species of bioinoculants in soil, after their in-field release. Two entomopathogenic fungi, Beauveria bassiana and B. brongniartii, were traced and quantified in soil samples obtained from field trials. These two fungal species were used as biological agents in Poland to control Melolontha melolontha (European cockchafer), whose larvae live in soil menacing horticultural crops. Specificity of SSR markers was verified using controls consisting of: i) soil samples containing fungal spores of B. bassiana and B. brongniartii in known dilutions; ii) the DNA of the fungal microorganisms; iii) soil samples singly inoculated with each fungus species. An initial evaluation of the protocol was performed with analyses of soil DNA and mycelial DNA. Further, the simultaneous detection and quantification of B. bassiana and B. brongniartii in soil was achieved in field samples after application of the bio-inoculants. The protocol can be considered as a relatively low cost solution for the detection, identification and traceability of fungal bio-inoculants in soil. PMID:26975931

  18. Effect of different analyte diffusion/adsorption protocols on SERS signals

    NASA Astrophysics Data System (ADS)

    Li, Ruoping; Petschek, Rolfe G.; Han, Junhe; Huang, Mingju

    2018-07-01

    The effect of different analyte diffusion/adsorption protocols was studied which is often overlooked in surface-enhanced Raman scattering (SERS) technique. Three protocols: highly concentrated dilution (HCD) protocol, half-half dilution (HHD) protocol and layered adsorption (LA) protocol were studied and the SERS substrates were monolayer films of 80 nm Ag nanoparticles (NPs) which were modified by polyvinylpyrrolidone. The diffusion/adsorption mechanisms were modelled using the diffusion equation and the electromagnetic field distribution of two adjacent Ag NPs was simulated by the finite-different time-domain method. All experimental data and theoretical analysis suggest that different diffusion/adsorption behaviour of analytes will cause different SERS signal enhancements. HHD protocol could produce the most uniform and reproducible samples, and the corresponding signal intensity of the analyte is the strongest. This study will help to understand and promote the use of SERS technique in quantitative analysis.

  19. In-field Welding and Coating Protocols

    DOT National Transportation Integrated Search

    2009-05-12

    Gas Technology Institute (GTI) and Edison Welding Institute (EWI) created both laboratory and infield girth weld samples to evaluate the effects of weld geometry and hydrogen off-gassing on the performance of protective coatings. Laboratory made plat...

  20. Unbiased Strain-Typing of Arbovirus Directly from Mosquitoes Using Nanopore Sequencing: A Field-forward Biosurveillance Protocol.

    PubMed

    Russell, Joseph A; Campos, Brittany; Stone, Jennifer; Blosser, Erik M; Burkett-Cadena, Nathan; Jacobs, Jonathan L

    2018-04-03

    The future of infectious disease surveillance and outbreak response is trending towards smaller hand-held solutions for point-of-need pathogen detection. Here, samples of Culex cedecei mosquitoes collected in Southern Florida, USA were tested for Venezuelan Equine Encephalitis Virus (VEEV), a previously-weaponized arthropod-borne RNA-virus capable of causing acute and fatal encephalitis in animal and human hosts. A single 20-mosquito pool tested positive for VEEV by quantitative reverse transcription polymerase chain reaction (RT-qPCR) on the Biomeme two3. The virus-positive sample was subjected to unbiased metatranscriptome sequencing on the Oxford Nanopore MinION and shown to contain Everglades Virus (EVEV), an alphavirus in the VEEV serocomplex. Our results demonstrate, for the first time, the use of unbiased sequence-based detection and subtyping of a high-consequence biothreat pathogen directly from an environmental sample using field-forward protocols. The development and validation of methods designed for field-based diagnostic metagenomics and pathogen discovery, such as those suitable for use in mobile "pocket laboratories", will address a growing demand for public health teams to carry out their mission where it is most urgent: at the point-of-need.

  1. Sampling protocol for monitoring abiotic and biotic characteristics of mountain ponds and lakes

    USGS Publications Warehouse

    Hoffman, Robert L.; Tyler, Torrey J.; Larson, Gary L.; Adams, Michael J.; Wente, Wendy; Galvan, Stephanie

    2005-01-01

    This document describes field techniques and procedures used for sampling mountain ponds and lakes. These techniques and procedures will be used primarily to monitor, as part of long-term programs in National Parks and other protected areas, the abiotic and biotic characteristics of naturally occurring permanent montane lentic systems up to 75 ha in surface area. However, the techniques and procedures described herein also can be used to sample temporary or ephemeral montane lentic sites. Each Standard Operating Procedure (SOP) section addresses a specific component of the limnological investigation, and describes in detail field sampling methods pertaining to parameters to be measured for each component.

  2. Developing and evaluating rapid field methods to estimate peat carbon

    Treesearch

    Rodney A. Chimner; Cassandra A. Ott; Charles H. Perry; Randall K. Kolka

    2014-01-01

    Many international protocols (e.g., REDD+) are developing inventories of ecosystem carbon stocks and fluxes at country and regional scales, which can include peatlands. As the only nationally implemented field inventory and remeasurement of forest soils in the US, the USDA Forest Service Forest Inventory and Analysis Program (FIA) samples the top 20 cm of organic soils...

  3. Development of an efficient real-time quantitative PCR protocol for detection of Xanthomonas arboricola pv. pruni in Prunus species.

    PubMed

    Palacio-Bielsa, Ana; Cubero, Jaime; Cambra, Miguel A; Collados, Raquel; Berruete, Isabel M; López, María M

    2011-01-01

    Xanthomonas arboricola pv. pruni, the causal agent of bacterial spot disease of stone fruit, is considered a quarantine organism by the European Union and the European and Mediterranean Plant Protection Organization (EPPO). The bacterium can undergo an epiphytic phase and/or be latent and can be transmitted by plant material, but currently, only visual inspections are used to certify plants as being X. arboricola pv. pruni free. A novel and highly sensitive real-time TaqMan PCR detection protocol was designed based on a sequence of a gene for a putative protein related to an ABC transporter ATP-binding system in X. arboricola pv. pruni. Pathogen detection can be completed within a few hours with a sensitivity of 10(2) CFU ml(-1), thus surpassing the sensitivity of the existing conventional PCR. Specificity was assessed for X. arboricola pv. pruni strains from different origins as well as for closely related Xanthomonas species, non-Xanthomonas species, saprophytic bacteria, and healthy Prunus samples. The efficiency of the developed protocol was evaluated with field samples of 14 Prunus species and rootstocks. For symptomatic leaf samples, the protocol was very efficient even when washed tissues of the leaves were directly amplified without any previous DNA extraction. For samples of 117 asymptomatic leaves and 285 buds, the protocol was more efficient after a simple DNA extraction, and X. arboricola pv. pruni was detected in 9.4% and 9.1% of the 402 samples analyzed, respectively, demonstrating its frequent epiphytic or endophytic phase. This newly developed real-time PCR protocol can be used as a quantitative assay, offers a reliable and sensitive test for X. arboricola pv. pruni, and is suitable as a screening test for symptomatic as well as asymptomatic plant material.

  4. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR COLLECTION OF SURFACE WIPE SAMPLES FOR PESTICIDES OR METALS (UA-F-8.1)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures for collecting surface wipe samples inside a home for analysis of either metals or pesticides. This procedure covers the preparation of the surface wipe material and field activities. This protocol was followed to ensure con...

  5. X-ray-generated heralded macroscopical quantum entanglement of two nuclear ensembles.

    PubMed

    Liao, Wen-Te; Keitel, Christoph H; Pálffy, Adriana

    2016-09-19

    Heralded entanglement between macroscopical samples is an important resource for present quantum technology protocols, allowing quantum communication over large distances. In such protocols, optical photons are typically used as information and entanglement carriers between macroscopic quantum memories placed in remote locations. Here we investigate theoretically a new implementation which employs more robust x-ray quanta to generate heralded entanglement between two crystal-hosted macroscopical nuclear ensembles. Mössbauer nuclei in the two crystals interact collectively with an x-ray spontaneous parametric down conversion photon that generates heralded macroscopical entanglement with coherence times of approximately 100 ns at room temperature. The quantum phase between the entangled crystals can be conveniently manipulated by magnetic field rotations at the samples. The inherent long nuclear coherence times allow also for mechanical manipulations of the samples, for instance to check the stability of entanglement in the x-ray setup. Our results pave the way for first quantum communication protocols that use x-ray qubits.

  6. Chicago Lead in Drinking Water Study

    EPA Pesticide Factsheets

    EPA Region 5 and the Chicago Department of Water Management conducted a study on field sampling protocols for lead in drinking water. The purpose of the study was to evaluate the method used by public water systems to monitor lead levels.

  7. High-pressure freezing for scanning transmission electron tomography analysis of cellular organelles.

    PubMed

    Walther, Paul; Schmid, Eberhard; Höhn, Katharina

    2013-01-01

    Using an electron microscope's scanning transmission mode (STEM) for collection of tomographic datasets is advantageous compared to bright field transmission electron microscopic (TEM). For image formation, inelastic scattering does not cause chromatic aberration, since in STEM mode no image forming lenses are used after the beam has passed the sample, in contrast to regular TEM. Therefore, thicker samples can be imaged. It has been experimentally demonstrated that STEM is superior to TEM and energy filtered TEM for tomography of samples as thick as 1 μm. Even when using the best electron microscope, adequate sample preparation is the key for interpretable results. We adapted protocols for high-pressure freezing of cultivated cells from a physiological state. In this chapter, we describe optimized high-pressure freezing and freeze substitution protocols for STEM tomography in order to obtain high membrane contrast.

  8. Regularly scheduled, day-time, slow-onset 60 Hz electric and magnetic field exposure does not depress serum melatonin concentration in nonhuman primates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rogers, W.R.; Smith, H.D.; Orr, J.L.

    Experiments conducted with laboratory rodents indicate that exposure to 60 Hz electric fields or magnetic fields can suppress nocturnal melatonin concentrations in pineal gland and blood. In three experiments employing three field-exposed and three sham-exposed nonhuman primates, each implanted with an indwelling venous cannula to allow repeated blood sampling, the authors studied the effects of either 6 kV/m and 50 {micro}T (0.5 G) or 30 kV/m and 100 {micro}T (1.0 G) on serum melatonin patterns. The fields were ramped on and off slowly, so that no transients occurred. Extensive quality control for the melatonin assay, computerized control and monitoring ofmore » field intensities, and consistent exposure protocols were used. No changes in nocturnal serum melatonin concentration resulted from 6 weeks of day-time exposure with slow field onset/offset and a highly regular exposure protocol. These results indicate that, under the conditions tested, day-time exposure to 60 Hz electric and magnetic fields in combination does not result in melatonin suppression in primates.« less

  9. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR COLLECTION OF SURFACE WIPE SAMPLES FOR PESTICIDES OR METALS ANALYSIS (UA-F-8.1)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures for collecting surface wipe samples inside a home for analysis of either metals or pesticides. This procedure covers the preparation of the surface wipe material and field activities. This protocol was followed to ensure con...

  10. Updated operational protocols for the U.S. Geological Survey Precipitation Chemistry Quality Assurance Project in support of the National Atmospheric Deposition Program

    USGS Publications Warehouse

    Wetherbee, Gregory A.; Martin, RoseAnn

    2017-02-06

    The U.S. Geological Survey Branch of Quality Systems operates the Precipitation Chemistry Quality Assurance Project (PCQA) for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) and National Atmospheric Deposition Program/Mercury Deposition Network (NADP/MDN). Since 1978, various programs have been implemented by the PCQA to estimate data variability and bias contributed by changing protocols, equipment, and sample submission schemes within NADP networks. These programs independently measure the field and laboratory components which contribute to the overall variability of NADP wet-deposition chemistry and precipitation depth measurements. The PCQA evaluates the quality of analyte-specific chemical analyses from the two, currently (2016) contracted NADP laboratories, Central Analytical Laboratory and Mercury Analytical Laboratory, by comparing laboratory performance among participating national and international laboratories. Sample contamination and stability are evaluated for NTN and MDN by using externally field-processed blank samples provided by the Branch of Quality Systems. A colocated sampler program evaluates the overall variability of NTN measurements and bias between dissimilar precipitation gages and sample collectors.This report documents historical PCQA operations and general procedures for each of the external quality-assurance programs from 2007 to 2016.

  11. Planning and setting objectives in field studies: Chapter 2

    USGS Publications Warehouse

    Fisher, Robert N.; Dodd, C. Kenneth

    2016-01-01

    This chapter enumerates the steps required in designing and planning field studies on the ecology and conservation of reptiles, as these involve a high level of uncertainty and risk. To this end, the chapter differentiates between goals (descriptions of what one intends to accomplish) and objectives (the measurable steps required to achieve the established goals). Thus, meeting a specific goal may require many objectives. It may not be possible to define some of them until certain experiments have been conducted; often evaluations of sampling protocols are needed to increase certainty in the biological results. And if sampling locations are fixed and sampling events are repeated over time, then both study-specific covariates and sampling-specific covariates should exist. Additionally, other critical design considerations for field study include obtaining permits, as well as researching ethics and biosecurity issues.

  12. New Cost-Effective Method for Long-Term Groundwater Monitoring Programs

    DTIC Science & Technology

    2013-05-01

    with a small-volume, gas -tight syringe (< 1 mL) and injected directly into the field-portable GC. Alternatively, the well headspace sample can be...according to manufacturers’ protocols. Isobutylene was used as the calibration standard for the PID. The standard gas mixtures were used for 3-point...monitoring wells are being evaluated: 1) direct headspace sampling, 2) sampling tube with gas permeable membrane, and 3) gas -filled passive vapor

  13. A novel hybridization approach for detection of citrus viroids.

    PubMed

    Murcia, N; Serra, P; Olmos, A; Duran-Vila, N

    2009-04-01

    Citrus plants are natural hosts of several viroid species all belonging to the family Pospiviroidae. Previous attempts to detect viroids from field-grown species and cultivars yielded erratic results unless analyses were performed using Etrog citron a secondary bio-amplification host. To overcome the use of Etrog citron a number of RT-PCR approaches have been proposed with different degrees of success. Here we report the suitability of an easy to handle northern hybridization protocol for viroid detection of samples collected from field-grown citrus species and cultivars. The protocol involves: (i) Nucleic acid preparations from bark tissue samples collected from field-grown trees regardless of the growing season and storage conditions; (ii) Separation in 5% PAGE or 1% agarose, blotting to membrane and fixing; (iii) Hybridization with viroid-specific DIG-labelled probes and detection with anti-DIG-alkaline phosphatase conjugate and autoradiography with the CSPD substrate. The method has been tested with viroid-infected trees of sweet orange, lemon, mandarin, grapefruit, sour orange, Swingle citrumello, Tahiti lime and Mexican lime. This novel hybridization approach is extremely sensitive, easy to handle and shortens the time needed for reliable viroid indexing tests. The suitability of PCR generated DIG-labelled probes and the sensitivity achieved when the samples are separated and blotted from non-denaturing gels are discussed.

  14. Publication trends of study protocols in rehabilitation.

    PubMed

    Jesus, Tiago S; Colquhoun, Heather L

    2017-09-04

    Growing evidence points for the need to publish study protocols in the health field. To observe whether the growing interest in publishing study protocols in the broader health field has been translated into increased publications of rehabilitation study protocols. Observational study using publication data and its indexation in PubMed. Not applicable. Not applicable. PubMed was searched with appropriate combinations of Medical Subject Headings up to December 2014. The effective presence of study protocols was manually screened. Regression models analyzed the yearly growth of publications. Two-sample Z-tests analyzed whether the proportion of Systematic Reviews (SRs) and Randomized Controlled Trials (RCTs) among study protocols differed from that of the same designs for the broader rehabilitation research. Up to December 2014, 746 publications of rehabilitation study protocols were identified, with an exponential growth since 2005 (r2=0.981; p<0.001). RCT protocols were the most common among rehabilitation study protocols (83%), while RCTs were significantly more prevalent among study protocols than among the broader rehabilitation research (83% vs. 35.8%; p<0.001). For SRs, the picture was reversed: significantly less common among study protocols (2.8% vs. 9.3%; p<0.001). Funding was more often reported by rehabilitation study protocols than the broader rehabilitation research (90% vs. 53.1%; p<0.001). Rehabilitation journals published a significantly lower share of rehabilitation study protocols than they did for the broader rehabilitation research (1.8% vs.16.7%; p<0.001). Identifying the reasons for these discrepancies and reverting unwarranted disparities (e.g. low rate of publication for rehabilitation SR protocols) are likely new avenues for rehabilitation research and its publication. SRs, particularly those aggregating RCT results, are considered the best standard of evidence to guide rehabilitation clinical practice; however, that standard can be improved in rigor and/or transparency if the publications of rehabilitation SRs protocols become more common.

  15. Effects of field storage method on E. coli concentrations measured in storm water runoff.

    PubMed

    Harmel, Daren; Wagner, Kevin; Martin, Emily; Smith, Doug; Wanjugi, Pauline; Gentry, Terry; Gregory, Lucas; Hendon, Tina

    2016-03-01

    Storm water runoff is increasingly assessed for fecal indicator organisms (e.g., Escherichia coli, E. coli) and its impact on contact recreation. Concurrently, use of autosamplers along with logistic, economic, technical, and personnel barriers is challenging conventional protocols for sample holding times and storage conditions in the field. A common holding time limit for E. coli is 8 h with a 10 °C storage temperature, but several research studies support longer hold time thresholds. The use of autosamplers to collect E. coli water samples has received little field research attention; thus, this study was implemented to compare refrigerated and unrefrigerated autosamplers and evaluate potential E. coli concentration differences due to field storage temperature (storms with holding times ≤24 h) and due to field storage time and temperature (storms >24 h). Data from 85 runoff events on four diverse watersheds showed that field storage times and temperatures had minor effects on mean and median E. coli concentrations. Graphs and error values did, however, indicate a weak tendency for higher concentrations in the refrigerated samplers, but it is unknown to what extent differing die-off and/or regrowth rates, heterogeneity in concentrations within samples, and laboratory analysis uncertainty contributed to the results. The minimal differences in measured E. coli concentrations cast doubt on the need for utilizing the rigid conventional protocols for field holding time and storage temperature. This is not to say that proper quality assurance and quality control is not important but to emphasize the need to consider the balance between data quality and practical constraints related to logistics, funding, travel time, and autosampler use in storm water studies.

  16. Detecting in situ copepod diet diversity using molecular technique: development of a copepod/symbiotic ciliate-excluding eukaryote-inclusive PCR protocol.

    PubMed

    Hu, Simin; Guo, Zhiling; Li, Tao; Carpenter, Edward J; Liu, Sheng; Lin, Senjie

    2014-01-01

    Knowledge of in situ copepod diet diversity is crucial for accurately describing pelagic food web structure but is challenging to achieve due to lack of an easily applicable methodology. To enable analysis with whole copepod-derived DNAs, we developed a copepod-excluding 18S rDNA-based PCR protocol. Although it is effective in depressing amplification of copepod 18S rDNA, its applicability to detect diverse eukaryotes in both mono- and mixed-species has not been demonstrated. Besides, the protocol suffers from the problem that sequences from symbiotic ciliates are overrepresented in the retrieved 18S rDNA libraries. In this study, we designed a blocking primer to make a combined primer set (copepod/symbiotic ciliate-excluding eukaryote-common: CEEC) to depress PCR amplification of symbiotic ciliate sequences while maximizing the range of eukaryotes amplified. We firstly examined the specificity and efficacy of CEEC by PCR-amplifying DNAs from 16 copepod species, 37 representative organisms that are potential prey of copepods and a natural microplankton sample, and then evaluated the efficiency in reconstructing diet composition by detecting the food of both lab-reared and field-collected copepods. Our results showed that the CEEC primer set can successfully amplify 18S rDNA from a wide range of isolated species and mixed-species samples while depressing amplification of that from copepod and targeted symbiotic ciliate, indicating the universality of CEEC in specifically detecting prey of copepods. All the predetermined food offered to copepods in the laboratory were successfully retrieved, suggesting that the CEEC-based protocol can accurately reconstruct the diets of copepods without interference of copepods and their associated ciliates present in the DNA samples. Our initial application to analyzing the food composition of field-collected copepods uncovered diverse prey species, including those currently known, and those that are unsuspected, as copepod prey. While testing is required, this protocol provides a useful strategy for depicting in situ dietary composition of copepods.

  17. National protocol framework for the inventory and monitoring of bees

    USGS Publications Warehouse

    Droege, Sam; Engler, Joseph D.; Sellers, Elizabeth A.; Lee O'Brien,

    2016-01-01

    This national protocol framework is a standardized tool for the inventory and monitoring of the approximately 4,200 species of native and non-native bee species that may be found within the National Wildlife Refuge System (NWRS) administered by the U.S. Fish and Wildlife Service (USFWS). However, this protocol framework may also be used by other organizations and individuals to monitor bees in any given habitat or location. Our goal is to provide USFWS stations within the NWRS (NWRS stations are land units managed by the USFWS such as national wildlife refuges, national fish hatcheries, wetland management districts, conservation areas, leased lands, etc.) with techniques for developing an initial baseline inventory of what bee species are present on their lands and to provide an inexpensive, simple technique for monitoring bees continuously and for monitoring and evaluating long-term population trends and management impacts. The latter long-term monitoring technique requires a minimal time burden for the individual station, yet can provide a good statistical sample of changing populations that can be investigated at the station, regional, and national levels within the USFWS’ jurisdiction, and compared to other sites within the United States and Canada. This protocol framework was developed in cooperation with the United States Geological Survey (USGS), the USFWS, and a worldwide network of bee researchers who have investigated the techniques and methods for capturing bees and tracking population changes. The protocol framework evolved from field and lab-based investigations at the USGS Bee Inventory and Monitoring Laboratory at the Patuxent Wildlife Research Center in Beltsville, Maryland starting in 2002 and was refined by a large number of USFWS, academic, and state groups. It includes a Protocol Introduction and a set of 8 Standard Operating Procedures or SOPs and adheres to national standards of protocol content and organization. The Protocol Narrative describes the history and need for the protocol framework and summarizes the basic elements of objectives, sampling design, field methods, training, data management, analysis, and reporting. The SOPs provide more detail and specific instructions for implementing the protocol framework. A central database, for managing all the resulting data is under development. We welcome use of this protocol framework by our partners, as appropriate for their bee inventory and monitoring objectives.

  18. Development of a field testing protocol for identifying Deepwater Horizon oil spill residues trapped near Gulf of Mexico beaches.

    PubMed

    Han, Yuling; Clement, T Prabhakar

    2018-01-01

    The Deepwater Horizon (DWH) accident, one of the largest oil spills in U.S. history, contaminated several beaches located along the Gulf of Mexico (GOM) shoreline. The residues from the spill still continue to be deposited on some of these beaches. Methods to track and monitor the fate of these residues require approaches that can differentiate the DWH residues from other types of petroleum residues. This is because, historically, the crude oil released from sources such as natural seeps and anthropogenic discharges have also deposited other types of petroleum residues on GOM beaches. Therefore, identifying the origin of these residues is critical for developing effective management strategies for monitoring the long-term environmental impacts of the DWH oil spill. Advanced fingerprinting methods that are currently used for identifying the source of oil spill residues require detailed laboratory studies, which can be cost-prohibitive. Also, most agencies typically use untrained workers or volunteers to conduct shoreline monitoring surveys and these worker will not have access to advanced laboratory facilities. Furthermore, it is impractical to routinely fingerprint large volumes of samples that are collected after a major oil spill event, such as the DWH spill. In this study, we propose a simple field testing protocol that can identify DWH oil spill residues based on their unique physical characteristics. The robustness of the method is demonstrated by testing a variety of oil spill samples, and the results are verified by characterizing the samples using advanced chemical fingerprinting methods. The verification data show that the method yields results that are consistent with the results derived from advanced fingerprinting methods. The proposed protocol is a reliable, cost-effective, practical field approach for differentiating DWH residues from other types of petroleum residues.

  19. Development of a field testing protocol for identifying Deepwater Horizon oil spill residues trapped near Gulf of Mexico beaches

    PubMed Central

    Han, Yuling

    2018-01-01

    The Deepwater Horizon (DWH) accident, one of the largest oil spills in U.S. history, contaminated several beaches located along the Gulf of Mexico (GOM) shoreline. The residues from the spill still continue to be deposited on some of these beaches. Methods to track and monitor the fate of these residues require approaches that can differentiate the DWH residues from other types of petroleum residues. This is because, historically, the crude oil released from sources such as natural seeps and anthropogenic discharges have also deposited other types of petroleum residues on GOM beaches. Therefore, identifying the origin of these residues is critical for developing effective management strategies for monitoring the long-term environmental impacts of the DWH oil spill. Advanced fingerprinting methods that are currently used for identifying the source of oil spill residues require detailed laboratory studies, which can be cost-prohibitive. Also, most agencies typically use untrained workers or volunteers to conduct shoreline monitoring surveys and these worker will not have access to advanced laboratory facilities. Furthermore, it is impractical to routinely fingerprint large volumes of samples that are collected after a major oil spill event, such as the DWH spill. In this study, we propose a simple field testing protocol that can identify DWH oil spill residues based on their unique physical characteristics. The robustness of the method is demonstrated by testing a variety of oil spill samples, and the results are verified by characterizing the samples using advanced chemical fingerprinting methods. The verification data show that the method yields results that are consistent with the results derived from advanced fingerprinting methods. The proposed protocol is a reliable, cost-effective, practical field approach for differentiating DWH residues from other types of petroleum residues. PMID:29329313

  20. Electric field stimulated growth of Zn whiskers

    NASA Astrophysics Data System (ADS)

    Niraula, D.; McCulloch, J.; Warrell, G. R.; Irving, R.; Karpov, V. G.; Shvydka, Diana

    2016-07-01

    We have investigated the impact of strong (˜104 V/cm) electric fields on the development of Zn whiskers. The original samples, with considerable whisker infestation were cut from Zn-coated steel floors and then exposed to electric fields stresses for 10-20 hours at room temperature. We used various electric field sources, from charges accumulated in samples irradiated by: (1) the electron beam of a scanning electron microscope (SEM), (2) the electron beam of a medical linear accelerator, and (3) the ion beam of a linear accelerator; we also used (4) the electric field produced by a Van der Graaf generator. In all cases, the exposed samples exhibited a considerable (tens of percent) increase in whiskers concentration compared to the control sample. The acceleration factor defined as the ratio of the measured whisker growth rate over that in zero field, was estimated to approach several hundred. The statistics of lengths of e-beam induced whiskers was found to follow the log-normal distribution known previously for metal whiskers. The observed accelerated whisker growth is attributed to electrostatic effects. These results offer promise for establishing whisker-related accelerated life testing protocols.

  1. Implications of the field sampling procedure of the LUCAS Topsoil Survey for uncertainty in soil organic carbon concentrations.

    NASA Astrophysics Data System (ADS)

    Lark, R. M.; Rawlins, B. G.; Lark, T. A.

    2014-05-01

    The LUCAS Topsoil survey is a pan-European Union initiative in which soil data were collected according to standard protocols from 19 967 sites. Any inference about soil variables is subject to uncertainty due to different sources of variability in the data. In this study we examine the likely magnitude of uncertainty due to the field-sampling protocol. The published sampling protocol (LUCAS, 2009) describes a procedure to form a composite soil sample from aliquots collected to a depth of between approximately 15-20. A v-shaped hole to the target depth is cut with a spade, then a slice is cut from one of the exposed surfaces. This methodology gives rather less control of the sampling depth than protocols used in other soil and geochemical surveys, this may be a substantial source of variation in uncultivated soils with strong contrasts between an organic-rich A-horizon and an underlying B-horizon. We extracted all representative profile descriptions from soil series recorded in the memoir of the 1:250 000-scale map of Northern England (Soil Survey of England and Wales, 1984) where the base of the A-horizon is less than 20 cm below the surface. The Soil Associations in which these 14 series are significant members cover approximately 17% of the area of Northern England, and are expected to be the mineral soils with the largest organic content. Soil Organic Carbon content and bulk density were extracted for the A- and B-horizons, along with the thickness of the horizons. Recorded bulk density, or prediction by a pedotransfer function, were also recorded. For any proposed angle of the v-shaped hole, the proportions of A- and B-horizon in the resulting sample may be computed by trigonometry. From the bulk density and SOC concentration of the horizons, the SOC concentration of the sample can be computed. For each Soil Series we drew 1000 random samples from a trapezoidal distribution of angles, with uniform density over the range corresponding to depths 15-20 cm and zero density for angles corresponding to depths larger than 21 cm or less than 14 cm. We computed the corresponding variance of sample SOC contents. We found that the variance in SOC determinations attributable to variation in sample depth for these uncultivated soils was of the same order of magnitude as the estimate of the subsampling + analytical variance component (both on a log scale) that we previously computed for soils in the UK (Rawlins et al., 2009). It seems unnecessary to accept this source of uncertainty, given the effort undertaken to reduce the analytical variation which is no larger (and often smaller) than this variation due to the field protocol. If pan-European soil monitoring is to be based on the LUCAS Topsoil survey, as suggested by an initial report, uncertainty could be reduced if the sampling depth was specified to a unique depth, rather than the current depth range. LUCAS. 2009. Instructions for Surveyors. Technical reference document C-1: General implementation, Land Cover and Use, Water management, Soil, Transect, Photos. European Commission, Eurostat. Rawlins, B.G., Scheib, A.J., Lark, R.M. & Lister, T.R. 2009. Sampling and analytical plus subsampling variance components for five soil indicators observed at regional scale. European Journal of Soil Science 60, 740-747

  2. U.S. Geological Survey nutrient preservation experiment; nutrient concentration data for surface-, ground-, and municipal-supply water samples and quality-assurance samples

    USGS Publications Warehouse

    Patton, Charles J.; Truitt, Earl P.

    1995-01-01

    This report is a compilation of analytical results from a study conducted at the U.S. Geological Survey, National Water Quality Laboratory (NWQL) in 1992 to assess the effectiveness of three field treatment protocols to stabilize nutrient concentra- tions in water samples stored for about 1 month at 4C. Field treatments tested were chilling, adjusting sample pH to less than 2 with sulfuric acid and chilling, and adding 52 milligrams of mercury (II) chloride per liter of sample and chilling. Field treatments of samples collected for determination of ammonium, nitrate plus nitrite, nitrite, dissolved Kjeldahl nitrogen, orthophosphate, and dissolved phosphorus included 0.45-micrometer membrane filtration. Only total Kjeldahl nitrogen and total phosphorus were determined in unfiltered samples. Data reported here pertain to water samples collected in April and May 1992 from 15 sites within the continental United States. Also included in this report are analytical results for nutrient concentrations in synthetic reference samples that were analyzed concurrently with real samples.

  3. Multiplex PCR method for MinION and Illumina sequencing of Zika and other virus genomes directly from clinical samples

    PubMed Central

    Quick, Josh; Grubaugh, Nathan D; Pullan, Steven T; Claro, Ingra M; Smith, Andrew D; Gangavarapu, Karthik; Oliveira, Glenn; Robles-Sikisaka, Refugio; Rogers, Thomas F; Beutler, Nathan A; Burton, Dennis R; Lewis-Ximenez, Lia Laura; de Jesus, Jaqueline Goes; Giovanetti, Marta; Hill, Sarah; Black, Allison; Bedford, Trevor; Carroll, Miles W; Nunes, Marcio; Alcantara, Luiz Carlos; Sabino, Ester C; Baylis, Sally A; Faria, Nuno; Loose, Matthew; Simpson, Jared T; Pybus, Oliver G; Andersen, Kristian G; Loman, Nicholas J

    2018-01-01

    Genome sequencing has become a powerful tool for studying emerging infectious diseases; however, genome sequencing directly from clinical samples without isolation remains challenging for viruses such as Zika, where metagenomic sequencing methods may generate insufficient numbers of viral reads. Here we present a protocol for generating coding-sequence complete genomes comprising an online primer design tool, a novel multiplex PCR enrichment protocol, optimised library preparation methods for the portable MinION sequencer (Oxford Nanopore Technologies) and the Illumina range of instruments, and a bioinformatics pipeline for generating consensus sequences. The MinION protocol does not require an internet connection for analysis, making it suitable for field applications with limited connectivity. Our method relies on multiplex PCR for targeted enrichment of viral genomes from samples containing as few as 50 genome copies per reaction. Viral consensus sequences can be achieved starting with clinical samples in 1-2 days following a simple laboratory workflow. This method has been successfully used by several groups studying Zika virus evolution and is facilitating an understanding of the spread of the virus in the Americas. PMID:28538739

  4. HEALTH-SCREENING PROTOCOLS FOR VINACEOUS AMAZONS (AMAZONA VINACEA) IN A REINTRODUCTION PROJECT.

    PubMed

    Saidenberg, André B S; Zuniga, Eveline; Melville, Priscilla A; Salaberry, Sandra; Benites, Nilson R

    2015-12-01

    Reintroduction is a growing field in the conservation of endangered species. The vinaceous Amazon parrot (Amazona vinacea) is extinct in several areas, and a project to release confiscated individuals to their former range is currently underway. The objective of this study was to evaluate and improve the selection and treatment of individual release candidates by detecting possible pathogen carriers using samples taken before and during release. As part of prerelease health protocols, samples were obtained from 29 parrots on three different occasions while in captivity and once after their release. Samples were screened for paramyxovirus type 1, avian influenza, poxvirus, coronavirus, psittacine herpesvirus 1, Chlamydia psittaci , enteropathogenic Escherichia coli (EPEC), Salmonella spp., and endoparasites. The majority of samples returned negative results, with the exception of two individuals that tested positive for C. psittaci in the first sampling and for Ascaridia spp. in the second pooled sampling. Treatments for C. psittaci and endoparasites were administered prior to release, and negative results were obtained in subsequent exams. The number of positive results for E. coli (non-EPEC) decreased during the rehabilitation period. Adequate quarantine procedures and health examinations greatly minimize disease risks. The protocols employed in this study resulted in acceptable health status in accordance with current environmental legislation in Brazil. Additionally, protocols allowed informed decisions to release candidates, minimized risks, and favored the selection of healthy individuals, thereby contributing to the recovery of this species. It is important to determine appropriate minimum health-screening protocols when advanced diagnostics may not be available or high costs make the tests prohibitive in countries where confiscations occur. We hypothesize that a minimum panel of tests of pooled samples can serve as an alternative approach that minimizes costs and overall workload and supports projects intended to restore and promote flagship species and hamper their illegal trade.

  5. Refinement of NMR structures using implicit solvent and advanced sampling techniques.

    PubMed

    Chen, Jianhan; Im, Wonpil; Brooks, Charles L

    2004-12-15

    NMR biomolecular structure calculations exploit simulated annealing methods for conformational sampling and require a relatively high level of redundancy in the experimental restraints to determine quality three-dimensional structures. Recent advances in generalized Born (GB) implicit solvent models should make it possible to combine information from both experimental measurements and accurate empirical force fields to improve the quality of NMR-derived structures. In this paper, we study the influence of implicit solvent on the refinement of protein NMR structures and identify an optimal protocol of utilizing these improved force fields. To do so, we carry out structure refinement experiments for model proteins with published NMR structures using full NMR restraints and subsets of them. We also investigate the application of advanced sampling techniques to NMR structure refinement. Similar to the observations of Xia et al. (J.Biomol. NMR 2002, 22, 317-331), we find that the impact of implicit solvent is rather small when there is a sufficient number of experimental restraints (such as in the final stage of NMR structure determination), whether implicit solvent is used throughout the calculation or only in the final refinement step. The application of advanced sampling techniques also seems to have minimal impact in this case. However, when the experimental data are limited, we demonstrate that refinement with implicit solvent can substantially improve the quality of the structures. In particular, when combined with an advanced sampling technique, the replica exchange (REX) method, near-native structures can be rapidly moved toward the native basin. The REX method provides both enhanced sampling and automatic selection of the most native-like (lowest energy) structures. An optimal protocol based on our studies first generates an ensemble of initial structures that maximally satisfy the available experimental data with conventional NMR software using a simplified force field and then refines these structures with implicit solvent using the REX method. We systematically examine the reliability and efficacy of this protocol using four proteins of various sizes ranging from the 56-residue B1 domain of Streptococcal protein G to the 370-residue Maltose-binding protein. Significant improvement in the structures was observed in all cases when refinement was based on low-redundancy restraint data. The proposed protocol is anticipated to be particularly useful in early stages of NMR structure determination where a reliable estimate of the native fold from limited data can significantly expedite the overall process. This refinement procedure is also expected to be useful when redundant experimental data are not readily available, such as for large multidomain biomolecules and in solid-state NMR structure determination.

  6. Effects of field storage method on E. coli concentrations measured in storm water runoff

    USDA-ARS?s Scientific Manuscript database

    Storm water runoff is increasingly assessed for fecal indicator organisms (e.g., Escherichia coli, E. coli) and its impact on contact recreation. Concurrently, use of autosamplers along with logistic, economic, technical, and personnel barriers are challenging conventional protocols for sample hold...

  7. Marine Mammal Necropsy: An Introductory Guide for Stranding Responders and Field Biologists

    DTIC Science & Technology

    2007-09-01

    the researcher or lab for required tissues and proper sample storage protocols (chill, fix, freeze and/or place in viral transport media). The most...tissues and fluids such as: liver, kidney, serum, aqueous humor, stom- ach contents, intestinal contents, feces, and urine . Tissue samples can be stored...refer to the Figure (2-1) for further explanation on frozen sample storage . The first label is written in black Sharpie on a 1 - 2 square inch piece of

  8. Macro to microfluidics system for biological environmental monitoring.

    PubMed

    Delattre, Cyril; Allier, Cédric P; Fouillet, Yves; Jary, Dorothée; Bottausci, Frederic; Bouvier, Denis; Delapierre, Guillaume; Quinaud, Manuelle; Rival, Arnaud; Davoust, Laurent; Peponnet, Christine

    2012-01-01

    Biological environmental monitoring (BEM) is a growing field of research which challenges both microfluidics and system automation. The aim is to develop a transportable system with analysis throughput which satisfies the requirements: (i) fully autonomous, (ii) complete protocol integration from sample collection to final analysis, (iii) detection of diluted molecules or biological species in a large real life environmental sample volume, (iv) robustness and (v) flexibility and versatility. This paper discusses all these specifications in order to define an original fluidic architecture based on three connected modules, a sampling module, a sample preparation module and a detection module. The sample preparation module highly concentrates on the pathogens present in a few mL samples of complex and unknown solutions and purifies the pathogens' nucleic acids into a few μL of a controlled buffer. To do so, a two-step concentration protocol based on magnetic beads is automated in a reusable macro-to-micro fluidic system. The detection module is a PCR based miniaturized platform using digital microfluidics, where reactions are performed in 64 nL droplets handled by electrowetting on dielectric (EWOD) actuation. The design and manufacture of the two modules are reported as well as their respective performances. To demonstrate the integration of the complete protocol in the same system, first results of pathogen detection are shown. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Measuring stream temperature with digital data loggers: a user's guide

    Treesearch

    Jason Dunham; Gwynne Chandler; Bruce Rieman; Don Martin

    2005-01-01

    Digital data loggers (thermographs) are among the most widespread instruments in use for monitoring physical conditions in aquatic ecosystems. The intent of this protocol is to provide guidelines for selecting and programming data loggers, sampling water temperatures in the field, data screening and analysis, and data archiving.

  10. Pre-Mission Input Requirements to Enable Successful Sample Collection by A Remote Field/EVA Team

    NASA Technical Reports Server (NTRS)

    Cohen, B. A.; Lim, D. S. S.; Young, K. E.; Brunner, A.; Elphic, R. E.; Horne, A.; Kerrigan, M. C.; Osinski, G. R.; Skok, J. R.; Squyres, S. W.; hide

    2016-01-01

    The FINESSE (Field Investigations to Enable Solar System Science and Exploration) team, part of the Solar System Exploration Virtual Institute (SSERVI), is a field-based research program aimed at generating strategic knowledge in preparation for human and robotic exploration of the Moon, near-Earth asteroids, Phobos and Deimos, and beyond. In contract to other technology-driven NASA analog studies, The FINESSE WCIS activity is science-focused and, moreover, is sampling-focused with the explicit intent to return the best samples for geochronology studies in the laboratory. We used the FINESSE field excursion to the West Clearwater Lake Impact structure (WCIS) as an opportunity to test factors related to sampling decisions. We examined the in situ sample characterization and real-time decision-making process of the astronauts, with a guiding hypothesis that pre-mission training that included detailed background information on the analytical fate of a sample would better enable future astronauts to select samples that would best meet science requirements. We conducted three tests of this hypothesis over several days in the field. Our investigation was designed to document processes, tools and procedures for crew sampling of planetary targets. This was not meant to be a blind, controlled test of crew efficacy, but rather an effort to explicitly recognize the relevant variables that enter into sampling protocol and to be able to develop recommendations for crew and backroom training in future endeavors.

  11. Safe and cost-effective protocol for shipment of samples from Foot-and-Mouth Disease suspected cases for laboratory diagnostic.

    PubMed

    Romey, A; Relmy, A; Gorna, K; Laloy, E; Zientara, S; Blaise-Boisseau, S; Bakkali Kassimi, L

    2018-02-01

    An essential step towards the global control and eradication of foot-and-mouth disease (FMD) is the identification of circulating virus strains in endemic regions to implement adequate outbreak control measures. However, due to the high biological risk and the requirement for biological samples to be shipped frozen, the cost of shipping samples becomes one of major obstacles hindering submission of suspected samples to reference laboratories for virus identification. In this study, we report the development of a cost-effective and safe method for shipment of FMD samples. The protocol is based on the inactivation of FMD virus (FMDV) on lateral flow device (LFD, penside test routinely used in the field for rapid immunodetection of FMDV), allowing its subsequent detection and typing by RT-PCR and recovery of live virus upon RNA transfection into permissive cells. After live FMDV collection onto LFD strip and soaking in 0.2% citric acid solution, the virus is totally inactivated. Viral RNA is still detectable by real-time RT-PCR following inactivation, and the virus strain can be characterized by sequencing of the VP1 coding region. In addition, live virus can be rescued by transfecting RNA extract from treated LFD into cells. This protocol should help promoting submission of FMD suspected samples to reference laboratories (by reducing the cost of sample shipping) and thus characterization of FMDV strains circulating in endemic regions. © 2017 Blackwell Verlag GmbH.

  12. Comparison of thermal and microwave paleointensity estimates in specimens that violate Thellier's laws

    NASA Astrophysics Data System (ADS)

    Grappone, J. M., Jr.; Biggin, A. J.; Barrett, T. J.; Hill, M. J.

    2017-12-01

    Deep in the Earth, thermodynamic behavior drives the geodynamo and creates the Earth's magnetic field. Determining how the strength of the field, its paleointensity (PI), varies with time, is vital to our understanding of Earth's evolution. Thellier-style paleointensity experiments assume the presence of non-interacting, single domain (SD) magnetic particles, which follow Thellier's laws. Most natural rocks however, contain larger, multi-domain (MD) or interacting single domain (ISD) particles that often violate these laws and cause experiments to fail. Even for samples that pass reliability criteria designed to minimize the impact of MD or ISD grains, different PI techniques can give systematically different estimates, implying violation of Thellier's laws. Our goal is to identify any disparities in PI results that may be explainable by protocol-specific MD and ISD behavior and determine optimum methods to maximize accuracy. Volcanic samples from the Hawai'ian SOH1 borehole previously produced method-dependent PI estimates. Previous studies showed consistently lower PI values when using a microwave (MW) system and the perpendicular method than using the original thermal Thellier-Thellier (OT) technique. However, the data were ambiguous regarding the cause of the discrepancy. The diverging estimates appeared to be either the result of using OT instead of the perpendicular method or the result of using MW protocols instead of thermal protocols. Comparison experiments were conducted using the thermal perpendicular method and microwave OT technique to bridge the gap. Preliminary data generally show that the perpendicular method gives lower estimates than OT for comparable Hlab values. MW estimates are also generally lower than thermal estimates using the same protocol.

  13. A simplified and cost-effective enrichment protocol for the isolation of Campylobacter spp. from retail broiler meat without microaerobic incubation

    PubMed Central

    2011-01-01

    Background To simplify the methodology for the isolation of Campylobacter spp. from retail broiler meat, we evaluated 108 samples (breasts and thighs) using an unpaired sample design. The enrichment broths were incubated under aerobic conditions (subsamples A) and for comparison under microaerobic conditions (subsamples M) as recommended by current reference protocols. Sensors were used to measure the dissolved oxygen (DO) in the broth and the percentage of oxygen (O2) in the head space of the bags used for enrichment. Campylobacter isolates were identified with multiplex PCR assays and typed using pulsed-field gel electrophoresis (PFGE). Ribosomal intergenic spacer analyses (RISA) and denaturing gradient gel electrophoresis (DGGE) were used to study the bacterial communities of subsamples M and A after 48 h enrichment. Results The number of Campylobacter positive subsamples were similar for A and M when all samples were combined (P = 0.81) and when samples were analyzed by product (breast: P = 0.75; thigh: P = 1.00). Oxygen sensors showed that DO values in the broth were around 6 ppm and O2 values in the head space were 14-16% throughout incubation. PFGE demonstrated high genomic similarity of isolates in the majority of the samples in which isolates were obtained from subsamples A and M. RISA and DGGE results showed a large variability in the bacterial populations that could be attributed to sample-to-sample variations and not enrichment conditions (aerobic or microaerobic). These data also suggested that current sampling protocols are not optimized to determine the true number of Campylobacter positive samples in retail boiler meat. Conclusions Decreased DO in enrichment broths is naturally achieved. This simplified, cost-effective enrichment protocol with aerobic incubation could be incorporated into reference methods for the isolation of Campylobacter spp. from retail broiler meat. PMID:21812946

  14. Visual loop-mediated isothermal amplification (LAMP) for the rapid diagnosis of Enterocytozoon hepatopenaei (EHP) infection.

    PubMed

    T, Sathish Kumar; A, Navaneeth Krishnan; J, Joseph Sahaya Rajan; M, Makesh; K P, Jithendran; S V, Alavandi; K K, Vijayan

    2018-05-01

    The emerging microsporidian parasite Enterocytozoon hepatopenaei (EHP), the causative agent of hepatopancreatic microsporidiosis, has been widely reported in shrimp-farming countries. EHP infection can be detected by light microscopy observation of spores (1.7 × 1 μm) in stained hepatopancreas (HP) tissue smears, HP tissue sections, and fecal samples. EHP can also be detected by polymerase chain reaction (PCR) targeting the small subunit (SSU) ribosomal RNA (rRNA) gene or the spore wall protein gene (SWP). In this study, a rapid, sensitive, specific, and closed tube visual loop-mediated isothermal amplification (LAMP) protocol combined with FTA cards was developed for the diagnosis of EHP. LAMP primers were designed based on the SSU rRNA gene of EHP. The target sequence of EHP was amplified at constant temperature of 65 °C for 45 min and amplified LAMP products were visually detected in a closed tube system by using SYBR™ green I dye. Detection limit of this LAMP protocol was ten copies. Field and clinical applicability of this assay was evaluated using 162 field samples including 106 HP tissue samples and 56 fecal samples collected from shrimp farms. Out of 162 samples, EHP could be detected in 62 samples (47 HP samples and 15 fecal samples). When compared with SWP-PCR as the gold standard, this EHP LAMP assay had 95.31% sensitivity, 98.98% specificity, and a kappa value of 0.948. This simple, closed tube, clinically evaluated visual LAMP assay has great potential for diagnosing EHP at the farm level, particularly under low-resource circumstances.

  15. Movement of particles using sequentially activated dielectrophoretic particle trapping

    DOEpatents

    Miles, Robin R.

    2004-02-03

    Manipulation of DNA and cells/spores using dielectrophoretic (DEP) forces to perform sample preparation protocols for polymerized chain reaction (PCR) based assays for various applications. This is accomplished by movement of particles using sequentially activated dielectrophoretic particle trapping. DEP forces induce a dipole in particles, and these particles can be trapped in non-uniform fields. The particles can be trapped in the high field strength region of one set of electrodes. By switching off this field and switching on an adjacent electrodes, particles can be moved down a channel with little or no flow.

  16. Chapter A6. Section 6.6. Alkalinity and Acid Neutralizing Capacity

    USGS Publications Warehouse

    Rounds, Stewart A.; Wilde, Franceska D.

    2002-01-01

    Alkalinity (determined on a filtered sample) and Acid Neutralizing Capacity (ANC) (determined on a whole-water sample) are measures of the ability of a water sample to neutralize strong acid. Alkalinity and ANC provide information on the suitability of water for uses such as irrigation, determining the efficiency of wastewater processes, determining the presence of contamination by anthropogenic wastes, and maintaining ecosystem health. In addition, alkalinity is used to gain insights on the chemical evolution of an aqueous system. This section of the National Field Manual (NFM) describes the USGS field protocols for alkalinity/ANC determination using either the inflection-point or Gran function plot methods, including calculation of carbonate species, and provides guidance on equipment selection.

  17. Isothermal amplification of environmental DNA (eDNA) for direct field-based monitoring and laboratory confirmation of Dreissena sp.

    PubMed

    Williams, Maggie R; Stedtfeld, Robert D; Engle, Cathrine; Salach, Paul; Fakher, Umama; Stedtfeld, Tiffany; Dreelin, Erin; Stevenson, R Jan; Latimore, Jo; Hashsham, Syed A

    2017-01-01

    Loop-mediated isothermal amplification (LAMP) of aquatic invasive species environmental DNA (AIS eDNA) was used for rapid, sensitive, and specific detection of Dreissena sp. relevant to the Great Lakes (USA) basin. The method was validated for two uses including i) direct amplification of eDNA using a hand filtration system and ii) confirmation of the results after DNA extraction using a conventional thermal cycler run at isothermal temperatures. Direct amplification eliminated the need for DNA extraction and purification and allowed detection of target invasive species in grab or concentrated surface water samples, containing both free DNA as well as larger cells and particulates, such as veligers, eggs, or seeds. The direct amplification method validation was conducted using Dreissena polymorpha and Dreissena bugensis and uses up to 1 L grab water samples for high target abundance (e.g., greater than 10 veligers (larval mussels) per L for Dreissena sp.) or 20 L samples concentrated through 35 μm nylon screens for low target abundance, at less than 10 veligers per liter water. Surface water concentrate samples were collected over a period of three years, mostly from inland lakes in Michigan with the help of a network of volunteers. Field samples collected from 318 surface water locations included i) filtered concentrate for direct amplification validation and ii) 1 L grab water sample for eDNA extraction and confirmation. Though the extraction-based protocol was more sensitive (resulting in more positive detections than direct amplification), direct amplification could be used for rapid screening, allowing for quicker action times. For samples collected between May and August, results of eDNA direct amplification were consistent with known presence/absence of selected invasive species. A cross-platform smartphone application was also developed to disseminate the analyzed results to volunteers. Field tests of the direct amplification protocol using a portable device (Gene-Z) showed the method could be used in the field to obtain results within one hr (from sample to result). Overall, the direct amplification has the potential to simplify the eDNA-based monitoring of multiple aquatic invasive species. Additional studies are warranted to establish quantitative correlation between eDNA copy number, veliger, biomass or organismal abundance in the field.

  18. Isothermal amplification of environmental DNA (eDNA) for direct field-based monitoring and laboratory confirmation of Dreissena sp.

    PubMed Central

    Stedtfeld, Robert D.; Engle, Cathrine; Salach, Paul; Fakher, Umama; Stedtfeld, Tiffany; Dreelin, Erin; Stevenson, R. Jan; Latimore, Jo; Hashsham, Syed A.

    2017-01-01

    Loop-mediated isothermal amplification (LAMP) of aquatic invasive species environmental DNA (AIS eDNA) was used for rapid, sensitive, and specific detection of Dreissena sp. relevant to the Great Lakes (USA) basin. The method was validated for two uses including i) direct amplification of eDNA using a hand filtration system and ii) confirmation of the results after DNA extraction using a conventional thermal cycler run at isothermal temperatures. Direct amplification eliminated the need for DNA extraction and purification and allowed detection of target invasive species in grab or concentrated surface water samples, containing both free DNA as well as larger cells and particulates, such as veligers, eggs, or seeds. The direct amplification method validation was conducted using Dreissena polymorpha and Dreissena bugensis and uses up to 1 L grab water samples for high target abundance (e.g., greater than 10 veligers (larval mussels) per L for Dreissena sp.) or 20 L samples concentrated through 35 μm nylon screens for low target abundance, at less than 10 veligers per liter water. Surface water concentrate samples were collected over a period of three years, mostly from inland lakes in Michigan with the help of a network of volunteers. Field samples collected from 318 surface water locations included i) filtered concentrate for direct amplification validation and ii) 1 L grab water sample for eDNA extraction and confirmation. Though the extraction-based protocol was more sensitive (resulting in more positive detections than direct amplification), direct amplification could be used for rapid screening, allowing for quicker action times. For samples collected between May and August, results of eDNA direct amplification were consistent with known presence/absence of selected invasive species. A cross-platform smartphone application was also developed to disseminate the analyzed results to volunteers. Field tests of the direct amplification protocol using a portable device (Gene-Z) showed the method could be used in the field to obtain results within one hr (from sample to result). Overall, the direct amplification has the potential to simplify the eDNA-based monitoring of multiple aquatic invasive species. Additional studies are warranted to establish quantitative correlation between eDNA copy number, veliger, biomass or organismal abundance in the field. PMID:29036210

  19. Quantification of trace elements and speciation of iron in atmospheric particulate matter

    NASA Astrophysics Data System (ADS)

    Upadhyay, Nabin

    Trace metal species play important roles in atmospheric redox processes and in the generation of oxidants in cloud systems. The chemical impact of these elements on atmospheric and cloud chemistry is dependent on their occurrence, solubility and speciation. First, analytical protocols have been developed to determine trace elements in particulate matter samples collected for carbonaceous analysis. The validated novel protocols were applied to the determination of trace elements in particulate samples collected in the remote marine atmosphere and urban areas in Arizona to study air pollution issues. The second part of this work investigates on solubility and speciation in environmental samples. A detailed study on the impact of the nature and strength of buffer solutions on solubility and speciation of iron lead to a robust protocol, allowing for comparative measurements in matrices representative of cloud water conditions. Application of this protocol to samples from different environments showed low iron solubility (less than 1%) in dust-impacted events and higher solubility (5%) in anthropogenically impacted urban samples. In most cases, Fe(II) was the dominant oxidation state in the soluble fraction of iron. The analytical protocol was then applied to investigate iron processing by fogs. Field observations showed that only a small fraction (1%) of iron was scavenged by fog droplets for which each of the soluble and insoluble fraction were similar. A coarse time resolution limited detailed insights into redox cycling within fog system. Overall results suggested that the major iron species in the droplets was Fe(1I) (80% of soluble iron). Finally, the occurrence and sources of emerging organic pollutants in the urban atmosphere were investigated. Synthetic musk species are ubiquitous in the urban environment (less than 5 ng m-3) and investigations at wastewater treatment plants showed that wastewater aeration basins emit a substantial amount of these species to the atmosphere.

  20. Sample of a client intake information protocol: a synopsis and rationale.

    PubMed

    Green, Shari

    2012-11-01

    The utilization of standardized comprehensive forms in the field of orofacial myology is crucial as this profession continues to grow and establish assessment and treatment protocols. This article formally presents a comprehensive health history intake form currently in use, and highlights the rationale for each particular question within this form in an effort to explore the evidence-based theory behind each question utilized. Highlighting the importance of obtaining a thorough health history as it pertains to our profession, personally allows the clinician to ultimately best plan a therapeutic strategy and assess the individual criteria necessary for successful orofacial myofunctional habituation.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ottesen, Elizabeth A.; Marin, Roman; Preston, Christina M.

    Planktonic microbial activity and community structure is dynamic, and can change dramatically on time scales of hours to days. Yet for logistical reasons, this temporal scale is typically undersampled in the marine environment. In order to facilitate higher-resolution, long-term observation of microbial diversity and activity, we developed a protocol for automated collection and fixation of marine microbes using the Environmental Sample Processor (ESP) platform. The protocol applies a preservative (RNALater) to cells collected on filters, for long-term storage and preservation of total cellular RNA. Microbial samples preserved using this protocol yielded high-quality RNA after 30 days of storage at roommore » temperature, or onboard the ESP at in situ temperatures. Pyrosequencing of complementary DNA libraries generated from ESP-collected and preserved samples yielded transcript abundance profiles nearly indistinguishable from those derived from conventionally treated replicate samples. To demonstrate the utility of the method, we used a moored ESP to remotely and autonomously collect Monterey Bay seawater for metatranscriptomic analysis. Community RNA was extracted and pyrosequenced from samples collected at four time points over the course of a single day. In all four samples, the oxygenic photoautotrophs were predominantly eukaryotic, while the bacterial community was dominated by Polaribacter-like Flavobacteria and a Rhodobacterales bacterium sharing high similarity with Rhodobacterales sp. HTCC2255. However, each time point was associated with distinct species abundance and gene transcript profiles. These laboratory and field tests confirmed that autonomous collection and preservation is a feasible and useful approach for characterizing the expressed genes and environmental responses of marine microbial communities.« less

  2. An efficient protocol for tissue sampling and DNA isolation from the stem bark of Leguminosae trees.

    PubMed

    Novaes, R M L; Rodrigues, J G; Lovato, M B

    2009-02-03

    Traditionally, molecular studies of plant species have used leaves as the source of DNA. However, sampling leaves from tall tree species can be quite difficult and expensive. We developed a sequence of procedures for using stem bark as a source of DNA from Leguminosae trees of the Atlantic Forest and the Cerrado. Leguminosae is an important species-rich family in these two highly diverse and endangered biomes. A modified CTAB protocol for DNA isolation is described, and details of the procedures for sampling and storage of the bark are given. The procedures were initially developed for three species, and then their applicability for 15 other species was evaluated. DNA of satisfactory quality was obtained from the bark of all species. The amounts of DNA obtained from leaves were slightly higher than from bark samples, while its purity was the same. Storing the bark frozen or by drying in silica gel yielded similar results. Polymerase chain reaction amplification worked for both plastid and nuclear genomes. This alternative for isolating DNA from bark samples of trees facilitates field work with these tree species.

  3. Use of Electronic Hand-held Devices for Collection of Savannah River Site Environmental Data - 13329

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marberry, Hugh; Moore, Winston

    2013-07-01

    Savannah River Nuclear Solutions has begun using Xplore Tablet PC's to collect data in the field for soil samples, groundwater samples, air samples and round sheets at the Savannah River Site (SRS). EPA guidelines for groundwater sampling are incorporated into the application to ensure the sample technician follows the proper protocol. The sample technician is guided through the process for sampling and round sheet data collection by a series of menus and input boxes. Field measurements and well stabilization information are entered into the tablet for uploading into Environmental Restoration Data Management System (ERDMS). The process helps to eliminate inputmore » errors and provides data integrity. A soil sample technician has the ability to collect information about location of sample, field parameter, describe the soil sample, print bottle labels, and print chain of custody for the sample that they have collected. An air sample technician has the ability to provide flow, pressure, hours of operation, print bottle labels and chain of custody for samples they collect. Round sheets are collected using the information provided in the various procedures. The data are collected and uploaded into ERDMS. The equipment used is weather proof and hardened for the field use. Global Positioning System (GPS) capabilities are integrated into the applications to provide the location where samples were collected and to help sample technicians locate wells that are not visited often. (authors)« less

  4. Protocol for monitoring forest-nesting birds in National Park Service parks

    USGS Publications Warehouse

    Dawson, Deanna K.; Efford, Murray G.

    2013-01-01

    These documents detail the protocol for monitoring forest-nesting birds in National Park Service parks in the National Capital Region Network (NCRN). In the first year of sampling, counts of birds should be made at 384 points on the NCRN spatially randomized grid, developed to sample terrestrial resources. Sampling should begin on or about May 20 and continue into early July; on each day the sampling period begins at sunrise and ends five hours later. Each point should be counted twice, once in the first half of the field season and once in the second half, with visits made by different observers, balancing the within-season coverage of points and their spatial coverage by observers, and allowing observer differences to be tested. Three observers, skilled in identifying birds of the region by sight and sound and with previous experience in conducting timed counts of birds, will be needed for this effort. Observers should be randomly assigned to ‘routes’ consisting of eight points, in close proximity and, ideally, in similar habitat, that can be covered in one morning. Counts are 10 minutes in length, subdivided into four 2.5-min intervals. Within each time interval, new birds (i.e., those not already detected) are recorded as within or beyond 50 m of the point, based on where first detected. Binomial distance methods are used to calculate annual estimates of density for species. The data are also amenable to estimation of abundance and detection probability via the removal method. Generalized linear models can be used to assess between-year changes in density estimates or unadjusted count data. This level of sampling is expected to be sufficient to detect a 50% decline in 10 years for approximately 50 bird species, including 14 of 19 species that are priorities for conservation efforts, if analyses are based on unadjusted count data, and for 30 species (6 priority species) if analyses are based on density estimates. The estimates of required sample sizes are based on the mean number of individuals detected per 10 minutes in available data from surveys in three NCRN parks. Once network-wide data from the first year of sampling are available, this and other aspects of the protocol should be re-assessed, and changes made as desired or necessary before the start of the second field season. Thereafter, changes should not be made to the field methods, and sampling should be conducted annually for at least ten years. NCRN staff should keep apprised of new analytical methods developed for analysis of point-count data.

  5. Updated archaeointensity dataset from the SW Pacific

    NASA Astrophysics Data System (ADS)

    Hill, Mimi; Nilsson, Andreas; Holme, Richard; Hurst, Elliot; Turner, Gillian; Herries, Andy; Sheppard, Peter

    2016-04-01

    It is well known that there are far more archaeomagnetic data from the Northern Hemisphere than from the Southern. Here we present a compilation of archaeointensity data from the SW Pacific region covering the past 3000 years. The results have primarily been obtained from a collection of ceramics from the SW Pacific Islands including Fiji, Tonga, Papua New Guinea, New Caledonia and Vanuatu. In addition we present results obtained from heated clay balls from Australia. The microwave method has predominantly been used with a variety of experimental protocols including IZZI and Coe variants. Standard Thellier archaeointensity experiments using the IZZI protocol have also been carried out on selected samples. The dataset is compared to regional predictions from current global geomagnetic field models, and the influence of the new data on constraining the pfm9k family of global geomagnetic field models is explored.

  6. Capillary Electrophoresis Analysis of Organic Amines and Amino Acids in Saline and Acidic Samples Using the Mars Organic Analyzer

    NASA Astrophysics Data System (ADS)

    Stockton, Amanda M.; Chiesl, Thomas N.; Lowenstein, Tim K.; Amashukeli, Xenia; Grunthaner, Frank; Mathies, Richard A.

    2009-11-01

    The Mars Organic Analyzer (MOA) has enabled the sensitive detection of amino acid and amine biomarkers in laboratory standards and in a variety of field sample tests. However, the MOA is challenged when samples are extremely acidic and saline or contain polyvalent cations. Here, we have optimized the MOA analysis, sample labeling, and sample dilution buffers to handle such challenging samples more robustly. Higher ionic strength buffer systems with pKa values near pH 9 were developed to provide better buffering capacity and salt tolerance. The addition of ethylaminediaminetetraacetic acid (EDTA) ameliorates the negative effects of multivalent cations. The optimized protocol utilizes a 75 mM borate buffer (pH 9.5) for Pacific Blue labeling of amines and amino acids. After labeling, 50 mM (final concentration) EDTA is added to samples containing divalent cations to ameliorate their effects. This optimized protocol was used to successfully analyze amino acids in a saturated brine sample from Saline Valley, California, and a subcritical water extract of a highly acidic sample from the Río Tinto, Spain. This work expands the analytical capabilities of the MOA and increases its sensitivity and robustness for samples from extraterrestrial environments that may exhibit pH and salt extremes as well as metal ions.

  7. Capillary electrophoresis analysis of organic amines and amino acids in saline and acidic samples using the Mars organic analyzer.

    PubMed

    Stockton, Amanda M; Chiesl, Thomas N; Lowenstein, Tim K; Amashukeli, Xenia; Grunthaner, Frank; Mathies, Richard A

    2009-11-01

    The Mars Organic Analyzer (MOA) has enabled the sensitive detection of amino acid and amine biomarkers in laboratory standards and in a variety of field sample tests. However, the MOA is challenged when samples are extremely acidic and saline or contain polyvalent cations. Here, we have optimized the MOA analysis, sample labeling, and sample dilution buffers to handle such challenging samples more robustly. Higher ionic strength buffer systems with pK(a) values near pH 9 were developed to provide better buffering capacity and salt tolerance. The addition of ethylaminediaminetetraacetic acid (EDTA) ameliorates the negative effects of multivalent cations. The optimized protocol utilizes a 75 mM borate buffer (pH 9.5) for Pacific Blue labeling of amines and amino acids. After labeling, 50 mM (final concentration) EDTA is added to samples containing divalent cations to ameliorate their effects. This optimized protocol was used to successfully analyze amino acids in a saturated brine sample from Saline Valley, California, and a subcritical water extract of a highly acidic sample from the Río Tinto, Spain. This work expands the analytical capabilities of the MOA and increases its sensitivity and robustness for samples from extraterrestrial environments that may exhibit pH and salt extremes as well as metal ions.

  8. NEON Data Products: Supporting the Validation of GCOS Essential Climate Variables

    NASA Astrophysics Data System (ADS)

    Petroy, S. B.; Fox, A. M.; Metzger, S.; Thorpe, A.; Meier, C. L.

    2014-12-01

    The National Ecological Observatory Network (NEON) is a continental-scale ecological observation platform designed to collect and disseminate data that contributes to understanding and forecasting the impacts of climate change, land use change, and invasive species on ecology. NEON will collect in-situ and airborne data over 60 sites across the US, including Alaska, Hawaii, and Puerto Rico. The NEON Biomass, Productivity, and Biogeochemistry protocols currently direct the collection of samples from distributed, gradient, and tower plots at each site, with sampling occurring either multiple times during the growing season, annually, or on three- or five-year centers (e.g. for coarse woody debris). These data are processed into a series of field-derived data products (e.g. Biogeochemistry, LAI, above ground Biomass, etc.), and when combined with the NEON airborne hyperspectral and LiDAR imagery, are used support validation efforts of algorithms for deriving vegetation characteristics from the airborne data. Sites are further characterized using airborne data combined with in-situ tower measurements, to create additional data products of interest to the GCOS community, such as Albedo and fPAR. Presented here are a summary of tower/field/airborne sampling and observation protocols and examples of provisional datasets collected at NEON sites that may be used to support the ongoing validation of GCOS Essential Climate Variables.

  9. ColiSense, today's sample today: A rapid on-site detection of β-D-Glucuronidase activity in surface water as a surrogate for E. coli.

    PubMed

    Heery, Brendan; Briciu-Burghina, Ciprian; Zhang, Dian; Duffy, Gillian; Brabazon, Dermot; O'Connor, Noel; Regan, Fiona

    2016-01-01

    A sensitive field-portable fluorimeter with incubating capability and triplicate sample chambers was designed and built. The system was optimised for the on-site analysis of E. coli in recreational waters using fluorescent based enzyme assays. The target analyte was β-D-Glucuronidase (GUS) which hydrolyses a synthetic substrate 6-Chloro-4-Methyl-Umbelliferyl-β-D-Glucuronide (6-CMUG) to release the fluorescent molecule 6-Chloro-4-Methyl-Umbelliferyl (6-CMU). The system was calibrated with 6-CMU standards. A LOD of 5 nM and a resolution of less than 1 nM was determined while enzyme kinetic tests showed detection of activities below 1 pmol min(-1) mL(-1) of sample. A field portable sample preparation, enzyme extraction protocol and continuous assay were applied with the system to analyse freshwater and marine samples. Results from a one day field trial are shown which demonstrated the ability of the system to deliver results on-site within a 75 min period. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Protocol for quantitative tracing of surface water with synthetic DNA

    NASA Astrophysics Data System (ADS)

    Foppen, J. W.; Bogaard, T. A.

    2012-04-01

    Based on experiments we carried out in 2010 with various synthetic single stranded DNA markers with a size of 80 nucleotides (ssDNA; Foppen et al., 2011), we concluded that ssDNA can be used to carry out spatially distributed multi-tracer experiments in the environment. Main advantages are in principle unlimited amount of tracers, environmental friendly and tracer recovery at very high dilution rates (detection limit is very low). However, when ssDNA was injected in headwater streams, we found that at selected downstream locations, the total mass recovery was less than 100%. The exact reason for low mass recovery was unknown. In order to start identifying the cause of the loss of mass in these surface waters, and to increase our knowledge of the behaviour of synthetic ssDNA in the environment, we examined the effect of laboratory and field protocols working with artificial DNA by performing numerous batch experiments. Then, we carried out several field tests in different headwater streams in the Netherlands and in Luxembourg. The laboratory experiments consisted of a batch of water in a vessel with in the order of 10^10 ssDNA molecules injected into the batch. The total duration of each experiment was 10 hour, and, at regular time intervals, 100 µl samples were collected in a 1.5 ml Eppendorf vial for qPCR analyses. The waters we used ranged from milliQ water to river water with an Electrical Conductivity of around 400 μS/cm. The batch experiments were performed in different vessel types: polyethylene bottles, polypropylene copolymer bottles , and glass bottles. In addition, two filter types were tested: 1 µm pore size glass fibre filters and 0.2 µm pore size cellulose acetate filters. Lastly, stream bed sediment was added to the batch experiments to quantify interaction of the DNA with sediment. For each field experiment around 10^15 ssDNA molecules were injected, and water samples were collected 100 - 600 m downstream of the point of injection. Additionally, the field tests were performed with salt and deuterium as tracer. To study possible decay by sunlight and/or microbial activity for synthetic DNA, immediately in the field and for the duration of the entire experiment, we carried out batch experiments. All samples were stored in a 1.5 ml Eppendorf vial in a cool-box in dry ice (-80°C). Quantitative PCR on a Mini Opticon (Bio Rad, Hercules, CA, USA) was carried out to determine DNA concentrations in the samples. Results showed the importance of a strict protocol for working with ssDNA as a tracer for quantitative tracing, since ssDNA interacts with surface areas of glass and plastic, depending on water quality and ionic strength. Interaction with the sediment and decay due to sunlight and/or microbial activity was negligible in most cases. The ssDNA protocol was then tested in natural streams. Promising results were obtained using ssDNA as quantitative tracer. The breakthrough curves using ssDNA were similar to the ones of salt or deuterium. We will present the revised protocol to use ssDNA for multi-tracing experiments in natural streams and discuss the opportunities and limitations.

  11. Lakes and reservoirs—Guidelines for study design and sampling

    USGS Publications Warehouse

    ,

    2015-09-29

    The “National Field Manual for the Collection of Water-Quality Data” (NFM) is an online report with separately published chapters that provides the protocols and guidelines by which U.S. Geological Survey personnel obtain the data used to assess the quality of the Nation’s surface-water and groundwater resources. Chapter A10 reviews limnological principles, describes the characteristics that distinguish lakes from reservoirs, and provides guidance for developing temporal and spatial sampling strategies and data-collection approaches to be used in lake and reservoir environmental investigations.Within this chapter are references to other chapters of the NFM that provide more detailed guidelines related to specific topics and more detailed protocols for the quality assurance and assessment of the lake and reservoir data. Protocols and procedures to address and document the quality of lake and reservoir investigations are adapted from, or referenced to, the protocols and standard operating procedures contained in related chapters of this NFM.Before 2017, the U.S. Geological Survey (USGS) “National Field Manual for the Collection of Water-Quality Data” (NFM) chapters were released in the USGS Techniques of Water-Resources Investigations series. Effective in 2018, new and revised NFM chapters are being released in the USGS Techniques and Methods series; this series change does not affect the content and format of the NFM. More information is in the general introduction to the NFM (USGS Techniques and Methods, book 9, chapter A0, 2018) at https://doi.org/10.3133/tm9A0. The authoritative current versions of NFM chapters are available in the USGS Publications Warehouse at https://pubs.er.usgs.gov. Comments, questions, and suggestions related to the NFM can be addressed to nfm-owq@usgs.gov.

  12. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR STANDARD PROTOCOL FOR CLEANING LABORATORY AND FIELD SAMPLING APPARATUS (UA-L-5.1)

    EPA Science Inventory

    The purpose of this SOP is to describe the standard approach used for cleaning glassware and plasticware during the Arizona NHEXAS project and the "Border" study. Keywords: lab; equipment; cleaning.

    The National Human Exposure Assessment Survey (NHEXAS) is a federal interagency...

  13. Cross-Validation of a PACER Prediction Equation for Assessing Aerobic Capacity in Hungarian Youth

    ERIC Educational Resources Information Center

    Saint-Maurice, Pedro F.; Welk, Gregory J.; Finn, Kevin J.; Kaj, Mónika

    2015-01-01

    Purpose: The purpose of this article was to evaluate the validity of the Progressive Aerobic Cardiovascular and Endurance Run (PACER) test in a sample of Hungarian youth. Method: Approximately 500 participants (aged 10-18 years old) were randomly selected across Hungary to complete both laboratory (maximal treadmill protocol) and field assessments…

  14. Tools and technologies needed for conducting planetary field geology while on EVA: Insights from the 2010 Desert RATS geologist crewmembers

    NASA Astrophysics Data System (ADS)

    Young, Kelsey; Hurtado, José M.; Bleacher, Jacob E.; Brent Garry, W.; Bleisath, Scott; Buffington, Jesse; Rice, James W.

    2013-10-01

    The tools used by crews while on extravehicular activity during future missions to other bodies in the Solar System will be a combination of traditional geologic field tools (e.g. hammers, rakes, sample bags) and state-of-the-art technologies (e.g. high definition cameras, digital situational awareness devices, and new geologic tools). In the 2010 Desert Research and Technology Studies (RATS) field test, four crews, each consisting of an astronaut/engineer and field geologist, tested and evaluated various technologies during two weeks of simulated spacewalks in the San Francisco volcanic field, Arizona. These tools consisted of both Apollo-style field geology tools and modern technological equipment not used during the six Apollo lunar landings. The underlying exploration driver for this field test was to establish the protocols and technology needed for an eventual manned mission to an asteroid, the Moon, or Mars. The authors of this paper represent Desert RATS geologist crewmembers as well as two engineers who worked on technology development. Here we present an evaluation and assessment of these tools and technologies based on our first-hand experience of using them during the analog field test. We intend this to serve as a basis for continued development of technologies and protocols used for conducting planetary field geology as the Solar System exploration community moves forward into the next generation of planetary surface exploration.

  15. Preliminary Validation of Direct Detection of Foot-And-Mouth Disease Virus within Clinical Samples Using Reverse Transcription Loop-Mediated Isothermal Amplification Coupled with a Simple Lateral Flow Device for Detection

    PubMed Central

    Waters, Ryan A.; Fowler, Veronica L.; Armson, Bryony; Nelson, Noel; Gloster, John; Paton, David J.; King, Donald P.

    2014-01-01

    Rapid, field-based diagnostic assays are desirable tools for the control of foot-and-mouth disease (FMD). Current approaches involve either; 1) Detection of FMD virus (FMDV) with immuochromatographic antigen lateral flow devices (LFD), which have relatively low analytical sensitivity, or 2) portable RT-qPCR that has high analytical sensitivity but is expensive. Loop-mediated isothermal amplification (LAMP) may provide a platform upon which to develop field based assays without these drawbacks. The objective of this study was to modify an FMDV-specific reverse transcription–LAMP (RT-LAMP) assay to enable detection of dual-labelled LAMP products with an LFD, and to evaluate simple sample processing protocols without nucleic acid extraction. The limit of detection of this assay was demonstrated to be equivalent to that of a laboratory based real-time RT-qPCR assay and to have a 10,000 fold higher analytical sensitivity than the FMDV-specific antigen LFD currently used in the field. Importantly, this study demonstrated that FMDV RNA could be detected from epithelial suspensions without the need for prior RNA extraction, utilising a rudimentary heat source for amplification. Once optimised, this RT-LAMP-LFD protocol was able to detect multiple serotypes from field epithelial samples, in addition to detecting FMDV in the air surrounding infected cattle, pigs and sheep, including pre-clinical detection. This study describes the development and evaluation of an assay format, which may be used as a future basis for rapid and low cost detection of FMDV. In addition it provides providing “proof of concept” for the future use of LAMP assays to tackle other challenging diagnostic scenarios encompassing veterinary and human health. PMID:25165973

  16. Identification of phlebotomine sand fly blood meals by real-time PCR.

    PubMed

    Sales, Kamila Gaudêncio da Silva; Costa, Pietra Lemos; de Morais, Rayana Carla Silva; Otranto, Domenico; Brandão-Filho, Sinval Pinto; Cavalcanti, Milena de Paiva; Dantas-Torres, Filipe

    2015-04-16

    Phlebotomine sand flies are blood-feeding insects of great medical and veterinary significance acting as vectors of Leishmania parasites. Studying the blood-feeding pattern of these insects may help in the understanding of their interactions with potential reservoir hosts of Leishmania parasites. In this study, we developed real time PCR assays for the identification of sand fly blood meal. Six pairs of primers were designed based on cytochrome b gene sequences available in GenBank of the following potential hosts: dog, cat, horse, chicken, black rat, and human. Firstly, SYBR Green-based real time PCR assays were conducted using a standard curve with eight different concentrations (i.e., 10 ng, 1 ng, 100 pg, 10 pg, 1 pg, 100 fg, 10 fg and 1 fg per 2 μl) of DNA samples extracted from EDTA blood samples from each target animal. Then, DNA samples extracted from field-collected engorged female sand flies belonging to three species (i.e., Lutzomyia longipalpis, L. migonei and L. lenti) were tested by the protocols standardized herein. Additionally, female sand flies were experimentally fed on a black rat (Rattus rattus) and used for evaluating the time course of the detection of the protocol targeting this species. The protocols performed well with detection limits of 10 pg to 100 fg. Field-collected female sand flies were fed on blood from humans (73%), chickens (23%), dogs (22%), horses (15%), black rats (11%) and cats (2%). Interestingly, 76.1% of the L. longipalpis females were positive for human blood. In total, 48% of the tested females were fed on single sources, 31% on two and 12% on three. The analysis of the time course showed that the real time PCR protocol targeting the black rat DNA was able to detect small amounts of the host DNA up to 5 days after the blood meal. The real time PCR assays standardized herein successfully detected small amounts of host DNA in female sand flies fed on different vertebrate species and, specifically for the black rats, up to 5 days after the blood meal. These assays represent promising tools for the identification of blood meal in field-collected female sand flies.

  17. Continuous-flow centrifugation to collect suspended sediment for chemical analysis

    USGS Publications Warehouse

    Conn, Kathleen E.; Dinicola, Richard S.; Black, Robert W.; Cox, Stephen E.; Sheibley, Richard W.; Foreman, James R.; Senter, Craig A.; Peterson, Norman T.

    2016-12-22

    Recent advances in suspended-sediment monitoring tools and surrogate technologies have greatly improved the ability to quantify suspended-sediment concentrations and to estimate daily, seasonal, and annual suspended-sediment fluxes from rivers to coastal waters. However, little is known about the chemical composition of suspended sediment, and how it may vary spatially between water bodies and temporally within a single system owing to climate, seasonality, land use, and other natural and anthropogenic drivers. Many water-quality contaminants, such as organic and inorganic chemicals, nutrients, and pathogens, preferentially partition in sediment rather than water. Suspended sediment-bound chemical concentrations may be undetected during analysis of unfiltered water samples, owing to small water sample volumes and analytical limitations. Quantification of suspended sediment‑bound chemical concentrations is needed to improve estimates of total chemical concentrations, chemical fluxes, and exposure levels of aquatic organisms and humans in receiving environments. Despite these needs, few studies or monitoring programs measure the chemical composition of suspended sediment, largely owing to the difficulty in consistently obtaining samples of sufficient quality and quantity for laboratory analysis.A field protocol is described here utilizing continuous‑flow centrifugation for the collection of suspended sediment for chemical analysis. The centrifuge used for development of this method is small, lightweight, and portable for the field applications described in this protocol. Project scoping considerations, deployment of equipment and system layout options, and results from various field and laboratory quality control experiments are described. The testing confirmed the applicability of the protocol for the determination of many inorganic and organic chemicals sorbed on suspended sediment, including metals, pesticides, polycyclic aromatic hydrocarbons, and polychlorinated biphenyls. The particle-size distribution of the captured sediment changes to a more fine-grained sample during centrifugation, and the necessity to account for this change when extrapolating chemical concentrations on the centrifuged sediment sample to the environmental water system is discussed.The data produced using this method will help eliminate a data gap of suspended sediment-bound chemical concentrations, and will support management decisions, such as chemical source-control efforts or in-stream restoration activities. When coupled with streamflow and sediment flux data, it will improve estimates of riverine chemical fluxes, and will aid in assessing the importance and impacts of suspended sediment-bound chemicals to downstream freshwater and coastal marine ecosystems.

  18. In Search of a Dipole Field during the Plio-Pleistocene

    NASA Astrophysics Data System (ADS)

    Asefaw, H. F.; Tauxe, L.; Staudigel, H.; Shaar, R.; Cai, S.; Cromwell, G.; Behar, N.; Koppers, A. A. P.

    2017-12-01

    A geocentric axial dipole (GAD) field accounts for the majority of the modern field and is assumed to be a good first order approximation for the time averaged ancient field. A GAD field predicts a latitudinal dependence of intensity. Given this relationship, the intensity of the field measured at the North and South poles should be twice as strong as the intensity recorded at the equator. The current paleointensity database- archived at both http://earth.liv.ac.uk/pint/ and http://earthref.org/MagIC - shows no such dependency over the last 5 Myr (e.g. Lawrence et al., 2009, doi: 10.1029/2008GC002072; Cromwell et al., 2015; doi: 10.1002/2014JB011828). In order to investigate whether better experimental protocol or data selection approaches could resolve the problem, we: 1) applied a new data selection protocol (CCRIT) which has recovered historical field values with high precision and accuracy (Cromwell et al., 2015), 2) re-sampled the fine grained tops of lava flows in Antarctica (77.9° S) that were previously studied for paleodirections but failed to meet our strict selection criteria, 3) sampled cinder cones in the Golan Heights (33.08° N), and 4) acquired data from lava flows from the HSDP2 drill core in Hawaii (19.71° N ). New and published Ar-Ar dates demonstrate that all the samples formed in the last 5 Myr. We conducted IZZI modified Thellier-Thellier experiments and then calculated paleointensities from the samples that passed a set of strict selection criteria. After applying the CCRIT criteria to our data, we find a time averaged paleointensity of 35.7 ±6.86 μT in the Golan Heights, 34.5 μT in Hawaii, and 34.22 ±3.4 μT in Antarctica. New results from Iceland (64° N), published by Cromwell et al. (2015, doi: 10.1002/2014JB011828), also pass the CCRIT criteria and record an average intensity of 33.1 ± 8.3 μT. The average paleointensities from the Golan Heights, Antarctica, Iceland and Hawaii, that span the last 5 Myr and pass the CCRIT criteria, fail to show the variation of intensity with latitude that is expected of an ideal GAD field. The question remains as to why.

  19. The Development and Piloting of a Mobile Data Collection Protocol to Assess Compliance With a National Tobacco Advertising, Promotion, and Product Display Ban at Retail Venues in the Russian Federation

    PubMed Central

    Grant, Ashley S; Spires, Mark H; Cohen, Joanna E

    2016-01-01

    Background Tobacco control policies that lead to a significant reduction in tobacco industry marketing can improve public health by reducing consumption of tobacco and preventing initiation of tobacco use. Laws that ban or restrict advertising and promotion in point-of-sale (POS) environments, in the moment when consumers decide whether or not to purchase a tobacco product, must be correctly implemented to achieve the desired public health benefits. POS policy compliance assessments can support implementation; however, there are challenges to conducting evaluations that are rigorous, cost-effective, and timely. Data collection must be discreet, accurate, and systematic, and ideally collected both before and after policies take effect. The use of mobile phones and other mobile technology provide opportunities to efficiently collect data and support effective tobacco control policies. The Russian Federation (Russia) passed a comprehensive national tobacco control law that included a ban on most forms of tobacco advertising and promotion, effective November 15, 2013. The legislation further prohibited the display of tobacco products at retail trade sites and eliminated kiosks as a legal trade site, effective June 1, 2014. Objective The objective of the study was to develop and test a mobile data collection protocol including: (1) retailer sampling, (2) adaptation of survey instruments for mobile phones, and (3) data management protocols. Methods Two waves of observations were conducted; wave 1 took place during April-May 2014, after the advertising and promotion bans were effective, and again in August-September 2014, after the product display ban and elimination of tobacco sales in kiosks came into effect. Sampling took place in 5 Russian cities: Moscow, St. Petersburg, Novosibirsk, Yekaterinburg, and Kazan. Lack of access to a comprehensive list of licensed tobacco retailers necessitated a sampling approach that included the development of a walking protocol to identify tobacco retailers to observe. Observation instruments were optimized for use on mobile devices and included the collection of images/photos and the geographic location of retailers. Data were uploaded in real-time to a remote (“cloud-based”) server accessible via Internet and verified with the use of a data management protocol that included submission of daily field notes from the research team for review by project managers. Results The walking protocol was a practical means of identifying 780 relevant retail venues in Russia, in the absence of reliable sampling resources. Mobile phones were convenient tools for completing observation checklists discretely and accurately. Daily field notes and meticulous oversight of collected data were critical to ensuring data quality. Conclusions Mobile technology can support timely and accurate data collection and also help monitor data quality through the use of real-time uploads. These protocols can be adapted to assess compliance with other types of public health policies. PMID:27580800

  20. The Development and Piloting of a Mobile Data Collection Protocol to Assess Compliance With a National Tobacco Advertising, Promotion, and Product Display Ban at Retail Venues in the Russian Federation.

    PubMed

    Grant, Ashley S; Kennedy, Ryan D; Spires, Mark H; Cohen, Joanna E

    2016-08-31

    Tobacco control policies that lead to a significant reduction in tobacco industry marketing can improve public health by reducing consumption of tobacco and preventing initiation of tobacco use. Laws that ban or restrict advertising and promotion in point-of-sale (POS) environments, in the moment when consumers decide whether or not to purchase a tobacco product, must be correctly implemented to achieve the desired public health benefits. POS policy compliance assessments can support implementation; however, there are challenges to conducting evaluations that are rigorous, cost-effective, and timely. Data collection must be discreet, accurate, and systematic, and ideally collected both before and after policies take effect. The use of mobile phones and other mobile technology provide opportunities to efficiently collect data and support effective tobacco control policies. The Russian Federation (Russia) passed a comprehensive national tobacco control law that included a ban on most forms of tobacco advertising and promotion, effective November 15, 2013. The legislation further prohibited the display of tobacco products at retail trade sites and eliminated kiosks as a legal trade site, effective June 1, 2014. The objective of the study was to develop and test a mobile data collection protocol including: (1) retailer sampling, (2) adaptation of survey instruments for mobile phones, and (3) data management protocols. Two waves of observations were conducted; wave 1 took place during April-May 2014, after the advertising and promotion bans were effective, and again in August-September 2014, after the product display ban and elimination of tobacco sales in kiosks came into effect. Sampling took place in 5 Russian cities: Moscow, St. Petersburg, Novosibirsk, Yekaterinburg, and Kazan. Lack of access to a comprehensive list of licensed tobacco retailers necessitated a sampling approach that included the development of a walking protocol to identify tobacco retailers to observe. Observation instruments were optimized for use on mobile devices and included the collection of images/photos and the geographic location of retailers. Data were uploaded in real-time to a remote ("cloud-based") server accessible via Internet and verified with the use of a data management protocol that included submission of daily field notes from the research team for review by project managers. The walking protocol was a practical means of identifying 780 relevant retail venues in Russia, in the absence of reliable sampling resources. Mobile phones were convenient tools for completing observation checklists discretely and accurately. Daily field notes and meticulous oversight of collected data were critical to ensuring data quality. Mobile technology can support timely and accurate data collection and also help monitor data quality through the use of real-time uploads. These protocols can be adapted to assess compliance with other types of public health policies.

  1. Microfluidics for rapid cytokeratin immunohistochemical staining in frozen sections.

    PubMed

    Brajkovic, Saska; Dupouy, Diego G; de Leval, Laurence; Gijs, Martin Am

    2017-08-01

    Frozen sections (FS) of tumor samples represent a cornerstone of pathological intraoperative consultation and have an important role in the microscopic analysis of specimens during surgery. So far, immunohistochemical (IHC) stainings on FS have been demonstrated for a few markers using manual methods. Microfluidic technologies have proven to bring substantial improvement in many fields of diagnostics, though only a few microfluidic devices have been designed to improve the performance of IHC assays. In this work, we show optimization of a complete pan-cytokeratin chromogenic immunostaining protocol on FS using a microfluidic tissue processor into a protocol taking <12 min. Our results showed specificity and low levels of background. The dimensions of the microfluidic prototype device are compatible with the space constraints of an intraoperative pathology laboratory. We therefore anticipate that the adoption of microfluidic technologies in the field of surgical pathology can significantly improve the way FSs influence surgical procedures.

  2. Microfluidics for rapid cytokeratin immunohistochemical staining in frozen sections

    PubMed Central

    Brajkovic, Saska; Dupouy, Diego G.; de Leval, Laurence; Gijs, Martin A. M.

    2017-01-01

    Frozen sections (FS) of tumor samples represent a cornerstone of pathological intraoperative consultation and play an important role in the microscopic analysis of specimens during surgery. So far, immunohistochemical (IHC) stainings on FS have been demonstrated for a few markers using manual methods. Microfluidic technologies have proven to bring substantial improvement in many fields of diagnostics, though only a few microfluidic devices have been designed to improve the performance of IHC assays. In this work, we show optimization of a complete pan-cytokeratin chromogenic immunostaining protocol on FS using a microfluidic tissue processor, into a protocol taking less than 12 minutes. Our results showed specificity and low levels of background. The dimensions of the microfluidic prototype device are compatible with the space constraints of an intraoperative pathology laboratory. We therefore anticipate that the adoption of microfluidic technologies in the field of surgical pathology can significantly improve the way FSs influence surgical procedures. PMID:28553936

  3. Monitoring well utility in a heterogeneous DNAPL source zone area: Insights from proximal multilevel sampler wells and sampling capture-zone modelling

    NASA Astrophysics Data System (ADS)

    McMillan, Lindsay A.; Rivett, Michael O.; Wealthall, Gary P.; Zeeb, Peter; Dumble, Peter

    2018-03-01

    Groundwater-quality assessment at contaminated sites often involves the use of short-screen (1.5 to 3 m) monitoring wells. However, even over these intervals considerable variation may occur in contaminant concentrations in groundwater adjacent to the well screen. This is especially true in heterogeneous dense non-aqueous phase liquid (DNAPL) source zones, where cm-scale contamination variability may call into question the effectiveness of monitoring wells to deliver representative data. The utility of monitoring wells in such settings is evaluated by reference to high-resolution multilevel sampler (MLS) wells located proximally to short-screen wells, together with sampling capture-zone modelling to explore controls upon well sample provenance and sensitivity to monitoring protocols. Field data are analysed from the highly instrumented SABRE research site that contained an old trichloroethene source zone within a shallow alluvial aquifer at a UK industrial facility. With increased purging, monitoring-well samples tend to a flow-weighted average concentration but may exhibit sensitivity to the implemented protocol and degree of purging. Formation heterogeneity adjacent to the well-screen particularly, alongside pump-intake position and water level, influence this sensitivity. Purging of low volumes is vulnerable to poor reproducibility arising from concentration variability predicted over the initial 1 to 2 screen volumes purged. Marked heterogeneity may also result in limited long-term sample concentration stabilization. Development of bespoke monitoring protocols, that consider screen volumes purged, alongside water-quality indicator parameter stabilization, is recommended to validate and reduce uncertainty when interpreting monitoring-well data within source zone areas. Generalised recommendations on monitoring well based protocols are also developed. A key monitoring well utility is their proportionately greater sample draw from permeable horizons constituting a significant contaminant flux pathway and hence representative fraction of source mass flux. Acquisition of complementary, high-resolution, site monitoring data, however, vitally underpins optimal interpretation of monitoring-well datasets and appropriate advancement of a site conceptual model and remedial implementation.

  4. Comparison of three methods of DNA extraction from peripheral blood mononuclear cells and lung fragments of equines.

    PubMed

    Santos, E M; Paula, J F R; Motta, P M C; Heinemann, M B; Leite, R C; Haddad, J P A; Del Puerto, H L; Reis, J K P

    2010-08-17

    We compared three different protocols for DNA extraction from horse peripheral blood mononuclear cells (PBMC) and lung fragments, determining average final DNA concentration, purity, percentage of PCR amplification using beta-actin, and cost. Thirty-four samples from PBMC, and 33 samples from lung fragments were submitted to DNA extraction by three different protocols. Protocol A consisted of a phenol-chloroform and isoamylic alcohol extraction, Protocol B used alkaline extraction with NaOH, and Protocol C used the DNAzol((R)) reagent kit. Protocol A was the best option for DNA extraction from lung fragments, producing high DNA concentrations, with high sensitivity in PCR amplification (100%), followed by Protocols C and B. On the other hand, for PBMC samples, Protocol B gave the highest sensitivity in PCR amplification (100%), followed by Protocols C and A. We conclude that Protocol A should be used for PCR diagnosis from lung fragment samples, while Protocol B should be used for PBMC.

  5. Feasibility of Automatic Extraction of Electronic Health Data to Evaluate a Status Epilepticus Clinical Protocol.

    PubMed

    Hafeez, Baria; Paolicchi, Juliann; Pon, Steven; Howell, Joy D; Grinspan, Zachary M

    2016-05-01

    Status epilepticus is a common neurologic emergency in children. Pediatric medical centers often develop protocols to standardize care. Widespread adoption of electronic health records by hospitals affords the opportunity for clinicians to rapidly, and electronically evaluate protocol adherence. We reviewed the clinical data of a small sample of 7 children with status epilepticus, in order to (1) qualitatively determine the feasibility of automated data extraction and (2) demonstrate a timeline-style visualization of each patient's first 24 hours of care. Qualitatively, our observations indicate that most clinical data are well labeled in structured fields within the electronic health record, though some important information, particularly electroencephalography (EEG) data, may require manual abstraction. We conclude that a visualization that clarifies a patient's clinical course can be automatically created using the patient's electronic clinical data, supplemented with some manually abstracted data. Future work could use this timeline to evaluate adherence to status epilepticus clinical protocols. © The Author(s) 2015.

  6. Guidelines and sample protocol for sampling forest gaps.

    Treesearch

    J.R. Runkle

    1992-01-01

    A protocol for sampling forest canopy gaps is presented. Methods used in published gap studies are reviewed. The sample protocol will be useful in developing a broader understanding of forest structure and dynamics through comparative studies across different forest ecosystems.

  7. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR STANDARD PROTOCOL FOR CLEANING LABORATORY AND FIELD SAMPLING APPARATUS (UA-L-5.1)

    EPA Science Inventory

    The purpose of this SOP is to describe the standard approach used for cleaning glassware and plasticware during the Arizona NHEXAS project and the Border study. Keywords: lab; equipment; cleaning.

    The U.S.-Mexico Border Program is sponsored by the Environmental Health Workgroup...

  8. It's Time to Develop a New "Draft Test Protocol" for a Mars Sample Return Mission (or Two…).

    PubMed

    Rummel, John D; Kminek, Gerhard

    2018-04-01

    The last time NASA envisioned a sample return mission from Mars, the development of a protocol to support the analysis of the samples in a containment facility resulted in a "Draft Test Protocol" that outlined required preparations "for the safe receiving, handling, testing, distributing, and archiving of martian materials here on Earth" (Rummel et al., 2002 ). This document comprised a specific protocol to be used to conduct a biohazard test for a returned martian sample, following the recommendations of the Space Studies Board of the US National Academy of Sciences. Given the planned launch of a sample-collecting and sample-caching rover (Mars 2020) in 2 years' time, and with a sample return planned for the end of the next decade, it is time to revisit the Draft Test Protocol to develop a sample analysis and biohazard test plan to meet the needs of these future missions. Key Words: Biohazard detection-Mars sample analysis-Sample receiving facility-Protocol-New analytical techniques-Robotic sample handling. Astrobiology 18, 377-380.

  9. Role of conformational sampling in computing mutation-induced changes in protein structure and stability.

    PubMed

    Kellogg, Elizabeth H; Leaver-Fay, Andrew; Baker, David

    2011-03-01

    The prediction of changes in protein stability and structure resulting from single amino acid substitutions is both a fundamental test of macromolecular modeling methodology and an important current problem as high throughput sequencing reveals sequence polymorphisms at an increasing rate. In principle, given the structure of a wild-type protein and a point mutation whose effects are to be predicted, an accurate method should recapitulate both the structural changes and the change in the folding-free energy. Here, we explore the performance of protocols which sample an increasing diversity of conformations. We find that surprisingly similar performances in predicting changes in stability are achieved using protocols that involve very different amounts of conformational sampling, provided that the resolution of the force field is matched to the resolution of the sampling method. Methods involving backbone sampling can in some cases closely recapitulate the structural changes accompanying mutations but not surprisingly tend to do more harm than good in cases where structural changes are negligible. Analysis of the outliers in the stability change calculations suggests areas needing particular improvement; these include the balance between desolvation and the formation of favorable buried polar interactions, and unfolded state modeling. Copyright © 2010 Wiley-Liss, Inc.

  10. IFSA: a microfluidic chip-platform for frit-based immunoassay protocols

    NASA Astrophysics Data System (ADS)

    Hlawatsch, Nadine; Bangert, Michael; Miethe, Peter; Becker, Holger; Gärtner, Claudia

    2013-03-01

    Point-of-care diagnostics (POC) is one of the key application fields for lab-on-a-chip devices. While in recent years much of the work has concentrated on integrating complex molecular diagnostic assays onto a microfluidic device, there is a need to also put comparatively simple immunoassay-type protocols on a microfluidic platform. In this paper, we present the development of a microfluidic cartridge using an immunofiltration approach. In this method, the sandwich immunoassay takes place in a porous frit on which the antibodies have immobilized. The device is designed to be able to handle three samples in parallel and up to four analytical targets per sample. In order to meet the critical cost targets for the diagnostic market, the microfluidic chip has been designed and manufactured using high-volume manufacturing technologies in mind. Validation experiments show comparable sensitivities in comparison with conventional immunofiltration kits.

  11. Rapid and reliable high-throughput methods of DNA extraction for use in barcoding and molecular systematics of mushrooms.

    PubMed

    Dentinger, Bryn T M; Margaritescu, Simona; Moncalvo, Jean-Marc

    2010-07-01

    We present two methods for DNA extraction from fresh and dried mushrooms that are adaptable to high-throughput sequencing initiatives, such as DNA barcoding. Our results show that these protocols yield ∼85% sequencing success from recently collected materials. Tests with both recent (<2 year) and older (>100 years) specimens reveal that older collections have low success rates and may be an inefficient resource for populating a barcode database. However, our method of extracting DNA from herbarium samples using small amount of tissue is reliable and could be used for important historical specimens. The application of these protocols greatly reduces time, and therefore cost, of generating DNA sequences from mushrooms and other fungi vs. traditional extraction methods. The efficiency of these methods illustrates that standardization and streamlining of sample processing should be shifted from the laboratory to the field. © 2009 Blackwell Publishing Ltd.

  12. A client-treatment matching protocol for therapeutic communities: first report.

    PubMed

    Melnick, G; De Leon, G; Thomas, G; Kressel, D

    2001-10-01

    The present study is the first report on a client-treatment matching protocol (CMP) to guide admissions to residential and outpatient substance abuse treatment settings. Two cohorts, a field test sample (n = 318) and cross-validation (n = 407) sample were drawn from consecutive admissions to nine geographically distributed multisetting therapeutic communities (TCs). A passive matching design was employed. Clients received the CMP on admission, but agencies were "blind" to the CMP treatment recommendation (i.e., match) and assigned clients to treatment by the usual intake procedures. Bivariate and logistical regression analyses show that positive treatment dispositions (treatment completion or longer retention in treatment)) were significantly higher among the CMP-matched clients. The present findings provide the empirical basis for studies assessing the validity and utility of the CMP with controlled designs. Though limited to TC-oriented agencies, the present research supports the use of objective matching criteria to improve treatment.

  13. Lethal cardiac amyloidosis: Modification of the Congo Red technique on a forensic case.

    PubMed

    Rancati, A; Andreola, S; Bailo, P; Boracchi, M; Fociani, P; Gentile, G; Zoja, R

    2018-05-26

    Congo Red staining is usually used in diagnosing amyloidosis, a pathology characterized by the storage of abnormal proteins in several human organs. When assessed on samples fixated in formalin and embended in paraffin, this staining can undergo several artefacts, causing diagnostic and interpretative difficulties due to its weak stainability and a consequent reduced visibility of the amyloid. These complications, in time, requested several variations of this staining technique, especially in clinical practice, while in the forensic field no protocols has ever been adapted to cadaveric samples, a material that is already characteristically burdened by a peculiar stainability. In our work, studying a sudden death caused by cardiac amyloidosis and diagnosed only with post-mortem exams, we present a modified Congo Red staining used with the purpose to demonstrate amyloid in cadaveric material after the unsuccessfully use of all standard protocols. Copyright © 2018. Published by Elsevier B.V.

  14. Residual stresses investigations in composite samples by speckle interferometry and specimen repositioning

    NASA Astrophysics Data System (ADS)

    Baldi, Alfonso; Jacquot, Pierre

    2003-05-01

    Graphite-epoxy laminates are subjected to the "incremental hole-drilling" technique in order to investigate the residual stresses acting within each layer of the composite samples. In-plane speckle interferometry is used to measure the displacement field created by each drilling increment around the hole. Our approach features two particularities (1) we rely on the precise repositioning of the samples in the optical set-up after each new boring step, performed by means of a high precision, numerically controlled milling machine in the workshop; (2) for each increment, we acquire three displacement fields, along the length, the width of the samples, and at 45°, using a single symmetrical double beam illumination and a rotary stage holding the specimens. The experimental protocol is described in detail and the experimental results are presented, including a comparison with strain gages. Speckle interferometry appears as a suitable method to respond to the increasing demand for residual stress determination in composite samples.

  15. Favorable Geochemistry from Springs and Wells in Colorado

    DOE Data Explorer

    Richard E. Zehner

    2012-02-01

    This layer contains favorable geochemistry for high-temperature geothermal systems, as interpreted by Richard "Rick" Zehner. The data is compiled from the data obtained from the USGS. The original data set combines 15,622 samples collected in the State of Colorado from several sources including 1) the original Geotherm geochemical database, 2) USGS NWIS (National Water Information System), 3) Colorado Geological Survey geothermal sample data, and 4) original samples collected by R. Zehner at various sites during the 2011 field season. These samples are also available in a separate shapefile FlintWaterSamples.shp. Data from all samples were reportedly collected using standard water sampling protocols (filtering through 0.45 micron filter, etc.) Sample information was standardized to ppm (micrograms/liter) in spreadsheet columns. Commonly-used cation and silica geothermometer temperature estimates are included.

  16. Automated paleomagnetic and rock magnetic data acquisition with an in-line horizontal "2G" system

    NASA Astrophysics Data System (ADS)

    Mullender, Tom A. T.; Frederichs, Thomas; Hilgenfeldt, Christian; de Groot, Lennart V.; Fabian, Karl; Dekkers, Mark J.

    2016-09-01

    Today's paleomagnetic and magnetic proxy studies involve processing of large sample collections while simultaneously demanding high quality data and high reproducibility. Here we describe a fully automated interface based on a commercial horizontal pass-through "2G" DC-SQUID magnetometer. This system is operational at the universities of Bremen (Germany) and Utrecht (Netherlands) since 1998 and 2006, respectively, while a system is currently being built at NGU Trondheim (Norway). The magnetometers are equipped with "in-line" alternating field (AF) demagnetization, a direct-current bias field coil along the coaxial AF demagnetization coil for the acquisition of anhysteretic remanent magnetization (ARM) and a long pulse-field coil for the acquisition of isothermal remanent magnetization (IRM). Samples are contained in dedicated low magnetization perspex holders that are manipulated by a pneumatic pick-and-place-unit. Upon desire samples can be measured in several positions considerably enhancing data quality in particular for magnetically weak samples. In the Bremen system, the peak of the IRM pulse fields is actively measured which reduces the discrepancy between the set field and the field that is actually applied. Techniques for quantifying and removing gyroremanent overprints and for measuring the viscosity of IRM further extend the range of applications of the system. Typically c. 300 paleomagnetic samples can be AF demagnetized per week (15 levels) in the three-position protocol. The versatility of the system is illustrated by several examples of paleomagnetic and rock magnetic data processing.

  17. Comprehensive Non-Destructive Conservation Documentation of Lunar Samples Using High-Resolution Image-Based 3D Reconstructions and X-Ray CT Data

    NASA Technical Reports Server (NTRS)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Hanna, R. D.; Ketcham, R. A.

    2015-01-01

    Established contemporary conservation methods within the fields of Natural and Cultural Heritage encourage an interdisciplinary approach to preservation of heritage material (both tangible and intangible) that holds "Outstanding Universal Value" for our global community. NASA's lunar samples were acquired from the moon for the primary purpose of intensive scientific investigation. These samples, however, also invoke cultural significance, as evidenced by the millions of people per year that visit lunar displays in museums and heritage centers around the world. Being both scientifically and culturally significant, the lunar samples require a unique conservation approach. Government mandate dictates that NASA's Astromaterials Acquisition and Curation Office develop and maintain protocols for "documentation, preservation, preparation and distribution of samples for research, education and public outreach" for both current and future collections of astromaterials. Documentation, considered the first stage within the conservation methodology, has evolved many new techniques since curation protocols for the lunar samples were first implemented, and the development of new documentation strategies for current and future astromaterials is beneficial to keeping curation protocols up to date. We have developed and tested a comprehensive non-destructive documentation technique using high-resolution image-based 3D reconstruction and X-ray CT (XCT) data in order to create interactive 3D models of lunar samples that would ultimately be served to both researchers and the public. These data enhance preliminary scientific investigations including targeted sample requests, and also provide a new visual platform for the public to experience and interact with the lunar samples. We intend to serve these data as they are acquired on NASA's Astromaterials Acquisistion and Curation website at http://curator.jsc.nasa.gov/. Providing 3D interior and exterior documentation of astromaterial samples addresses the increasing demands for accessability to data and contemporary techniques for documentation, which can be realized for both current collections as well as future sample return missions.

  18. Determination of the magnetocaloric entropy change by field sweep using a heat flux setup

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monteiro, J. C. B., E-mail: jolmiui@gmail.com; Reis, R. D. dos; Mansanares, A. M.

    2014-08-18

    We report on a simple setup using a heat flux sensor adapted to a Quantum Design Physical Property Measurement System to determine the magnetocaloric entropy change (ΔS). The major differences for the existing setups are the simplicity of this assembly and the ease to obtain the isothermal entropy change either by a field sweep or a temperature sweep process. We discuss the use of these two processes applied to Gd and Gd{sub 5}Ge{sub 2}Si{sub 2} samples. The results are compared to the temperature sweep measurements and they show the advantages of this setup and of the field sweep procedure. Wemore » found a significant reduction of ΔS and on the refrigerating cooling power (RCP) at low field changes in a field sweep process when the sample is not driven to the same initial state for each temperature. We show that the field sweep process without any measuring protocol is the only correct way to experimentally determine ΔS and RCP for a practical regenerative refrigerator.« less

  19. Mars Sample Handling Protocol Workshop Series: Workshop 4

    NASA Technical Reports Server (NTRS)

    Race Margaret S. (Editor); DeVincenzi, Donald L. (Editor); Rummel, John D. (Editor); Acevedo, Sara E. (Editor)

    2001-01-01

    In preparation for missions to Mars that will involve the return of samples to Earth, it will be necessary to prepare for the receiving, handling, testing, distributing, and archiving of martian materials here on Earth. Previous groups and committees have studied selected aspects of sample return activities, but specific detailed protocols for the handling and testing of returned samples must still be developed. To further refine the requirements for sample hazard testing and to develop the criteria for subsequent release of sample materials from quarantine, the NASA Planetary Protection Officer convened a series of workshops in 2000-2001. The overall objective of the Workshop Series was to produce a Draft Protocol by which returned martian sample materials can be assessed for biological hazards and examined for evidence of life (extant or extinct) while safeguarding the purity of the samples from possible terrestrial contamination. This report also provides a record of the proceedings of Workshop 4, the final Workshop of the Series, which was held in Arlington, Virginia, June 5-7, 2001. During Workshop 4, the sub-groups were provided with a draft of the protocol compiled in May 2001 from the work done at prior Workshops in the Series. Then eight sub-groups were formed to discuss the following assigned topics: Review and Assess the Draft Protocol for Physical/Chemical Testing Review and Assess the Draft Protocol for Life Detection Testing Review and Assess the Draft Protocol for Biohazard Testing Environmental and Health/Monitoring and Safety Issues Requirements of the Draft Protocol for Facilities and Equipment Contingency Planning for Different Outcomes of the Draft Protocol Personnel Management Considerations in Implementation of the Draft Protocol Draft Protocol Implementation Process and Update Concepts This report provides the first complete presentation of the Draft Protocol for Mars Sample Handling to meet planetary protection needs. This Draft Protocol, which was compiled from deliberations and recommendations from earlier Workshops in the Series, represents a consensus that emerged from the discussions of all the sub-groups assembled over the course of the five Workshops of the Series. These discussions converged on a conceptual approach to sample handling, as well as on specific analytical requirements. Discussions also identified important issues requiring attention, as well as research and development needed for protocol implementation.

  20. Low-field and high-field magnetic resonance contrast imaging of magnetoferritin as a pathological model system of iron accumulation

    NASA Astrophysics Data System (ADS)

    Strbak, Oliver; Balejcikova, Lucia; Baciak, Ladislav; Kovac, Jozef; Masarova-Kozelova, Marta; Krafcik, Andrej; Dobrota, Dusan; Kopcansky, Peter

    2017-09-01

    Various pathological processes including neurodegenerative disorders are associated with the accumulation of iron, while it is believed that a precursor of iron accumulation is ferritin. Physiological ferritin is due to low relaxivity, which results in only weak detection by magnetic resonance imaging (MRI) techniques. On the other hand, pathological ferritin is associated with disrupted iron homeostasis and structural changes in the mineral core, and should increase the hypointensive artefacts in MRI. On the basis of recent findings in respect to the pathological ferritin structure, we prepared the magnetoferritin particles as a possible pathological ferritin model system. The particles were characterised with dynamic light scattering, as well as with superconducting quantum interference device measurements. With the help of low-field (0.2 T) and high-field (4.7 T) MRI standard T 2-weighted protocols we found that it is possible to clearly distinguish between native ferritin as a physiological model system, and magnetoferritin as a pathological model system. Surprisingly, the T 2-weighted short TI inversion recovery protocol at low-field system showed the optimum contrast differentiation. Such findings are highly promising for exploiting the use of iron accumulation as a noninvasive diagnostics tool of pathological processes, where the magnetoferritin particles could be utilised as MRI iron quantification calibration samples.

  1. Evaluation of storage and filtration protocols for alpine/subalpine lake water quality samples

    Treesearch

    John L. Korfmacher; Robert C. Musselman

    2007-01-01

    Many government agencies and other organizations sample natural alpine and subalpine surface waters using varying protocols for sample storage and filtration. Simplification of protocols would be beneficial if it could be shown that sample quality is unaffected. In this study, samples collected from low ionic strength waters in alpine and subalpine lake inlets...

  2. 7 CFR 301.92-11 - Inspection and sampling protocols.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 5 2010-01-01 2010-01-01 false Inspection and sampling protocols. 301.92-11 Section... Inspection and sampling protocols. Type(s) of plants in the nursery Type(s) of plants shipped interstate... interstate. (1) Annual inspection, sampling, and testing—(i) Inspection. The nursery must be inspected...

  3. 7 CFR 301.92-11 - Inspection and sampling protocols.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 5 2011-01-01 2011-01-01 false Inspection and sampling protocols. 301.92-11 Section... Inspection and sampling protocols. Type(s) of plants in the nursery Type(s) of plants shipped interstate... interstate. (1) Annual inspection, sampling, and testing—(i) Inspection. The nursery must be inspected...

  4. Identification of Phosphorylated Proteins on a Global Scale.

    PubMed

    Iliuk, Anton

    2018-05-31

    Liquid chromatography (LC) coupled with tandem mass spectrometry (MS/MS) has enabled researchers to analyze complex biological samples with unprecedented depth. It facilitates the identification and quantification of modifications within thousands of proteins in a single large-scale proteomic experiment. Analysis of phosphorylation, one of the most common and important post-translational modifications, has particularly benefited from such progress in the field. Here, detailed protocols are provided for a few well-regarded, common sample preparation methods for an effective phosphoproteomic experiment. © 2018 by John Wiley & Sons, Inc. Copyright © 2018 John Wiley & Sons, Inc.

  5. A technique for measuring petal gloss, with examples from the Namaqualand flora.

    PubMed

    Whitney, Heather M; Rands, Sean A; Elton, Nick J; Ellis, Allan G

    2012-01-01

    The degree of floral gloss varies between species. However, little is known about this distinctive floral trait, even though it could be a key feature of floral biotic and abiotic interactions. One reason for the absence of knowledge is the lack of a simple, repeatable method of gloss measurement that can be used in the field to study floral gloss. A protocol is described for measuring gloss in petal samples collected in the field, using a glossmeter. Repeatability of the technique is assessed. We demonstrate a simple yet highly accurate and repeatable method that can easily be implemented in the field. We also highlight the huge variety of glossiness found within flowers and between species in a sample of spring-blooming flowers collected in Namaqualand, South Africa. We discuss the potential uses of this method and its applications for furthering studies in plant-pollinator interactions. We also discuss the potential functions of gloss in flowers.

  6. Direct and long-term detection of gene doping in conventional blood samples.

    PubMed

    Beiter, T; Zimmermann, M; Fragasso, A; Hudemann, J; Niess, A M; Bitzer, M; Lauer, U M; Simon, P

    2011-03-01

    The misuse of somatic gene therapy for the purpose of enhancing athletic performance is perceived as a coming threat to the world of sports and categorized as 'gene doping'. This article describes a direct detection approach for gene doping that gives a clear yes-or-no answer based on the presence or absence of transgenic DNA in peripheral blood samples. By exploiting a priming strategy to specifically amplify intronless DNA sequences, we developed PCR protocols allowing the detection of very small amounts of transgenic DNA in genomic DNA samples to screen for six prime candidate genes. Our detection strategy was verified in a mouse model, giving positive signals from minute amounts (20 μl) of blood samples for up to 56 days following intramuscular adeno-associated virus-mediated gene transfer, one of the most likely candidate vector systems to be misused for gene doping. To make our detection strategy amenable for routine testing, we implemented a robust sample preparation and processing protocol that allows cost-efficient analysis of small human blood volumes (200 μl) with high specificity and reproducibility. The practicability and reliability of our detection strategy was validated by a screening approach including 327 blood samples taken from professional and recreational athletes under field conditions.

  7. Study of microtip-based extraction and purification of DNA from human samples for portable devices

    NASA Astrophysics Data System (ADS)

    Fotouhi, Gareth

    DNA sample preparation is essential for genetic analysis. However, rapid and easy-to-use methods are a major challenge to obtaining genetic information. Furthermore, DNA sample preparation technology must follow the growing need for point-of-care (POC) diagnostics. The current use of centrifuges, large robots, and laboratory-intensive protocols has to be minimized to meet the global challenge of limited access healthcare by bringing the lab to patients through POC devices. To address these challenges, a novel extraction method of genomic DNA from human samples is presented by using heat-cured polyethyleneimine-coated microtips generating a high electric field. The microtip extraction method is based on recent work using an electric field and capillary action integrated into an automated device. The main challenges to the method are: (1) to obtain a stable microtip surface for the controlled capture and release of DNA and (2) to improve the recovery of DNA from samples with a high concentration of inhibitors, such as human samples. The present study addresses these challenges by investigating the heat curing of polyethyleneimine (PEI) coated on the surface of the microtip. Heat-cured PEI-coated microtips are shown to control the capture and release of DNA. Protocols are developed for the extraction and purification of DNA from human samples. Heat-cured PEI-coated microtip methods of DNA sample preparation are used to extract genomic DNA from human samples. It is discovered through experiment that heat curing of a PEI layer on a gold-coated surface below 150°C could inhibit the signal of polymerase chain reaction (PCR). Below 150°C, the PEI layer is not completely cured and dissolved off the gold-coated surface. Dissolved PEI binds with DNA to inhibit PCR. Heat curing of a PEI layer above 150°C on a gold-coated surface prevents inhibition to PCR and gel electrophoresis. In comparison to gold-coated microtips, the 225°C-cured PEI-coated microtips improve the recovery of DNA to 45% efficiency. Furthermore, the 225°C-cured PEI-coated microtips recover more DNA than gold-coated microtips when the surface is washed. Heat-cured (225°C) PEI-coated microtips are used for the recovery of human genomic DNA from whole blood. A washing protocol is developed to remove inhibiting particles bound to the PEI-coated microtip surface after DNA extraction. From 1.25 muL of whole blood, an average of 1.83 ng of human genomic DNA is captured, purified, and released using a 225°C-cured PEI-coated microtip in less than 30 minutes. The extracted DNA is profiled by short tandem repeat analysis (STR). For forensic and medical applications, genomic DNA is extracted from dried samples using heat-cured PEI-coated microtips that are integrated into an automated device. DNA extraction from dried samples is critical for forensics. The use of dried samples in the medical field is increasing because dried samples are convenient for storage, biosafety, and contamination. The main challenge is the time required to properly extract DNA in a purified form. Typically, a 1 hour incubation period is required to complete this process. Overnight incubation is sometimes necessary. To address this challenge, a pre-extraction washing step is investigated to remove inhibiting particles from dried blood spots (DBS) before DNA is released from dried form into solution for microtip extraction. The developed protocol is expanded to extract DNA from a variety of dried samples including nasal swabs, buccal swabs, and other forensic samples. In comparison to a commercial kit, the microtip-based extraction reduced the processing time from 1.5 hours to 30 minutes or less with an equivalent concentration of extracted DNA from dried blood spots. The developed assay will benefit genetic studies on newborn screening, forensic investigation, and POC diagnostics.

  8. The United States Department Of Agriculture Northeast Area-wide Tick Control Project: history and protocol.

    PubMed

    Pound, Joe Mathews; Miller, John Allen; George, John E; Fish, Durland

    2009-08-01

    The Northeast Area-wide Tick Control Project (NEATCP) was funded by the United States Department of Agriculture (USDA) as a large-scale cooperative demonstration project of the USDA-Agricultural Research Service (ARS)-patented 4-Poster tick control technology (Pound et al. 1994) involving the USDA-ARS and a consortium of universities, state agencies, and a consulting firm at research locations in the five states of Connecticut (CT), Maryland (MD), New Jersey (NJ), New York (NY), and Rhode Island (RI). The stated objective of the project was "A community-based field trial of ARS-patented tick control technology designed to reduce the risk of Lyme disease in northeastern states." Here we relate the rationale and history of the technology, a chronological listing of events leading to implementation of the project, the original protocol for selecting treatment, and control sites, and protocols for deployment of treatments, sampling, assays, data analyses, and estimates of efficacy.

  9. A Draft Test Protocol for Detecting Possible Biohazards in Martian Samples Returned to Earth

    NASA Technical Reports Server (NTRS)

    Rummel, John D. (Editor); Race, Margaret S.; DeVincenzi, Donald L.; Schad, P. Jackson; Stabekis, Pericles D.; Viso, Michel; Acevedo, Sara E.

    2002-01-01

    This document presents the first complete draft of a protocol for detecting possible biohazards in Mars samples returned to Earth: it is the final product of the Mars Sample Handling Protocol Workshop Series. convened in 2000-2001 by NASA's Planetary Protection Officer. The goal of the five-workshop Series vas to develop a comprehensive protocol by which returned martian sample materials could be assessed k r the presence of any biological hazard(s) while safeguarding the purity of the samples from possible terrestrial contamination.

  10. Optimization and evaluation of single-cell whole-genome multiple displacement amplification.

    PubMed

    Spits, C; Le Caignec, C; De Rycke, M; Van Haute, L; Van Steirteghem, A; Liebaers, I; Sermon, K

    2006-05-01

    The scarcity of genomic DNA can be a limiting factor in some fields of genetic research. One of the methods developed to overcome this difficulty is whole genome amplification (WGA). Recently, multiple displacement amplification (MDA) has proved very efficient in the WGA of small DNA samples and pools of cells, the reaction being catalyzed by the phi29 or the Bst DNA polymerases. The aim of the present study was to develop a reliable, efficient, and fast protocol for MDA at the single-cell level. We first compared the efficiency of phi29 and Bst polymerases on DNA samples and single cells. The phi29 polymerase generated accurately, in a short time and from a single cell, sufficient DNA for a large set of tests, whereas the Bst enzyme showed a low efficiency and a high error rate. A single-cell protocol was optimized using the phi29 polymerase and was evaluated on 60 single cells; the DNA obtained DNA was assessed by 22 locus-specific PCRs. This new protocol can be useful for many applications involving minute quantities of starting material, such as forensic DNA analysis, prenatal and preimplantation genetic diagnosis, or cancer research. (c) 2006 Wiley-Liss, Inc.

  11. Uncertainties in the estimation of specific absorption rate during radiofrequency alternating magnetic field induced non-adiabatic heating of ferrofluids

    NASA Astrophysics Data System (ADS)

    Lahiri, B. B.; Ranoo, Surojit; Philip, John

    2017-11-01

    Magnetic fluid hyperthermia (MFH) is becoming a viable cancer treatment methodology where the alternating magnetic field induced heating of magnetic fluid is utilized for ablating the cancerous cells or making them more susceptible to the conventional treatments. The heating efficiency in MFH is quantified in terms of specific absorption rate (SAR), which is defined as the heating power generated per unit mass. In majority of the experimental studies, SAR is evaluated from the temperature rise curves, obtained under non-adiabatic experimental conditions, which is prone to various thermodynamic uncertainties. A proper understanding of the experimental uncertainties and its remedies is a prerequisite for obtaining accurate and reproducible SAR. Here, we study the thermodynamic uncertainties associated with peripheral heating, delayed heating, heat loss from the sample and spatial variation in the temperature profile within the sample. Using first order approximations, an adiabatic reconstruction protocol for the measured temperature rise curves is developed for SAR estimation, which is found to be in good agreement with those obtained from the computationally intense slope corrected method. Our experimental findings clearly show that the peripheral and delayed heating are due to radiation heat transfer from the heating coils and slower response time of the sensor, respectively. Our results suggest that the peripheral heating is linearly proportional to the sample area to volume ratio and coil temperature. It is also observed that peripheral heating decreases in presence of a non-magnetic insulating shielding. The delayed heating is found to contribute up to ~25% uncertainties in SAR values. As the SAR values are very sensitive to the initial slope determination method, explicit mention of the range of linear regression analysis is appropriate to reproduce the results. The effect of sample volume to area ratio on linear heat loss rate is systematically studied and the results are compared using a lumped system thermal model. The various uncertainties involved in SAR estimation are categorized as material uncertainties, thermodynamic uncertainties and parametric uncertainties. The adiabatic reconstruction is found to decrease the uncertainties in SAR measurement by approximately three times. Additionally, a set of experimental guidelines for accurate SAR estimation using adiabatic reconstruction protocol is also recommended. These results warrant a universal experimental and data analysis protocol for SAR measurements during field induced heating of magnetic fluids under non-adiabatic conditions.

  12. 77 FR 5492 - Magnuson-Stevens Act Provisions; General Provisions for Domestic Fisheries; Application for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-03

    ... assess the performance of an approved sampling protocol and to allow for continued sample collection and... developmental sampling protocol. While this application was being reviewed and was available for public comment, the sampling protocol being tested was adopted into the National Shellfish Sanitation Program by the...

  13. Two-party quantum key agreement protocols under collective noise channel

    NASA Astrophysics Data System (ADS)

    Gao, Hao; Chen, Xiao-Guang; Qian, Song-Rong

    2018-06-01

    Recently, quantum communication has become a very popular research field. The quantum key agreement (QKA) plays an important role in the field of quantum communication, based on its unconditional security in terms of theory. Among all kinds of QKA protocols, QKA protocols resisting collective noise are widely being studied. In this paper, we propose improved two-party QKA protocols resisting collective noise and present a feasible plan for information reconciliation. Our protocols' qubit efficiency has achieved 26.67%, which is the best among all the two-party QKA protocols against collective noise, thus showing that our protocol can improve the transmission efficiency of quantum key agreement.

  14. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    PubMed

    Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W

    2017-02-01

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.

  15. Use of Chironomidae (Diptera) Surface-Floating Pupal Exuviae as a Rapid Bioassessment Protocol for Water Bodies

    PubMed Central

    Kranzfelder, Petra; Anderson, Alyssa M.; Egan, Alexander T.; Mazack, Jane E.; Bouchard,, R. William; Rufer, Moriya M.; Ferrington,, Leonard C.

    2015-01-01

    Rapid bioassessment protocols using benthic macroinvertebrate assemblages have been successfully used to assess human impacts on water quality. Unfortunately, traditional benthic larval sampling methods, such as the dip-net, can be time-consuming and expensive. An alternative protocol involves collection of Chironomidae surface-floating pupal exuviae (SFPE). Chironomidae is a species-rich family of flies (Diptera) whose immature stages typically occur in aquatic habitats. Adult chironomids emerge from the water, leaving their pupal skins, or exuviae, floating on the water’s surface. Exuviae often accumulate along banks or behind obstructions by action of the wind or water current, where they can be collected to assess chironomid diversity and richness. Chironomids can be used as important biological indicators, since some species are more tolerant to pollution than others. Therefore, the relative abundance and species composition of collected SFPE reflect changes in water quality. Here, methods associated with field collection, laboratory processing, slide mounting, and identification of chironomid SFPE are described in detail. Advantages of the SFPE method include minimal disturbance at a sampling area, efficient and economical sample collection and laboratory processing, ease of identification, applicability in nearly all aquatic environments, and a potentially more sensitive measure of ecosystem stress. Limitations include the inability to determine larval microhabitat use and inability to identify pupal exuviae to species if they have not been associated with adult males. PMID:26274889

  16. Use of Chironomidae (Diptera) Surface-Floating Pupal Exuviae as a Rapid Bioassessment Protocol for Water Bodies.

    PubMed

    Kranzfelder, Petra; Anderson, Alyssa M; Egan, Alexander T; Mazack, Jane E; Bouchard, R William; Rufer, Moriya M; Ferrington, Leonard C

    2015-07-24

    Rapid bioassessment protocols using benthic macroinvertebrate assemblages have been successfully used to assess human impacts on water quality. Unfortunately, traditional benthic larval sampling methods, such as the dip-net, can be time-consuming and expensive. An alternative protocol involves collection of Chironomidae surface-floating pupal exuviae (SFPE). Chironomidae is a species-rich family of flies (Diptera) whose immature stages typically occur in aquatic habitats. Adult chironomids emerge from the water, leaving their pupal skins, or exuviae, floating on the water's surface. Exuviae often accumulate along banks or behind obstructions by action of the wind or water current, where they can be collected to assess chironomid diversity and richness. Chironomids can be used as important biological indicators, since some species are more tolerant to pollution than others. Therefore, the relative abundance and species composition of collected SFPE reflect changes in water quality. Here, methods associated with field collection, laboratory processing, slide mounting, and identification of chironomid SFPE are described in detail. Advantages of the SFPE method include minimal disturbance at a sampling area, efficient and economical sample collection and laboratory processing, ease of identification, applicability in nearly all aquatic environments, and a potentially more sensitive measure of ecosystem stress. Limitations include the inability to determine larval microhabitat use and inability to identify pupal exuviae to species if they have not been associated with adult males.

  17. White HDPE bottles as source of serious contamination of water samples with Ba and Zn.

    PubMed

    Reimann, Clemens; Grimstvedt, Andreas; Frengstad, Bjørn; Finne, Tor Erik

    2007-03-15

    During a recent study of surface water quality factory new white high-density polyethylene (HDPE) bottles were used for collecting the water samples. According to the established field protocol of the Geological Survey of Norway the bottles were twice carefully rinsed with water in the field prior to sampling. Several blank samples using milli-Q (ELGA) water (>18.2 MOmega) were also prepared. On checking the analytical results the blanks returned values of Ag, Ba, Sr, V, Zn and Zr. For Ba and Zn the values (c. 300 microg/l and 95 microg/l) were about 10 times above the concentrations that can be expected in natural waters. A laboratory test of the bottles demonstrated that the bottles contaminate the samples with significant amounts of Ba and Zn and some Sr. Simple acid washing of the bottles prior to use did not solve the contamination problem for Ba and Zn. The results suggest that there may exist "clean" and "dirty" HDPE bottles depending on manufacturer/production process. When collecting water samples it is mandatory to check bottles regularly as a possible source of contamination.

  18. Recommended protocols for sampling macrofungi

    Treesearch

    Gregory M. Mueller; John Paul Schmit; Sabine M. Hubndorf Leif Ryvarden; Thomas E. O' Dell; D. Jean Lodge; Patrick R. Leacock; Milagro Mata; Loengrin Umania; Qiuxin (Florence) Wu; Daniel L. Czederpiltz

    2004-01-01

    This chapter discusses several issues regarding reommended protocols for sampling macrofungi: Opportunistic sampling of macrofungi, sampling conspicuous macrofungi using fixed-size, sampling small Ascomycetes using microplots, and sampling a fixed number of downed logs.

  19. National Sample Assessment Protocols

    ERIC Educational Resources Information Center

    Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2012

    2012-01-01

    These protocols represent a working guide for planning and implementing national sample assessments in connection with the national Key Performance Measures (KPMs). The protocols are intended for agencies involved in planning or conducting national sample assessments and personnel responsible for administering associated tenders or contracts,…

  20. Temperature management during semen processing: Impact on boar sperm quality under laboratory and field conditions.

    PubMed

    Schulze, M; Henning, H; Rüdiger, K; Wallner, U; Waberski, D

    2013-12-01

    Freshly collected boar spermatozoa are sensitive to a fast reduction in temperature because of lipid phase transition and phase separation processes. Temperature management during semen processing may determine the quality of stored samples. The aim of this study was to evaluate the influence of isothermic and hypothermic semen processing protocols on boar sperm quality under laboratory and field conditions. In the laboratory study, ejaculates (n = 12) were first diluted (1:1) with Beltsville Thawing Solution (BTS) at 32 °C, then processed either with isothermic (32 °C) or hypothermic (21 °C) BTS, stored at 17 °C, and assessed on days 1, 3, and 6. Temperature curves showed that 150 minutes after the first dilution, semen doses of both groups reached the same temperature. Two-step hypothermic processing resulted in lower sperm motility on days 1 and 6 (P < 0.05). Concomitantly, hypothermally processed samples contained less membrane intact sperm on days 3 and 6 (P < 0.05). Using AndroStar Plus extender instead of BTS reduced the negative effect of hypothermic processing. In the field study, 15 semen samples from each of 23 European artificial insemination studs were evaluated as part of an external quality control program. Semen quality based on motility, membrane integrity, mitochondrial activity, and a thermoresistance test was higher for stations using one-step isothermic dilutions (n = 7) compared with artificial insemination centers using two-step hypothermic protocols (n = 16). Both studies show that chilling injury associated with hypothermic dilution results in lower quality of stored boar semen compared with isothermic dilution and that the type of semen extender affects the outcomes. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Using Wireless Power Meters to Measure Energy Use of Miscellaneous and Electronic Devices in Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    UC Berkeley, Berkeley, CA USA; Brown, Richard; Lanzisera, Steven

    2011-05-24

    Miscellaneous and electronic devices consume about one-third of the primary energy used in U.S. buildings, and their energy use is increasing faster than other end-uses. Despite the success of policies, such as Energy Star, that promote more efficient miscellaneous and electronic products, much remains to be done to address the energy use of these devices if we are to achieve our energy and carbon reduction goals. Developing efficiency strategies for these products depends on better data about their actual usage, but very few studies have collected field data on the long-term energy used by a large sample of devices duemore » to the difficulty and expense of collecting device-level energy data. This paper describes the development of an improved method for collecting device-level energy and power data using small, relatively inexpensive wireless power meters. These meters form a mesh network based on Internet standard protocols and can form networks of hundreds of metering points in a single building. Because the meters are relatively inexpensive and do not require manual data downloading, they can be left in the field for months or years to collect long time-series energy use data. In addition to the metering technology, we also describe a field protocol used to collect comprehensive, robust data on the miscellaneous and electronic devices in a building. The paper presents sample results from several case study buildings, in which all the plug-in devices for several homes were metered, and a representative sample of several hundred plug-in devices in a commercial office building were metered for several months.« less

  2. Integrating occupancy modeling and interview data for corridor identification: A case study for jaguars in Nicaragua

    USGS Publications Warehouse

    Zeller, K.A.; Nijhawan, S.; Salom-Perez, R.; Potosme, S.H.; Hines, J.E.

    2011-01-01

    Corridors are critical elements in the long-term conservation of wide-ranging species like the jaguar (Panthera onca). Jaguar corridors across the range of the species were initially identified using a GIS-based least-cost corridor model. However, due to inherent errors in remotely sensed data and model uncertainties, these corridors warrant field verification before conservation efforts can begin. We developed a novel corridor assessment protocol based on interview data and site occupancy modeling. We divided our pilot study area, in southeastern Nicaragua, into 71, 6. ??. 6 km sampling units and conducted 160 structured interviews with local residents. Interviews were designed to collect data on jaguar and seven prey species so that detection/non-detection matrices could be constructed for each sampling unit. Jaguars were reportedly detected in 57% of the sampling units and had a detection probability of 28%. With the exception of white-lipped peccary, prey species were reportedly detected in 82-100% of the sampling units. Though the use of interview data may violate some assumptions of the occupancy modeling approach for determining 'proportion of area occupied', we countered these shortcomings through study design and interpreting the occupancy parameter, psi, as 'probability of habitat used'. Probability of habitat use was modeled for each target species using single state or multistate models. A combination of the estimated probabilities of habitat use for jaguar and prey was selected to identify the final jaguar corridor. This protocol provides an efficient field methodology for identifying corridors for easily-identifiable species, across large study areas comprised of unprotected, private lands. ?? 2010 Elsevier Ltd.

  3. A Draft Test Protocol for Detecting Possible Biohazards in Martian Samples Returned to Earth

    NASA Technical Reports Server (NTRS)

    Rummel, John D.; Race, Margaret S.; DeVinenzi, Donald L.; Schad, P. Jackson; Stabekis, Pericles D.; Viso, Michel; Acevedo, Sara E.

    2002-01-01

    This document presents the first complete draft of a protocol for detecting possible biohazards in Mars samples returned to Earth; it is the final product of the Mars Sample Handling Protocol Workshop Series, convened in 2000-2001 by NASA's Planetary Protection Officer. The goal of the five-workshop Series vas to develop a comprehensive protocol by which returned martian sample materials could be assessed for the presence of any biological hazard(s) while safeguarding the purity of the samples from possible terrestrial contamination The reference numbers for the proceedings from the five individual Workshops.

  4. Automation of DNA and miRNA co-extraction for miRNA-based identification of human body fluids and tissues.

    PubMed

    Kulstein, Galina; Marienfeld, Ralf; Miltner, Erich; Wiegand, Peter

    2016-10-01

    In the last years, microRNA (miRNA) analysis came into focus in the field of forensic genetics. Yet, no standardized and recommendable protocols for co-isolation of miRNA and DNA from forensic relevant samples have been developed so far. Hence, this study evaluated the performance of an automated Maxwell® 16 System-based strategy (Promega) for co-extraction of DNA and miRNA from forensically relevant (blood and saliva) samples compared to (semi-)manual extraction methods. Three procedures were compared on the basis of recovered quantity of DNA and miRNA (as determined by real-time PCR and Bioanalyzer), miRNA profiling (shown by Cq values and extraction efficiency), STR profiles, duration, contamination risk and handling. All in all, the results highlight that the automated co-extraction procedure yielded the highest miRNA and DNA amounts from saliva and blood samples compared to both (semi-)manual protocols. Also, for aged and genuine samples of forensically relevant traces the miRNA and DNA yields were sufficient for subsequent downstream analysis. Furthermore, the strategy allows miRNA extraction only in cases where it is relevant to obtain additional information about the sample type. Besides, this system enables flexible sample throughput and labor-saving sample processing with reduced risk of cross-contamination. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Serum Dried Samples to Detect Dengue Antibodies: A Field Study.

    PubMed

    Maldonado-Rodríguez, Angelica; Rojas-Montes, Othon; Vazquez-Rosales, Guillermo; Chavez-Negrete, Adolfo; Rojas-Uribe, Magdalena; Posadas-Mondragon, Araceli; Aguilar-Faisal, Leopoldo; Cevallos, Ana Maria; Xoconostle-Cazares, Beatriz; Lira, Rosalia

    2017-01-01

    Dried blood and serum samples are useful resources for detecting antiviral antibodies. The conditions for elution of the sample need to be optimized for each disease. Dengue is a widespread disease in Mexico which requires continuous surveillance. In this study, we standardized and validated a protocol for the specific detection of dengue antibodies from dried serum spots (DSSs). Paired serum and DSS samples from 66 suspected cases of dengue were collected in a clinic in Veracruz, Mexico. Samples were sent to our laboratory, where the conditions for optimal elution of DSSs were established. The presence of anti-dengue antibodies was determined in the paired samples. DSS elution conditions were standardized as follows: 1 h at 4°C in 200  µ l of DNase-, RNase-, and protease-free PBS (1x). The optimal volume of DSS eluate to be used in the IgG assay was 40  µ l. Sensitivity of 94%, specificity of 93.3%, and kappa concordance of 0.87 were obtained when comparing the antidengue reactivity between DSSs and serum samples. DSS samples are useful for detecting anti-dengue IgG antibodies in the field.

  6. Serum Dried Samples to Detect Dengue Antibodies: A Field Study

    PubMed Central

    Maldonado-Rodríguez, Angelica; Rojas-Montes, Othon; Chavez-Negrete, Adolfo; Rojas-Uribe, Magdalena; Posadas-Mondragon, Araceli; Aguilar-Faisal, Leopoldo; Xoconostle-Cazares, Beatriz

    2017-01-01

    Background Dried blood and serum samples are useful resources for detecting antiviral antibodies. The conditions for elution of the sample need to be optimized for each disease. Dengue is a widespread disease in Mexico which requires continuous surveillance. In this study, we standardized and validated a protocol for the specific detection of dengue antibodies from dried serum spots (DSSs). Methods Paired serum and DSS samples from 66 suspected cases of dengue were collected in a clinic in Veracruz, Mexico. Samples were sent to our laboratory, where the conditions for optimal elution of DSSs were established. The presence of anti-dengue antibodies was determined in the paired samples. Results DSS elution conditions were standardized as follows: 1 h at 4°C in 200 µl of DNase-, RNase-, and protease-free PBS (1x). The optimal volume of DSS eluate to be used in the IgG assay was 40 µl. Sensitivity of 94%, specificity of 93.3%, and kappa concordance of 0.87 were obtained when comparing the antidengue reactivity between DSSs and serum samples. Conclusion DSS samples are useful for detecting anti-dengue IgG antibodies in the field. PMID:28630868

  7. A Field-Based Testing Protocol for Assessing Gross Motor Skills in Preschool Children: The Children's Activity and Movement in Preschool Study Motor Skills Protocol

    ERIC Educational Resources Information Center

    Williams, Harriet G.; Pfeiffer, Karin A.; Dowda, Marsha; Jeter, Chevy; Jones, Shaverra; Pate, Russell R.

    2009-01-01

    The purpose of this study was to develop a valid and reliable tool for use in assessing motor skills in preschool children in field-based settings. The development of the Children's Activity and Movement in Preschool Study Motor Skills Protocol included evidence of its reliability and validity for use in field-based environments as part of large…

  8. Automatic 1H-NMR Screening of Fatty Acid Composition in Edible Oils

    PubMed Central

    Castejón, David; Fricke, Pascal; Cambero, María Isabel; Herrera, Antonio

    2016-01-01

    In this work, we introduce an NMR-based screening method for the fatty acid composition analysis of edible oils. We describe the evaluation and optimization needed for the automated analysis of vegetable oils by low-field NMR to obtain the fatty acid composition (FAC). To achieve this, two scripts, which automatically analyze and interpret the spectral data, were developed. The objective of this work was to drive forward the automated analysis of the FAC by NMR. Due to the fact that this protocol can be carried out at low field and that the complete process from sample preparation to printing the report only takes about 3 min, this approach is promising to become a fundamental technique for high-throughput screening. To demonstrate the applicability of this method, the fatty acid composition of extra virgin olive oils from various Spanish olive varieties (arbequina, cornicabra, hojiblanca, manzanilla, and picual) was determined by 1H-NMR spectroscopy according to this protocol. PMID:26891323

  9. Molecular Typing of Clostridium perfringens from a Food-Borne Disease Outbreak in a Nursing Home: Ribotyping versus Pulsed-Field Gel Electrophoresis

    PubMed Central

    Schalch, Barbara; Bader, Lutz; Schau, Hans-Peter; Bergmann, Rolf; Rometsch, Andrea; Maydl, Gertraud; Keßler, Silvia

    2003-01-01

    In 1998, 21 inhabitants of a German nursing home fell ill with acute gastroenteritis after consumption of minced beef heart (P. Graf and L. Bader, Epidemiol. Bull. 41:327-329, 2000). Two residents died during hospital treatment. Seventeen Clostridium perfringens strains were collected from two different dishes and from patients' stool samples and autopsy materials. A majority of these isolates was not typeable by restriction fragment length polymorphism-pulsed-field gel electrophoresis (PFGE). Subsequent ribotyping of C. perfringens distinguished four different groups. The same ribopattern was detected in a minced beef heart dish, in autopsy material from the two deceased patients, and additionally in stool samples from six further residents who had fallen ill with diarrhea. Three further ribopatterns from food and autopsy materials could be differentiated. As chromosomal macrorestriction with subsequent PFGE is generally regarded more useful than ribotyping for molecular strain analysis, four selected isolates were lysed in parallel with a standard protocol and two nucleases inhibiting modifications. Neither of these methods could differentiate all of the isolates. These results suggest that PFGE with the current standard protocols is not able to characterize all C. perfringens isolates from food-borne disease investigations and that ribotyping is still a helpful method for molecular identification of clonal relationships. PMID:12574310

  10. Testing the efficiency of rover science protocols for robotic sample selection: A GeoHeuristic Operational Strategies Test

    NASA Astrophysics Data System (ADS)

    Yingst, R. A.; Bartley, J. K.; Chidsey, T. C.; Cohen, B. A.; Gilleaudeau, G. J.; Hynek, B. M.; Kah, L. C.; Minitti, M. E.; Williams, R. M. E.; Black, S.; Gemperline, J.; Schaufler, R.; Thomas, R. J.

    2018-05-01

    The GHOST field tests are designed to isolate and test science-driven rover operations protocols, to determine best practices. During a recent field test at a potential Mars 2020 landing site analog, we tested two Mars Science Laboratory data-acquisition and decision-making methods to assess resulting science return and sample quality: a linear method, where sites of interest are studied in the order encountered, and a "walkabout-first" method, where sites of interest are examined remotely before down-selecting to a subset of sites that are interrogated with more resource-intensive instruments. The walkabout method cost less time and fewer resources, while increasing confidence in interpretations. Contextual data critical to evaluating site geology was acquired earlier than for the linear method, and given a higher priority, which resulted in development of more mature hypotheses earlier in the analysis process. Combined, this saved time and energy in the collection of data with more limited spatial coverage. Based on these results, we suggest that the walkabout method be used where doing so would provide early context and time for the science team to develop hypotheses-critical tests; and that in gathering context, coverage may be more important than higher resolution.

  11. Monitoring well utility in a heterogeneous DNAPL source zone area: Insights from proximal multilevel sampler wells and sampling capture-zone modelling.

    PubMed

    McMillan, Lindsay A; Rivett, Michael O; Wealthall, Gary P; Zeeb, Peter; Dumble, Peter

    2018-03-01

    Groundwater-quality assessment at contaminated sites often involves the use of short-screen (1.5 to 3 m) monitoring wells. However, even over these intervals considerable variation may occur in contaminant concentrations in groundwater adjacent to the well screen. This is especially true in heterogeneous dense non-aqueous phase liquid (DNAPL) source zones, where cm-scale contamination variability may call into question the effectiveness of monitoring wells to deliver representative data. The utility of monitoring wells in such settings is evaluated by reference to high-resolution multilevel sampler (MLS) wells located proximally to short-screen wells, together with sampling capture-zone modelling to explore controls upon well sample provenance and sensitivity to monitoring protocols. Field data are analysed from the highly instrumented SABRE research site that contained an old trichloroethene source zone within a shallow alluvial aquifer at a UK industrial facility. With increased purging, monitoring-well samples tend to a flow-weighted average concentration but may exhibit sensitivity to the implemented protocol and degree of purging. Formation heterogeneity adjacent to the well-screen particularly, alongside pump-intake position and water level, influence this sensitivity. Purging of low volumes is vulnerable to poor reproducibility arising from concentration variability predicted over the initial 1 to 2 screen volumes purged. Marked heterogeneity may also result in limited long-term sample concentration stabilization. Development of bespoke monitoring protocols, that consider screen volumes purged, alongside water-quality indicator parameter stabilization, is recommended to validate and reduce uncertainty when interpreting monitoring-well data within source zone areas. Generalised recommendations on monitoring well based protocols are also developed. A key monitoring well utility is their proportionately greater sample draw from permeable horizons constituting a significant contaminant flux pathway and hence representative fraction of source mass flux. Acquisition of complementary, high-resolution, site monitoring data, however, vitally underpins optimal interpretation of monitoring-well datasets and appropriate advancement of a site conceptual model and remedial implementation. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  12. A Constrained and Versioned Data Model for TEAM Data

    NASA Astrophysics Data System (ADS)

    Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.

    2009-04-01

    The objective of the Tropical Ecology Assessment and Monitoring Network (www.teamnetwork.org) is "To generate real time data for monitoring long-term trends in tropical biodiversity through a global network of TEAM sites (i.e. field stations in tropical forests), providing an early warning system on the status of biodiversity to effectively guide conservation action". To achieve this, the TEAM Network operates by collecting data via standardized protocols at TEAM Sites. The standardized TEAM protocols include the Climate, Vegetation and Terrestrial Vertebrate Protocols. Some sites also implement additional protocols. There are currently 7 TEAM Sites with plans to grow the network to 15 by June 30, 2009 and 50 TEAM Sites by the end of 2010. At each TEAM Site, data is gathered as defined by the protocols and according to a predefined sampling schedule. The TEAM data is organized and stored in a database based on the TEAM spatio-temporal data model. This data model is at the core of the TEAM Information System - it consumes and executes spatio-temporal queries, and analytical functions that are performed on TEAM data, and defines the object data types, relationships and operations that maintain database integrity. The TEAM data model contains object types including types for observation objects (e.g. bird, butterfly and trees), sampling unit, person, role, protocol, site and the relationship of these object types. Each observation data record is a set of attribute values of an observation object and is always associated with a sampling unit, an observation timestamp or time interval, a versioned protocol and data collectors. The operations on the TEAM data model can be classified as read operations, insert operations and update operations. Following are some typical operations: The operation get(site, protocol, [sampling unit block, sampling unit,] start time, end time) returns all data records using the specified protocol and collected at the specified site, block, sampling unit and time range. The operation insertSamplingUnit(sampling unit, site, protocol) saves a new sampling unit into the data model and links it with the site and protocol. The operation updateSampligUnit(sampling_unit_id, attribute, value) changes the attribute (e.g. latitude or longitude) of the sampling unit to the specified value. The operation insertData(observation record, site, protocol, sampling unit, timestamps, data collectors) saves a new observation record into the database and associates it with specified objects. The operation updateData(protocol, data_id, attribute, value) modifies the attribute of an existing observation record to the specified value. All the insert or update operations require: 1) authorization to ensure the user has necessary privileges to perform the operation; 2) timestamp validation to ensure the observation timestamps are in the designated time range specified in the sampling schedule; 3) data validation to check that the data records use correct taxonomy terms and data values. No authorization is performed for get operations, but under some specific condition, a username may be required for the purpose of authentication. Along with the validations above, the TEAM data model also supports human based data validation on observed data through the Data Review subsystem to ensure data quality. The data review is implemented by adding two attributes review_tag and review_comment to each observation data record. The attribute review_tag is used by a reviewer to specify the quality of data, and the attribute review_comment is for reviewers to give more information when a problem is identified. The review_tag attribute can be populated by either the system conducting QA/QC tests or by pre-specified scientific experts. The following is the review operation, which is actually a special case of the operation updateData: The operation updateReview(protocol, data_id, judgment, comment) sets the attribute review_tag and review_comment to the specified values. By systematically tracking every step, The TEAM data model can roll back to any previous state. This is achieved by introducing a historical data container for each editable object type. When the operation updateData is applied to an object to modify its attribute, the object will be tagged with the current timestamp and the name of the user who conducts the operation, the tagged object will then be moved into the historical data container, and finally a new object will be created with the new value for the specified attribute. The diagram illustrates the architecture of the TEAM data management system. A data collector can use the Data Ingestion subsystem to load new data records into the TEAM data model. The system establishes a first level of review (i.e. meets minimum data standards via QA/QC tests). Further review is done via experts and they can verify and provide their comments on data records through the Data Review subsystem. The data editor can then address data records based on the reviewer's comments. Users can use the Data Query and Download application to find data by sites, protocols and time ranges. The Data Query and Download system packages selected data with the data license and important metadata information into a single package and delivers it to the user.

  13. Multipinhole SPECT helical scan parameters and imaging volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yao, Rutao, E-mail: rutaoyao@buffalo.edu; Deng, Xiao; Wei, Qingyang

    Purpose: The authors developed SPECT imaging capability on an animal PET scanner using a multiple-pinhole collimator and step-and-shoot helical data acquisition protocols. The objective of this work was to determine the preferred helical scan parameters, i.e., the angular and axial step sizes, and the imaging volume, that provide optimal imaging performance. Methods: The authors studied nine helical scan protocols formed by permuting three rotational and three axial step sizes. These step sizes were chosen around the reference values analytically calculated from the estimated spatial resolution of the SPECT system and the Nyquist sampling theorem. The nine helical protocols were evaluatedmore » by two figures-of-merit: the sampling completeness percentage (SCP) and the root-mean-square (RMS) resolution. SCP was an analytically calculated numerical index based on projection sampling. RMS resolution was derived from the reconstructed images of a sphere-grid phantom. Results: The RMS resolution results show that (1) the start and end pinhole planes of the helical scheme determine the axial extent of the effective field of view (EFOV), and (2) the diameter of the transverse EFOV is adequately calculated from the geometry of the pinhole opening, since the peripheral region beyond EFOV would introduce projection multiplexing and consequent effects. The RMS resolution results of the nine helical scan schemes show optimal resolution is achieved when the axial step size is the half, and the angular step size is about twice the corresponding values derived from the Nyquist theorem. The SCP results agree in general with that of RMS resolution but are less critical in assessing the effects of helical parameters and EFOV. Conclusions: The authors quantitatively validated the effective FOV of multiple pinhole helical scan protocols and proposed a simple method to calculate optimal helical scan parameters.« less

  14. Cleaning the IceMole: collection of englacial samples from Blood Falls, Antarctica

    NASA Astrophysics Data System (ADS)

    Mikucki, J.; Digel, I.; Chua, M.; Davis, J.; Ghosh, D.; Lyons, W. B.; Welch, K. A.; Purcell, A.; Francke, G.; Feldmann, M.; Espe, C.; Heinen, D.; Dachwald, B.; Kowalski, J.; Tulaczyk, S. M.

    2016-12-01

    The Minimally Invasive Direct Glacial Access project (MIDGE) used a maneuverable thermoelectric melting probe called the IceMole to collect the first englacial samples of brine from Blood Falls, Antarctica. In order to maintain the scientific integrity of samples collected and minimize impact to this specially protected ecosystem, microbial and chemical contamination of the IceMole needed to be minimized. Guidelines have been established for research in Antarctic subglacial systems by the scientific and regulatory community and have been detailed by the "Code of Conduct for the Exploration and Research of Subglacial Aquatic Environments" put forth by the Scientific Committee on Antarctic Research (SCAR) Action Group, and was submitted to the Antarctic Treaty System. This Code of Conduct (CoC) recognizes the ecological importance and pristine nature of subglacial habitats and recommends a path forward towards clean exploration. Similarly, the US and European space agencies (NASA and ESA) have detailed instrument preparation protocols for the exploration of icy worlds in our solar system for planetary protection. Given the synergistic aims of these two groups we have adopted protocols from both subglacial and space exploration approaches. Here we present our approach to cleaning the IceMole in the field and report on ability to reduce the bioload inherent on the melter. Specifically our protocol reduced the exterior bio-load by an order of magnitude, to levels common in most clean rooms, and 1-3 orders of magnitude below that of Taylor Glacier ice surrounding Blood Falls. Our results indicate that the collection of englacial samples for microbiological analysis is feasible with melting probes.

  15. A field protocol to monitor cavity-nesting birds

    Treesearch

    J. Dudley; V. Saab

    2003-01-01

    We developed a field protocol to monitor populations of cavity-nesting birds in burned and unburned coniferous forests of western North America. Standardized field methods are described for implementing long-term monitoring strategies and for conducting field research to evaluate the effects of habitat change on cavity-nesting birds. Key references (but not...

  16. A multigear protocol for sampling crayfish assemblages in Gulf of Mexico coastal streams

    Treesearch

    William R. Budnick; William E. Kelso; Susan B. Adams; Michael D. Kaller

    2018-01-01

    Identifying an effective protocol for sampling crayfish in streams that vary in habitat and physical/chemical characteristics has proven problematic. We evaluated an active, combined-gear (backpack electrofishing and dipnetting) sampling protocol in 20 Coastal Plain streams in Louisiana. Using generalized linear models and rarefaction curves, we evaluated environmental...

  17. Adaptive tracking of a time-varying field with a quantum sensor

    NASA Astrophysics Data System (ADS)

    Bonato, Cristian; Berry, Dominic W.

    2017-05-01

    Sensors based on single spins can enable magnetic-field detection with very high sensitivity and spatial resolution. Previous work has concentrated on sensing of a constant magnetic field or a periodic signal. Here, we instead investigate the problem of estimating a field with nonperiodic variation described by a Wiener process. We propose and study, by numerical simulations, an adaptive tracking protocol based on Bayesian estimation. The tracking protocol updates the probability distribution for the magnetic field based on measurement outcomes and adapts the choice of sensing time and phase in real time. By taking the statistical properties of the signal into account, our protocol strongly reduces the required measurement time. This leads to a reduction of the error in the estimation of a time-varying signal by up to a factor of four compare with protocols that do not take this information into account.

  18. Mussel micronucleus cytome assay.

    PubMed

    Bolognesi, Claudia; Fenech, Michael

    2012-05-17

    The micronucleus (MN) assay is one of the most widely used genotoxicity biomarkers in aquatic organisms, providing an efficient measure of chromosomal DNA damage occurring as a result of either chromosome breakage or chromosome mis-segregation during mitosis. The MN assay is today applied in laboratory and field studies using hemocytes and gill cells from bivalves, mainly from the genera Mytilus. These represent 'sentinel' organisms because of their ability to survive under polluted conditions and to accumulate both organic and inorganic pollutants. Because the mussel MN assay also includes scoring of different cell types, including necrotic and apoptotic cells and other nuclear anomalies, it is in effect an MN cytome assay. The mussel MN cytome (MUMNcyt) assay protocol we describe here reports the recommended experimental design, sample size, cell preparation, cell fixation and staining methods. The protocol also includes criteria and photomicrographs for identifying different cell types and scoring criteria for micronuclei (MNi) and nuclear buds. The complete procedure requires approximately 10 h for each experimental point/sampling station (ten animals).

  19. A review of blood sample handling and pre-processing for metabolomics studies.

    PubMed

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. On the Ground or in the Air? A Methodological Experiment on Crop Residue Cover Measurement in Ethiopia

    NASA Astrophysics Data System (ADS)

    Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash

    2017-10-01

    Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.

  1. On the Ground or in the Air? A Methodological Experiment on Crop Residue Cover Measurement in Ethiopia.

    PubMed

    Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash

    2017-10-01

    Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.

  2. Year 1 Field Work Report: Utah Bat Monitoring Protocol

    DTIC Science & Technology

    2010-01-28

    Plateau shrublands, Great Basin shrub steppe, Wasatch and Uinta montane forests, Mojave Desert and Wyoming Basin shrub steppe. A total 65, 20 x 20 km... Basin shrub steppe, Wasatch and Uinta montane forests) each harbored 20 sampling cells, while the limited size of the Mojave Desert and Wyoming Basin ...Wasatch and Uinta montane forest and Wyoming Basin shrub steppe). Site # A unique identifier between 1 and 20 within each ecoregion. UTM The

  3. The importance of quality control in validating concentrations ...

    EPA Pesticide Factsheets

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer

  4. The role of field auditing in environmental quality assurance management.

    PubMed

    Claycomb, D R

    2000-01-01

    Environmental data quality improvement continues to focus on analytical laboratoryperformance with little, if any, attention given to improving the performance of field consultants responsible for sample collection. Many environmental professionals often assume that the primary opportunity for data error lies within the activities conducted by the laboratory. Experience in the evaluation of environmental data and project-wide quality assurance programs indicates that an often-ignored factor affecting environmental data quality is the manner in which a sample is acquired and handled in the field. If a sample is not properly collected, preserved, stored, and transported in the field, even the best laboratory practices and analytical methods cannot deliver accurate and reliable data (i.e., bad data in equals bad data out). Poor quality environmental data may result in inappropriate decisions regarding site characterization and remedial action. Field auditing is becoming an often-employed technique for examining the performance of the environmental sampling field team and how their performance may affect data quality. The field audits typically focus on: (1) verifying that field consultants adhere to project control documents (e.g., Work Plans and Standard Operating Procedures [SOPs]) during field operations; (2) providing third-party independent assurance that field procedures, quality assurance/ quality control (QA/QC)protocol, and field documentation are sufficient to produce data of satisfactory quality; (3) providing a defense in the event that field procedures are called into question; and (4) identifying ways to reduce sampling costs. Field audits are typically most effective when performed on a surprise basis; that is, the sampling contractor may be aware that a field audit will be conducted during some phase of sampling activities but is not informed of the specific day(s) that the audit will be conducted. The audit also should be conducted early on in the sampling program such that deficiencies noted during the audit can be addressed before the majority of field activities have been completed. A second audit should be performed as a follow-up to confirm that the recommended changes have been implemented. A field auditor is assigned to the project by matching, as closely as possible, the auditor's experience with the type of field activities being conducted. The auditor uses a project-specific field audit checklist developed from key information contained in project control documents. Completion of the extensive audit checklist during the audit focuses the auditor on evaluating each aspect of field activities being performed. Rather than examine field team performance after sampling, a field auditor can do so while the samples are being collected and can apply real-time corrective action as appropriate. As a result of field audits, responsible parties often observe vast improvements in their consultant's field procedures and, consequently, receive more reliable and representative field data at a lower cost. The cost savings and improved data quality that result from properly completed field audits make the field auditing process both cost-effective and functional.

  5. Circadian temperature rhythms of older people

    NASA Technical Reports Server (NTRS)

    Monk, T. H.; Buysse, D. J.; Reynolds, C. F. 3rd; Kupfer, D. J.; Houck, P. R.

    1995-01-01

    This collection of studies had the aim of exploring whether older (77+ years) men and women have circadian body temperature rhythms different from those of younger adults. A total of 20 older men and 28 older women were compared with either 22 young men or 14 middle-aged men in four protocols; all but the first protocol using a subset of the sample. The four protocols were: 1) 24 h, and 2) 72 h data collections on a normal laboratory routine (sleeping at night); 3) between 36 h and 153 h of field data collection at home; and 4) 36 h of a constant conditions routine (wakeful bedrest under temporal isolation) in the laboratory. There was some evidence for an age-related phase advance in temperature rhythm, especially for the older men on a normal routine, though this was not present in the constant conditions protocol, where 5 of the older subjects showed major delays in the timing of the body temperature trough (10:00 or later). There was no statistically significant evidence from any of the protocols that older subjects generally had lower temperature rhythm amplitudes than younger adults. Only when older men were compared with younger men in 24-h rhythm amplitude by simple t-test did any comparison involving amplitude achieve statistical significance (p < 0.05).

  6. Two alternative DNA extraction methods to improve the detection of Mycobacterium-tuberculosis-complex members in cattle and red deer tissue samples.

    PubMed

    Fell, Shari; Bröckl, Stephanie; Büttner, Mathias; Rettinger, Anna; Zimmermann, Pia; Straubinger, Reinhard K

    2016-09-15

    Bovine tuberculosis (bTB), which is caused by Mycobacterium bovis and M. caprae, is a notifiable animal disease in Germany. Diagnostic procedure is based on a prescribed protocol that is published in the framework of German bTB legislation. In this protocol small sample volumes are used for DNA extraction followed by real-time PCR analyses. As mycobacteria tend to concentrate in granuloma and the infected tissue in early stages of infection does not necessarily show any visible lesions, it is likely that DNA extraction from only small tissue samples (20-40 mg) of a randomly chosen spot from the organ and following PCR testing may result in false negative results. In this study two DNA extraction methods were developed to process larger sample volumes to increase the detection sensitivity of mycobacterial DNA in animal tissue. The first extraction method is based on magnetic capture, in which specific capture oligonucleotides were utilized. These nucleotides are linked to magnetic particles and capture Mycobacterium-tuberculosis-complex (MTC) DNA released from 10 to 15 g of tissue material. In a second approach remaining sediments from the magnetic capture protocol were further processed with a less complex extraction protocol that can be used in daily routine diagnostics. A total number of 100 tissue samples from 34 cattle (n = 74) and 18 red deer (n = 26) were analyzed with the developed protocols and results were compared to the prescribed protocol. All three extraction methods yield reliable results by the real-time PCR analysis. The use of larger sample volume led to a sensitivity increase of DNA detection which was shown by the decrease of Ct-values. Furthermore five samples which were tested negative or questionable by the official extraction protocol were detected positive by real time PCR when the alternative extraction methods were used. By calculating the kappa index, the three extraction protocols resulted in a moderate (0.52; protocol 1 vs 3) to almost perfect agreement (1.00; red deer sample testing with all protocols). Both new methods yielded increased detection rates for MTC DNA detection in large sample volumes and consequently improve the official diagnostic protocol.

  7. An Evaluation Methodology for Protocol Analysis Systems

    DTIC Science & Technology

    2007-03-01

    Main Memory Requirement NS: Needham-Schroeder NSL: Needham-Schroeder-Lowe OCaml : Objective Caml POSIX: Portable Operating System...methodology is needed. A. PROTOCOL ANALYSIS FIELD As with any field, there is a specialized language used within the protocol analysis community. Figure...ProVerif requires that Objective Caml ( OCaml ) be installed on the system, OCaml version 3.09.3 was installed. C. WINDOWS CONFIGURATION OS

  8. Detecting and enumerating soil-transmitted helminth eggs in soil: New method development and results from field testing in Kenya and Bangladesh.

    PubMed

    Steinbaum, Lauren; Kwong, Laura H; Ercumen, Ayse; Negash, Makeda S; Lovely, Amira J; Njenga, Sammy M; Boehm, Alexandria B; Pickering, Amy J; Nelson, Kara L

    2017-04-01

    Globally, about 1.5 billion people are infected with at least one species of soil-transmitted helminth (STH). Soil is a critical environmental reservoir of STH, yet there is no standard method for detecting STH eggs in soil. We developed a field method for enumerating STH eggs in soil and tested the method in Bangladesh and Kenya. The US Environmental Protection Agency (EPA) method for enumerating Ascaris eggs in biosolids was modified through a series of recovery efficiency experiments; we seeded soil samples with a known number of Ascaris suum eggs and assessed the effect of protocol modifications on egg recovery. We found the use of 1% 7X as a surfactant compared to 0.1% Tween 80 significantly improved recovery efficiency (two-sided t-test, t = 5.03, p = 0.007) while other protocol modifications-including different agitation and flotation methods-did not have a significant impact. Soil texture affected the egg recovery efficiency; sandy samples resulted in higher recovery compared to loamy samples processed using the same method (two-sided t-test, t = 2.56, p = 0.083). We documented a recovery efficiency of 73% for the final improved method using loamy soil in the lab. To field test the improved method, we processed soil samples from 100 households in Bangladesh and 100 households in Kenya from June to November 2015. The prevalence of any STH (Ascaris, Trichuris or hookworm) egg in soil was 78% in Bangladesh and 37% in Kenya. The median concentration of STH eggs in soil in positive samples was 0.59 eggs/g dry soil in Bangladesh and 0.15 eggs/g dry soil in Kenya. The prevalence of STH eggs in soil was significantly higher in Bangladesh than Kenya (chi-square, χ2 = 34.39, p < 0.001) as was the concentration (Mann-Whitney, z = 7.10, p < 0.001). This new method allows for detecting STH eggs in soil in low-resource settings and could be used for standardizing soil STH detection globally.

  9. Detecting and enumerating soil-transmitted helminth eggs in soil: New method development and results from field testing in Kenya and Bangladesh

    PubMed Central

    Kwong, Laura H.; Ercumen, Ayse; Negash, Makeda S.; Lovely, Amira J.; Njenga, Sammy M.; Boehm, Alexandria B.; Pickering, Amy J.; Nelson, Kara L.

    2017-01-01

    Globally, about 1.5 billion people are infected with at least one species of soil-transmitted helminth (STH). Soil is a critical environmental reservoir of STH, yet there is no standard method for detecting STH eggs in soil. We developed a field method for enumerating STH eggs in soil and tested the method in Bangladesh and Kenya. The US Environmental Protection Agency (EPA) method for enumerating Ascaris eggs in biosolids was modified through a series of recovery efficiency experiments; we seeded soil samples with a known number of Ascaris suum eggs and assessed the effect of protocol modifications on egg recovery. We found the use of 1% 7X as a surfactant compared to 0.1% Tween 80 significantly improved recovery efficiency (two-sided t-test, t = 5.03, p = 0.007) while other protocol modifications—including different agitation and flotation methods—did not have a significant impact. Soil texture affected the egg recovery efficiency; sandy samples resulted in higher recovery compared to loamy samples processed using the same method (two-sided t-test, t = 2.56, p = 0.083). We documented a recovery efficiency of 73% for the final improved method using loamy soil in the lab. To field test the improved method, we processed soil samples from 100 households in Bangladesh and 100 households in Kenya from June to November 2015. The prevalence of any STH (Ascaris, Trichuris or hookworm) egg in soil was 78% in Bangladesh and 37% in Kenya. The median concentration of STH eggs in soil in positive samples was 0.59 eggs/g dry soil in Bangladesh and 0.15 eggs/g dry soil in Kenya. The prevalence of STH eggs in soil was significantly higher in Bangladesh than Kenya (chi-square, χ2 = 34.39, p < 0.001) as was the concentration (Mann-Whitney, z = 7.10, p < 0.001). This new method allows for detecting STH eggs in soil in low-resource settings and could be used for standardizing soil STH detection globally. PMID:28379956

  10. Field validation of protocols developed to evaluate in-line mastitis detection systems.

    PubMed

    Kamphuis, C; Dela Rue, B T; Eastwood, C R

    2016-02-01

    This paper reports on a field validation of previously developed protocols for evaluating the performance of in-line mastitis-detection systems. The protocols outlined 2 requirements of these systems: (1) to detect cows with clinical mastitis (CM) promptly and accurately to enable timely and appropriate treatment and (2) to identify cows with high somatic cell count (SCC) to manage bulk milk SCC levels. Gold standard measures, evaluation tests, performance measures, and performance targets were proposed. The current study validated the protocols on commercial dairy farms with automated in-line mastitis-detection systems using both electrical conductivity (EC) and SCC sensor systems that both monitor at whole-udder level. The protocol for requirement 1 was applied on 3 commercial farms. For requirement 2, the protocol was applied on 6 farms; 3 of them had low bulk milk SCC (128×10(3) cells/mL) and were the same farms as used for field evaluation of requirement 1. Three farms with high bulk milk SCC (270×10(3) cells/mL) were additionally enrolled. The field evaluation methodology and results were presented at a workshop including representation from 7 international suppliers of in-line mastitis-detection systems. Feedback was sought on the acceptance of standardized performance evaluation protocols and recommended refinements to the protocols. Although the methodology for requirement 1 was relatively labor intensive and required organizational skills over an extended period, no major issues were encountered during the field validation of both protocols. The validation, thus, proved the protocols to be practical. Also, no changes to the data collection process were recommended by the technology supplier representatives. However, 4 recommendations were made to refine the protocols: inclusion of an additional analysis that ignores small (low-density) clot observations in the definition of CM, extension of the time window from 4 to 5 milkings for timely alerts for CM, setting a maximum number of 10 milkings for the time window to detect a CM episode, and presentation of sensitivity for a larger range of false alerts per 1,000 milkings replacing minimum performance targets. The recommended refinements are discussed with suggested changes to the original protocols. The information presented is intended to inform further debate toward achieving international agreement on standard protocols to evaluate performance of in-line mastitis-detection systems. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  11. Quality assured measurements of animal building emissions: odor concentrations.

    PubMed

    Jacobson, Larry D; Hetchler, Brian P; Schmidt, David R; Nicolai, Richard E; Heber, Albert J; Ni, Ji-Qin; Hoff, Steven J; Koziel, Jacek A; Zhang, Yuanhui; Beasley, David B; Parker, David B

    2008-06-01

    Standard protocols for sampling and measuring odor emissions from livestock buildings are needed to guide scientists, consultants, regulators, and policy-makers. A federally funded, multistate project has conducted field studies in six states to measure emissions of odor, coarse particulate matter (PM(10)), total suspended particulates, hydrogen sulfide, ammonia, and carbon dioxide from swine and poultry production buildings. The focus of this paper is on the intermittent measurement of odor concentrations at nearly identical pairs of buildings in each state and on protocols to minimize variations in these measurements. Air was collected from pig and poultry barns in small (10 L) Tedlar bags through a gas sampling system located in an instrument trailer housing gas and dust analyzers. The samples were analyzed within 30 hr by a dynamic dilution forced-choice olfactometer (a dilution apparatus). The olfactometers (AC'SCENT International Olfactometer, St. Croix Sensory, Inc.) used by all participating laboratories meet the olfactometry standards (American Society for Testing and Materials and European Committee for Standardization [CEN]) in the United States and Europe. Trained panelists (four to eight) at each laboratory measured odor concentrations (dilution to thresholds [DT]) from the bag samples. Odor emissions were calculated by multiplying odor concentration differences between inlet and outlet air by standardized (20 degrees C and 1 atm) building airflow rates.

  12. A DNA fingerprinting procedure for ultra high-throughput genetic analysis of insects.

    PubMed

    Schlipalius, D I; Waldron, J; Carroll, B J; Collins, P J; Ebert, P R

    2001-12-01

    Existing procedures for the generation of polymorphic DNA markers are not optimal for insect studies in which the organisms are often tiny and background molecular information is often non-existent. We have used a new high throughput DNA marker generation protocol called randomly amplified DNA fingerprints (RAF) to analyse the genetic variability in three separate strains of the stored grain pest, Rhyzopertha dominica. This protocol is quick, robust and reliable even though it requires minimal sample preparation, minute amounts of DNA and no prior molecular analysis of the organism. Arbitrarily selected oligonucleotide primers routinely produced approximately 50 scoreable polymorphic DNA markers, between individuals of three independent field isolates of R. dominica. Multivariate cluster analysis using forty-nine arbitrarily selected polymorphisms generated from a single primer reliably separated individuals into three clades corresponding to their geographical origin. The resulting clades were quite distinct, with an average genetic difference of 37.5 +/- 6.0% between clades and of 21.0 +/- 7.1% between individuals within clades. As a prelude to future gene mapping efforts, we have also assessed the performance of RAF under conditions commonly used in gene mapping. In this analysis, fingerprints from pooled DNA samples accurately and reproducibly reflected RAF profiles obtained from individual DNA samples that had been combined to create the bulked samples.

  13. A new, ultra-low latency data transmission protocol for Earthquake Early Warning Systems

    NASA Astrophysics Data System (ADS)

    Hill, P.; Hicks, S. P.; McGowan, M.

    2016-12-01

    One measure used to assess the performance of Earthquake Early Warning Systems (EEWS) is the delay time between earthquake origin and issued alert. EEWS latency is dependent on a number of sources (e.g. P-wave propagation, digitisation, transmission, receiver processing, triggering, event declaration). Many regional seismic networks use the SEEDlink protocol; however, packet size is fixed to 512-byte miniSEED records, resulting in transmission latencies of >0.5 s. Data packetisation is seen as one of the main sources of delays in EEWS (Brown et al., 2011). Optimising data-logger and telemetry configurations is a cost-effective strategy to improve EEWS alert times (Behr et al., 2015). Digitisers with smaller, selectable packets can result in faster alerts (Sokos et al., 2016). We propose a new seismic protocol for regional seismic networks benefiting low-latency applications such as EEWS. The protocol, based on Güralp's existing GDI-link format is an efficient and flexible method to exchange data between seismic stations and data centers for a range of network configurations. The main principle is to stream data sample-by-sample instead of fixed-length packets to minimise transmission latency. Self-adaptive packetisation with compression maximises available telemetry bandwidth. Highly flexible metadata fields within GDI-link are compatible with existing miniSEED definitions. Data is sent as integers or floats, supporting a wide range of data formats, including discrete parameters such as Pd & τC for on-site earthquake early warning. Other advantages include: streaming station state-of-health information, instrument control, support of backfilling and fail-over strategies during telemetry outages. Based on tests carried out on the Güralp Minimus data-logger, we show our new protocol can reduce transmission latency to as low as 1 ms. The low-latency protocol is currently being implemented with common processing packages. The results of these tests will help to highlight latency levels that can be achieved with next-generation EEWS.

  14. Incidence and rates of visual field progression after longitudinally measured optic disc change in glaucoma.

    PubMed

    Chauhan, Balwantray C; Nicolela, Marcelo T; Artes, Paul H

    2009-11-01

    To determine whether glaucoma patients with progressive optic disc change have subsequent visual field progression earlier and at a faster rate compared with those without disc change. Prospective, longitudinal, cohort study. Eighty-one patients with open-angle glaucoma. Patients underwent confocal scanning laser tomography and standard automated perimetry every 6 months. The complete follow-up was divided into initial and subsequent periods. Two initial periods-first 3 years (Protocol A) and first half of the total follow-up (Protocol B)-were used, with the respective remainder being the subsequent follow-up. Disc change during the initial follow-up was determined with liberal, moderate, or conservative criteria of the Topographic Change Analysis. Subsequent field progression was determined with significant pattern deviation change in >or=3 locations (criterion used in the Early Manifest Glaucoma Trial). As a control analysis, field change during the initial follow-up was determined with significant pattern deviation change in >or=1, >or=2, or >or=3 locations. Survival time to subsequent field progression, rates of mean deviation (MD) change, and positive and negative likelihood ratios. The median (interquartile range) total follow-up was 11.0 (8.0-12.0) years with 22 (18-24) examinations. More patients had disc changes during the initial follow-up compared with field changes. The mean time to field progression was consistently shorter (protocol A, 0.8-1.7 years; protocol B, 0.3-0.7 years) in patients with prior disc change. In the control analysis, patients with prior field change had statistically earlier subsequent field progression (protocol A, 2.9-3.0 years; protocol B, 0.7-0.9). Similarly, patients with either prior disc or field change always had worse mean rates of subsequent MD change, although the distributions overlapped widely. Patients with subsequent field progression were up to 3 times more likely to have prior disc change compared with those without, and up to 5 times more likely to have prior field change compared with those without. Longitudinally measured optic disc change is predictive of subsequent visual field progression and may be an efficacious end point for functional outcomes in clinical studies and trials in glaucoma.

  15. New advances in scanning microscopy and its application to study parasitic protozoa.

    PubMed

    de Souza, Wanderley; Attias, Marcia

    2018-07-01

    Scanning electron microscopy has been used to observe and study parasitic protozoa for at least 40 years. However, field emission electron sources, as well as improvements in lenses and detectors, brought the resolution power of scanning electron microscopes (SEM) to a new level. Parallel to the refinement of instruments, protocols for preservation of the ultrastructure, immunolabeling, exposure of cytoskeleton and inner structures of parasites and host cells were developed. This review is focused on protozoan parasites of medical and veterinary relevance, e.g., Toxoplasma gondii, Tritrichomonas foetus, Giardia intestinalis, and Trypanosoma cruzi, compilating the main achievements in describing the fine ultrastructure of their surface, cytoskeleton and interaction with host cells. Two new resources, namely, Helium Ion Microscopy (HIM) and Slice and View, using either Focused Ion Beam (FIB) abrasion or Microtome Serial Sectioning (MSS) within the microscope chamber, combined to backscattered electron imaging of fixed (chemically or by quick freezing followed by freeze substitution and resin embedded samples is bringing an exponential amount of valuable information. In HIM there is no need of conductive coating and the depth of field is much higher than in any field emission SEM. As for FIB- and MSS-SEM, high resolution 3-D models of areas and volumes larger than any other technique allows can be obtained. The main results achieved with all these technological tools and some protocols for sample preparation are included in this review. In addition, we included some results obtained with environmental/low vacuum scanning microscopy and cryo-scanning electron microscopy, both promising, but not yet largely employed SEM modalities. Copyright © 2018. Published by Elsevier Inc.

  16. Correlation Between Iron and alpha and pi Glutathione-S-Transferase Levels in Humans

    DTIC Science & Technology

    2012-09-01

    assays were performed as described in the Biotrin High Sensitivity Alpha GST EIA kit protocol. First, serum samples were diluted 1:10 with wash solution...immunosorbent assays were performed as described in the Biotrin Pi GST EIA kit protocol. First, plasma samples were diluted 1:5 with sample diluent...immunosorbent assays were performed as described in the AssayMax Human Transferrin ELISA kit protocol. First, serum samples were diluted 1:2000 with MIX

  17. Providing an Authentic Research Experience for University of the Fraser Valley Undergraduate Students by Investigating and Documenting Seasonal and Longterm Changes in Fraser Valley Stream Water Chemistry.

    NASA Astrophysics Data System (ADS)

    Gillies, S. L.; Marsh, S. J.; Peucker-Ehrenbrink, B.; Janmaat, A.; Bourdages, M.; Paulson, D.; Groeneweg, A.; Bogaerts, P.; Robertson, K.; Clemence, E.; Smith, S.; Yakemchuk, A.; Faber, A.

    2017-12-01

    Undergraduate students in the Geography and Biology Departments at the University of the Fraser Valley (UFV) have been provided the opportunity to participate in the time series sampling of the Fraser River at Fort Langley and Fraser Valley tributaries as part of the Global Rivers Observatory (GRO, www.globalrivers.org) which is coordinated by Woods Hole Oceanographic Institution and Woods Hole Research Center. Student research has focussed on Clayburn, Willband and Stoney Creeks that flow from Sumas Mountain northwards to the Fraser River. These watercourses are increasingly being impacted by anthropogenic activity including residential developments, industrial activity, and agricultural landuse. Students are instructed in field sampling protocols and the collection of water chemistry data and the care and maintenance of the field equipment. Students develop their own research projects and work in support of each other as teams in the field to collect the data and water samples. Students present their findings as research posters at local academic conferences and at UFV's Student Research Day. Through their involvement in our field research our students have become more aware of the state of our local streams, the methods used to monitor water chemistry and how water chemistry varies seasonally.

  18. A concise approach for building the s-T diagram for Mn-Fe-P-Si hysteretic magnetocaloric material

    NASA Astrophysics Data System (ADS)

    Christiaanse, T. V.; Campbell, O.; Trevizoli, P. V.; Misra, S.; van Asten, D.; Zhang, L.; Govindappa, P.; Niknia, I.; Teyber, R.; Rowe, A.

    2017-09-01

    The use of first order magnetocaloric materials (FOM’s) in magnetic cycles is of interest for the development of efficient magnetic heat pumps. FOM’s present promising magnetocaloric properties; however, hysteresis reduces the reversible adiabatic temperature change (Δ Tad ) of these materials, and consequently, impacts performance. The present paper evaluates the reversible Δ Tad in a FOM. Six samples of the Mn-Fe-P-Si material with different transition temperatures are examined. The samples are measured for heat capacity, magnetization, and adiabatic temperature change using heating and cooling protocols to characterize hysteresis. After correcting demagnetizing fields, the entropy-temperature (s-T ) diagrams are constructed and used to calculate adiabatic temperature change using four different thermal paths. The post-calculated Δ Tad is compared with experimental data from direct Δ Tad measurements. Most of the samples of Mn-Fe-P-Si show that post-calculated Δ Tad resulting from the heating zero field and cooling in-field entropy curves align best with the Δ Tad measurements. The impact of the demagnetizing field is shown in terms of absolute variation to the post-calculated Δ Tad . A functional representation is used to explain observed data sensitivities in the post-calculated Δ Tad .

  19. Long Term Resource Monitoring Program procedures: fish monitoring

    USGS Publications Warehouse

    Ratcliff, Eric N.; Glittinger, Eric J.; O'Hara, T. Matt; Ickes, Brian S.

    2014-01-01

    This manual constitutes the second revision of the U.S. Army Corps of Engineers’ Upper Mississippi River Restoration-Environmental Management Program (UMRR-EMP) Long Term Resource Monitoring Program (LTRMP) element Fish Procedures Manual. The original (1988) manual merged and expanded on ideas and recommendations related to Upper Mississippi River fish sampling presented in several early documents. The first revision to the manual was made in 1995 reflecting important protocol changes, such as the adoption of a stratified random sampling design. The 1995 procedures manual has been an important document through the years and has been cited in many reports and scientific manuscripts. The resulting data collected by the LTRMP fish component represent the largest dataset on fish within the Upper Mississippi River System (UMRS) with more than 44,000 collections of approximately 5.7 million fish. The goal of this revision of the procedures manual is to document changes in LTRMP fish sampling procedures since 1995. Refinements to sampling methods become necessary as monitoring programs mature. Possible refinements are identified through field experiences (e.g., sampling techniques and safety protocols), data analysis (e.g., planned and studied gear efficiencies and reallocations of effort), and technological advances (e.g., electronic data entry). Other changes may be required because of financial necessity (i.e., unplanned effort reductions). This version of the LTRMP fish monitoring manual describes the most current (2014) procedures of the LTRMP fish component.

  20. Tissue Sampling Guides for Porcine Biomedical Models.

    PubMed

    Albl, Barbara; Haesner, Serena; Braun-Reichhart, Christina; Streckel, Elisabeth; Renner, Simone; Seeliger, Frank; Wolf, Eckhard; Wanke, Rüdiger; Blutke, Andreas

    2016-04-01

    This article provides guidelines for organ and tissue sampling adapted to porcine animal models in translational medical research. Detailed protocols for the determination of sampling locations and numbers as well as recommendations on the orientation, size, and trimming direction of samples from ∼50 different porcine organs and tissues are provided in the Supplementary Material. The proposed sampling protocols include the generation of samples suitable for subsequent qualitative and quantitative analyses, including cryohistology, paraffin, and plastic histology; immunohistochemistry;in situhybridization; electron microscopy; and quantitative stereology as well as molecular analyses of DNA, RNA, proteins, metabolites, and electrolytes. With regard to the planned extent of sampling efforts, time, and personnel expenses, and dependent upon the scheduled analyses, different protocols are provided. These protocols are adjusted for (I) routine screenings, as used in general toxicity studies or in analyses of gene expression patterns or histopathological organ alterations, (II) advanced analyses of single organs/tissues, and (III) large-scale sampling procedures to be applied in biobank projects. Providing a robust reference for studies of porcine models, the described protocols will ensure the efficiency of sampling, the systematic recovery of high-quality samples representing the entire organ or tissue as well as the intra-/interstudy comparability and reproducibility of results. © The Author(s) 2016.

  1. Library preparation and data analysis packages for rapid genome sequencing.

    PubMed

    Pomraning, Kyle R; Smith, Kristina M; Bredeweg, Erin L; Connolly, Lanelle R; Phatale, Pallavi A; Freitag, Michael

    2012-01-01

    High-throughput sequencing (HTS) has quickly become a valuable tool for comparative genetics and genomics and is now regularly carried out in laboratories that are not connected to large sequencing centers. Here we describe an updated version of our protocol for constructing single- and paired-end Illumina sequencing libraries, beginning with purified genomic DNA. The present protocol can also be used for "multiplexing," i.e. the analysis of several samples in a single flowcell lane by generating "barcoded" or "indexed" Illumina sequencing libraries in a way that is independent from Illumina-supported methods. To analyze sequencing results, we suggest several independent approaches but end users should be aware that this is a quickly evolving field and that currently many alignment (or "mapping") and counting algorithms are being developed and tested.

  2. Soil pH Mapping with an On-The-Go Sensor

    PubMed Central

    Schirrmann, Michael; Gebbers, Robin; Kramer, Eckart; Seidel, Jan

    2011-01-01

    Soil pH is a key parameter for crop productivity, therefore, its spatial variation should be adequately addressed to improve precision management decisions. Recently, the Veris pH Manager™, a sensor for high-resolution mapping of soil pH at the field scale, has been made commercially available in the US. While driving over the field, soil pH is measured on-the-go directly within the soil by ion selective antimony electrodes. The aim of this study was to evaluate the Veris pH Manager™ under farming conditions in Germany. Sensor readings were compared with data obtained by standard protocols of soil pH assessment. Experiments took place under different scenarios: (a) controlled tests in the lab, (b) semicontrolled test on transects in a stop-and-go mode, and (c) tests under practical conditions in the field with the sensor working in its typical on-the-go mode. Accuracy issues, problems, options, and potential benefits of the Veris pH Manager™ were addressed. The tests demonstrated a high degree of linearity between standard laboratory values and sensor readings. Under practical conditions in the field (scenario c), the measure of fit (r2) for the regression between the on-the-go measurements and the reference data was 0.71, 0.63, and 0.84, respectively. Field-specific calibration was necessary to reduce systematic errors. Accuracy of the on-the-go maps was considerably higher compared with the pH maps obtained by following the standard protocols, and the error in calculating lime requirements was reduced by about one half. However, the system showed some weaknesses due to blockage by residual straw and weed roots. If these problems were solved, the on-the-go sensor investigated here could be an efficient alternative to standard sampling protocols as a basis for liming in Germany. PMID:22346591

  3. Soil pH mapping with an on-the-go sensor.

    PubMed

    Schirrmann, Michael; Gebbers, Robin; Kramer, Eckart; Seidel, Jan

    2011-01-01

    Soil pH is a key parameter for crop productivity, therefore, its spatial variation should be adequately addressed to improve precision management decisions. Recently, the Veris pH Manager™, a sensor for high-resolution mapping of soil pH at the field scale, has been made commercially available in the US. While driving over the field, soil pH is measured on-the-go directly within the soil by ion selective antimony electrodes. The aim of this study was to evaluate the Veris pH Manager™ under farming conditions in Germany. Sensor readings were compared with data obtained by standard protocols of soil pH assessment. Experiments took place under different scenarios: (a) controlled tests in the lab, (b) semicontrolled test on transects in a stop-and-go mode, and (c) tests under practical conditions in the field with the sensor working in its typical on-the-go mode. Accuracy issues, problems, options, and potential benefits of the Veris pH Manager™ were addressed. The tests demonstrated a high degree of linearity between standard laboratory values and sensor readings. Under practical conditions in the field (scenario c), the measure of fit (r(2)) for the regression between the on-the-go measurements and the reference data was 0.71, 0.63, and 0.84, respectively. Field-specific calibration was necessary to reduce systematic errors. Accuracy of the on-the-go maps was considerably higher compared with the pH maps obtained by following the standard protocols, and the error in calculating lime requirements was reduced by about one half. However, the system showed some weaknesses due to blockage by residual straw and weed roots. If these problems were solved, the on-the-go sensor investigated here could be an efficient alternative to standard sampling protocols as a basis for liming in Germany.

  4. Lead Sampling Protocols: Why So Many and What Do They Tell You?

    EPA Science Inventory

    Sampling protocols can be broadly categorized based on their intended purpose of 1) Pb regulatory compliance/corrosion control efficacy, 2) Pb plumbing source determination or Pb type identification, and 3) Pb exposure assessment. Choosing the appropriate protocol is crucial to p...

  5. Forensic electrochemistry: indirect electrochemical sensing of the components of the new psychoactive substance "Synthacaine".

    PubMed

    Cumba, Loanda R; Kolliopoulos, Athanasios V; Smith, Jamie P; Thompson, Paul D; Evans, Peter R; Sutcliffe, Oliver B; do Carmo, Devaney R; Banks, Craig E

    2015-08-21

    "Synthacaine" is a New Psychoactive Substance which is, due to its inherent psychoactive properties, reported to imitate the effects of cocaine and is therefore consequently branded as "legal cocaine". The only analytical approach reported to date for the sensing of "Synthacaine" is mass spectrometry. In this paper, we explore and evaluate a range of potential analytical techniques for its quantification and potential use in the field screening "Synthacaine" using Raman spectroscopy, presumptive (colour) testing, High Performance Liquid Chromatography (HPLC) and electrochemistry. HPLC analysis of street samples reveals that "Synthacaine" comprises a mixture of methiopropamine (MPA) and 2-aminoindane (2-AI). Raman spectroscopy and presumptive (colour) tests, the Marquis, Mandelin, Simon's and Robadope test, are evaluated towards a potential in-the-field screening approach but are found to not be able to discriminate between the two when they are both present in the same sample, as is the case in the real street samples. We report for the first time a novel indirect electrochemical protocol for the sensing of MPA and 2-AI which is independently validated in street samples with HPLC. This novel electrochemical approach based upon one-shot disposable cost effective screen-printed graphite macroelectrodes holds potential for in-the-field screening for "Synthacaine".

  6. Size characterization by Sedimentation Field Flow Fractionation of silica particles used as food additives.

    PubMed

    Contado, Catia; Ravani, Laura; Passarella, Martina

    2013-07-25

    Four types of SiO2, available on the market as additives in food and personal care products, were size characterized using Sedimentation Field Flow Fractionation (SdFFF), SEM, TEM and Photon Correlation Spectroscopy (PCS). The synergic use of the different analytical techniques made it possible, for some samples, to confirm the presence of primary nanoparticles (10 nm) organized in clusters or aggregates of different dimension and, for others, to discover that the available information is incomplete, particularly that regarding the presence of small particles. A protocol to extract the silica particles from a simple food matrix was set up, enriching (0.25%, w w(-1)) a nearly silica-free instant barley coffee powder with a known SiO2 sample. The SdFFF technique, in conjunction with SEM observations, made it possible to identify the added SiO2 particles and verify the new particle size distribution. The SiO2 content of different powdered foodstuffs was determined by graphite furnace atomic absorption spectroscopy (GFAAS); the concentrations ranged between 0.006 and 0.35% (w w(-1)). The protocol to isolate the silica particles was so applied to the most SiO2-rich commercial products and the derived suspensions were separated by SdFFF; SEM and TEM observations supported the size analyses while GFAAS determinations on collected fractions permitted element identification. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Water-quality, biological, and habitat assessment of the Boeuf River Basin, southeastern Arkansas, 1994-96

    USGS Publications Warehouse

    Barks, C. Shane; Petersen, James C.; Usrey, Faron D.

    2002-01-01

    Water-quality and biological samples were collected at several sites in the Boeuf River Basin between November 1994 and December 1996. Water-quality and benthic macroinvertebrate community samples were collected and habitat was measured once at 25 ambient monitoring sites during periods of seasonal low flow. Water-quality storm-runoff samples were collected during 11 storm events at two sites (one draining a cotton field and one draining a forested area). Water-quality samples were collected at one site during the draining of a catfish pond. Water-quality samples from the 25 ambient sites indicate that streams in the Boeuf River Basin typically are turbid and nutrient enriched in late fall during periods of relatively low flow. Most suspended solids concentrations ranged from about 50 to 200 milligrams per liter (mg/L), most total nitrogen concentrations ranged from about 1.1 to 1.8 mg/L, and most total phosphorus concentrations ranged from about 0.25 to 0.40 mg/L. Suspended solids, total nitrogen, total ammonia plus organic nitrogen, total phosphorus, and dissolved orthophosphorus concentrations from samples collected during storm events were typically higher at the cotton field site than at the forested site. Estimated annual yields of suspended solids, nitrogen, and phosphorus were substantially higher from the cotton field than from the forested area. Dissolved chloride concentrations typically were higher at the forested site than from the cotton field site. Typically, the suspended solids and nutrient concentrations from the 25 ambient sites were lower than concentrations in runoff from the cotton field but higher than concentrations in runoff from the forest area. Concentrations of sulfate, chloride, suspended solids, and some nutrients in samples from the catfish pond generally were greater than concentrations in samples from other sites. Total phosphorus, orthophosphorus, and fecal coliform bacteria concentrations from the catfish pond generally were lower than concentrations in samples from other sites. Biological condition scores calculated using macroinvertebrate samples and U.S. Environmental Protection Agency Rapid Bioassessment Protocol II indicated that most of the 25 ambient sites would be in the 'moderately impaired' category. However, substantial uncertainty exists in this rating because bioassessment data were compared with data from a reference site outside of the Boeuf River Basin sampled using different methods. Several metrics indicated that communities at most of the ambient sites are composed of more tolerant macroinvertebrates than the community at the reference site. Habitat assessments (using Rapid Bioassessment Protocol II) indicated the reference site outside the Boeuf River Basin had better habitat than the ambient sites. Physical habitat scores for the 25 ambient sites indicated that most ambient sites had poor bottom substrate cover, embeddedness values, and flow and had poor to fair habitat related to most other factors. Most habitat factors at the reference site were considered good to excellent. Part of the variation in biological condition scores was explained by physical habitat scores and concentrations of suspended solids and dissolved oxygen. However, a considerable amount of variability in biological condition scores is not explained by these factors.

  8. Zoom Reconstruction Tool: Evaluation of Image Quality and Influence on the Diagnosis of Root Fracture.

    PubMed

    Queiroz, Polyane Mazucatto; Santaella, Gustavo Machado; Capelozza, Ana Lúcia Alvares; Rosalen, Pedro Luiz; Freitas, Deborah Queiroz; Haiter-Neto, Francisco

    2018-04-01

    This study evaluated the image quality and the diagnosis of root fractures when using the Zoom Reconstruction tool (J Morita, Kyoto, Japan). A utility wax phantom with a metal sample inside was used for objective evaluation, and a mandible with 27 single-rooted teeth (with and without obturation and with and without vertical or horizontal fractures) was used for diagnostic evaluation. The images were acquired in 3 protocols: protocol 1, field of view (FOV) of 4 × 4 cm and a voxel size of 0.08 mm; protocol 2, FOV of 10 × 10 cm and a voxel size of 0.2 mm; and protocol 3, Zoom Reconstruction of images from protocol 2 (FOV of 4 × 4 cm and a voxel size of 0.08 mm). The objective evaluation was achieved by measuring the image noise, and the diagnosis of fractures was performed by 3 evaluators. The area under the receiver operating characteristic curve was used to calculate accuracy, and analysis of variance compared the accuracy and image quality of the protocols. Regarding quality, protocol 1 was superior to protocol 2 (P < .0001) and Zoom Reconstruction (P < .0001). Additionally, images of protocol 2 presented less noise than the Zoom Reconstruction image (P < .0001); however, for diagnosis, Zoom Reconstruction was superior in relation to protocol 2 (P = .011) and did not differ from protocol 1 (P = .228) for the diagnosis of a vertical root fracture in filled teeth. The Zoom Reconstruction tool allows better accuracy for vertical root fracture detection in filled teeth, making it possible to obtain a higher-resolution image from a lower-resolution examination without having to expose the patient to more radiation. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  9. Validation of an antibody-based biosensor for rapid quantification of 2,4,6-trinitrotoluene (TNT) contamination in ground water and river water.

    PubMed

    Bromage, Erin S; Vadas, George G; Harvey, Ellen; Unger, Michael A; Kaattari, Stephen L

    2007-10-15

    Nitroaromatics are common pollutants of soil and groundwater at military installations because of their manufacture, storage, and use at these sites. Long-term monitoring of these pollutants comprise a significant percentage of restoration costs. Further, remediation activities often have to be delayed, while the samples are processed via traditional chemical assessment protocols. Here we describe a rapid (<5 min), cost-effective, accurate method using a KinExA Inline Biosensor for monitoring of 2,4,6-trinitrotoluene (TNT) in field water samples. The biosensor, which is based on KinExA technology, accurately estimated the concentration of TNT in double-blind comparisons with similar accuracy to traditional high-performance liquid chromatography(HPLC). In the assessment of field samples, the biosensor accurately predicted the concentration of TNT over the range of 1-30,000 microg/L when compared to either HPLC or quantitative gas chromatography-mass spectrometry (GC-MS). Various pre-assessment techniques were explored to examine whether field samples could be assessed untreated, without the removal of particulates or the use of solvents. In most cases, the KinExA Inline Biosensor gave a uniform assessment of TNT concentration independent of pretreatment method. This indicates that this sensor possesses significant promise for rapid, on-site assessment of TNT pollution in environmental water samples.

  10. LPS-induced microvascular leukocytosis can be assessed by blue-field entoptic phenomenon.

    PubMed

    Kolodjaschna, Julia; Berisha, Fatmire; Lung, Solveig; Schaller, Georg; Polska, Elzbieta; Jilma, Bernd; Wolzt, Michael; Schmetterer, Leopold

    2004-08-01

    Administration of low doses of Escherichia coli endotoxin [a lipopolysaccharide (LPS)] to humans enables the study of inflammatory mechanisms. The purpose of the present study was to investigate whether the blue-field entoptic technique may be used to quantify the increase in circulating leukocytes in the ocular microvasculature after LPS infusion. In addition, combined laser Doppler velocimetry and retinal vessel size measurement were used to study red blood cell movement. Twelve healthy male volunteers received 20 IU/kg iv LPS as a bolus infusion. Outcome parameters were measured at baseline and 4 h after LPS administration. In the first protocol (n = 6 subjects), ocular hemodynamic effects were assessed with the blue-field entoptic technique, the retinal vessel analyzer, and laser Doppler velocimetry. In the second protocol (n = 6 subjects), white blood cell (WBC) counts from peripheral blood samples and blue-field entoptic technique measurements were performed. LPS caused peripheral blood leukocytosis and increased WBC density in ocular microvessels (by 49%; P = 0.036) but did not change WBC velocity. In addition, retinal venous diameter was increased (by 9%; P = 0.008), but red blood cell velocity remained unchanged. The LPS-induced changes in retinal WBC density and leukocyte counts were significantly correlated (r = 0.87). The present study indicates that the blue-field entoptic technique can be used to assess microvascular leukocyte recruitment in vivo. In addition, our data indicate retinal venous dilation in response to endotoxin.

  11. Representativeness of laboratory sampling procedures for the analysis of trace metals in soil.

    PubMed

    Dubé, Jean-Sébastien; Boudreault, Jean-Philippe; Bost, Régis; Sona, Mirela; Duhaime, François; Éthier, Yannic

    2015-08-01

    This study was conducted to assess the representativeness of laboratory sampling protocols for purposes of trace metal analysis in soil. Five laboratory protocols were compared, including conventional grab sampling, to assess the influence of sectorial splitting, sieving, and grinding on measured trace metal concentrations and their variability. It was concluded that grinding was the most important factor in controlling the variability of trace metal concentrations. Grinding increased the reproducibility of sample mass reduction by rotary sectorial splitting by up to two orders of magnitude. Combined with rotary sectorial splitting, grinding increased the reproducibility of trace metal concentrations by almost three orders of magnitude compared to grab sampling. Moreover, results showed that if grinding is used as part of a mass reduction protocol by sectorial splitting, the effect of sieving on reproducibility became insignificant. Gy's sampling theory and practice was also used to analyze the aforementioned sampling protocols. While the theoretical relative variances calculated for each sampling protocol qualitatively agreed with the experimental variances, their quantitative agreement was very poor. It was assumed that the parameters used in the calculation of theoretical sampling variances may not correctly estimate the constitutional heterogeneity of soils or soil-like materials. Finally, the results have highlighted the pitfalls of grab sampling, namely, the fact that it does not exert control over incorrect sampling errors and that it is strongly affected by distribution heterogeneity.

  12. Non-invasive surveillance for Plasmodium in reservoir macaque species.

    PubMed

    Siregar, Josephine E; Faust, Christina L; Murdiyarso, Lydia S; Rosmanah, Lis; Saepuloh, Uus; Dobson, Andrew P; Iskandriati, Diah

    2015-10-12

    Primates are important reservoirs for human diseases, but their infection status and disease dynamics are difficult to track in the wild. Within the last decade, a macaque malaria, Plasmodium knowlesi, has caused disease in hundreds of humans in Southeast Asia. In order to track cases and understand zoonotic risk, it is imperative to be able to quantify infection status in reservoir macaque species. In this study, protocols for the collection of non-invasive samples and isolation of malaria parasites from naturally infected macaques are optimized. Paired faecal and blood samples from 60 Macaca fascicularis and four Macaca nemestrina were collected. All animals came from Sumatra or Java and were housed in semi-captive breeding colonies around West Java. DNA was extracted from samples using a modified protocol. Nested polymerase chain reactions (PCR) were run to detect Plasmodium using primers targeting mitochondrial DNA. Sensitivity of screening faecal samples for Plasmodium was compared to other studies using Kruskal Wallis tests and logistic regression models. The best primer set was 96.7 % (95 % confidence intervals (CI): 83.3-99.4 %) sensitive for detecting Plasmodium in faecal samples of naturally infected macaques (n = 30). This is the first study to produce definitive estimates of Plasmodium sensitivity and specificity in faecal samples from naturally infected hosts. The sensitivity was significantly higher than some other studies involving wild primates. Faecal samples can be used for detection of malaria infection in field surveys of macaques, even when there are no parasites visible in thin blood smears. Repeating samples from individuals will improve inferences of the epidemiology of malaria in wild primates.

  13. Lessons Learned for Geologic Data Collection and Sampling: Insights from the Desert RATS 2010 Geologist Crewmembers

    NASA Technical Reports Server (NTRS)

    Hurtado, J. M., Jr.; Bleacher, J. E.; Rice, J.; Young, K.; Garry, W. B.; Eppler, D.

    2011-01-01

    Since 1997, Desert Research and Technology Studies (D-RATS) has conducted hardware and operations tests in the Arizona desert that advance human and robotic planetary exploration capabilities. D-RATS 2010 (8/31-9/13) simulated geologic traverses through a terrain of cinder cones, lava flows, and underlying sedimentary units using a pair of crewed rovers and extravehicular activities (EVAs) for geologic fieldwork. There were two sets of crews, each consisting of an engineer/commander and an experienced field geologist drawn from the academic community. A major objective of D-RATS was to examine the functions of a science support team, the roles of geologist crewmembers, and protocols, tools, and technologies needed for effective data collection and sample documentation. Solutions to these problems must consider how terrestrial field geology must be adapted to geologic fieldwork during EVAs

  14. The room temperature preservation of filtered environmental DNA samples and assimilation into a phenol–chloroform–isoamyl alcohol DNA extraction

    PubMed Central

    Renshaw, Mark A; Olds, Brett P; Jerde, Christopher L; McVeigh, Margaret M; Lodge, David M

    2015-01-01

    Current research targeting filtered macrobial environmental DNA (eDNA) often relies upon cold ambient temperatures at various stages, including the transport of water samples from the field to the laboratory and the storage of water and/or filtered samples in the laboratory. This poses practical limitations for field collections in locations where refrigeration and frozen storage is difficult or where samples must be transported long distances for further processing and screening. This study demonstrates the successful preservation of eDNA at room temperature (20 °C) in two lysis buffers, CTAB and Longmire's, over a 2-week period of time. Moreover, the preserved eDNA samples were seamlessly integrated into a phenol–chloroform–isoamyl alcohol (PCI) DNA extraction protocol. The successful application of the eDNA extraction to multiple filter membrane types suggests the methods evaluated here may be broadly applied in future eDNA research. Our results also suggest that for many kinds of studies recently reported on macrobial eDNA, detection probabilities could have been increased, and at a lower cost, by utilizing the Longmire's preservation buffer with a PCI DNA extraction. PMID:24834966

  15. Refinement of Protocols for Measuring the Apparent Optical Properties of Seawater. Chapter 8

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B.; Zibordi, Giuseppe; Berthon, Jean-Francois; Nirek, Andre; Antoine, David

    2003-01-01

    Ocean color satellite missions, like the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) or the Moderate Resolution Imaging Spectroradiometer (MODIS) projects, are tasked with acquiring a global ocean color data set, validating and monitoring the accuracy and quality of the data, processing the radiometric data into geophysical units using a set of atmospheric and bio-optical algorithms, and distributing the final products to the scientific community. The long-standing requirement of the SeaWiFS Project, for example, is to produce spectral water-leaving radiances, LW(lambda), to within 5% absolute (lambda denotes wavelength) and chlorophyll a concentrations to within 35% (Hooker and Esaias 1993), and most ocean color sensors have the same or similar requirements. Although a diverse set of activities are required to ensure the accuracy requirements are met (Hooker and McClain 2000), the perspective here is with field observations. The accurate determination of upper ocean apparent optical properties (AOPs) is essential for the vicarious calibration of ocean color data and the validation of the derived data products, because the sea-truth measurements are used to evaluate the satellite observations (Hooker and McClain 2000). The uncertainties with in situ AOP measurements have various sources: a) the sampling procedures used in the field, including the environmental conditions encountered; b) the absolute characterization of the radiometers in the laboratory; c) the conversion of the light signals to geophysical units in a processing scheme, and d) the stability of the radiometers in the harsh environment they are subjected to during transport and use. Assuming ideal environmental conditions, so this aspect can be neglected, the SeaWiFS ground-truth uncertainty budget can only be satisfied if each uncertainty is on the order of 1-2%, or what is generally referred to as 1% radiometry. In recent years, progress has been made in estimating the magnitude of some of these uncertainties and in defining procedures for minimizing them. For the SeaWiFS Project, the first step was to convene a workshop to draft the SeaWiFS Ocean Optics Protocols (hereafter referred to as the Protocols). The Protocols initially adhered to the Joint Global Ocean Flux Study (JGOFS) sampling procedures (JGOFS 1991) and defined the standards for optical measurements to be used in SeaWiFS calibration and validation activities (Mueller and Austin 1992). Over time, the Protocols were revised (Mueller and Austin 1995), and then recurringly updated on essentially an annual basis (Mueller 2000, 2002, and 2003) as part of the Sensor Inter-comparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) project. 98

  16. Sticky trap and stem-tap sampling protocols for the Asian citrus psyllid (Hemiptera: Psyllidae)

    USDA-ARS?s Scientific Manuscript database

    Sampling statistics were obtained to develop a sampling protocol for estimating numbers of adult Diaphorina citri in citrus using two different sampling methods: yellow sticky traps and stem–tap samples. A 4.0 ha block of mature orange trees was stratified into ten 0.4 ha strata and sampled using...

  17. Evaluation of hemifield sector analysis protocol in multifocal visual evoked potential objective perimetry for the diagnosis and early detection of glaucomatous field defects.

    PubMed

    Mousa, Mohammad F; Cubbidge, Robert P; Al-Mansouri, Fatima; Bener, Abdulbari

    2014-02-01

    Multifocal visual evoked potential (mfVEP) is a newly introduced method used for objective visual field assessment. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard automated perimetry (SAP) visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey field analyzer (HFA) test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the hemifield sector analysis (HSA) protocol. Analysis of the HFA was done using the standard grading system. Analysis of mfVEP results showed that there was a statistically significant difference between the three groups in the mean signal to noise ratio (ANOVA test, p < 0.001 with a 95% confidence interval). The difference between superior and inferior hemispheres in all subjects were statistically significant in the glaucoma patient group in all 11 sectors (t-test, p < 0.001), partially significant in 5 / 11 (t-test, p < 0.01), and no statistical difference in most sectors of the normal group (1 / 11 sectors was significant, t-test, p < 0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86%, respectively, and for glaucoma suspect patients the values were 89% and 79%, respectively. The new HSA protocol used in the mfVEP testing can be applied to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss.

  18. Evaluation of Hemifield Sector Analysis Protocol in Multifocal Visual Evoked Potential Objective Perimetry for the Diagnosis and Early Detection of Glaucomatous Field Defects

    PubMed Central

    Mousa, Mohammad F.; Cubbidge, Robert P.; Al-Mansouri, Fatima

    2014-01-01

    Purpose Multifocal visual evoked potential (mfVEP) is a newly introduced method used for objective visual field assessment. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard automated perimetry (SAP) visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. Methods Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey field analyzer (HFA) test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the hemifield sector analysis (HSA) protocol. Analysis of the HFA was done using the standard grading system. Results Analysis of mfVEP results showed that there was a statistically significant difference between the three groups in the mean signal to noise ratio (ANOVA test, p < 0.001 with a 95% confidence interval). The difference between superior and inferior hemispheres in all subjects were statistically significant in the glaucoma patient group in all 11 sectors (t-test, p < 0.001), partially significant in 5 / 11 (t-test, p < 0.01), and no statistical difference in most sectors of the normal group (1 / 11 sectors was significant, t-test, p < 0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86%, respectively, and for glaucoma suspect patients the values were 89% and 79%, respectively. Conclusions The new HSA protocol used in the mfVEP testing can be applied to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. PMID:24511212

  19. Evaluation of the effects of footwear hygiene protocols on nonspecific bacterial contamination of floor surfaces in an equine hospital.

    PubMed

    Stockton, Kelly A; Morley, Paul S; Hyatt, Doreene R; Burgess, Brandy A; Patterson, Gage; Dunowska, Magda; Lee, David E

    2006-04-01

    To evaluate the effects of footwear hygiene protocols on bacterial contamination of floor surfaces in an equine hospital. Field trial. Footwear hygiene protocols evaluated included use of rubber overboots with footbaths and footmats containing a quaternary ammonium disinfectant, rubber overboots with footbaths and footmats containing a peroxygen disinfectant, and no restrictions on footwear type but mandatory use of footbaths and footmats containing a peroxygen disinfectant. Nonspecific aerobic bacterial counts were determined via 2 procedures for sample collection and bacterial enumeration (contact plates vs swabbing combined with use of spread plates), and the effects of each footwear hygiene protocol were compared. There were no consistent findings suggesting that any of the protocols were associated with differences in numbers of bacteria recovered from floor surfaces. Although there were detectable differences in numbers of bacteria recovered in association with different footwear hygiene protocols, differences in least square mean bacterial counts did not appear to be clinically relevant (ie, were < 1 log10). Although cleaning and disinfection of footwear are important aids in reducing the risk of nosocomial transmission of infectious agents in veterinary hospitals, the numbers of aerobic bacteria recovered from floor surfaces were not affected by use of rubber overboots or the types of disinfectant used in this study. Further study is warranted to evaluate the usefulness of footwear hygiene practices relative to their efficacy for reducing transmission of specific pathogens or decreasing nosocomial disease risk.

  20. Landbird Monitoring Protocol for National Parks in the North Coast and Cascades Network

    USGS Publications Warehouse

    Siegel, Rodney B.; Wilkerson, Robert L.; Jenkins, Kurt J.; Kuntz, Robert C.; Boetsch, John R.; Schaberl, James P.; Happe, Patricia J.

    2007-01-01

    This protocol narrative outlines the rationale, sampling design and methods for monitoring landbirds in the North Coast and Cascades Network (NCCN) during the breeding season. The NCCN, one of 32 networks of parks in the National Park System, comprises seven national park units in the Pacific Northwest, including three large, mountainous, natural area parks (Mount Rainier [MORA] and Olympic [OLYM] National Parks, North Cascades National Park Service Complex [NOCA]), and four small historic cultural parks (Ebey's Landing National Historical Reserve [EBLA], Lewis and Clark National Historical Park [LEWI], Fort Vancouver National Historical Park [FOVA], and San Juan Island National Historical Park [SAJH]). The protocol reflects decisions made by the NCCN avian monitoring group, which includes NPS representatives from each of the large parks in the Network as well as personnel from the U.S. Geological Survey Forest and Rangeland Ecosystem Science Center (USGS-FRESC) Olympic Field Station, and The Institute for Bird Populations, at meetings held between 2000 (Siegel and Kuntz, 2000) and 2005. The protocol narrative describes the monitoring program in relatively broad terms, and its structure and content adhere to the outline and recommendations developed by Oakley and others (2003) and adopted by NPS. Finer details of the methodology are addressed in a set of standard operating procedures (SOPs) that accompany the protocol narrative. We also provide appendixes containing additional supporting materials that do not clearly belong in either the protocol narrative or the standard operating procedures.

  1. GENERIC VERIFICATION PROTOCOL: DISTRIBUTED GENERATION AND COMBINED HEAT AND POWER FIELD TESTING PROTOCOL

    EPA Science Inventory

    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  2. High-frequency DOC and nitrate measurements provide new insights into their export and their relationships to rainfall-runoff processes

    NASA Astrophysics Data System (ADS)

    Schwab, Michael; Klaus, Julian; Pfister, Laurent; Weiler, Markus

    2015-04-01

    Over the past decades, stream sampling protocols for environmental tracers were often limited by logistical and technological constraints. Long-term sampling programs would typically rely on weekly sampling campaigns, while high-frequency sampling would remain restricted to a few days or hours at best. We stipulate that the currently predominant sampling protocols are too coarse to capture and understand the full amplitude of rainfall-runoff processes and its relation to water quality fluctuations. Weekly sampling protocols are not suited to get insights into the hydrological system during high flow conditions. Likewise, high frequency measurements of a few isolated events do not allow grasping inter-event variability in contributions and processes. Our working hypothesis is based on the potential of a new generation of field-deployable instruments for measuring environmental tracers at high temporal frequencies over an extended period. With this new generation of instruments we expect to gain new insights into rainfall-runoff dynamics, both at intra- and inter-event scales. Here, we present the results of one year of DOC and nitrate measurements with the field deployable UV-Vis spectrometer spectro::lyser (scan Messtechnik GmbH). The instrument measures the absorption spectrum from 220 to 720 nm in situ and at high frequencies and derives DOC and nitrate concentrations. The measurements were carried out at 15 minutes intervals in the Weierbach catchment (0.47 km2) in Luxemburg. This fully forested catchment is characterized by cambisol soils and fractured schist as underlying bedrock. The time series of DOC and nitrate give insights into the high frequency dynamics of stream water. Peaks in DOC concentrations are closely linked to discharge peaks that occur during or right after a rainfall event. Those first discharge peaks can be linked to fast near surface runoff processes and are responsible for a remarkable amount of DOC export. A special characterisation of the Weierbach catchment are the delayed second peaks a few days after the rainfall event. Nitrate concentrations are following this second peak. We assume that this delayed response is going back to subsurface or upper groundwater flows, with nitrate enriched water. On an inter-event scale during low flow / base flow conditions, we observe interesting diurnal patterns of both DOC and nitrate concentrations. Overall, the long-term high-frequency measurements of DOC and nitrate provide us the opportunity to separate different rainfall-runoff processes and link the amount of DOC and nitrate export to them to quantify the overall relevance of the different processes.

  3. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 4; Inherent Optical Properties: Instruments, Characterizations, Field Measurements and Data Analysis Protocols; Revised

    NASA Technical Reports Server (NTRS)

    Mueller, J. L. (Editor); Fargion, Giuletta S. (Editor); McClain, Charles R. (Editor); Pegau, Scott; Zaneveld, J. Ronald V.; Mitchell, B. Gregg; Kahru, Mati; Wieland, John; Stramska, Malgorzat

    2003-01-01

    This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 (Mueller and Fargion 2002, Volumes 1 and 2) is entirely superseded by the six volumes of Revision 4 listed above.

  4. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4, Volume IV: Inherent Optical Properties: Instruments, Characterizations, Field Measurements and Data Analysis Protocols

    NASA Technical Reports Server (NTRS)

    Mueller, J. L.; Fargion, G. S.; McClain, C. R. (Editor); Pegau, S.; Zanefeld, J. R. V.; Mitchell, B. G.; Kahru, M.; Wieland, J.; Stramska, M.

    2003-01-01

    This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparision and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background, and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 is entirely superseded by the six volumes of Revision 4 listed above.

  5. Protocol for determining bull trout presence

    USGS Publications Warehouse

    Peterson, James; Dunham, Jason B.; Howell, Philip; Thurow, Russell; Bonar, Scott

    2002-01-01

    The Western Division of the American Fisheries Society was requested to develop protocols for determining presence/absence and potential habitat suitability for bull trout. The general approach adopted is similar to the process for the marbled murrelet, whereby interim guidelines are initially used, and the protocols are subsequently refined as data are collected. Current data were considered inadequate to precisely identify suitable habitat but could be useful in stratifying sampling units for presence/absence surveys. The presence/absence protocol builds on previous approaches (Hillman and Platts 1993; Bonar et al. 1997), except it uses the variation in observed bull trout densities instead of a minimum threshold density and adjusts for measured differences in sampling efficiency due to gear types and habitat characteristics. The protocol consists of: 1. recommended sample sizes with 80% and 95% detection probabilities for juvenile and resident adult bull trout for day and night snorkeling and electrofishing adjusted for varying habitat characteristics for 50m and 100m sampling units, 2. sampling design considerations, including possible habitat characteristics for stratification, 3. habitat variables to be measured in the sampling units, and 3. guidelines for training sampling crews. Criteria for habitat strata consist of coarse, watershed-scale characteristics (e.g., mean annual air temperature) and fine-scale, reach and habitat-specific features (e.g., water temperature, channel width). The protocols will be revised in the future using data from ongoing presence/absence surveys, additional research on sampling efficiencies, and development of models of habitat/species occurrence.

  6. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... of official release is no longer required under paragraph (c)(2) of this section. (ii) One sample at... required under paragraph (c)(2) of this section. The sample submitted at the 90-day interval shall be from...

  7. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... of official release is no longer required under paragraph (c)(2) of this section. (ii) One sample at... required under paragraph (c)(2) of this section. The sample submitted at the 90-day interval shall be from...

  8. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... of official release is no longer required under paragraph (c)(2) of this section. (ii) One sample at... required under paragraph (c)(2) of this section. The sample submitted at the 90-day interval shall be from...

  9. Mars Sample Handling Protocol Workshop Series: Workshop 2

    NASA Technical Reports Server (NTRS)

    Rummel, John D. (Editor); Acevedo, Sara E. (Editor); Kovacs, Gregory T. A. (Editor); Race, Margaret S. (Editor); DeVincenzi, Donald L. (Technical Monitor)

    2001-01-01

    Numerous NASA reports and studies have identified Planetary Protection (PP) as an important part of any Mars sample return mission. The mission architecture, hardware, on-board experiments, and related activities must be designed in ways that prevent both forward- and back-contamination and also ensure maximal return of scientific information. A key element of any PP effort for sample return missions is the development of guidelines for containment and analysis of returned sample(s). As part of that effort, NASA and the Space Studies Board (SSB) of the National Research Council (NRC) have each assembled experts from a wide range of scientific fields to identify and discuss issues pertinent to sample return. In 1997, the SSB released its report on recommendations for handling and testing of returned Mars samples. In particular, the NRC recommended that: a) samples returned from Mars by spacecraft should be contained and treated as potentially hazardous until proven otherwise, and b) rigorous physical, chemical, and biological analyses [should] confirm that there is no indication of the presence of any exogenous biological entity. Also in 1997, a Mars Sample Quarantine Protocol workshop was convened at NASA Ames Research Center to deal with three specific aspects of the initial handling of a returned Mars sample: 1) biocontainment, to prevent 'uncontrolled release' of sample material into the terrestrial environment; 2) life detection, to examine the sample for evidence of organisms; and 3) biohazard testing, to determine if the sample poses any threat to terrestrial life forms and the Earth's biosphere. In 1999, a study by NASA's Mars Sample Handling and Requirements Panel (MSHARP) addressed three other specific areas in anticipation of returning samples from Mars: 1) sample collection and transport back to Earth; 2) certification of the samples as non-hazardous; and 3) sample receiving, curation, and distribution. To further refine the requirements for sample hazard testing and the criteria for subsequent release of sample materials from quarantine, the NASA Planetary Protection Officer convened an additional series of workshops beginning in March 2000. The overall objective of these workshops was to develop comprehensive protocols to assess whether the returned materials contain any biological hazards, and to safeguard the purity of the samples from possible terrestrial contamination. This document is the report of the second Workshop in the Series. The information herein will ultimately be integrated into a final document reporting the proceedings of the entire Workshop Series along with additional information and recommendations.

  10. Collaboration During the NASA ABoVE Airborne SAR Campaign: Sampling Strategies Used by NGEE Arctic and Other Partners in Alaska and Western Canada

    NASA Astrophysics Data System (ADS)

    Wullschleger, S. D.; Charsley-Groffman, L.; Baltzer, J. L.; Berg, A. A.; Griffith, P. C.; Jafarov, E. E.; Marsh, P.; Miller, C. E.; Schaefer, K. M.; Siqueira, P.; Wilson, C. J.; Kasischke, E. S.

    2017-12-01

    There is considerable interest in using L- and P-band Synthetic Aperture Radar (SAR) data to monitor variations in aboveground woody biomass, soil moisture, and permafrost conditions in high-latitude ecosystems. Such information is useful for quantifying spatial heterogeneity in surface and subsurface properties, and for model development and evaluation. To conduct these studies, it is desirable that field studies share a common sampling strategy so that the data from multiple sites can be combined and used to analyze variations in conditions across different landscape geomorphologies and vegetation types. In 2015, NASA launched the decade-long Arctic-Boreal Vulnerability Experiment (ABoVE) to study the sensitivity and resilience of these ecosystems to disturbance and environmental change. NASA is able to leverage its remote sensing strengths to collect airborne and satellite observations to capture important ecosystem properties and dynamics across large spatial scales. A critical component of this effort includes collection of ground-based data that can be used to analyze, calibrate and validate remote sensing products. ABoVE researchers at a large number of sites located in important Arctic and boreal ecosystems in Alaska and western Canada are following common design protocols and strategies for measuring soil moisture, thaw depth, biomass, and wetland inundation. Here we elaborate on those sampling strategies as used in the 2017 summer SAR campaign and address the sampling design and measurement protocols for supporting the ABoVE aerial activities. Plot size, transect length, and distribution of replicates across the landscape systematically allowed investigators to optimally sample a site for soil moisture, thaw depth, and organic layer thickness. Specific examples and data sets are described for the Department of Energy's Next-Generation Ecosystem Experiments (NGEE Arctic) project field sites near Nome and Barrow, Alaska. Future airborne and satellite campaigns will be conducted by the NASA ABoVE team and additional collaboration is encouraged.

  11. Standard operating procedures for collection of soil and sediment samples for the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study

    USGS Publications Warehouse

    Fisher, Shawn C.; Reilly, Timothy J.; Jones, Daniel K.; Benzel, William M.; Griffin, Dale W.; Loftin, Keith A.; Iwanowicz, Luke R.; Cohl, Jonathan A.

    2015-12-17

    An understanding of the effects on human and ecological health brought by major coastal storms or flooding events is typically limited because of a lack of regionally consistent baseline and trends data in locations proximal to potential contaminant sources and mitigation activities, sensitive ecosystems, and recreational facilities where exposures are probable. In an attempt to close this gap, the U.S. Geological Survey (USGS) has implemented the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study to collect regional sediment-quality data prior to and in response to future coastal storms. The standard operating procedure (SOP) detailed in this document serves as the sample-collection protocol for the SCoRR strategy by providing step-by-step instructions for site preparation, sample collection and processing, and shipping of soil and surficial sediment (for example, bed sediment, marsh sediment, or beach material). The objectives of the SCoRR strategy pilot study are (1) to create a baseline of soil-, sand-, marsh sediment-, and bed-sediment-quality data from sites located in the coastal counties from Maine to Virginia based on their potential risk of being contaminated in the event of a major coastal storm or flooding (defined as Resiliency mode); and (2) respond to major coastal storms and flooding by reoccupying select baseline sites and sampling within days of the event (defined as Response mode). For both modes, samples are collected in a consistent manner to minimize bias and maximize quality control by ensuring that all sampling personnel across the region collect, document, and process soil and sediment samples following the procedures outlined in this SOP. Samples are analyzed using four USGS-developed screening methods—inorganic geochemistry, organic geochemistry, pathogens, and biological assays—which are also outlined in this SOP. Because the SCoRR strategy employs a multi-metric approach for sample analyses, this protocol expands upon and reconciles differences in the sample collection protocols outlined in the USGS “National Field Manual for the Collection of Water-Quality Data,” which should be used in conjunction with this SOP. A new data entry and sample tracking system also is presented to ensure all relevant data and metadata are gathered at the sample locations and in the laboratories.

  12. Quarantine and protocol

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The purpose of the Orbiting Quarantine Facility is to provide maximum protection of the terrestrial biosphere by ensuring that the returned Martian samples are safe to bring to Earth. The protocol designed to detect the presence of biologically active agents in the Martian soil is described. The protocol determines one of two things about the sample: (1) that it is free from nonterrestrial life forms and can be sent to a terrestrial containment facility where extensive chemical, biochemical, geological, and physical investigations can be conducted; or (2) that it exhibits "biological effects" of the type that dictate second order testing. The quarantine protocol is designed to be conducted on a small portion of the returned sample, leaving the bulk of the sample undisturbed for study on Earth.

  13. Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples

    PubMed Central

    2012-01-01

    Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466

  14. A novel PFIB sample preparation protocol for correlative 3D X-ray CNT and FIB-TOF-SIMS tomography.

    PubMed

    Priebe, Agnieszka; Audoit, Guillaume; Barnes, Jean-Paul

    2017-02-01

    We present a novel sample preparation method that allows correlative 3D X-ray Computed Nano-Tomography (CNT) and Focused Ion Beam Time-Of-Flight Secondary Ion Mass Spectrometry (FIB-TOF-SIMS) tomography to be performed on the same sample. In addition, our invention ensures that samples stay unmodified structurally and chemically between the subsequent experiments. The main principle is based on modifying the topography of the X-ray CNT experimental setup before FIB-TOF-SIMS measurements by incorporating a square washer around the sample. This affects the distribution of extraction field lines and therefore influences the trajectories of secondary ions that are now guided more efficiently towards the detector. As the result, secondary ion detection is significantly improved and higher, i.e. statistically better, signals are obtained. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Judges' Agreement and Disagreement Patterns When Encoding Verbal Protocols.

    ERIC Educational Resources Information Center

    Schael, Jocelyne; Dionne, Jean-Paul

    The basis of agreement or disagreement among judges/evaluators when applying a coding scheme to concurrent verbal protocols was studied. The sample included 20 university graduates, from varied backgrounds; 10 subjects had and 10 subjects did not have experience in protocol analysis. The total sample was divided into four balanced groups according…

  16. Thermal relaxation and collective dynamics of interacting aerosol-generated hexagonal NiFe2O4 nanoparticles.

    PubMed

    Ortega, D; Kuznetsov, M V; Morozov, Yu G; Belousova, O V; Parkin, I P

    2013-12-28

    This article reports on the magnetic properties of interacting uncoated nickel ferrite (NiFe2O4) nanoparticles synthesized through an aerosol levitation-jet technique. A comprehensive set of samples with different compositions of background gas and metal precursors, as well as applied electric field intensities, has been studied. Nanoparticles prepared under a field of 210 kV m(-1) show moderately high-field irreversibility and shifted hysteresis loops after field-cooling, also exhibiting a joint temperature decrease of the exchange field and coercivity. The appearance of memory effects has been checked using the genuine ZFC protocol and the observed behavior cannot be fully explained in terms of thermal relaxation. Although dipolar interactions prevail, exchange interactions occur to a certain extent within a narrow range of applied fields. The origin of the slow dynamics in the system is found to be given by the interplay of the distribution of energy barriers due to size dispersion and the cooperative dynamics associated with frustrated interactions.

  17. The Effect of Postmastectomy Radiation Therapy on Breast Implants: Material Analysis on Silicone and Polyurethane Prosthesis.

    PubMed

    Lo Torto, Federico; Relucenti, Michela; Familiari, Giuseppe; Vaia, Nicola; Casella, Donato; Matassa, Roberto; Miglietta, Selenia; Marinozzi, Franco; Bini, Fabiano; Fratoddi, Ilaria; Sciubba, Fabio; Cassese, Raffaele; Tombolini, Vincenzo; Ribuffo, Diego

    2018-05-17

    The pathogenic mechanism underlying capsular contracture is still unknown. It is certainly a multifactorial process, resulting from human body reaction, biofilm activation, bacteremic seeding, or silicone exposure. The scope of the present article is to investigate the effect of hypofractionated radiotherapy protocol (2.66 Gy × 16 sessions) both on silicone and polyurethane breast implants. Silicone implants and polyurethane underwent irradiation according to a hypofractionated radiotherapy protocol for the treatment of breast cancer. After irradiation implant shells underwent mechanical, chemical, and microstructural evaluation by means of tensile testing, infrared spectra in attenuated total reflectance mode, nuclear magnetic resonance, and field emission scanning electron microscopy. At superficial analysis, irradiated silicone samples show several visible secondary and tertiary blebs. Polyurethane implants showed an open cell structure, which closely resembles a sponge. Morphological observation of struts from treated polyurethane sample shows a more compact structure, with significantly shorter and thicker struts compared with untreated sample. The infrared spectra in attenuated total reflectance mode spectra of irradiated and control samples were compared either for silicon and polyurethane samples. In the case of silicone-based membranes, treated and control specimens showed similar bands, with little differences in the treated one. Nuclear magnetic resonance spectra on the fraction soluble in CDCl3 support these observations. Tensile tests on silicone samples showed a softer behavior of the treated ones. Tensile tests on Polyurethane samples showed no significant differences. Polyurethane implants seem to be more resistant to radiotherapy damage, whereas silicone prosthesis showed more structural, mechanical, and chemical modifications.

  18. Creating and validating GIS measures of urban design for health research.

    PubMed

    Purciel, Marnie; Neckerman, Kathryn M; Lovasi, Gina S; Quinn, James W; Weiss, Christopher; Bader, Michael D M; Ewing, Reid; Rundle, Andrew

    2009-12-01

    Studies relating urban design to health have been impeded by the unfeasibility of conducting field observations across large areas and the lack of validated objective measures of urban design. This study describes measures for five dimensions of urban design - imageability, enclosure, human scale, transparency, and complexity - created using public geographic information systems (GIS) data from the US Census and city and state government. GIS measures were validated for a sample of 588 New York City block faces using a well-documented field observation protocol. Correlations between GIS and observed measures ranged from 0.28 to 0.89. Results show valid urban design measures can be constructed from digital sources.

  19. Creating and validating GIS measures of urban design for health research

    PubMed Central

    Purciel, Marnie; Neckerman, Kathryn M.; Lovasi, Gina S.; Quinn, James W.; Weiss, Christopher; Bader, Michael D.M.; Ewing, Reid; Rundle, Andrew

    2012-01-01

    Studies relating urban design to health have been impeded by the unfeasibility of conducting field observations across large areas and the lack of validated objective measures of urban design. This study describes measures for five dimensions of urban design – imageability, enclosure, human scale, transparency, and complexity – created using public geographic information systems (GIS) data from the US Census and city and state government. GIS measures were validated for a sample of 588 New York City block faces using a well-documented field observation protocol. Correlations between GIS and observed measures ranged from 0.28 to 0.89. Results show valid urban design measures can be constructed from digital sources. PMID:22956856

  20. Malaria rapid diagnostic test as point-of-care test: study protocol for evaluating the VIKIA Malaria Ag Pf/Pan.

    PubMed

    Kim, Saorin; Nhem, Sina; Dourng, Dany; Ménard, Didier

    2015-03-14

    Malaria rapid diagnostic tests (RDTs) are generally considered as point-of-care tests. However, most of the studies assessing the performance of malaria RDTs are conducted by research teams that are not representative of the classical end-users, who are typically unskilled in traditional laboratory techniques for diagnosing malaria. To evaluate the performance of a malaria RDT by end-users in a malaria-endemic area, a study protocol was designed and the VIKIA Malaria Ag Pf/Pan test, previously evaluated in 2013, was re-evaluated by representative end-users. Twenty end-users with four different profiles in seven communes in Kampot Province (Cambodia) were selected. A set of 20 calibrated aliquots, including negative samples, low positive samples (200 parasites/μL of Plasmodium falciparum and Plasmodium vivax) and high positive samples (2,000 parasites/μL of P. falciparum and P. vivax) was used. Testing was performed directly by the end-users without any practical training on the VIKIA Malaria Ag Pf/Pan kit. All results obtained by the end-users were consistent with the expected results, except for the low positive (200 parasites/μL) P. vivax aliquot (35% of concordant results). No significant difference was observed between the different end-users. End-user interviews evaluating ease-of-use and ease-of-reading of the VIKIA Malaria Ag Pf/Pan kit recorded 159 positive answers and only one negative answer. Out of 20 end-users, only one considered the test was not easy to perform with the support of the quick guide. The data presented in this study clearly demonstrate that the performance of the VIKIA Malaria Ag Pf/Pan test when performed by traditional end-users in field conditions is similar to that obtained by a research team and that this RDT can be considered as a point-of-care tool/assay. Furthermore, the protocol designed for this study could be used systematically in parallel to conventional evaluation studies to determine the performance of malaria RDTs in field conditions.

  1. Comparative effectiveness of light-microscopic techniques and PCR in detecting Thelohania solenopsae (Microsporidia) infections in red imported fire ants (Solenopsis invicta).

    PubMed

    Milks, Maynard L; Sokolova, Yuliya Y; Isakova, Irina A; Fuxa, James R; Mitchell, Forrest; Snowden, Karen F; Vinson, S Bradleigh

    2004-01-01

    The main goal of this study was to compare the effectiveness of three staining techniques (calcofluor white M2R, Giemsa and modified trichrome), and the polymerase chain reaction (PCR) in detecting the microsporidium Thelohania solenopsae in red imported fire ants (Solenopsis invicta). The effect of the number of ants in a sample on the sensitivity of the staining techniques and the PCR, and the effect of three DNA extraction protocols on the sensitivity of PCR were also examined. In the first protocol, the ants were macerated and the crude homogenate was used immediately in the PCR. In the second protocol, the homogenate was placed on a special membrane (FTA card) that traps DNA, which is subsequently used in the PCR. In the third protocol, the DNA was purified from the homogenate by traditional phenol-chloroform extraction. Except for PCR using FTA cards, the sensitivity (number of samples positive for T. solenopsae) of all detection techniques increased with the number of ants in the sample. Overall, Giemsa was the least sensitive of all detection techniques. Calcofluor was more sensitive than modified trichrome with ants from one site and was equally as sensitive as PCR with crude DNA or a FTA card with ants from both sites. Trichrome staining was equally as sensitive as PCR with a FTA card at both sites, but it was less sensitive than PCR with crude DNA at one site. PCR on FTA cards was less sensitive than PCR with crude DNA for ants from one site but not the other. There was no difference whether crude or phenol-chloroform purified DNA was used as template. In summary, the results of this study show that PCR based on a crude DNA solution is equal to or more sensitive in detecting T. solenopsae than the other detection techniques investigated, and that it can be used as a reliable diagnostic tool for screening field samples of S. invicta for T. solenopsae. Nevertheless, ant smear stained with calcofluor or modified trichrome should be used to buttress findings from PCR.

  2. Thermogravimetric Analysis of Single-Wall Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Arepalli, Sivram; Nikolaev, Pavel; Gorelik, Olga

    2010-01-01

    An improved protocol for thermogravimetric analysis (TGA) of samples of single-wall carbon nanotube (SWCNT) material has been developed to increase the degree of consistency among results so that meaningful comparisons can be made among different samples. This improved TGA protocol is suitable for incorporation into the protocol for characterization of carbon nanotube material. In most cases, TGA of carbon nanotube materials is performed in gas mixtures that contain oxygen at various concentrations. The improved protocol is summarized.

  3. Efficacy and safety of imidacloprid/moxidectin spot-on solution and fenbendazole in the treatment of dogs naturally infected with Angiostrongylus vasorum (Baillet, 1866).

    PubMed

    Willesen, J L; Kristensen, A T; Jensen, A L; Heine, J; Koch, J

    2007-07-20

    A randomized, blinded, controlled multicentre field trial study was conducted to evaluate the efficacy and safety of imidacloprid 10%/moxidectin 2.5% spot-on solution and fenbendazole in treating dogs naturally infected with Angiostrongylus vasorum. Dogs were randomly treated either with a single dose of 0.1 ml/kg bodyweight of imidacloprid 10%/moxidectin 2.5% spot-on solution or with 25 mg/kg bodyweight fenbendazole per os for 20 days. The study period was 42 days with dogs being examined on days 0, 7 and 42. The primary efficacy parameter was the presence of L1 larvae in faecal samples evaluated by a Baermann test from three consecutive days. Thoracic radiographs performed on each visit were being taken as a paraclinical parameter to support the results of the Baermann test. Twenty-seven dogs in the imidacloprid/moxidectin group and 23 dogs in the fenbendazole group completed the study according to protocol. The efficacies of the two treatment protocols were 85.2% (imidacloprid/moxidectin) and 91.3% (fenbendazole) with no significant difference between treatment groups. On radiographic evaluation pulmonary parenchyma showed similar improvement in each group. No serious adverse effects to treatment were recorded: most of the minor adverse effects were gastrointestinal such as diarrhea (nine dogs), vomitus (eight dogs) and salivation (three dogs). In general, these adverse effects were of short duration (1-2 days) within the first few days after treatment start and required little or no treatment. This prospective study demonstrates that both treatment protocols used are efficacious under field conditions, that treatment of mildly to moderately infected dogs with either of these protocols is safe and yields an excellent prognosis for recovering from the infection.

  4. Purifying, Separating, and Concentrating Cells From a Sample Low in Biomass

    NASA Technical Reports Server (NTRS)

    Benardini, James N.; LaDuc, Myron T.; Diamond, Rochelle

    2012-01-01

    Frequently there is an inability to process and analyze samples of low biomass due to limiting amounts of relevant biomaterial in the sample. Furthermore, molecular biological protocols geared towards increasing the density of recovered cells and biomolecules of interest, by their very nature, also concentrate unwanted inhibitory humic acids and other particulates that have an adversarial effect on downstream analysis. A novel and robust fluorescence-activated cell-sorting (FACS)-based technology has been developed for purifying (removing cells from sampling matrices), separating (based on size, density, morphology), and concentrating cells (spores, prokaryotic, eukaryotic) from a sample low in biomass. The technology capitalizes on fluorescent cell-sorting technologies to purify and concentrate bacterial cells from a low-biomass, high-volume sample. Over the past decade, cell-sorting detection systems have undergone enhancements and increased sensitivity, making bacterial cell sorting a feasible concept. Although there are many unknown limitations with regard to the applicability of this technology to environmental samples (smaller cells, few cells, mixed populations), dogmatic principles support the theoretical effectiveness of this technique upon thorough testing and proper optimization. Furthermore, the pilot study from which this report is based proved effective and demonstrated this technology capable of sorting and concentrating bacterial endospore and bacterial cells of varying size and morphology. Two commercial off-the-shelf bacterial counting kits were used to optimize a bacterial stain/dye FACS protocol. A LIVE/DEAD BacLight Viability and Counting Kit was used to distinguish between the live and dead cells. A Bacterial Counting Kit comprising SYTO BC (mixture of SYTO dyes) was employed as a broad-spectrum bacterial counting agent. Optimization using epifluorescence microscopy was performed with these two dye/stains. This refined protocol was further validated using varying ratios and mixtures of cells to ensure homogenous staining compared to that of individual cells, and were utilized for flow analyzer and FACS labeling. This technology focuses on the purification and concentration of cells from low-biomass spacecraft assembly facility samples. Currently, purification and concentration of low-biomass samples plague planetary protection downstream analyses. Having a capability to use flow cytometry to concentrate cells out of low-biomass, high-volume spacecraft/ facility sample extracts will be of extreme benefit to the fields of planetary protection and astrobiology. Successful research and development of this novel methodology will significantly increase the knowledge base for designing more effective cleaning protocols, and ultimately lead to a more empirical and true account of the microbial diversity present on spacecraft surfaces. Refined cleaning and an enhanced ability to resolve microbial diversity may decrease the overall cost of spacecraft assembly and/or provide a means to begin to assess challenging planetary protection missions.

  5. Highlighting the complexities of a groundwater pilot study during an avian influenza outbreak: Methods, lessons learned, and select contaminant results.

    PubMed

    Hubbard, Laura E; Kolpin, Dana W; Fields, Chad L; Hladik, Michelle L; Iwanowicz, Luke R

    2017-10-01

    The highly pathogenic avian influenza (H5N2) outbreak in the Midwestern United States (US) in 2015 was historic due to the number of birds and poultry operations impacted and the corresponding economic loss to the poultry industry and was the largest animal health emergency in US history. The U.S. Geological Survey (USGS), with the assistance of several state and federal agencies, aided the response to the outbreak by developing a study to determine the extent of virus transport in the environment. The study goals were to: develop the appropriate sampling methods and protocols for measuring avian influenza virus (AIV) in groundwater, provide the first baseline data on AIV and outbreak- and poultry-related contaminant occurrence and movement into groundwater, and document climatological factors that may have affected both survival and transport of AIV to groundwater during the months of the 2015 outbreak. While site selection was expedient, there were often delays in sample response times due to both relationship building between agencies, groups, and producers and logistical time constraints. This study's design and sampling process highlights the unpredictable nature of disease outbreaks and the corresponding difficulty in environmental sampling of such events. The lessons learned, including field protocols and approaches, can be used to improve future research on AIV in the environment. Published by Elsevier Inc.

  6. Highlighting the complexities of a groundwater pilot study during an avian influenza outbreak: Methods, lessons learned, and select contaminant results

    USGS Publications Warehouse

    Hubbard, Laura E.; Kolpin, Dana W.; Fields, Chad L.; Hladik, Michelle L.; Iwanowicz, Luke R.

    2017-01-01

    The highly pathogenic avian influenza (H5N2) outbreak in the Midwestern United States (US) in 2015 was historic due to the number of birds and poultry operations impacted and the corresponding economic loss to the poultry industry and was the largest animal health emergency in US history. The U.S. Geological Survey (USGS), with the assistance of several state and federal agencies, aided the response to the outbreak by developing a study to determine the extent of virus transport in the environment. The study goals were to: develop the appropriate sampling methods and protocols for measuring avian influenza virus (AIV) in groundwater, provide the first baseline data on AIV and outbreak- and poultry-related contaminant occurrence and movement into groundwater, and document climatological factors that may have affected both survival and transport of AIV to groundwater during the months of the 2015 outbreak. While site selection was expedient, there were often delays in sample response times due to both relationship building between agencies, groups, and producers and logistical time constraints. This study's design and sampling process highlights the unpredictable nature of disease outbreaks and the corresponding difficulty in environmental sampling of such events. The lessons learned, including field protocols and approaches, can be used to improve future research on AIV in the environment.

  7. Homogenization of sample absorption for the imaging of large and dense fossils with synchrotron microtomography.

    PubMed

    Sanchez, Sophie; Fernandez, Vincent; Pierce, Stephanie E; Tafforeau, Paul

    2013-09-01

    Propagation phase-contrast synchrotron radiation microtomography (PPC-SRμCT) has proved to be very successful for examining fossils. Because fossils range widely in taphonomic preservation, size, shape and density, X-ray computed tomography protocols are constantly being developed and refined. Here we present a 1-h procedure that combines a filtered high-energy polychromatic beam with long-distance PPC-SRμCT (sample to detector: 4-16 m) and an attenuation protocol normalizing the absorption profile (tested on 13-cm-thick and 5.242 g cm(-3) locally dense samples but applicable to 20-cm-thick samples). This approach provides high-quality imaging results, which show marked improvement relative to results from images obtained without the attenuation protocol in apparent transmission, contrast and signal-to-noise ratio. The attenuation protocol involves immersing samples in a tube filled with aluminum or glass balls in association with a U-shaped aluminum profiler. This technique therefore provides access to a larger dynamic range of the detector used for tomographic reconstruction. This protocol homogenizes beam-hardening artifacts, thereby rendering it effective for use with conventional μCT scanners.

  8. Development of a real-time microchip PCR system for portable plant disease diagnosis.

    PubMed

    Koo, Chiwan; Malapi-Wight, Martha; Kim, Hyun Soo; Cifci, Osman S; Vaughn-Diaz, Vanessa L; Ma, Bo; Kim, Sungman; Abdel-Raziq, Haron; Ong, Kevin; Jo, Young-Ki; Gross, Dennis C; Shim, Won-Bo; Han, Arum

    2013-01-01

    Rapid and accurate detection of plant pathogens in the field is crucial to prevent the proliferation of infected crops. Polymerase chain reaction (PCR) process is the most reliable and accepted method for plant pathogen diagnosis, however current conventional PCR machines are not portable and require additional post-processing steps to detect the amplified DNA (amplicon) of pathogens. Real-time PCR can directly quantify the amplicon during the DNA amplification without the need for post processing, thus more suitable for field operations, however still takes time and require large instruments that are costly and not portable. Microchip PCR systems have emerged in the past decade to miniaturize conventional PCR systems and to reduce operation time and cost. Real-time microchip PCR systems have also emerged, but unfortunately all reported portable real-time microchip PCR systems require various auxiliary instruments. Here we present a stand-alone real-time microchip PCR system composed of a PCR reaction chamber microchip with integrated thin-film heater, a compact fluorescence detector to detect amplified DNA, a microcontroller to control the entire thermocycling operation with data acquisition capability, and a battery. The entire system is 25 × 16 × 8 cm(3) in size and 843 g in weight. The disposable microchip requires only 8-µl sample volume and a single PCR run consumes 110 mAh of power. A DNA extraction protocol, notably without the use of liquid nitrogen, chemicals, and other large lab equipment, was developed for field operations. The developed real-time microchip PCR system and the DNA extraction protocol were used to successfully detect six different fungal and bacterial plant pathogens with 100% success rate to a detection limit of 5 ng/8 µl sample.

  9. Development of a Real-Time Microchip PCR System for Portable Plant Disease Diagnosis

    PubMed Central

    Kim, Hyun Soo; Cifci, Osman S.; Vaughn-Diaz, Vanessa L.; Ma, Bo; Kim, Sungman; Abdel-Raziq, Haron; Ong, Kevin; Jo, Young-Ki; Gross, Dennis C.; Shim, Won-Bo; Han, Arum

    2013-01-01

    Rapid and accurate detection of plant pathogens in the field is crucial to prevent the proliferation of infected crops. Polymerase chain reaction (PCR) process is the most reliable and accepted method for plant pathogen diagnosis, however current conventional PCR machines are not portable and require additional post-processing steps to detect the amplified DNA (amplicon) of pathogens. Real-time PCR can directly quantify the amplicon during the DNA amplification without the need for post processing, thus more suitable for field operations, however still takes time and require large instruments that are costly and not portable. Microchip PCR systems have emerged in the past decade to miniaturize conventional PCR systems and to reduce operation time and cost. Real-time microchip PCR systems have also emerged, but unfortunately all reported portable real-time microchip PCR systems require various auxiliary instruments. Here we present a stand-alone real-time microchip PCR system composed of a PCR reaction chamber microchip with integrated thin-film heater, a compact fluorescence detector to detect amplified DNA, a microcontroller to control the entire thermocycling operation with data acquisition capability, and a battery. The entire system is 25×16×8 cm3 in size and 843 g in weight. The disposable microchip requires only 8-µl sample volume and a single PCR run consumes 110 mAh of power. A DNA extraction protocol, notably without the use of liquid nitrogen, chemicals, and other large lab equipment, was developed for field operations. The developed real-time microchip PCR system and the DNA extraction protocol were used to successfully detect six different fungal and bacterial plant pathogens with 100% success rate to a detection limit of 5 ng/8 µl sample. PMID:24349341

  10. Assessment protocols of maximum oxygen consumption in young people with Down syndrome--a review.

    PubMed

    Seron, Bruna Barboza; Greguol, Márcia

    2014-03-01

    Maximum oxygen consumption is considered the gold standard measure of cardiorespiratory fitness. Young people with Down syndrome (DS) present low values of this indicator compared to their peers without disabilities and to young people with an intellectual disability but without DS. The use of reliable and valid assessment methods provides more reliable results for the diagnosis of cardiorespiratory fitness and the response of this variable to exercise. The aim of the present study was to review the literature on the assessment protocols used to measure maximum oxygen consumption in children and adolescents with Down syndrome giving emphasis to the protocols used, the validation process and their feasibility. The search was carried out in eight electronic databases--Scopus, Medline-Pubmed, Web of science, SportDiscus, Cinhal, Academic Search Premier, Scielo, and Lilacs. The inclusion criteria were: (a) articles which assessed VO2peak and/or VO2max (independent of the validation method), (b) samples composed of children and/or adolescents with Down syndrome, (c) participants of up to 20 years old, and (d) studies performed after 1990. Fifteen studies were selected and, of these, 11 measured the VO2peak using tests performed in a laboratory, 2 used field tests and the remaining 2 used both laboratory and field tests. The majority of the selected studies used maximal tests and conducted familiarization sessions. All the studies took into account the clinical conditions that could hamper testing or endanger the individuals. However, a large number of studies used tests which had not been specifically validated for the evaluated population. Finally, the search emphasized the small number of studies which use field tests to evaluate oxygen consumption. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. 21 CFR 660.46 - Samples; protocols; official release.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Samples; protocols; official release. 660.46 Section 660.46 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES..., a sample of product not iodinated with 125I means a sample from each filling of each lot packaged as...

  12. Technology transfer opportunities: new development: computerized field manual provides valuable resource for hydrologic investigations

    USGS Publications Warehouse

    Chapel, Paul

    1996-01-01

    The U.S. Geological Survey (USGS) is known throughout the world for conducting quality scientific investigation is hydrologic environments. Proper and consistent field techniques have been an integral part of this good research. Over the past few decades, the USGS has developed and published detailed, standard protocols for conducting studies in most aspects of the hydrologic environment. These protocols have been published in a number of diverse documents. The wealth of information contained in these diverse documents can benefit other scientists in industry, government, and academia that are involved in conducting hydrologic studies. Scientists at the USGS have brought together many of the most important of the field protocols in a user-friendly, graphical-interfaced field manual that will be useful in both the field and in the office. This electronic field manual can assist hydrologists and other scientists in conducting and documenting their field activities in a manner that is recognized standard throughout the hydrologic community.

  13. A study of the effectiveness of particulate cleaning protocols on intentionally contaminated niobium surfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reece, Charles E.; Ciancio, Elizabeth J.; Keyes, Katharine A.

    2009-11-01

    Particulate contamination on the surface of SRF cavities limits their performance via the enhanced generation of field-emitted electrons. Considerable efforts are expended to actively clean and avoid such contamination on niobium surfaces. The protocols in active use have been developed via feedback from cavity testing. This approach has the risk of over-conservatively ratcheting an ever increasing complexity of methods tied to particular circumstances. A complementary and perhaps helpful approach is to quantitatively assess the effectiveness of candidate methods at removing intentional representative particulate contamination. Toward this end, we developed a standardized contamination protocol using water suspensions of Nb{sub 2}O{sub 5}more » and SS 316 powders applied to BCP’d surfaces of standardized niobium samples yielding particle densities of order 200 particles/mm{sup 2}. From these starting conditions, controlled application of high pressure water rinse, ultrasonic cleaning, or CO{sub 2} snow jet cleaning was applied and the resulting surfaces examined via SEM/scanning EDS with particle recognition software. Results of initial parametric variations of each will be reported.« less

  14. Dark field imaging system for size characterization of magnetic micromarkers

    NASA Astrophysics Data System (ADS)

    Malec, A.; Haiden, C.; Kokkinis, G.; Keplinger, F.; Giouroudi, I.

    2017-05-01

    In this paper we demonstrate a dark field video imaging system for the detection and size characterization of individual magnetic micromarkers suspended in liquid and the detection of pathogens utilizing magnetically labelled E.coli. The system follows dynamic processes and interactions of moving micro/nano objects close to or below the optical resolution limit, and is especially suitable for small sample volumes ( 10 μl). The developed detection method can be used to obtain clinical information about liquid contents when an additional biological protocol is provided, i.e., binding of microorganisms (e.g. E.coli) to specific magnetic markers. Some of the major advantages of our method are the increased sizing precision in the micro- and nano-range as well as the setup's simplicity making it a perfect candidate for miniaturized devices. Measurements can thus be carried out in a quick, inexpensive, and compact manner. A minor limitation is that the concentration range of micromarkers in a liquid sample needs to be adjusted in such a manner that the number of individual particles in the microscope's field of view is sufficient.

  15. A comprehensive benchmarking study of protocols and sequencing platforms for 16S rRNA community profiling

    DOE PAGES

    Podar, Mircea; Shakya, Migun; D'Amore, Rosalinda; ...

    2016-01-14

    In the last 5 years, the rapid pace of innovations and improvements in sequencing technologies has completely changed the landscape of metagenomic and metagenetic experiments. Therefore, it is critical to benchmark the various methodologies for interrogating the composition of microbial communities, so that we can assess their strengths and limitations. Here, the most common phylogenetic marker for microbial community diversity studies is the 16S ribosomal RNA gene and in the last 10 years the field has moved from sequencing a small number of amplicons and samples to more complex studies where thousands of samples and multiple different gene regions aremore » interrogated.« less

  16. Isolation of PCR quality microbial community DNA from heavily contaminated environments.

    PubMed

    Gunawardana, Manjula; Chang, Simon; Jimenez, Abraham; Holland-Moritz, Daniel; Holland-Moritz, Hannah; La Val, Taylor P; Lund, Craig; Mullen, Madeline; Olsen, John; Sztain, Terra A; Yoo, Jennifer; Moss, John A; Baum, Marc M

    2014-07-01

    Asphalts, biochemically degraded oil, contain persistent, water-soluble compounds that pose a significant challenge to the isolation of PCR quality DNA. The adaptation of existing DNA purification protocols and commercial kits proved unsuccessful at overcoming this hurdle. Treatment of aqueous asphalt extracts with a polyamide resin afforded genomic microbial DNA templates that could readily be amplified by PCR. Physicochemically distinct asphalt samples from five natural oil seeps successfully generated the expected 291 bp amplicons targeting a region of the 16S rRNA gene, illustrating the robustness of the method. DNA recovery yields were in the 50-80% range depending on how the asphalt sample was seeded with exogenous DNA. The scope of the new method was expanded to include soil with high humic acid content. DNA from soil samples spiked with a range of humic acid concentrations was extracted with a commercial kit followed by treatment with the polyamide resin. The additional step significantly improved the purity of the DNA templates, especially at high humic acid concentrations, based on qPCR analysis of the bacterial 16S rRNA genes. The new method has the advantages of being inexpensive, simple, and rapid and should provide a valuable addition to protocols in the field of petroleum and soil microbiology. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Statistical Methods and Tools for Uxo Characterization (SERDP Final Technical Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pulsipher, Brent A.; Gilbert, Richard O.; Wilson, John E.

    2004-11-15

    The Strategic Environmental Research and Development Program (SERDP) issued a statement of need for FY01 titled Statistical Sampling for Unexploded Ordnance (UXO) Site Characterization that solicited proposals to develop statistically valid sampling protocols for cost-effective, practical, and reliable investigation of sites contaminated with UXO; protocols that could be validated through subsequent field demonstrations. The SERDP goal was the development of a sampling strategy for which a fraction of the site is initially surveyed by geophysical detectors to confidently identify clean areas and subsections (target areas, TAs) that had elevated densities of anomalous geophysical detector readings that could indicate the presencemore » of UXO. More detailed surveys could then be conducted to search the identified TAs for UXO. SERDP funded three projects: those proposed by the Pacific Northwest National Laboratory (PNNL) (SERDP Project No. UXO 1199), Sandia National Laboratory (SNL), and Oak Ridge National Laboratory (ORNL). The projects were closely coordinated to minimize duplication of effort and facilitate use of shared algorithms where feasible. This final report for PNNL Project 1199 describes the methods developed by PNNL to address SERDP's statement-of-need for the development of statistically-based geophysical survey methods for sites where 100% surveys are unattainable or cost prohibitive.« less

  18. Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis

    PubMed Central

    Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej

    2016-01-01

    Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world. PMID:28060297

  19. Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis.

    PubMed

    Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej

    2016-12-16

    Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world.

  20. Dual-view plane illumination microscopy for rapid and spatially isotropic imaging

    PubMed Central

    Kumar, Abhishek; Wu, Yicong; Christensen, Ryan; Chandris, Panagiotis; Gandler, William; McCreedy, Evan; Bokinsky, Alexandra; Colón-Ramos, Daniel A; Bao, Zhirong; McAuliffe, Matthew; Rondeau, Gary; Shroff, Hari

    2015-01-01

    We describe the construction and use of a compact dual-view inverted selective plane illumination microscope (diSPIM) for time-lapse volumetric (4D) imaging of living samples at subcellular resolution. Our protocol enables a biologist with some prior microscopy experience to assemble a diSPIM from commercially available parts, to align optics and test system performance, to prepare samples, and to control hardware and data processing with our software. Unlike existing light sheet microscopy protocols, our method does not require the sample to be embedded in agarose; instead, samples are prepared conventionally on glass coverslips. Tissue culture cells and Caenorhabditis elegans embryos are used as examples in this protocol; successful implementation of the protocol results in isotropic resolution and acquisition speeds up to several volumes per s on these samples. Assembling and verifying diSPIM performance takes ~6 d, sample preparation and data acquisition take up to 5 d and postprocessing takes 3–8 h, depending on the size of the data. PMID:25299154

  1. Jack Healy Remembers - Anecdotal Evidence for the Origin of the Approximate 24-hour Urine Sampling Protocol Used for Worker Bioassay Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, Eugene H.

    2008-10-01

    The origin of the approximate 24-hour urine sampling protocol used at Hanford for routine bioassay is attributed to an informal study done in the mid-1940s. While the actual data were never published and have been lost, anecdotal recollections by staff involved in the initial bioassay program design and administration suggest that the sampling protocol had a solid scientific basis. Numerous alternate methods for normalizing partial day samples to represent a total 24-hour collection have since been proposed and used, but no one method is obviously preferred.

  2. Cholera diagnosis in human stool and detection in water: protocol for a systematic review of available technologies.

    PubMed

    Diaconu, Karin; Falconer, Jennifer; O'May, Fiona; Jimenez, Miguel; Matragrano, Joe; Njanpop-Lafourcade, Betty; Ager, Alastair

    2018-02-20

    Cholera is a highly infectious diarrheal disease spread via fecal contamination of water and food sources; it is endemic in parts of Africa and Asia and recent outbreaks have been reported in Haiti, the Zambia and Democratic Republic of the Congo. If left untreated, the disease can be fatal in less than 24 h and result in case fatality ratios of 30-50%. Cholera disproportionately affects those living in areas with poor access to water and sanitation: the long-term public health response is focused on improving water and hygiene facilities and access. Short-term measures for infection prevention and control, and disease characterization and surveillance, are impaired by diagnostic delays: culture methods are slow and rely on the availability of infrastructure and specialist equipment. Rapid diagnostic tests have shown promise under field conditions and further innovations in this area have been proposed. This paper is the protocol for a systematic review focused on identifying current technologies and methods used for cholera diagnosis in stool, and detection in water. We will synthesize and appraise information on product technical specifications, accuracy and design features in order to inform infection prevention and control and innovation development. Embase, MEDLINE, CINAHL, Proquest, IndMed and the WHO and Campbell libraries will be searched. We will include studies reporting on field evaluations, including within-study comparisons against a reference standard, and laboratory evaluations reporting on product validation against field stool or water samples. We will extract data according to protocol and attempt meta-analyses if appropriate given data availability and quality. The systematic review builds on a previous scoping review in this field and expands upon this by synthesising data on both product technical characteristics and design features. The review will be of particular value to stakeholders engaged in diagnostic procurement and manufacturers interested in developing cholera or diarrheal disease diagnostics. PROSPERO CRD42016048428 .

  3. The case of biobank with the law: between a legal and scientific fiction.

    PubMed

    Sándor, Judit; Bárd, Petra; Tamburrini, Claudio; Tännsjö, Torbjörn

    2012-06-01

    According to estimates more than 400 biobanks currently operate across Europe. The term 'biobank' indicates a specific field of genetic study that has quietly developed without any significant critical reflection across European societies. Although scientists now routinely use this phrase, the wider public is still confused when the word 'bank' is being connected with the collection of their biological samples. There is a striking lack of knowledge of this field. In the recent Eurobarometer survey it was demonstrated that even in 2010 two-thirds of the respondents had never even heard about biobanks. The term gives the impression that a systematic collection of biological samples can constitute a 'bank' of considerable financial worth, where the biological samples, which are insignificant in isolation but are valuable as a collection, can be preserved, analysed and put to 'profitable use'. By studying the practices of the numerous already existing biobanks, the authors address the following questions: to what extent does the term 'biobank' reflect the normative concept of using biological samples for the purposes of biomedical research? Furthermore, is it in harmony with the so far agreed legal-ethical consensus in Europe or does it deliberately pull science to the territory of a new, ambiguous commercial field? In other words, do biobanks constitute a medico-legal fiction or are they substantively different from other biomedical research protocols on human tissues?

  4. A Draft Protocol for Detecting Possible Biohazards in Martian Samples Returned to Earth

    NASA Technical Reports Server (NTRS)

    Viso, M.; DeVincenzi, D. L.; Race, M. S.; Schad, P. J.; Stabekis, P. D.; Acevedo, S. E.; Rummel, J. D.

    2002-01-01

    In preparation for missions to Mars that will involve the return of samples, it is necessary to prepare for the safe receiving, handling, testing, distributing, and archiving of martian materials here on Earth. Previous groups and committees have studied selected aspects of sample return activities, but a specific protocol for handling and testing of returned -=1 samples from Mars remained to be developed. To refine the requirements for Mars sample hazard testing and to develop criteria for the subsequent release of sample materials from precautionary containment, NASA Planetary Protection Officer, working in collaboration with CNES, convened a series of workshops to produce a Protocol by which returned martian sample materials could be assessed for biological hazards and examined for evidence of life (extant or extinct), while safeguarding the samples from possible terrestrial contamination. The Draft Protocol was then reviewed by an Oversight and Review Committee formed specifically for that purpose and composed of senior scientists. In order to preserve the scientific value of returned martian samples under safe conditions, while avoiding false indications of life within the samples, the Sample Receiving Facility (SRF) is required to allow handling and processing of the Mars samples to prevent their terrestrial contamination while maintaining strict biological containment. It is anticipated that samples will be able to be shipped among appropriate containment facilities wherever necessary, under procedures developed in cooperation with international appropriate institutions. The SRF will need to provide different types of laboratory environments for carrying out, beyond sample description and curation, the various aspects of the protocol: Physical/Chemical analysis, Life Detection testing, and Biohazard testing. The main principle of these tests will be described and the criteria for release will be discussed, as well as the requirements for the SRF and its personnel.

  5. Active SAmpling Protocol (ASAP) to Optimize Individual Neurocognitive Hypothesis Testing: A BCI-Inspired Dynamic Experimental Design.

    PubMed

    Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie

    2016-01-01

    The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.

  6. Sensitivity comparison of sequential monadic and side-by-side presentation protocols in affective consumer testing.

    PubMed

    Colyar, Jessica M; Eggett, Dennis L; Steele, Frost M; Dunn, Michael L; Ogden, Lynn V

    2009-09-01

    The relative sensitivity of side-by-side and sequential monadic consumer liking protocols was compared. In the side-by-side evaluation, all samples were presented at once and evaluated together 1 characteristic at a time. In the sequential monadic evaluation, 1 sample was presented and evaluated on all characteristics, then returned before panelists received and evaluated another sample. Evaluations were conducted on orange juice, frankfurters, canned chili, potato chips, and applesauce. Five commercial brands, having a broad quality range, were selected as samples for each product category to assure a wide array of consumer liking scores. Without their knowledge, panelists rated the same 5 retail brands by 1 protocol and then 3 wk later by the other protocol. For 3 of the products, both protocols yielded the same order of overall liking. Slight differences in order of overall liking for the other 2 products were not significant. Of the 50 pairwise overall liking comparisons, 44 were in agreement. The different results obtained by the 2 protocols in order of liking and significance of paired comparisons were due to the experimental variation and differences in sensitivity. Hedonic liking scores were subjected to statistical power analyses and used to calculate minimum number of panelists required to achieve varying degrees of sensitivity when using side-by-side and sequential monadic protocols. In most cases, the side-by-side protocol was more sensitive, thus providing the same information with fewer panelists. Side-by-side protocol was less sensitive in cases where sensory fatigue was a factor.

  7. Human DNA extraction from whole saliva that was fresh or stored for 3, 6 or 12 months using five different protocols

    PubMed Central

    GARBIERI, Thais Francini; BROZOSKI, Daniel Thomas; DIONÍSIO, Thiago José; SANTOS, Carlos Ferreira; NEVES, Lucimara Teixeira das

    2017-01-01

    Abstract Saliva when compared to blood collection has the following advantages: it requires no specialized personnel for collection, allows for remote collection by the patient, is painless, well accepted by participants, has decreased risks of disease transmission, does not clot, can be frozen before DNA extraction and possibly has a longer storage time. Objective and Material and Methods This study aimed to compare the quantity and quality of human DNA extracted from saliva that was fresh or frozen for three, six and twelve months using five different DNA extraction protocols: protocol 1 – Oragene™ commercial kit, protocol 2 – QIAamp DNA mini kit, protocol 3 – DNA extraction using ammonium acetate, protocol 4 – Instagene™ Matrix and protocol 5 – Instagene™ Matrix diluted 1:1 using proteinase K and 1% SDS. Briefly, DNA was analyzed using spectrophotometry, electrophoresis and PCR. Results Results indicated that time spent in storage typically decreased the DNA quantity with the exception of protocol 1. The purity of DNA was generally not affected by storage times for the commercial based protocols, while the purity of the DNA samples extracted by the noncommercial protocols typically decreased when the saliva was stored longer. Only protocol 1 consistently extracted unfragmented DNA samples. In general, DNA samples extracted through protocols 1, 2, 3 and 4, regardless of storage time, were amplified by human specific primers whereas protocol 5 produced almost no samples that were able to be amplified by human specific primers. Depending on the protocol used, it was possible to extract DNA in high quantities and of good quality using whole saliva, and furthermore, for the purposes of DNA extraction, saliva can be reliably stored for relatively long time periods. Conclusions In summary, a complicated picture emerges when taking into account the extracted DNA’s quantity, purity and quality; depending on a given researchers needs, one protocol’s particular strengths and costs might be the deciding factor for its employment. PMID:28403355

  8. Protocol for Detection of Yersinia pestis in Environmental ...

    EPA Pesticide Factsheets

    Methods Report This is the first ever open-access and detailed protocol available to all government departments and agencies, and their contractors to detect Yersinia pestis, the pathogen that causes plague, from multiple environmental sample types including water. Each analytical method includes sample processing procedure for each sample type in a step-by-step manner. It includes real-time PCR, traditional microbiological culture, and the Rapid Viability PCR (RV-PCR) analytical methods. For large volume water samples it also includes an ultra-filtration-based sample concentration procedure. Because of such a non-restrictive availability of this protocol to all government departments and agencies, and their contractors, the nation will now have increased laboratory capacity to analyze large number of samples during a wide-area plague incident.

  9. Processing Protocol for Soil Samples Potentially ...

    EPA Pesticide Factsheets

    Method Operating Procedures This protocol describes the processing steps for 45 g and 9 g soil samples potentially contaminated with Bacillus anthracis spores. The protocol is designed to separate and concentrate the spores from bulk soil down to a pellet that can be used for further analysis. Soil extraction solution and mechanical shaking are used to disrupt soil particle aggregates and to aid in the separation of spores from soil particles. Soil samples are washed twice with soil extraction solution to maximize recovery. Differential centrifugation is used to separate spores from the majority of the soil material. The 45 g protocol has been demonstrated by two laboratories using both loamy and sandy soil types. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol would be robust enough to use at multiple laboratories while achieving comparable recoveries. The 45 g protocol has demonstrated a matrix limit of detection at 14 spores/gram of soil for loamy and sandy soils.

  10. Processing protocol for soil samples potentially contaminated with Bacillus anthracis spores [HS7.52.02 - 514

    USGS Publications Warehouse

    Silvestri, Erin E.; Griffin, Dale W.

    2017-01-01

    This protocol describes the processing steps for 45 g and 9 g soil samples potentially contaminated with Bacillus anthracis spores. The protocol is designed to separate and concentrate the spores from bulk soil down to a pellet that can be used for further analysis. Soil extraction solution and mechanical shaking are used to disrupt soil particle aggregates and to aid in the separation of spores from soil particles. Soil samples are washed twice with soil extraction solution to maximize recovery. Differential centrifugation is used to separate spores from the majority of the soil material. The 45 g protocol has been demonstrated by two laboratories using both loamy and sandy soil types. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol would be robust enough to use at multiple laboratories while achieving comparable recoveries. The 45 g protocol has demonstrated a matrix limit of detection at 14 spores/gram of soil for loamy and sandy soils.

  11. A preliminary architecture for building communication software from traffic captures

    NASA Astrophysics Data System (ADS)

    Acosta, Jaime C.; Estrada, Pedro

    2017-05-01

    Security analysts are tasked with identifying and mitigating network service vulnerabilities. A common problem associated with in-depth testing of network protocols is the availability of software that communicates across disparate protocols. Many times, the software required to communicate with these services is not publicly available. Developing this software is a time-consuming undertaking that requires expertise and understanding of the protocol specification. The work described in this paper aims at developing a software package that is capable of automatically creating communication clients by using packet capture (pcap) and TShark dissectors. Currently, our focus is on simple protocols with fixed fields. The methodologies developed as part of this work will extend to other complex protocols such as the Gateway Load Balancing Protocol (GLBP), Port Aggregation Protocol (PAgP), and Open Shortest Path First (OSPF). Thus far, we have architected a modular pipeline for an automatic traffic-based software generator. We start the transformation of captured network traffic by employing TShark to convert packets into a Packet Details Markup Language (PDML) file. The PDML file contains a parsed, textual, representation of the packet data. Then, we extract field data, types, along with inter and intra-packet dependencies. This information is then utilized to construct an XML file that encompasses the protocol state machine and field vocabulary. Finally, this XML is converted into executable code. Using our methodology, and as a starting point, we have succeeded in automatically generating software that communicates with other hosts using an automatically generated Internet Control Message Protocol (ICMP) client program.

  12. Real-Time DNA Sequencing in the Antarctic Dry Valleys Using the Oxford Nanopore Sequencer

    PubMed Central

    Johnson, Sarah S.; Zaikova, Elena; Goerlitz, David S.; Bai, Yu; Tighe, Scott W.

    2017-01-01

    The ability to sequence DNA outside of the laboratory setting has enabled novel research questions to be addressed in the field in diverse areas, ranging from environmental microbiology to viral epidemics. Here, we demonstrate the application of offline DNA sequencing of environmental samples using a hand-held nanopore sequencer in a remote field location: the McMurdo Dry Valleys, Antarctica. Sequencing was performed using a MK1B MinION sequencer from Oxford Nanopore Technologies (ONT; Oxford, United Kingdom) that was equipped with software to operate without internet connectivity. One-direction (1D) genomic libraries were prepared using portable field techniques on DNA isolated from desiccated microbial mats. By adequately insulating the sequencer and laptop, it was possible to run the sequencing protocol for up to 2½ h under arduous conditions. PMID:28337073

  13. Conducting field studies for testing pesticide leaching models

    USGS Publications Warehouse

    Smith, Charles N.; Parrish, Rudolph S.; Brown, David S.

    1990-01-01

    A variety of predictive models are being applied to evaluate the transport and transformation of pesticides in the environment. These include well known models such as the Pesticide Root Zone Model (PRZM), the Risk of Unsaturated-Saturated Transport and Transformation Interactions for Chemical Concentrations Model (RUSTIC) and the Groundwater Loading Effects of Agricultural Management Systems Model (GLEAMS). The potentially large impacts of using these models as tools for developing pesticide management strategies and regulatory decisions necessitates development of sound model validation protocols. This paper offers guidance on many of the theoretical and practical problems encountered in the design and implementation of field-scale model validation studies. Recommendations are provided for site selection and characterization, test compound selection, data needs, measurement techniques, statistical design considerations and sampling techniques. A strategy is provided for quantitatively testing models using field measurements.

  14. Field Guide to the Plant Community Types of Voyageurs National Park

    USGS Publications Warehouse

    Faber-Langendoen, Don; Aaseng, Norman; Hop, Kevin; Lew-Smith, Michael

    2007-01-01

    INTRODUCTION The objective of the U.S. Geological Survey-National Park Service Vegetation Mapping Program is to classify, describe, and map vegetation for most of the park units within the National Park Service (NPS). The program was created in response to the NPS Natural Resources Inventory and Monitoring Guidelines issued in 1992. Products for each park include digital files of the vegetation map and field data, keys and descriptions to the plant communities, reports, metadata, map accuracy verification summaries, and aerial photographs. Interagency teams work in each park and, following standardized mapping and field sampling protocols, develop products and vegetation classification standards that document the various vegetation types found in a given park. The use of a standard national vegetation classification system and mapping protocol facilitate effective resource stewardship by ensuring compatibility and widespread use of the information throughout the NPS as well as by other Federal and state agencies. These vegetation classifications and maps and associated information support a wide variety of resource assessment, park management, and planning needs, and provide a structure for framing and answering critical scientific questions about plant communities and their relation to environmental processes across the landscape. This field guide is intended to make the classification accessible to park visitors and researchers at Voyageurs National Park, allowing them to identify any stand of natural vegetation and showing how the classification can be used in conjunction with the vegetation map (Hop and others, 2001).

  15. Conduct of a personal radiofrequency electromagnetic field measurement study: proposed study protocol.

    PubMed

    Röösli, Martin; Frei, Patrizia; Bolte, John; Neubauer, Georg; Cardis, Elisabeth; Feychting, Maria; Gajsek, Peter; Heinrich, Sabine; Joseph, Wout; Mann, Simon; Martens, Luc; Mohler, Evelyn; Parslow, Roger C; Poulsen, Aslak Harbo; Radon, Katja; Schüz, Joachim; Thuroczy, György; Viel, Jean-François; Vrijheid, Martine

    2010-05-20

    The development of new wireless communication technologies that emit radio frequency electromagnetic fields (RF-EMF) is ongoing, but little is known about the RF-EMF exposure distribution in the general population. Previous attempts to measure personal exposure to RF-EMF have used different measurement protocols and analysis methods making comparisons between exposure situations across different study populations very difficult. As a result, observed differences in exposure levels between study populations may not reflect real exposure differences but may be in part, or wholly due to methodological differences. The aim of this paper is to develop a study protocol for future personal RF-EMF exposure studies based on experience drawn from previous research. Using the current knowledge base, we propose procedures for the measurement of personal exposure to RF-EMF, data collection, data management and analysis, and methods for the selection and instruction of study participants. We have identified two basic types of personal RF-EMF measurement studies: population surveys and microenvironmental measurements. In the case of a population survey, the unit of observation is the individual and a randomly selected representative sample of the population is needed to obtain reliable results. For microenvironmental measurements, study participants are selected in order to represent typical behaviours in different microenvironments. These two study types require different methods and procedures. Applying our proposed common core procedures in future personal measurement studies will allow direct comparisons of personal RF-EMF exposures in different populations and study areas.

  16. Conduct of a personal radiofrequency electromagnetic field measurement study: proposed study protocol

    PubMed Central

    2010-01-01

    Background The development of new wireless communication technologies that emit radio frequency electromagnetic fields (RF-EMF) is ongoing, but little is known about the RF-EMF exposure distribution in the general population. Previous attempts to measure personal exposure to RF-EMF have used different measurement protocols and analysis methods making comparisons between exposure situations across different study populations very difficult. As a result, observed differences in exposure levels between study populations may not reflect real exposure differences but may be in part, or wholly due to methodological differences. Methods The aim of this paper is to develop a study protocol for future personal RF-EMF exposure studies based on experience drawn from previous research. Using the current knowledge base, we propose procedures for the measurement of personal exposure to RF-EMF, data collection, data management and analysis, and methods for the selection and instruction of study participants. Results We have identified two basic types of personal RF-EMF measurement studies: population surveys and microenvironmental measurements. In the case of a population survey, the unit of observation is the individual and a randomly selected representative sample of the population is needed to obtain reliable results. For microenvironmental measurements, study participants are selected in order to represent typical behaviours in different microenvironments. These two study types require different methods and procedures. Conclusion Applying our proposed common core procedures in future personal measurement studies will allow direct comparisons of personal RF-EMF exposures in different populations and study areas. PMID:20487532

  17. A MORE COST-EFFECTIVE EMAP-ESTUARIES BENTHIC MACROFAUNAL SAMPLING PROTOCOL

    EPA Science Inventory

    The standard benthic macrofaunal sampling protocol in the U.S. Environmental Protection Agency's Pacific Coast Environmental Monitoring and Assessment Program (EMAP) is to collect a minimum of 30 random benthic samples per reporting unit (e.g., estuary) using a 0.1 m2 grab and to...

  18. It's Time to Develop a New "Draft Test Protocol" for a Mars Sample Return Mission (or Two....)

    NASA Astrophysics Data System (ADS)

    Rummel, J. D.

    2018-04-01

    A Mars Sample Return (MSR) will involve analysis of those samples in containment, including their safe receiving, handling, testing, and archiving. With an MSR planned for the end of the next decade, it is time to update the existing MSR protocol.

  19. Exosome-like vesicles in uterine aspirates: a comparison of ultracentrifugation-based isolation protocols.

    PubMed

    Campoy, Irene; Lanau, Lucia; Altadill, Tatiana; Sequeiros, Tamara; Cabrera, Silvia; Cubo-Abert, Montserrat; Pérez-Benavente, Assumpción; Garcia, Angel; Borrós, Salvador; Santamaria, Anna; Ponce, Jordi; Matias-Guiu, Xavier; Reventós, Jaume; Gil-Moreno, Antonio; Rigau, Marina; Colas, Eva

    2016-06-18

    Uterine aspirates are used in the diagnostic process of endometrial disorders, yet further applications could emerge if its complex milieu was simplified. Exosome-like vesicles isolated from uterine aspirates could become an attractive source of biomarkers, but there is a need to standardize isolation protocols. The objective of the study was to determine whether exosome-like vesicles exist in the fluid fraction of uterine aspirates and to compare protocols for their isolation, characterization, and analysis. We collected uterine aspirates from 39 pre-menopausal women suffering from benign gynecological diseases. The fluid fraction of 27 of those aspirates were pooled and split into equal volumes to evaluate three differential centrifugation-based procedures: (1) a standard protocol, (2) a filtration protocol, and (3) a sucrose cushion protocol. Characterization of isolated vesicles was assessed by electron microscopy, nanoparticle tracking analysis and immunoblot. Specifically for RNA material, we evaluate the effect of sonication and RNase A treatment at different steps of the protocol. We finally confirmed the efficiency of the selected methods in non-pooled samples. All protocols were useful to isolate exosome-like vesicles. However, the Standard procedure was the best performing protocol to isolate exosome-like vesicles from uterine aspirates: nanoparticle tracking analysis revealed a higher concentration of vesicles with a mode of 135 ± 5 nm, and immunoblot showed a higher expression of exosome-related markers (CD9, CD63, and CD81) thus verifying an enrichment in this type of vesicles. RNA contained in exosome-like vesicles was successfully extracted with no sonication treatment and exogenous nucleic acids digestion with RNaseA, allowing the analysis of the specific inner cargo by Real-Time qPCR. We confirmed the existence of exosome-like vesicles in the fluid fraction of uterine aspirates. They were successfully isolated by differential centrifugation giving sufficient proteomic and transcriptomic material for further analyses. The Standard protocol was the best performing procedure since the other two tested protocols did not ameliorate neither yield nor purity of exosome-like vesicles. This study contributes to establishing the basis for future comparative studies to foster the field of biomarker research in gynecology.

  20. Framework and indicator testing protocol for developing and piloting quality indicators for the UK quality and outcomes framework.

    PubMed

    Campbell, Stephen M; Kontopantelis, Evangelos; Hannon, Kerin; Burke, Martyn; Barber, Annette; Lester, Helen E

    2011-08-10

    Quality measures should be subjected to a testing protocol before being used in practice using key attributes such as acceptability, feasibility and reliability, as well as identifying issues derived from actual implementation and unintended consequences. We describe the methodologies and results of an indicator testing protocol (ITP) using data from proposed quality indicators for the United Kingdom Quality and Outcomes Framework (QOF). The indicator testing protocol involved a multi-step and methodological process: 1) The RAND/UCLA Appropriateness Method, to test clarity and necessity, 2) data extraction from patients' medical records, to test technical feasibility and reliability, 3) diaries, to test workload, 4) cost-effectiveness modelling, and 5) semi-structured interviews, to test acceptability, implementation issues and unintended consequences. Testing was conducted in a sample of representative family practices in England. These methods were combined into an overall recommendation for each tested indicator. Using an indicator testing protocol as part of piloting was seen as a valuable way of testing potential indicators in 'real world' settings. Pilot 1 (October 2009-March 2010) involved thirteen indicators across six clinical domains and twelve indicators passed the indicator testing protocol. However, the indicator testing protocol identified a number of implementation issues and unintended consequences that can be rectified or removed prior to national roll out. A palliative care indicator is used as an exemplar of the value of piloting using a multiple attribute indicator testing protocol - while technically feasible and reliable, it was unacceptable to practice staff and raised concerns about potentially causing actual patient harm. This indicator testing protocol is one example of a protocol that may be useful in assessing potential quality indicators when adapted to specific country health care settings and may be of use to policy-makers and researchers worldwide to test the likely effect of implementing indicators prior to roll out. It builds on and codifies existing literature and other testing protocols to create a field testing methodology that can be used to produce country specific quality indicators for pay-for-performance or quality improvement schemes.

  1. Sampling design for an integrated socioeconomic and ecological survey by using satellite remote sensing and ordination

    PubMed Central

    Binford, Michael W.; Lee, Tae Jeong; Townsend, Robert M.

    2004-01-01

    Environmental variability is an important risk factor in rural agricultural communities. Testing models requires empirical sampling that generates data that are representative in both economic and ecological domains. Detrended correspondence analysis of satellite remote sensing data were used to design an effective low-cost sampling protocol for a field study to create an integrated socioeconomic and ecological database when no prior information on ecology of the survey area existed. We stratified the sample for the selection of tambons from various preselected provinces in Thailand based on factor analysis of spectral land-cover classes derived from satellite data. We conducted the survey for the sampled villages in the chosen tambons. The resulting data capture interesting variations in soil productivity and in the timing of good and bad years, which a purely random sample would likely have missed. Thus, this database will allow tests of hypotheses concerning the effect of credit on productivity, the sharing of idiosyncratic risks, and the economic influence of environmental variability. PMID:15254298

  2. Biological response in vitro of skeletal muscle cells treated with different intensity continuous and pulsed ultrasound fields

    NASA Astrophysics Data System (ADS)

    Abrunhosa, Viviane M.; Mermelstein, Claudia S.; Costa, Manoel L.; Costa-Felix, Rodrigo P. B.

    2011-02-01

    Therapeutic ultrasound has been used in physiotherapy to accelerate tissue healing. Although the ultrasonic wave is widely used in clinical practice, not much is known about the biological effects of ultrasound on cells and tissues. This study aims to evaluate the biological response of ultrasound in primary cultures of chick myogenic cells. To ensure the metrological reliability of whole measurement process, the ultrasound equipment was calibrated in accordance with IEC 61689:2007. The skeletal muscle cells were divided in four samples. One sample was used as a control group and the others were submitted to different time and intensity and operation mode of ultrasound: 1) 0.5 W/cm2 continuous for 5 minutes, 2) 0.5 W/cm2 pulsed for 5 minutes, 3) 1.0 W/cm2 pulsed for 10 minutes. The samples were analyzed with phase contrast optical microscopy before and after the treatment. The results showed alignment of myogenic cells in the sample treated with 0.5 W/cm2 continuous during 5 minutes when compared with the control group and the other samples. This study is a first step towards a metrological and scientific based protocol to cells and tissues treatment under different ultrasound field exposures.

  3. The room temperature preservation of filtered environmental DNA samples and assimilation into a phenol-chloroform-isoamyl alcohol DNA extraction.

    PubMed

    Renshaw, Mark A; Olds, Brett P; Jerde, Christopher L; McVeigh, Margaret M; Lodge, David M

    2015-01-01

    Current research targeting filtered macrobial environmental DNA (eDNA) often relies upon cold ambient temperatures at various stages, including the transport of water samples from the field to the laboratory and the storage of water and/or filtered samples in the laboratory. This poses practical limitations for field collections in locations where refrigeration and frozen storage is difficult or where samples must be transported long distances for further processing and screening. This study demonstrates the successful preservation of eDNA at room temperature (20 °C) in two lysis buffers, CTAB and Longmire's, over a 2-week period of time. Moreover, the preserved eDNA samples were seamlessly integrated into a phenol-chloroform-isoamyl alcohol (PCI) DNA extraction protocol. The successful application of the eDNA extraction to multiple filter membrane types suggests the methods evaluated here may be broadly applied in future eDNA research. Our results also suggest that for many kinds of studies recently reported on macrobial eDNA, detection probabilities could have been increased, and at a lower cost, by utilizing the Longmire's preservation buffer with a PCI DNA extraction. © 2014 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  4. Improved diagnosis of common bile duct stone with single-shot balanced turbo field-echo sequence in MRCP.

    PubMed

    Noda, Yoshifumi; Goshima, Satoshi; Kojima, Toshihisa; Kawaguchi, Shimpei; Kawada, Hiroshi; Kawai, Nobuyuki; Koyasu, Hiromi; Matsuo, Masayuki; Bae, Kyongtae T

    2017-04-01

    To evaluate the value of adding single-shot balanced turbo field-echo (b-TFE) sequence to conventional magnetic resonance cholangiopancreatography (MRCP) for the detection of common bile duct (CBD) stone. One hundred thirty-seven consecutive patients with suspected CBD stone underwent MRCP including single-shot b-TFE sequence. Twenty-five patients were confirmed with CBD stone by endoscopic retrograde cholangiopancreatography or ultrasonography. Two radiologists reviewed two image protocols: protocol A (conventional MRCP protocol: unenhanced T1-, T2-, and respiratory-triggered three-dimensional fat-suppressed single-shot turbo spin-echo MRCP sequence) and protocol B (protocol A plus single-shot b-TFE sequence). The sensitivity, specificity, positive (PPV) and negative predictive value (NPV), and area under the receiver-operating-characteristic (ROC) curve (AUC) for the detection of CBD stone were compared. The sensitivity (72%) and NPV (94%) were the same between the two protocols. However, protocol B was greater in the specificity (99%) and PPV (94%) than protocol A (92% and 67%, respectively) (P = 0.0078 and 0.031, respectively). The AUC was significantly greater for protocol B (0.93) than for protocol A (0.86) (P = 0.026). Inclusion of single-shot b-TFE sequence to conventional MRCP significantly improved the specificity and PPV for the detection of CBD stone.

  5. Field investigations of bacterial contaminants and their effects on extended porcine semen.

    PubMed

    Althouse, G C; Kuster, C E; Clark, S G; Weisiger, R M

    2000-03-15

    Field investigations (n=23) were made over a 3-yr period at North American boar studs and farms in which the primary complaint was sperm agglutination in association with decreased sperm longevity of extended semen, and increased regular returns to estrus and/or vaginal discharges across parity. Microscopic examination of extended semen from these units revealed depressed gross motility (usually <30%), sperm agglutination, and sperm cell death occurring within 2 d of semen collection and processing regardless of the semen extender used. The extended semen exhibited a high number of induced acrosome abnormalities (>20%). Sample pH was acidic (5.7 to 6.4) in 93% of the submitted samples. Aerobic culture yielded a variety of bacteria from different genera. A single bacterial contaminant was obtained from 66% of the submitted samples (n=37 doses); 34% contained 2 or more different bacterial genera. The most frequently isolated contaminant bacteria from porcine extended semen were Alcaligenes xylosoxydans (n=3), Burkholderia cepacia (n=6), Enterobacter cloacae (n=6), Escherichia coli (n=6), Serratia marcescens (n=5), and Stenotrophomonas [Xanthomonas] maltophilia (n=6); these 6 bacteria accounted for 71% of all contaminated samples, and were spermicidal when re-inoculated and incubated in fresh, high quality extended semen. All contaminant bacteria were found to be resistant to the aminoglycoside gentamicin, a common preservative antibiotic used in commercial porcine semen extenders. Eleven genera were spermicidal in conjunction with an acidic environment, while 2 strains (E. coli, S. maltophilia) were spermicidal without this characteristic acidic environment. Bacteria originated from multiple sources at the stud/farm, and were of animal and nonanimal origin. A minimum contamination technique (MCT) protocol was developed to standardize hygiene and sanitation. This protocol focused on MCT's during boar preparation, semen collection, semen processing and laboratory sanitation. Implementation of the MCT, in addition to specific recommendations in stud management, resulted in the control of bacterial contamination in the extended semen.

  6. Improving Leishmania Species Identification in Different Types of Samples from Cutaneous Lesions

    PubMed Central

    Cruz-Barrera, Mónica L.; Ovalle-Bracho, Clemencia; Ortegon-Vergara, Viviana; Pérez-Franco, Jairo E.

    2015-01-01

    The discrimination of Leishmania species from patient samples has epidemiological and clinical relevance. In this study, different gene target PCR-restriction fragment length polymorphism (RFLP) protocols were evaluated for their robustness as Leishmania species discriminators in 61 patients with cutaneous leishmaniasis. We modified the hsp70-PCR-RFLP protocol and found it to be the most reliable protocol for species identification. PMID:25609727

  7. Development and layout of a protocol for the field performance of concrete deck and crack sealers.

    DOT National Transportation Integrated Search

    2009-09-01

    The main objective of this project was to develop and layout a protocol for the long-term monitoring and assessment of the performance of concrete deck and crack sealants in the field. To accomplish this goal, a total of six bridge decks were chosen ...

  8. Bioassessment Tools for Stony Corals: Field Testing of Monitoring Protocols in the US Virgin Islands (St. Croix)

    EPA Science Inventory

    Survey protocols for assessing coral reef condition were field tested at 61 reef stations in St. Croix, US Virgin Islands (USVI) during 2006. Three observations for stony corals were recorded: species, size, and percent live tissue. Stony corals were selected because they are pri...

  9. 2010 CEOS Field Reflectance Intercomparisons Lessons Learned

    NASA Technical Reports Server (NTRS)

    Thome, Kurtis; Fox, Nigel

    2011-01-01

    This paper summarizes lessons learned from the 2009 and 2010 joint field campaigns to Tuz Golu, Turkey. Emphasis is placed on the 2010 campaign related to understanding the equipment and measurement protocols, processing schemes, and traceability to SI quantities. Participants in both 2009 and 2010 used an array of measurement approaches to determine surface reflectance. One lesson learned is that even with all of the differences in collection between groups, the differences in reflectance are currently dominated by instrumental artifacts including knowledge of the white reference. Processing methodology plays a limited role once the bi-directional reflectance of the white reference is used rather than a hemispheric-directional value. The lack of a basic set of measurement protocols, or best practices, limits a group s ability to ensure SI traceability and the development of proper error budgets. Finally, rigorous attention to sampling methodology and its impact on instrument behavior is needed. The results of the 2009 and 2010 joint campaigns clearly demonstrate both the need and utility of such campaigns and such comparisons must continue in the future to ensure a coherent set of data that can span multiple sensor types and multiple decades.

  10. Virioplankton 'pegylation': use of PEG (polyethylene glycol) to concentrate and purify viruses in pelagic ecosystems.

    PubMed

    Colombet, J; Robin, A; Lavie, L; Bettarel, Y; Cauchie, H M; Sime-Ngando, T

    2007-12-01

    We have described the use of Polyethylene glycol (PEG) for the precipitation of natural communities of aquatic viruses, and its comparison with the usual concentration method based on ultracentrifugation. Experimental samples were obtained from different freshwater ecosystems whose trophic status varied. Based on transmission electron microscope observations and counting of phage-shaped particles, our results showed that the greatest recovery efficiency for all ecosystems was obtained when we used the PEG protocol. On average, this protocol allowed the recovery of >2-fold more viruses, compared to ultracentrifugation. In addition, the diversity of virioplankton, based on genomic size profiling using pulsed field gel electrophoresis, was higher and better discriminated when we used the PEG method. We conclude that pegylation offers a valid, simple and cheaper alternative method to ultracentrifugation, for the concentration and the purification of pelagic viruses.

  11. Reliability and utility of citizen science reef monitoring data collected by Reef Check Australia, 2002-2015.

    PubMed

    Done, Terence; Roelfsema, Chris; Harvey, Andrew; Schuller, Laura; Hill, Jocelyn; Schläppy, Marie-Lise; Lea, Alexandra; Bauer-Civiello, Anne; Loder, Jennifer

    2017-04-15

    Reef Check Australia (RCA) has collected data on benthic composition and cover at >70 sites along >1000km of Australia's Queensland coast from 2002 to 2015. This paper quantifies the accuracy, precision and power of RCA benthic composition data, to guide its application and interpretation. A simulation study established that the inherent accuracy of the Reef Check point sampling protocol is high (<±7% error absolute), in the range of estimates of benthic cover from 1% to 50%. A field study at three reef sites indicated that, despite minor observer- and deployment-related biases, the protocol does reliably document moderate ecological changes in coral communities. The error analyses were then used to guide the interpretation of inter-annual variability and long term trends at three study sites in RCA's major 2002-2015 data series for the Queensland coast. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Quantification of the overall measurement uncertainty associated with the passive moss biomonitoring technique: Sample collection and processing.

    PubMed

    Aboal, J R; Boquete, M T; Carballeira, A; Casanova, A; Debén, S; Fernández, J A

    2017-05-01

    In this study we examined 6080 data gathered by our research group during more than 20 years of research on the moss biomonitoring technique, in order to quantify the variability generated by different aspects of the protocol and to calculate the overall measurement uncertainty associated with the technique. The median variance of the concentrations of different pollutants measured in moss tissues attributed to the different methodological aspects was high, reaching values of 2851 (ng·g -1 ) 2 for Cd (sample treatment), 35.1 (μg·g -1 ) 2 for Cu (sample treatment), 861.7 (ng·g -1 ) 2 and for Hg (material selection). These variances correspond to standard deviations that constitute 67, 126 and 59% the regional background levels of these elements in the study region. The overall measurement uncertainty associated with the worst experimental protocol (5 subsamples, refrigerated, washed, 5 × 5 m size of the sampling area and once a year sampling) was between 2 and 6 times higher than that associated with the optimal protocol (30 subsamples, dried, unwashed, 20 × 20 m size of the sampling area and once a week sampling), and between 1.5 and 7 times higher than that associated with the standardized protocol (30 subsamples and once a year sampling). The overall measurement uncertainty associated with the standardized protocol could generate variations of between 14 and 47% in the regional background levels of Cd, Cu, Hg, Pb and Zn in the study area and much higher levels of variation in polluted sampling sites. We demonstrated that although the overall measurement uncertainty of the technique is still high, it can be reduced by using already well defined aspects of the protocol. Further standardization of the protocol together with application of the information on the overall measurement uncertainty would improve the reliability and comparability of the results of different biomonitoring studies, thus extending use of the technique beyond the context of scientific research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Understanding biological mechanisms underlying adverse birth outcomes in developing countries: protocol for a prospective cohort (AMANHI bio-banking) study.

    PubMed

    Baqui, Abdullah H; Khanam, Rasheda; Rahman, Mohammad Sayedur; Ahmed, Aziz; Rahman, Hasna Hena; Moin, Mamun Ibne; Ahmed, Salahuddin; Jehan, Fyezah; Nisar, Imran; Hussain, Atiya; Ilyas, Muhammad; Hotwani, Aneeta; Sajid, Muhammad; Qureshi, Shahida; Zaidi, Anita; Sazawal, Sunil; Ali, Said M; Deb, Saikat; Juma, Mohammed Hamad; Dhingra, Usha; Dutta, Arup; Ame, Shaali Makame; Hayward, Caroline; Rudan, Igor; Zangenberg, Mike; Russell, Donna; Yoshida, Sachiyo; Polašek, Ozren; Manu, Alexander; Bahl, Rajiv

    2017-12-01

    The AMANHI study aims to seek for biomarkers as predictors of important pregnancy-related outcomes, and establish a biobank in developing countries for future research as new methods and technologies become available. AMANHI is using harmonised protocols to enrol 3000 women in early pregnancies (8-19 weeks of gestation) for population-based follow-up in pregnancy up to 42 days postpartum in Bangladesh, Pakistan and Tanzania, with collection taking place between August 2014 and June 2016. Urine pregnancy tests will be used to confirm reported or suspected pregnancies for screening ultrasound by trained sonographers to accurately date the pregnancy. Trained study field workers will collect very detailed phenotypic and epidemiological data from the pregnant woman and her family at scheduled home visits during pregnancy (enrolment, 24-28 weeks, 32-36 weeks & 38+ weeks) and postpartum (days 0-6 or 42-60). Trained phlebotomists will collect maternal and umbilical blood samples, centrifuge and obtain aliquots of serum, plasma and the buffy coat for storage. They will also measure HbA1C and collect a dried spot sample of whole blood. Maternal urine samples will also be collected and stored, alongside placenta, umbilical cord tissue and membrane samples, which will both be frozen and prepared for histology examination. Maternal and newborn stool (for microbiota) as well as paternal and newborn saliva samples (for DNA extraction) will also be collected. All samples will be stored at -80°C in the biobank in each of the three sites. These samples will be linked to numerous epidemiological and phenotypic data with unique study identification numbers. AMANHI biobank proves that biobanking is feasible to implement in LMICs, but recognises that biobank creation is only the first step in addressing current global challenges.

  14. Kiloampere, Variable-Temperature, Critical-Current Measurements of High-Field Superconductors

    PubMed Central

    Goodrich, LF; Cheggour, N; Stauffer, TC; Filla, BJ; Lu, XF

    2013-01-01

    We review variable-temperature, transport critical-current (Ic) measurements made on commercial superconductors over a range of critical currents from less than 0.1 A to about 1 kA. We have developed and used a number of systems to make these measurements over the last 15 years. Two exemplary variable-temperature systems with coil sample geometries will be described: a probe that is only variable-temperature and a probe that is variable-temperature and variable-strain. The most significant challenge for these measurements is temperature stability, since large amounts of heat can be generated by the flow of high current through the resistive sample fixture. Therefore, a significant portion of this review is focused on the reduction of temperature errors to less than ±0.05 K in such measurements. A key feature of our system is a pre-regulator that converts a flow of liquid helium to gas and heats the gas to a temperature close to the target sample temperature. The pre-regulator is not in close proximity to the sample and it is controlled independently of the sample temperature. This allows us to independently control the total cooling power, and thereby fine tune the sample cooling power at any sample temperature. The same general temperature-control philosophy is used in all of our variable-temperature systems, but the addition of another variable, such as strain, forces compromises in design and results in some differences in operation and protocol. These aspects are analyzed to assess the extent to which the protocols for our systems might be generalized to other systems at other laboratories. Our approach to variable-temperature measurements is also placed in the general context of measurement-system design, and the perceived advantages and disadvantages of design choices are presented. To verify the accuracy of the variable-temperature measurements, we compared critical-current values obtained on a specimen immersed in liquid helium (“liquid” or Ic liq) at 5 K to those measured on the same specimen in flowing helium gas (“gas” or Ic gas) at the same temperature. These comparisons indicate the temperature control is effective over the superconducting wire length between the voltage taps, and this condition is valid for all types of sample investigated, including Nb-Ti, Nb3Sn, and MgB2 wires. The liquid/gas comparisons are used to study the variable-temperature measurement protocol that was necessary to obtain the “correct” critical current, which was assumed to be the Ic liq. We also calibrated the magnetoresistance effect of resistive thermometers for temperatures from 4 K to 35 K and magnetic fields from 0 T to 16 T. This calibration reduces systematic errors in the variable-temperature data, but it does not affect the liquid/gas comparison since the same thermometers are used in both cases. PMID:26401435

  15. Calibrated work function mapping by Kelvin probe force microscopy

    NASA Astrophysics Data System (ADS)

    Fernández Garrillo, Pablo A.; Grévin, Benjamin; Chevalier, Nicolas; Borowik, Łukasz

    2018-04-01

    We propose and demonstrate the implementation of an alternative work function tip calibration procedure for Kelvin probe force microscopy under ultrahigh vacuum, using monocrystalline metallic materials with known crystallographic orientation as reference samples, instead of the often used highly oriented pyrolytic graphite calibration sample. The implementation of this protocol allows the acquisition of absolute and reproducible work function values, with an improved uncertainty with respect to unprepared highly oriented pyrolytic graphite-based protocols. The developed protocol allows the local investigation of absolute work function values over nanostructured samples and can be implemented in electronic structures and devices characterization as demonstrated over a nanostructured semiconductor sample presenting Al0.7Ga0.3As and GaAs layers with variable thickness. Additionally, using our protocol we find that the work function of annealed highly oriented pyrolytic graphite is equal to 4.6 ± 0.03 eV.

  16. Effectiveness of the Preservation Protocol within EPA Method 200.8 for Soluble and Particulate Lead Recovery in Drinking Water

    EPA Science Inventory

    The purpose of this project was to investigate the effectiveness of the sample preservation protocol outlined in Method 200.8 in recovering lead from water samples. Lead recoveries were studied in various water samples spiked with lead by evaluating lead sorption and desorption f...

  17. 21 CFR 660.36 - Samples and protocols.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Samples and protocols. 660.36 Section 660.36 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) BIOLOGICS... Research Sample Custodian (ATTN: HFM-672) (see mailing addresses in § 600.2 of this chapter), within 30...

  18. BIASES IN CASTNET FILTER PACK RESULTS ASSOCIATED WITH SAMPLING PROTOCOL

    EPA Science Inventory

    In the current study, single filter weekly (w) results are compared with weekly results aggregated from day and night (dn) weekly samples. Comparisons of the two sampling protocols for all major constituents (SO42-, NO3-, NH4+, HNO3, and SO2) show median bias (MB) of < 5 nmol m-3...

  19. Protocol-based care: the standardisation of decision-making?

    PubMed

    Rycroft-Malone, Jo; Fontenla, Marina; Seers, Kate; Bick, Debra

    2009-05-01

    To explore how protocol-based care affects clinical decision-making. In the context of evidence-based practice, protocol-based care is a mechanism for facilitating the standardisation of care and streamlining decision-making through rationalising the information with which to make judgements and ultimately decisions. However, whether protocol-based care does, in the reality of practice, standardise decision-making is unknown. This paper reports on a study that explored the impact of protocol-based care on nurses' decision-making. Theoretically informed by realistic evaluation and the promoting action on research implementation in health services framework, a case study design using ethnographic methods was used. Two sites were purposively sampled; a diabetic and endocrine unit and a cardiac medical unit. Within each site, data collection included observation, postobservation semi-structured interviews with staff and patients, field notes, feedback sessions and document review. Data were inductively and thematically analysed. Decisions made by nurses in both sites were varied according to many different and interacting factors. While several standardised care approaches were available for use, in reality, a variety of information sources informed decision-making. The primary approach to knowledge exchange and acquisition was person-to-person; decision-making was a social activity. Rarely were standardised care approaches obviously referred to; nurses described following a mental flowchart, not necessarily linked to a particular guideline or protocol. When standardised care approaches were used, it was reported that they were used flexibly and particularised. While the logic of protocol-based care is algorithmic, in the reality of clinical practice, other sources of information supported nurses' decision-making process. This has significant implications for the political goal of standardisation. The successful implementation and judicious use of tools such as protocols and guidelines will likely be dependant on approaches that facilitate the development of nurses' decision-making processes in parallel to paying attention to the influence of context.

  20. Integration of electromagnetic induction sensor data in soil sampling scheme optimization using simulated annealing.

    PubMed

    Barca, E; Castrignanò, A; Buttafuoco, G; De Benedetto, D; Passarella, G

    2015-07-01

    Soil survey is generally time-consuming, labor-intensive, and costly. Optimization of sampling scheme allows one to reduce the number of sampling points without decreasing or even increasing the accuracy of investigated attribute. Maps of bulk soil electrical conductivity (EC a ) recorded with electromagnetic induction (EMI) sensors could be effectively used to direct soil sampling design for assessing spatial variability of soil moisture. A protocol, using a field-scale bulk EC a survey, has been applied in an agricultural field in Apulia region (southeastern Italy). Spatial simulated annealing was used as a method to optimize spatial soil sampling scheme taking into account sampling constraints, field boundaries, and preliminary observations. Three optimization criteria were used. the first criterion (minimization of mean of the shortest distances, MMSD) optimizes the spreading of the point observations over the entire field by minimizing the expectation of the distance between an arbitrarily chosen point and its nearest observation; the second criterion (minimization of weighted mean of the shortest distances, MWMSD) is a weighted version of the MMSD, which uses the digital gradient of the grid EC a data as weighting function; and the third criterion (mean of average ordinary kriging variance, MAOKV) minimizes mean kriging estimation variance of the target variable. The last criterion utilizes the variogram model of soil water content estimated in a previous trial. The procedures, or a combination of them, were tested and compared in a real case. Simulated annealing was implemented by the software MSANOS able to define or redesign any sampling scheme by increasing or decreasing the original sampling locations. The output consists of the computed sampling scheme, the convergence time, and the cooling law, which can be an invaluable support to the process of sampling design. The proposed approach has found the optimal solution in a reasonable computation time. The use of bulk EC a gradient as an exhaustive variable, known at any node of an interpolation grid, has allowed the optimization of the sampling scheme, distinguishing among areas with different priority levels.

  1. Metabolomic analysis-Addressing NMR and LC-MS related problems in human feces sample preparation.

    PubMed

    Moosmang, Simon; Pitscheider, Maria; Sturm, Sonja; Seger, Christoph; Tilg, Herbert; Halabalaki, Maria; Stuppner, Hermann

    2017-10-31

    Metabolomics is a well-established field in fundamental clinical research with applications in different human body fluids. However, metabolomic investigations in feces are currently an emerging field. Fecal sample preparation is a demanding task due to high complexity and heterogeneity of the matrix. To gain access to the information enclosed in human feces it is necessary to extract the metabolites and make them accessible to analytical platforms like NMR or LC-MS. In this study different pre-analytical parameters and factors were investigated i.e. water content, different extraction solvents, influence of freeze-drying and homogenization, ratios of sample weight to extraction solvent, and their respective impact on metabolite profiles acquired by NMR and LC-MS. The results indicate that profiles are strongly biased by selection of extraction solvent or drying of samples, which causes different metabolites to be lost, under- or overstated. Additionally signal intensity and reproducibility of the measurement were found to be strongly dependent on sample pre-treatment steps: freeze-drying and homogenization lead to improved release of metabolites and thus increased signals, but at the same time induced variations and thus deteriorated reproducibility. We established the first protocol for extraction of human fecal samples and subsequent measurement with both complementary techniques NMR and LC-MS. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Region-Based Collision Avoidance Beaconless Geographic Routing Protocol in Wireless Sensor Networks.

    PubMed

    Lee, JeongCheol; Park, HoSung; Kang, SeokYoon; Kim, Ki-Il

    2015-06-05

    Due to the lack of dependency on beacon messages for location exchange, the beaconless geographic routing protocol has attracted considerable attention from the research community. However, existing beaconless geographic routing protocols are likely to generate duplicated data packets when multiple winners in the greedy area are selected. Furthermore, these protocols are designed for a uniform sensor field, so they cannot be directly applied to practical irregular sensor fields with partial voids. To prevent the failure of finding a forwarding node and to remove unnecessary duplication, in this paper, we propose a region-based collision avoidance beaconless geographic routing protocol to increase forwarding opportunities for randomly-deployed sensor networks. By employing different contention priorities into the mutually-communicable nodes and the rest of the nodes in the greedy area, every neighbor node in the greedy area can be used for data forwarding without any packet duplication. Moreover, simulation results are given to demonstrate the increased packet delivery ratio and shorten end-to-end delay, rather than well-referred comparative protocols.

  3. Region-Based Collision Avoidance Beaconless Geographic Routing Protocol in Wireless Sensor Networks

    PubMed Central

    Lee, JeongCheol; Park, HoSung; Kang, SeokYoon; Kim, Ki-Il

    2015-01-01

    Due to the lack of dependency on beacon messages for location exchange, the beaconless geographic routing protocol has attracted considerable attention from the research community. However, existing beaconless geographic routing protocols are likely to generate duplicated data packets when multiple winners in the greedy area are selected. Furthermore, these protocols are designed for a uniform sensor field, so they cannot be directly applied to practical irregular sensor fields with partial voids. To prevent the failure of finding a forwarding node and to remove unnecessary duplication, in this paper, we propose a region-based collision avoidance beaconless geographic routing protocol to increase forwarding opportunities for randomly-deployed sensor networks. By employing different contention priorities into the mutually-communicable nodes and the rest of the nodes in the greedy area, every neighbor node in the greedy area can be used for data forwarding without any packet duplication. Moreover, simulation results are given to demonstrate the increased packet delivery ratio and shorten end-to-end delay, rather than well-referred comparative protocols. PMID:26057037

  4. Well installation and documentation, and ground-water sampling protocols for the pilot National Water-Quality Assessment Program

    USGS Publications Warehouse

    Hardy, M.A.; Leahy, P.P.; Alley, W.M.

    1989-01-01

    Several pilot projects are being conducted as part of the National Water Quality Assessment (NAWQA) Program. The purpose of the pilot program is to test and refine concepts for a proposed full-scale program. Three of the pilot projects are specifically designed to assess groundwater. The purpose of this report is to describe the criteria that are being used in the NAWQA pilot projects for selecting and documenting wells, installing new wells, and sampling wells for different water quality constituents. Guidelines are presented for the selection of wells for sampling. Information needed to accurately document each well includes site characteristics related to the location of the well, land use near the well, and important well construction features. These guidelines ensure the consistency of the information collected and will provide comparable data for interpretive purposes. Guidelines for the installation of wells are presented and include procedures that need to be followed for preparations prior to drilling, the selection of the drilling technique and casing type, the grouting procedure, and the well-development technique. A major component of the protocols is related to water quality sampling. Tasks are identified that need to be completed prior to visiting the site for sampling. Guidelines are presented for purging the well prior t sampling, both in terms of the volume of water pumped and the chemical stability of field parameters. Guidelines are presented concerning sampler selection as related to both inorganic and organic constituents. Documentation needed to describe the measurements and observations related to sampling each well and treating and preserving the samples are also presented. Procedures are presented for the storage and shipping of water samples, equipment cleaning, and quality assurance. Quality assurance guidelines include the description of the general distribution of the various quality assurance samples (blanks, spikes, duplicates, and reference samples) that will be used in the pilot program. (Lantz-PTT)

  5. Soil sampling strategies for site assessments in petroleum-contaminated areas.

    PubMed

    Kim, Geonha; Chowdhury, Saikat; Lin, Yen-Min; Lu, Chih-Jen

    2017-04-01

    Environmental site assessments are frequently executed for monitoring and remediation performance evaluation purposes, especially in total petroleum hydrocarbon (TPH)-contaminated areas, such as gas stations. As a key issue, reproducibility of the assessment results must be ensured, especially if attempts are made to compare results between different institutions. Although it is widely known that uncertainties associated with soil sampling are much higher than those with chemical analyses, field guides or protocols to deal with these uncertainties are not stipulated in detail in the relevant regulations, causing serious errors and distortion of the reliability of environmental site assessments. In this research, uncertainties associated with soil sampling and sample reduction for chemical analysis were quantified using laboratory-scale experiments and the theory of sampling. The research results showed that the TPH mass assessed by sampling tends to be overestimated and sampling errors are high, especially for the low range of TPH concentrations. Homogenization of soil was found to be an efficient method to suppress uncertainty, but high-resolution sampling could be an essential way to minimize this.

  6. Qualitative and quantitative assessment of unresolved complex mixture in PM2.5 of Bakersfield, CA

    NASA Astrophysics Data System (ADS)

    Nallathamby, Punith Dev; Lewandowski, Michael; Jaoui, Mohammed; Offenberg, John H.; Kleindienst, Tadeusz E.; Rubitschun, Caitlin; Surratt, Jason D.; Usenko, Sascha; Sheesley, Rebecca J.

    2014-12-01

    The 2010 CalNex (California Nexus) field experiment offered an opportunity for detailed characterization of atmospheric particulate carbon composition and sources in Bakersfield, CA. In the current study, the authors describe and employ a new protocol for reporting unresolved complex mixture (UCM) in over 30 daily samples. The Bakersfield, CA site has significant contribution from UCM, 2.9 ± 2.2% of the daily OC, which makes it an ideal first application. The new protocol reports two UCM peaks for Bakersfield with unique mean vapor pressure, retention time, mass spectra and daily ambient concentration trends. The first UCM peak, UCM-A, was comprised of semi-volatile compounds including alkanes, alkenes, and alkynes, with a mean vapor pressure of 2E-04 Torr and medium to heavy-duty diesel exhaust as a likely source. The second UCM peak, UCM-B, was comprised of linear, branched, and cyclic alkanes, with a mean vapor pressure of 1E-08 Torr. UCM-B had strong similarities to UCM in the NIST Standard Reference Material 1649b (urban dust) and to previously reported, detailed UCM for a representative Bakersfield sample, with possible sources including: motor vehicle exhaust, agricultural activities, and construction activities.

  7. Protocol for Biomarker Ratio Imaging Microscopy with Specific Application to Ductal Carcinoma In situ of the Breast

    PubMed Central

    Clark, Andrea J.; Petty, Howard R.

    2016-01-01

    This protocol describes the methods and steps involved in performing biomarker ratio imaging microscopy (BRIM) using formalin fixed paraffin-embedded (FFPE) samples of human breast tissue. The technique is based on the acquisition of two fluorescence images of the same microscopic field using two biomarkers and immunohistochemical tools. The biomarkers are selected such that one biomarker correlates with breast cancer aggressiveness while the second biomarker anti-correlates with aggressiveness. When the former image is divided by the latter image, a computed ratio image is formed that reflects the aggressiveness of tumor cells while increasing contrast and eliminating path-length and other artifacts from the image. For example, the aggressiveness of epithelial cells may be assessed by computing ratio images of N-cadherin and E-cadherin images or CD44 and CD24 images, which specifically reflect the mesenchymal or stem cell nature of the constituent cells, respectively. This methodology is illustrated for tissue samples of ductal carcinoma in situ (DCIS) and invasive breast cancer. This tool should be useful in tissue studies of experimental cancer as well as the management of cancer patients. PMID:27857940

  8. Test/QA plan for the validation of the verification protocol for high speed pesticide spray drift reduction technologies for row and field crops

    EPA Science Inventory

    This test/QA plan for evaluation the generic test protocol for high speed wind tunnel, representing aerial application, pesticide spray drift reduction technologies (DRT) for row and field crops is in conformance with EPA Requirements for Quality Assurance Project Plans (EPA QA/R...

  9. Test/QA plan for the validation of the verification protocol for low speed pesticide spray drift reduction technologies for row and field crops

    EPA Science Inventory

    This test/QA plan for evaluation the generic test protocol for high speed wind tunnel, representing aerial application, pesticide spray drift reduction technologies (DRT) for row and field crops is in conformance with EPA Requirements for Quality Assurance Project Plans (EPA QA/R...

  10. Polymorphism at the merozoite surface protein-3alpha locus of Plasmodium vivax: global and local diversity.

    PubMed

    Bruce, M C; Galinski, M R; Barnwell, J W; Snounou, G; Day, K P

    1999-10-01

    Allelic diversity at the Plasmodium vivax merozoite surface protein-3alpha (PvMsp-3alpha) locus was investigated using a combined polymerase chain reaction/restriction fragment length polymorphism (PCR/RFLP) protocol. Symptomatic patient isolates from global geographic origins showed a high level of polymorphism at the nucleotide level. These samples were used to validate the sensitivity, specificity, and reproducibility of the PCR/RFLP method. It was then used to investigate PvMsp3alpha diversity in field samples from children living in a single village in a malaria-endemic region of Papua New Guinea, with the aim of assessing the usefulness of this locus as an epidemiologic marker of P. vivax infections. Eleven PvMsp-3alpha alleles were distinguishable in 16 samples with single infections, revealing extensive parasite polymorphism within this restricted area. Multiple infections were easily detected and accounted for 5 (23%) of 22 positive samples. Pairs of samples from individual children provided preliminary evidence for high turnover of P. vivax populations.

  11. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  12. Clauser-Horne-Shimony-Holt versus three-party pseudo-telepathy: on the optimal number of samples in device-independent quantum private query

    NASA Astrophysics Data System (ADS)

    Basak, Jyotirmoy; Maitra, Subhamoy

    2018-04-01

    In device-independent (DI) paradigm, the trustful assumptions over the devices are removed and CHSH test is performed to check the functionality of the devices toward certifying the security of the protocol. The existing DI protocols consider infinite number of samples from theoretical point of view, though this is not practically implementable. For finite sample analysis of the existing DI protocols, we may also consider strategies for checking device independence other than the CHSH test. In this direction, here we present a comparative analysis between CHSH and three-party Pseudo-telepathy game for the quantum private query protocol in DI paradigm that appeared in Maitra et al. (Phys Rev A 95:042344, 2017) very recently.

  13. Patient-centred screening for primary immunodeficiency, a multi-stage diagnostic protocol designed for non-immunologists: 2011 update

    PubMed Central

    de Vries, E

    2012-01-01

    Members of the European Society for Immunodeficiencies (ESID) and other colleagues have updated the multi-stage expert-opinion-based diagnostic protocol for non-immunologists incorporating newly defined primary immunodeficiency diseases (PIDs). The protocol presented here aims to increase the awareness of PIDs among doctors working in different fields. Prompt identification of PID is important for prognosis, but this may not be an easy task. The protocol therefore starts from the clinical presentation of the patient. Because PIDs may present at all ages, this protocol is aimed at both adult and paediatric physicians. The multi-stage design allows cost-effective screening for PID of the large number of potential cases in the early phases, with more expensive tests reserved for definitive classification in collaboration with a specialist in the field of immunodeficiency at a later stage. PMID:22132890

  14. FoldGPCR: structure prediction protocol for the transmembrane domain of G protein-coupled receptors from class A.

    PubMed

    Michino, Mayako; Chen, Jianhan; Stevens, Raymond C; Brooks, Charles L

    2010-08-01

    Building reliable structural models of G protein-coupled receptors (GPCRs) is a difficult task because of the paucity of suitable templates, low sequence identity, and the wide variety of ligand specificities within the superfamily. Template-based modeling is known to be the most successful method for protein structure prediction. However, refinement of homology models within 1-3 A C alpha RMSD of the native structure remains a major challenge. Here, we address this problem by developing a novel protocol (foldGPCR) for modeling the transmembrane (TM) region of GPCRs in complex with a ligand, aimed to accurately model the structural divergence between the template and target in the TM helices. The protocol is based on predicted conserved inter-residue contacts between the template and target, and exploits an all-atom implicit membrane force field. The placement of the ligand in the binding pocket is guided by biochemical data. The foldGPCR protocol is implemented by a stepwise hierarchical approach, in which the TM helical bundle and the ligand are assembled by simulated annealing trials in the first step, and the receptor-ligand complex is refined with replica exchange sampling in the second step. The protocol is applied to model the human beta(2)-adrenergic receptor (beta(2)AR) bound to carazolol, using contacts derived from the template structure of bovine rhodopsin. Comparison with the X-ray crystal structure of the beta(2)AR shows that our protocol is particularly successful in accurately capturing helix backbone irregularities and helix-helix packing interactions that distinguish rhodopsin from beta(2)AR. (c) 2010 Wiley-Liss, Inc.

  15. Comparison of two DNA extraction protocols from leave samples of Cotinus coggygria, Citrus sinensis and Genus juglans.

    PubMed

    Fallah, F; Minaei Chenar, H; Amiri, H; Omodipour, S; Shirbande Ghods, F; Kahrizi, D; Sohrabi, M; Ghorbani, T; Kazemi, E

    2017-02-28

    High quality DNA is essential for molecular research. Secondary metabolites can affect the quantity and quality DNA. In current research two DNA isolation methods including CTAB and Delaporta (protocols 1 & 2 respectively) were applied in three leave samples from Cotinus coggygria, Citrus sinensis and Genus juglans that their leaves are rich of secondary metabolites. We successfully isolated DNA from C. coggygria, C. sinensis and Genus Juglans using the two protocols described above. Good quality DNA was isolated from C. coggygria, C. sinensis and Genus Juglans using protocol 1, while protocol 2 failed to produce usable DNA from these sources. The highest amount of DNA (1.3-1.6) was obtained from them using protocol 1. As we discovered, procedure 1 may work better for plants with secondary metabolites.

  16. Distributional assumptions in food and feed commodities- development of fit-for-purpose sampling protocols.

    PubMed

    Paoletti, Claudia; Esbensen, Kim H

    2015-01-01

    Material heterogeneity influences the effectiveness of sampling procedures. Most sampling guidelines used for assessment of food and/or feed commodities are based on classical statistical distribution requirements, the normal, binomial, and Poisson distributions-and almost universally rely on the assumption of randomness. However, this is unrealistic. The scientific food and feed community recognizes a strong preponderance of non random distribution within commodity lots, which should be a more realistic prerequisite for definition of effective sampling protocols. Nevertheless, these heterogeneity issues are overlooked as the prime focus is often placed only on financial, time, equipment, and personnel constraints instead of mandating acquisition of documented representative samples under realistic heterogeneity conditions. This study shows how the principles promulgated in the Theory of Sampling (TOS) and practically tested over 60 years provide an effective framework for dealing with the complete set of adverse aspects of both compositional and distributional heterogeneity (material sampling errors), as well as with the errors incurred by the sampling process itself. The results of an empirical European Union study on genetically modified soybean heterogeneity, Kernel Lot Distribution Assessment are summarized, as they have a strong bearing on the issue of proper sampling protocol development. TOS principles apply universally in the food and feed realm and must therefore be considered the only basis for development of valid sampling protocols free from distributional constraints.

  17. High sensitivity Troponin T: an audit of implementation of its protocol in a district general hospital.

    PubMed

    Kalim, Shahid; Nazir, Shaista; Khan, Zia Ullah

    2013-01-01

    Protocols based on newer high sensitivity Troponin T (hsTropT) assays can rule in a suspected Acute Myocardial Infarction (AMI) as early as 3 hours. We conducted this study to audit adherence to our Trust's newly introduced AMI diagnostic protocol based on paired hsTropT testing at 0 and 3 hours. We retrospectively reviewed data of all patients who had hsTropT test done between 1st and 7th May 2012. Patient's demographics, utility of single or paired samples, time interval between paired samples, patient's presenting symptoms and ECG findings were noted and their means, medians, Standard deviations and proportions were calculated. A total of 66 patients had hsTropT test done during this period. Mean age was 63.30 +/- 17.46 years and 38 (57.57%) were males. Twenty-four (36.36%) patients had only single, rather than protocol recommended paired hsTropT samples, taken. Among the 42 (63.63%) patients with paired samples, the mean time interval was found to be 4.41 +/- 5.7 hours. Contrary to the recommendations, 15 (22.73%) had a very long whereas 2 (3.03%) had a very short time interval between two samples. A subgroup analysis of patients with single samples, found only 2 (3.03%) patient with ST-segment elevation, appropriate for single testing. Our study confirmed that in a large number of patients the protocol for paired sampling or a recommended time interval of 3 hours between 2 samples was not being followed.

  18. Performance of Identifiler Direct and PowerPlex 16 HS on the Applied Biosystems 3730 DNA Analyzer for processing biological samples archived on FTA cards.

    PubMed

    Laurin, Nancy; DeMoors, Anick; Frégeau, Chantal

    2012-09-01

    Direct amplification of STR loci from biological samples collected on FTA cards without prior DNA purification was evaluated using Identifiler Direct and PowerPlex 16 HS in conjunction with the use of a high throughput Applied Biosystems 3730 DNA Analyzer. In order to reduce the overall sample processing cost, reduced PCR volumes combined with various FTA disk sizes were tested. Optimized STR profiles were obtained using a 0.53 mm disk size in 10 μL PCR volume for both STR systems. These protocols proved effective in generating high quality profiles on the 3730 DNA Analyzer from both blood and buccal FTA samples. Reproducibility, concordance, robustness, sample stability and profile quality were assessed using a collection of blood and buccal samples on FTA cards from volunteer donors as well as from convicted offenders. The new developed protocols offer enhanced throughput capability and cost effectiveness without compromising the robustness and quality of the STR profiles obtained. These results support the use of these protocols for processing convicted offender samples submitted to the National DNA Data Bank of Canada. Similar protocols could be applied to the processing of casework reference samples or in paternity or family relationship testing. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Multisite tumor sampling enhances the detection of intratumor heterogeneity at all different temporal stages of tumor evolution.

    PubMed

    Erramuzpe, Asier; Cortés, Jesús M; López, José I

    2018-02-01

    Intratumor heterogeneity (ITH) is an inherent process of tumor development that has received much attention in previous years, as it has become a major obstacle for the success of targeted therapies. ITH is also temporally unpredictable across tumor evolution, which makes its precise characterization even more problematic since detection success depends on the precise temporal snapshot at which ITH is analyzed. New and more efficient strategies for tumor sampling are needed to overcome these difficulties which currently rely entirely on the pathologist's interpretation. Recently, we showed that a new strategy, the multisite tumor sampling, works better than the routine sampling protocol for the ITH detection when the tumor time evolution was not taken into consideration. Here, we extend this work and compare the ITH detections of multisite tumor sampling and routine sampling protocols across tumor time evolution, and in particular, we provide in silico analyses of both strategies at early and late temporal stages for four different models of tumor evolution (linear, branched, neutral, and punctuated). Our results indicate that multisite tumor sampling outperforms routine protocols in detecting ITH at all different temporal stages of tumor evolution. We conclude that multisite tumor sampling is more advantageous than routine protocols in detecting intratumor heterogeneity.

  20. Environmental DNA sampling protocol - filtering water to capture DNA from aquatic organisms

    USGS Publications Warehouse

    Laramie, Matthew B.; Pilliod, David S.; Goldberg, Caren S.; Strickler, Katherine M.

    2015-09-29

    Environmental DNA (eDNA) analysis is an effective method of determining the presence of aquatic organisms such as fish, amphibians, and other taxa. This publication is meant to guide researchers and managers in the collection, concentration, and preservation of eDNA samples from lentic and lotic systems. A sampling workflow diagram and three sampling protocols are included as well as a list of suggested supplies. Protocols include filter and pump assembly using: (1) a hand-driven vacuum pump, ideal for sample collection in remote sampling locations where no electricity is available and when equipment weight is a primary concern; (2) a peristaltic pump powered by a rechargeable battery-operated driver/drill, suitable for remote sampling locations when weight consideration is less of a concern; (3) a 120-volt alternating current (AC) powered peristaltic pump suitable for any location where 120-volt AC power is accessible, or for roadside sampling locations. Images and detailed descriptions are provided for each step in the sampling and preservation process.

  1. Magnetic resonance imaging (MRI): A review of genetic damage investigations.

    PubMed

    Vijayalaxmi; Fatahi, Mahsa; Speck, Oliver

    2015-01-01

    Magnetic resonance imaging (MRI) is a powerful, non-invasive diagnostic medical imaging technique widely used to acquire detailed information about anatomy and function of different organs in the body, in both health and disease. It utilizes electromagnetic fields of three different frequency bands: static magnetic field (SMF), time-varying gradient magnetic fields (GMF) in the kHz range and pulsed radiofrequency fields (RF) in the MHz range. There have been some investigations examining the extent of genetic damage following exposure of bacterial and human cells to all three frequency bands of electromagnetic fields, as used during MRI: the rationale for these studies is the well documented evidence of positive correlation between significantly increased genetic damage and carcinogenesis. Overall, the published data were not sufficiently informative and useful because of the small sample size, inappropriate comparison of experimental groups, etc. Besides, when an increased damage was observed in MRI-exposed cells, the fate of such lesions was not further explored from multiple 'down-stream' events. This review provides: (i) information on the basic principles used in MRI technology, (ii) detailed experimental protocols, results and critical comments on the genetic damage investigations thus far conducted using MRI equipment and, (iii) a discussion on several gaps in knowledge in the current scientific literature on MRI. Comprehensive, international, multi-centered collaborative studies, using a common and widely used MRI exposure protocol (cardiac or brain scan) incorporating several genetic/epigenetic damage end-points as well as epidemiological investigations, in large number of individuals/patients are warranted to reduce and perhaps, eliminate uncertainties raised in genetic damage investigations in cells exposed in vitro and in vivo to MRI. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Protocols for Robust Herbicide Resistance Testing in Different Weed Species.

    PubMed

    Panozzo, Silvia; Scarabel, Laura; Collavo, Alberto; Sattin, Maurizio

    2015-07-02

    Robust protocols to test putative herbicide resistant weed populations at whole plant level are essential to confirm the resistance status. The presented protocols, based on whole-plant bioassays performed in a greenhouse, can be readily adapted to a wide range of weed species and herbicides through appropriate variants. Seed samples from plants that survived a field herbicide treatment are collected and stored dry at low temperature until used. Germination methods differ according to weed species and seed dormancy type. Seedlings at similar growth stage are transplanted and maintained in the greenhouse under appropriate conditions until plants have reached the right growth stage for herbicide treatment. Accuracy is required to prepare the herbicide solution to avoid unverifiable mistakes. Other critical steps such as the application volume and spray speed are also evaluated. The advantages of this protocol, compared to others based on whole plant bioassays using one herbicide dose, are related to the higher reliability and the possibility of inferring the resistance level. Quicker and less expensive in vivo or in vitro diagnostic screening tests have been proposed (Petri dish bioassays, spectrophotometric tests), but they provide only qualitative information and their widespread use is hindered by the laborious set-up that some species may require. For routine resistance testing, the proposed whole plant bioassay can be applied at only one herbicide dose, so reducing the costs.

  3. Performance assessment of two lysis methods for direct identification of yeasts from clinical blood cultures using MALDI-TOF mass spectrometry.

    PubMed

    Jeddi, Fakhri; Yapo-Kouadio, Gisèle Cha; Normand, Anne-Cécile; Cassagne, Carole; Marty, Pierre; Piarroux, Renaud

    2017-02-01

    In cases of fungal infection of the bloodstream, rapid species identification is crucial to provide adapted therapy and thereby ameliorate patient outcome. Currently, the commercial Sepsityper kit and the sodium-dodecyl sulfate (SDS) method coupled with MALDI-TOF mass spectrometry are the most commonly reported lysis protocols for direct identification of fungi from positive blood culture vials. However, the performance of these two protocols has never been compared on clinical samples. Accordingly, we performed a two-step survey on two distinct panels of clinical positive blood culture vials to identify the most efficient protocol, establish an appropriate log score (LS) cut-off, and validate the best method. We first compared the performance of the Sepsityper and the SDS protocols on 71 clinical samples. For 69 monomicrobial samples, mass spectrometry LS values were significantly higher with the SDS protocol than with the Sepsityper method (P < .0001), especially when the best score of four deposited spots was considered. Next, we established the LS cut-off for accurate identification at 1.7, based on specimen DNA sequence data. Using this LS cut-off, 66 (95.6%) and 46 (66.6%) isolates were correctly identified at the species level with the SDS and the Sepsityper protocols, respectively. In the second arm of the survey, we validated the SDS protocol on an additional panel of 94 clinical samples. Ninety-two (98.9%) of 93 monomicrobial samples were correctly identified at the species level (median LS = 2.061). Overall, our data suggest that the SDS method yields more accurate species identification of yeasts, than the Sepsityper protocol. © The Author 2016. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. Optimization of Native and Formaldehyde iPOND Techniques for Use in Suspension Cells.

    PubMed

    Wiest, Nathaniel E; Tomkinson, Alan E

    2017-01-01

    The isolation of proteins on nascent DNA (iPOND) technique developed by the Cortez laboratory allows a previously unparalleled ability to examine proteins associated with replicating and newly synthesized DNA in mammalian cells. Both the original, formaldehyde-based iPOND technique and a more recent derivative, accelerated native iPOND (aniPOND), have mostly been performed in adherent cell lines. Here, we describe modifications to both protocols for use with suspension cell lines. These include cell culture, pulse, and chase conditions that optimize sample recovery in both protocols using suspension cells and several key improvements to the published aniPOND technique that reduce sample loss, increase signal to noise, and maximize sample recovery. Additionally, we directly and quantitatively compare the iPOND and aniPOND protocols to test the strengths and limitations of both. Finally, we present a detailed protocol to perform the optimized aniPOND protocol in suspension cell lines. © 2017 Elsevier Inc. All rights reserved.

  5. Corrosion and mechanical performance of AZ91 exposed to simulated inflammatory conditions.

    PubMed

    Brooks, Emily K; Der, Stephanie; Ehrensberger, Mark T

    2016-03-01

    Magnesium (Mg) and its alloys, including Mg-9%Al-1%Zn (AZ91), are biodegradable metals with potential use as temporary orthopedic implants. Invasive orthopedic procedures can provoke an inflammatory response that produces hydrogen peroxide (H2O2) and an acidic environment near the implant. This study assessed the influence of inflammation on both the corrosion and mechanical properties of AZ91. The AZ91 samples in the inflammatory protocol were immersed for three days in a complex biologically relevant electrolyte (AMEM culture media) that contained serum proteins (FBS), 150 mM of H2O2, and was titrated to a pH of 5. The control protocol immersed AZ91 samples in the same biologically relevant electrolyte (AMEM & FBS) but without H2O2 and the acid titration. After 3 days all samples were switched into fresh AMEM & FBS for an additional 3-day immersion. During the initial immersion, inflammatory protocol samples showed increased corrosion rate determined by mass loss testing, increased Mg and Al ion released to solution, and a completely corroded surface morphology as compared to the control protocol. Although corrosion in both protocols slowed once the test electrolyte solution was replaced at 3 days, the samples originally exposed to the simulated inflammatory conditions continued to display enhanced corrosion rates as compared to the control protocol. These lingering effects may indicate the initial inflammatory corrosion processes modified components of the surface oxide and corrosion film or initiated aggressive localized processes that subsequently left the interface more vulnerable to continued enhanced corrosion. The electrochemical properties of the interfaces were also evaluated by EIS, which found that the corrosion characteristics of the AZ91 samples were potentially influenced by the role of intermediate adsorption layer processes. The increased corrosion observed for the inflammatory protocol did not affect the flexural mechanical properties of the AZ91 at any time point assessed. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Two Decades into the LCR: What We Do and Still Don’t Know to Solve Lead Problems

    EPA Science Inventory

    Site selection and sampling protocol biases in LCR samplingunderestimate peak lead and copper concentrations whilemissing erratic lead release episodes resulting from distributionsystem chemical and physical disturbances. Possible sitetargeting and sampling protocol changes could...

  7. Vascular Blood Collection protocol samples into MELFI

    NASA Image and Video Library

    2011-10-18

    iss029e028495 (10/18/2011) --- Japan Aerospace Exploration Agency astronaut Satoshi Furukawa,Expedition 29 flight engineer,prepares to put samples from the CSA (Canadian Space Agency) Vascular Blood Collection protocol into the MELFI-1 (Minus Eighty Laboratory Freezer for ISS 1) unit.

  8. Two Decades into the LCR: What We Do and Still Don’t Know to Solve Lead Problems - abstract

    EPA Science Inventory

    Site selection and sampling protocol biases in LCR samplingunderestimate peak lead and copper concentrations whilemissing erratic lead release episodes resulting from distributionsystem chemical and physical disturbances. Possible sitetargeting and sampling protocol changes could...

  9. Rapid pulsed-field gel electrophoresis protocol for subtyping of Streptococcus suis serotype 2.

    PubMed

    Luey, Cindy K Y; Chu, Yiu Wai; Cheung, Terence K M; Law, Catherine C P; Chu, Man Yu; Cheung, Danny T L; Kam, Kai Man

    2007-03-01

    A rapid pulsed-field gel electrophoresis (PFGE) protocol for subtyping of Streptococcus suis serotype 2 was developed and evaluated using 27 clinical isolates from 22 epidemiologically unrelated patients. Results were matched against antibiogram, virulence genotyping and multi locus sequence typing (MLST). PFGE appeared to be the most discriminatory with numerical index of discrimination (D) equal to 0.87.

  10. Monitoring riparian-vegetation composition and cover along the Colorado River downstream of Glen Canyon Dam, Arizona

    USGS Publications Warehouse

    Palmquist, Emily C.; Ralston, Barbara E.; Sarr, Daniel A.; Johnson, Taylor C.

    2018-06-05

    Vegetation in the riparian zone (the area immediately adjacent to streams, such as stream banks) along the Colorado River downstream of Glen Canyon Dam, Arizona, supports many ecosystem and societal functions. In both Glen Canyon and Grand Canyon, this ecosystem has changed over time in response to flow alterations, invasive species, and recreational use. Riparian-vegetation cover and composition are likely to continue to change as these pressures persist and new ones emerge. Because this system is a valuable resource that is known to change in response to flow regime and other disturbances, a long-term monitoring protocol has been designed with three primary objectives:Annually measure and summarize the status (composition and cover) of native and non-native vascular-plant species within the riparian zone of the Colorado River between Glen Canyon Dam and Lake Mead.At 5-year intervals, assess change in vegetation composition and cover in the riparian zone, as related to geomorphic setting and dam operations, particularly flow regime.Collect data in a manner that can be used by multiple stakeholders, particularly the basinwide monitoring program overseen by the National Park Service’s Northern Colorado Plateau Network Inventory and Monitoring program.A protocol for the long-term monitoring of riparian vegetation is described in detail and standard operating procedures are included herein for all tasks. Visual estimates of foliar and ground covers are collected in conjunction with environmental measurements to assess correlations of foliar cover with abiotic and flow variables. Sample quadrats are stratified by frequency of inundation, geomorphic feature, and by river segment to account for differences in vegetation type. Photographs of sites are also taken to illustrate qualitative characteristics of the site at the time of sampling. Procedures for field preparation, generating random samples, data collection, data management, collecting and managing unknown species collections, and reporting are also described. Although this protocol is intended to be consistent over the long-term, procedures for minor and major revisions to the protocol are also outlined.

  11. Exposing Underrepresented Groups to Climate Change and Atmospheric Science Through Service Learning and Community-Based Participatory Research

    NASA Astrophysics Data System (ADS)

    Padgett, D.

    2016-12-01

    Tennessee State University (TSU) is among seven partner institutions in the NASA-funded project "Mission Earth: Fusing Global Learning and Observations to Benefit the Environment (GLOBE) with NASA Assets to Build Systemic Innovation in STEM Education." The primary objective at the TSU site is to expose high school students from racial and ethnic groups traditionally underrepresented in STEM to atmospheric science and physical systems associated with climate change. Currently, undergraduate students enrolled in TSU's urban and physical courses develop lessons for high school students focused upon the analysis of global warming phenomena and related extreme weather events. The GLOBE Atmosphere Protocols are emphasized in exercises focused upon the urban heat island (UHI) phenomenon and air quality measurements. Pre-service teachers at TSU, and in-service teachers at four local high schools are being certified in the Atmosphere Protocols. Precipitation, ambient air temperature, surface temperature and other data are collected at the schools through a collaborative learning effort among the high school students, TSU undergraduates, and high school teachers. Data collected and recorded manually in the field are compared to each school's automated Weatherbug station measurements. Students and teachers engage in analysis of NASA imagery as part of the GLOBE Surface Temperature Protocol. At off-campus locations, US Clean Air Act (CAA) criteria air pollutant and Toxic Release Inventory (TRI) air pollutant sampling is being conducted in community-based participatory research (CBPR) format. Students partner with non-profit environmental organizations. Data collected using low-cost air sampling devices is being compared with readings from government air monitors. The GLOBE Aerosols Protocol is used in comparative assessments with air sampling results. Project deliverables include four new GLOBE schools, the enrollment of which is nearly entirely comprised of students underrepresented in STEM. A model for service learning activities with GLOBE to increase underrepresented groups participation in STEM is a second deliverable. A third deliverable, a comprehensive citizen science guidebook for grassroots level air quality assessment, is being developed for wide distribution.

  12. Using Digital Image Correlation to Characterize Local Strains on Vascular Tissue Specimens.

    PubMed

    Zhou, Boran; Ravindran, Suraj; Ferdous, Jahid; Kidane, Addis; Sutton, Michael A; Shazly, Tarek

    2016-01-24

    Characterization of the mechanical behavior of biological and engineered soft tissues is a central component of fundamental biomedical research and product development. Stress-strain relationships are typically obtained from mechanical testing data to enable comparative assessment among samples and in some cases identification of constitutive mechanical properties. However, errors may be introduced through the use of average strain measures, as significant heterogeneity in the strain field may result from geometrical non-uniformity of the sample and stress concentrations induced by mounting/gripping of soft tissues within the test system. When strain field heterogeneity is significant, accurate assessment of the sample mechanical response requires measurement of local strains. This study demonstrates a novel biomechanical testing protocol for calculating local surface strains using a mechanical testing device coupled with a high resolution camera and a digital image correlation technique. A series of sample surface images are acquired and then analyzed to quantify the local surface strain of a vascular tissue specimen subjected to ramped uniaxial loading. This approach can improve accuracy in experimental vascular biomechanics and has potential for broader use among other native soft tissues, engineered soft tissues, and soft hydrogel/polymeric materials. In the video, we demonstrate how to set up the system components and perform a complete experiment on native vascular tissue.

  13. Sampled-data-based consensus and containment control of multiple harmonic oscillators: A motion-planning approach

    NASA Astrophysics Data System (ADS)

    Liu, Yongfang; Zhao, Yu; Chen, Guanrong

    2016-11-01

    This paper studies the distributed consensus and containment problems for a group of harmonic oscillators with a directed communication topology. First, for consensus without a leader, a class of distributed consensus protocols is designed by using motion planning and Pontryagin's principle. The proposed protocol only requires relative information measurements at the sampling instants, without requiring information exchange over the sampled interval. By using stability theory and the properties of stochastic matrices, it is proved that the distributed consensus problem can be solved in the motion planning framework. Second, for the case with multiple leaders, a class of distributed containment protocols is developed for followers such that their positions and velocities can ultimately converge to the convex hull formed by those of the leaders. Compared with the existing consensus algorithms, a remarkable advantage of the proposed sampled-data-based protocols is that the sampling periods, communication topologies and control gains are all decoupled and can be separately designed, which relaxes many restrictions in controllers design. Finally, some numerical examples are given to illustrate the effectiveness of the analytical results.

  14. Quality-control results for ground-water and surface-water data, Sacramento River Basin, California, National Water-Quality Assessment, 1996-1998

    USGS Publications Warehouse

    Munday, Cathy; Domagalski, Joseph L.

    2003-01-01

    Evaluating the extent that bias and variability affect the interpretation of ground- and surface-water data is necessary to meet the objectives of the National Water-Quality Assessment (NAWQA) Program. Quality-control samples used to evaluate the bias and variability include annual equipment blanks, field blanks, field matrix spikes, surrogates, and replicates. This report contains quality-control results for the constituents critical to the ground- and surface-water components of the Sacramento River Basin study unit of the NAWQA Program. A critical constituent is one that was detected frequently (more than 50 percent of the time in blank samples), was detected at amounts exceeding water-quality standards or goals, or was important for the interpretation of water-quality data. Quality-control samples were collected along with ground- and surface-water samples during the high intensity phase (cycle 1) of the Sacramento River Basin NAWQA beginning early in 1996 and ending in 1998. Ground-water field blanks indicated contamination of varying levels of significance when compared with concentrations detected in environmental ground-water samples for ammonia, dissolved organic carbon, aluminum, and copper. Concentrations of aluminum in surface-water field blanks were significant when compared with environmental samples. Field blank samples collected for pesticide and volatile organic compound analyses revealed no contamination in either ground- or surface-water samples that would effect the interpretation of environmental data, with the possible exception of the volatile organic compound trichloromethane (chloroform) in ground water. Replicate samples for ground water and surface water indicate that variability resulting from sample collection, processing, and analysis was generally low. Some of the larger maximum relative percentage differences calculated for replicate samples occurred between samples having lowest absolute concentration differences and(or) values near the reporting limit. Surrogate recoveries for pesticides analyzed by gas chromatography/mass spectrometry (GC/MS), pesticides analyzed by high performance liquid chromatography (HPLC), and volatile organic compounds in ground- and surface-water samples were within the acceptable limits of 70 to 130 percent and median recovery values between 82 and 113 percent. The recovery percentages for surrogate compounds analyzed by HPLC had the highest standard deviation, 20 percent for ground-water samples and 16 percent for surface-water samples, and the lowest median values, 82 percent for ground-water samples and 91 percent for surface-water samples. Results were consistent with the recovery results described for the analytical methods. Field matrix spike recoveries for pesticide compounds analyzed using GC/MS in ground- and surface-water samples were comparable with published recovery data. Recoveries of carbofuran, a critical constituent in ground- and surface-water studies, and desethyl atrazine, a critical constituent in the ground-water study, could not be calculated because of problems with the analytical method. Recoveries of pesticides analyzed using HPLC in ground- and surface-water samples were generally low and comparable with published recovery data. Other methodological problems for HPLC analytes included nondetection of the spike compounds and estimated values of spike concentrations. Recovery of field matrix spikes for volatile organic compounds generally were within the acceptable range, 70 and 130 percent for both ground- and surface-water samples, and median recoveries from 62 to 127 percent. High or low recoveries could be related to errors in the field, such as double spiking or using spike solution past its expiration date, rather than problems during analysis. The methodological changes in the field spike protocol during the course of the Sacramento River Basin study, which included decreasing the amount of spike solu

  15. Sampling protocol, estimation, and analysis procedures for the down woody materials indicator of the FIA program

    Treesearch

    Christopher W. Woodall; Vicente J. Monleon

    2008-01-01

    The USDA Forest Service's Forest Inventory and Analysis program conducts an inventory of forests of the United States including down woody materials (DWM). In this report we provide the rationale and context for a national inventory of DWM, describe the components sampled, discuss the sampling protocol used and corresponding estimation procedures, and provide...

  16. Simple Sodium Dodecyl Sulfate-Assisted Sample Preparation Method for LC-MS-based Proteomic Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jianying; Dann, Geoffrey P.; Shi, Tujin

    2012-03-10

    Sodium dodecyl sulfate (SDS) is one of the most popular laboratory reagents used for highly efficient biological sample extraction; however, SDS presents a significant challenge to LC-MS-based proteomic analyses due to its severe interference with reversed-phase LC separations and electrospray ionization interfaces. This study reports a simple SDS-assisted proteomic sample preparation method facilitated by a novel peptide-level SDS removal protocol. After SDS-assisted protein extraction and digestion, SDS was effectively (>99.9%) removed from peptides through ion substitution-mediated DS- precipitation with potassium chloride (KCl) followed by {approx}10 min centrifugation. Excellent peptide recovery (>95%) was observed for less than 20 {mu}g of peptides.more » Further experiments demonstrated the compatibility of this protocol with LC-MS/MS analyses. The resulting proteome coverage from this SDS-assisted protocol was comparable to or better than those obtained from other standard proteomic preparation methods in both mammalian tissues and bacterial samples. These results suggest that this SDS-assisted protocol is a practical, simple, and broadly applicable proteomic sample processing method, which can be particularly useful when dealing with samples difficult to solubilize by other methods.« less

  17. Effective use of metadata in the integration and analysis of multi-dimensional optical data

    NASA Astrophysics Data System (ADS)

    Pastorello, G. Z.; Gamon, J. A.

    2012-12-01

    Data discovery and integration relies on adequate metadata. However, creating and maintaining metadata is time consuming and often poorly addressed or avoided altogether, leading to problems in later data analysis and exchange. This is particularly true for research fields in which metadata standards do not yet exist or are under development, or within smaller research groups without enough resources. Vegetation monitoring using in-situ and remote optical sensing is an example of such a domain. In this area, data are inherently multi-dimensional, with spatial, temporal and spectral dimensions usually being well characterized. Other equally important aspects, however, might be inadequately translated into metadata. Examples include equipment specifications and calibrations, field/lab notes and field/lab protocols (e.g., sampling regimen, spectral calibration, atmospheric correction, sensor view angle, illumination angle), data processing choices (e.g., methods for gap filling, filtering and aggregation of data), quality assurance, and documentation of data sources, ownership and licensing. Each of these aspects can be important as metadata for search and discovery, but they can also be used as key data fields in their own right. If each of these aspects is also understood as an "extra dimension," it is possible to take advantage of them to simplify the data acquisition, integration, analysis, visualization and exchange cycle. Simple examples include selecting data sets of interest early in the integration process (e.g., only data collected according to a specific field sampling protocol) or applying appropriate data processing operations to different parts of a data set (e.g., adaptive processing for data collected under different sky conditions). More interesting scenarios involve guided navigation and visualization of data sets based on these extra dimensions, as well as partitioning data sets to highlight relevant subsets to be made available for exchange. The DAX (Data Acquisition to eXchange) Web-based tool uses a flexible metadata representation model and takes advantage of multi-dimensional data structures to translate metadata types into data dimensions, effectively reshaping data sets according to available metadata. With that, metadata is tightly integrated into the acquisition-to-exchange cycle, allowing for more focused exploration of data sets while also increasing the value of, and incentives for, keeping good metadata. The tool is being developed and tested with optical data collected in different settings, including laboratory, field, airborne, and satellite platforms.

  18. Earth's magnetic field strength in the Early Cambrian: Thellier paleointensity estimates of Itabaiana mafic dykes, Northeast Brazil

    NASA Astrophysics Data System (ADS)

    Trindade, R. I. F.; Macouin, M.; Poitou, C.; Chauvin, A.; Hill, M.

    2012-04-01

    Thellier's paleointensity and microwave paleointensity experiments were carried out in Early Cambrian dykes from Itabaiana (NE Brazil) dated at 525 ±5 Ma. A previous paleomagnetic study on these dykes reveals a very stable characteristic component, whose thermoremanent nature is confirmed by positive baked contact tests performed in three different dykes. The main magnetic carrier is Ti-poor to pure magnetite in the PSD to SD domain state. Hysteresis parameters and first-order reversal curve (FORC) diagrams will be presented in order to apprehend the two different behaviors that characterize the samples during paleointensity experiments. From the 96 samples (from 13 dykes) analyzed in two laboratories using slightly different Thellier's experimental protocols, we have retained 12 samples (3 dykes) for paleointensity estimates. Paleointensity values range from 18.1 up to 40 μΤ. This corresponds to equivalent VDMs of 4.3 ± 0.5, 4.4 ± 1.4 and 5.3 ± 0.9 x 1022 Am2, for the three dykes respectively. These results, the first obtained for rapidly cooled Cambrian rocks, document a moderate Earth field in the Precambrian-Cambrian transition.

  19. Use of low density polyethylene membranes for assessment of genotoxicity of PAHs in the Seine River.

    PubMed

    Vincent-Hubert, Françoise; Uher, Emmanuelle; Di Giorgio, Carole; Michel, Cécile; De Meo, Michel; Gourlay-France, Catherine

    2017-03-01

    The genotoxicity of river water dissolved contaminants is usually estimated after grab sampling of river water. Water contamination can now be obtained with passive samplers that allow a time-integrated sampling of contaminants. Since it was verified that low density polyethylene membranes (LDPE) accumulate labile hydrophobic compounds, their use was proposed as a passive sampler. This study was designed to test the applicability of passive sampling for combined chemical and genotoxicity measurements. The LDPE extracts were tested with the umu test (TA1535/pSK1002 ± S9) and the Ames assay (TA98, TA100 and YG1041 ± S9). We describe here this new protocol and its application in two field studies on four sites of the Seine River. Field LDPE extracts were negative with the YG1041 and TA100 and weakly positive with the TA98 + S9 and Umu test. Concentrations of labile mutagenic PAHs were higher upstream of Paris than downstream of Paris. Improvement of the method is needed to determine the genotoxicity of low concentrations of labile dissolved organic contaminants.

  20. Residual indicator bacteria in autosampler tubing: a field and laboratory assessment.

    PubMed

    Hathaway, J M; Hunt, W F; Guest, R M; McCarthy, D T

    2014-01-01

    Microbial contamination in surface waters has become a worldwide cause for concern. As efforts are made to reduce this contamination, monitoring is integral to documenting and evaluating water quality improvements. Autosamplers are beneficial in such monitoring efforts, as large data sets can be generated with minimized effort. The extent to which autosamplers can be utilized for microbial monitoring is largely unknown due to concerns over contamination. Strict sterilization regimes for components contacting the water being sampled are difficult, and sometimes logistically implausible, when utilizing autosamplers. Field experimentation showed contamination of fecal coliform in autosamplers to be more of a concern than that of Escherichia coli. Further study in a controlled laboratory environment suggested that tubing configuration has a significant effect on residual E. coli concentrations in sampler tubing. The amount of time that passed since the last sample was collected from a given sampler (antecedent dry weather period - DWP) tubing was also a significant factor. At a DWP of 7 days, little to no contamination was found. Thus, simple protocols such as providing positive drainage of tubing between sample events and programming samplers to include rinses will reduce concerns of contamination in autosamplers.

  1. DNA elution from buccal cells stored on Whatman FTA Classic Cards using a modified methanol fixation method.

    PubMed

    Johanson, Helene C; Hyland, Valentine; Wicking, Carol; Sturm, Richard A

    2009-04-01

    We describe here a method for DNA elution from buccal cells and whole blood both collected onto Whatman FTA technology, using methanol fixation followed by an elution PCR program. Extracted DNA is comparable in quality to published Whatman FTA protocols, as judged by PCR-based genotyping. Elution of DNA from the dried sample is a known rate-limiting step in the published Whatman FTA protocol; this method enables the use of each 3-mm punch of sample for several PCR reactions instead of the standard, one PCR reaction per sample punch. This optimized protocol therefore extends the usefulness and cost effectiveness of each buccal swab sample collected, when used for nucleic acid PCR and genotyping.

  2. Avoidance of harvesting and sampling artefacts in hydraulic analyses: a protocol tested on Malus domestica

    PubMed Central

    Beikircher, Barbara; Mayr, Stefan

    2016-01-01

    A prerequisite for reliable hydraulic measurements is an accurate collection of the plant material. Thereby, the native hydraulic state of the sample has to be preserved during harvesting (i.e., cutting the plant or plant parts) and preparation (i.e., excising the target section). This is particularly difficult when harvesting has to be done under transpiring conditions. In this article, we present a harvesting and sampling protocol designed for hydraulic measurements on Malus domestica Borkh. and checked for possible sampling artefacts. To test for artefacts, we analysed the percentage loss of hydraulic conductivity, maximum specific conductivity and water contents of bark and wood of branches, taking into account conduit length, time of day of harvesting, different shoot ages and seasonal effects. Our results prove that use of appropriate protocols can avoid artefactual embolization or refilling even when the xylem is under tension at harvest. The presented protocol was developed for Malus but may also be applied for other angiosperms with similar anatomy and refilling characteristics. PMID:26705311

  3. FISH-in-CHIPS: A Microfluidic Platform for Molecular Typing of Cancer Cells.

    PubMed

    Perez-Toralla, Karla; Mottet, Guillaume; Tulukcuoglu-Guneri, Ezgi; Champ, Jérôme; Bidard, François-Clément; Pierga, Jean-Yves; Klijanienko, Jerzy; Draskovic, Irena; Malaquin, Laurent; Viovy, Jean-Louis; Descroix, Stéphanie

    2017-01-01

    Microfluidics offer powerful tools for the control, manipulation, and analysis of cells, in particular for the assessment of cell malignancy or the study of cell subpopulations. However, implementing complex biological protocols on chip remains a challenge. Sample preparation is often performed off chip using multiple manually performed steps, and protocols usually include different dehydration and drying steps that are not always compatible with a microfluidic format.Here, we report the implementation of a Fluorescence in situ Hybridization (FISH) protocol for the molecular typing of cancer cells in a simple and low-cost device. The geometry of the chip allows integrating the sample preparation steps to efficiently assess the genomic content of individual cells using a minute amount of sample. The FISH protocol can be fully automated, thus enabling its use in routine clinical practice.

  4. Electronic data collection for the analysis of surgical maneuvers on patients submitted to rhinoplasty

    PubMed Central

    Berger, Cezar; Freitas, Renato; Malafaia, Osvaldo; Pinto, José Simão de Paula; Mocellin, Marcos; Macedo, Evaldo; Fagundes, Marina Serrato Coelho

    2012-01-01

    Summary Introduction: In the health field, computerization has become increasingly necessary in professional practice, since it facilitates data recovery and assists in the development of research with greater scientific rigor. Objective: the present work aimed to develop, apply, and validate specific electronic protocols for patients referred for rhinoplasty. Methods: The prospective research had 3 stages: (1) preparation of theoretical data bases; (2) creation of a master protocol using Integrated System of Electronic Protocol (SINPE©); and (3) elaboration, application, and validation of a specific protocol for the nose and sinuses regarding rhinoplasty. Results: After the preparation of the master protocol, which dealt with the entire field of otorhinolaryngology, we idealized a specific protocol containing all matters related to the patient. In particular, the aesthetic and functional nasal complaints referred for surgical treatment (i.e., rhinoplasty) were organized into 6 main hierarchical categories: anamnesis, physical examination, complementary exams, diagnosis, treatment, and outcome. This protocol utilized these categories and their sub-items: finality; access; surgical maneuvers on the nasal dorsum, tip, and base; clinical evolution after 3, 6, and 12 months; revisional surgery; and quantitative and qualitative evaluations. Conclusion: The developed electronic-specific protocol is feasible and important for information registration from patients referred to rhinoplasty. PMID:25991979

  5. Determination of free CO2 in emergent groundwaters using a commercial beverage carbonation meter

    NASA Astrophysics Data System (ADS)

    Vesper, Dorothy J.; Edenborn, Harry M.

    2012-05-01

    SummaryDissolved CO2 in groundwater is frequently supersaturated relative to its equilibrium with atmospheric partial pressure and will degas when it is conveyed to the surface. Estimates of dissolved CO2 concentrations can vary widely between different hydrochemical facies because they have different sources of error (e.g., rapid degassing, low alkalinity, non-carbonate alkalinity). We sampled 60 natural spring and mine waters using a beverage industry carbonation meter, which measures dissolved CO2 based on temperature and pressure changes as the sample volume is expanded. Using a modified field protocol, the meter was found to be highly accurate in the range 0.2-35 mM CO2. The meter provided rapid, accurate and precise measurements of dissolved CO2 in natural waters for a range of hydrochemical facies. Dissolved CO2 concentrations measured in the field with the carbonation meter were similar to CO2 determined using the pH-alkalinity approach, but provided immediate results and avoided errors from alkalinity and pH determination. The portability and ease of use of the carbonation meter in the field made it well-suited to sampling in difficult terrain. The carbonation meter has proven useful in the study of aquatic systems where CO2 degassing drives geochemical changes that result in surficial mineral precipitation and deposition, such as tufa, travertine and mine drainage deposits.

  6. Near-optimal protocols in complex nonequilibrium transformations

    DOE PAGES

    Gingrich, Todd R.; Rotskoff, Grant M.; Crooks, Gavin E.; ...

    2016-08-29

    The development of sophisticated experimental means to control nanoscale systems has motivated efforts to design driving protocols that minimize the energy dissipated to the environment. Computational models are a crucial tool in this practical challenge. In this paper, we describe a general method for sampling an ensemble of finite-time, nonequilibrium protocols biased toward a low average dissipation. In addition, we show that this scheme can be carried out very efficiently in several limiting cases. As an application, we sample the ensemble of low-dissipation protocols that invert the magnetization of a 2D Ising model and explore how the diversity of themore » protocols varies in response to constraints on the average dissipation. In this example, we find that there is a large set of protocols with average dissipation close to the optimal value, which we argue is a general phenomenon.« less

  7. Improving inference for aerial surveys of bears: The importance of assumptions and the cost of unnecessary complexity.

    PubMed

    Schmidt, Joshua H; Wilson, Tammy L; Thompson, William L; Reynolds, Joel H

    2017-07-01

    Obtaining useful estimates of wildlife abundance or density requires thoughtful attention to potential sources of bias and precision, and it is widely understood that addressing incomplete detection is critical to appropriate inference. When the underlying assumptions of sampling approaches are violated, both increased bias and reduced precision of the population estimator may result. Bear ( Ursus spp.) populations can be difficult to sample and are often monitored using mark-recapture distance sampling (MRDS) methods, although obtaining adequate sample sizes can be cost prohibitive. With the goal of improving inference, we examined the underlying methodological assumptions and estimator efficiency of three datasets collected under an MRDS protocol designed specifically for bears. We analyzed these data using MRDS, conventional distance sampling (CDS), and open-distance sampling approaches to evaluate the apparent bias-precision tradeoff relative to the assumptions inherent under each approach. We also evaluated the incorporation of informative priors on detection parameters within a Bayesian context. We found that the CDS estimator had low apparent bias and was more efficient than the more complex MRDS estimator. When combined with informative priors on the detection process, precision was increased by >50% compared to the MRDS approach with little apparent bias. In addition, open-distance sampling models revealed a serious violation of the assumption that all bears were available to be sampled. Inference is directly related to the underlying assumptions of the survey design and the analytical tools employed. We show that for aerial surveys of bears, avoidance of unnecessary model complexity, use of prior information, and the application of open population models can be used to greatly improve estimator performance and simplify field protocols. Although we focused on distance sampling-based aerial surveys for bears, the general concepts we addressed apply to a variety of wildlife survey contexts.

  8. Exploring the Implementation of Steganography Protocols on Quantum Audio Signals

    NASA Astrophysics Data System (ADS)

    Chen, Kehan; Yan, Fei; Iliyasu, Abdullah M.; Zhao, Jianping

    2018-02-01

    Two quantum audio steganography (QAS) protocols are proposed, each of which manipulates or modifies the least significant qubit (LSQb) of the host quantum audio signal that is encoded as an FRQA (flexible representation of quantum audio) audio content. The first protocol (i.e. the conventional LSQb QAS protocol or simply the cLSQ stego protocol) is built on the exchanges between qubits encoding the quantum audio message and the LSQb of the amplitude information in the host quantum audio samples. In the second protocol, the embedding procedure to realize it implants information from a quantum audio message deep into the constraint-imposed most significant qubit (MSQb) of the host quantum audio samples, we refer to it as the pseudo MSQb QAS protocol or simply the pMSQ stego protocol. The cLSQ stego protocol is designed to guarantee high imperceptibility between the host quantum audio and its stego version, whereas the pMSQ stego protocol ensures that the resulting stego quantum audio signal is better immune to illicit tampering and copyright violations (a.k.a. robustness). Built on the circuit model of quantum computation, the circuit networks to execute the embedding and extraction algorithms of both QAS protocols are determined and simulation-based experiments are conducted to demonstrate their implementation. Outcomes attest that both protocols offer promising trade-offs in terms of imperceptibility and robustness.

  9. Efficacy and Safety of Low-field Synchronized Transcranial Magnetic Stimulation (sTMS) for Treatment of Major Depression.

    PubMed

    Leuchter, Andrew F; Cook, Ian A; Feifel, David; Goethe, John W; Husain, Mustafa; Carpenter, Linda L; Thase, Michael E; Krystal, Andrew D; Philip, Noah S; Bhati, Mahendra T; Burke, William J; Howland, Robert H; Sheline, Yvette I; Aaronson, Scott T; Iosifescu, Dan V; O'Reardon, John P; Gilmer, William S; Jain, Rakesh; Burgoyne, Karl S; Phillips, Bill; Manberg, Paul J; Massaro, Joseph; Hunter, Aimee M; Lisanby, Sarah H; George, Mark S

    2015-01-01

    Transcranial Magnetic Stimulation (TMS) customarily uses high-field electromagnets to achieve therapeutic efficacy in Major Depressive Disorder (MDD). Low-field magnetic stimulation also may be useful for treatment of MDD, with fewer treatment-emergent adverse events. To examine efficacy, safety, and tolerability of low-field magnetic stimulation synchronized to an individual's alpha frequency (IAF) (synchronized TMS, or sTMS) for treatment of MDD. Six-week double-blind sham-controlled treatment trial of a novel device that used three rotating neodymium magnets to deliver sTMS treatment. IAF was determined from a single-channel EEG prior to first treatment. Subjects had baseline 17-item Hamilton Depression Rating Scale (HamD17) ≥ 17. 202 subjects comprised the intent-to-treat (ITT) sample, and 120 subjects completed treatment per-protocol (PP). There was no difference in efficacy between active and sham in the ITT sample. Subjects in the PP sample (N = 59), however, had significantly greater mean decrease in HamD17 than sham (N = 60) (-9.00 vs. -6.56, P = 0.033). PP subjects with a history of poor response or intolerance to medication showed greater improvement with sTMS than did treatment-naïve subjects (-8.58 vs. -4.25, P = 0.017). Efficacy in the PP sample reflects exclusion of subjects who received fewer than 80% of scheduled treatments or were inadvertently treated at the incorrect IAF; these subgroups failed to separate from sham. There was no difference in adverse events between sTMS and sham, and no serious adverse events attributable to sTMS. Results suggest that sTMS may be effective, safe, and well tolerated for treating MDD when administered as intended. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Optimization of scat detection methods for a social ungulate, the wild pig, and experimental evaluation of factors affecting detection of scat

    USGS Publications Warehouse

    Keiter, David A.; Cunningham, Fred L.; Rhodes, Olin E.; Irwin, Brian J.; Beasley, James

    2016-01-01

    Collection of scat samples is common in wildlife research, particularly for genetic capture-mark-recapture applications. Due to high degradation rates of genetic material in scat, large numbers of samples must be collected to generate robust estimates. Optimization of sampling approaches to account for taxa-specific patterns of scat deposition is, therefore, necessary to ensure sufficient sample collection. While scat collection methods have been widely studied in carnivores, research to maximize scat collection and noninvasive sampling efficiency for social ungulates is lacking. Further, environmental factors or scat morphology may influence detection of scat by observers. We contrasted performance of novel radial search protocols with existing adaptive cluster sampling protocols to quantify differences in observed amounts of wild pig (Sus scrofa) scat. We also evaluated the effects of environmental (percentage of vegetative ground cover and occurrence of rain immediately prior to sampling) and scat characteristics (fecal pellet size and number) on the detectability of scat by observers. We found that 15- and 20-m radial search protocols resulted in greater numbers of scats encountered than the previously used adaptive cluster sampling approach across habitat types, and that fecal pellet size, number of fecal pellets, percent vegetative ground cover, and recent rain events were significant predictors of scat detection. Our results suggest that use of a fixed-width radial search protocol may increase the number of scats detected for wild pigs, or other social ungulates, allowing more robust estimation of population metrics using noninvasive genetic sampling methods. Further, as fecal pellet size affected scat detection, juvenile or smaller-sized animals may be less detectable than adult or large animals, which could introduce bias into abundance estimates. Knowledge of relationships between environmental variables and scat detection may allow researchers to optimize sampling protocols to maximize utility of noninvasive sampling for wild pigs and other social ungulates.

  11. Optimization of Scat Detection Methods for a Social Ungulate, the Wild Pig, and Experimental Evaluation of Factors Affecting Detection of Scat.

    PubMed

    Keiter, David A; Cunningham, Fred L; Rhodes, Olin E; Irwin, Brian J; Beasley, James C

    2016-01-01

    Collection of scat samples is common in wildlife research, particularly for genetic capture-mark-recapture applications. Due to high degradation rates of genetic material in scat, large numbers of samples must be collected to generate robust estimates. Optimization of sampling approaches to account for taxa-specific patterns of scat deposition is, therefore, necessary to ensure sufficient sample collection. While scat collection methods have been widely studied in carnivores, research to maximize scat collection and noninvasive sampling efficiency for social ungulates is lacking. Further, environmental factors or scat morphology may influence detection of scat by observers. We contrasted performance of novel radial search protocols with existing adaptive cluster sampling protocols to quantify differences in observed amounts of wild pig (Sus scrofa) scat. We also evaluated the effects of environmental (percentage of vegetative ground cover and occurrence of rain immediately prior to sampling) and scat characteristics (fecal pellet size and number) on the detectability of scat by observers. We found that 15- and 20-m radial search protocols resulted in greater numbers of scats encountered than the previously used adaptive cluster sampling approach across habitat types, and that fecal pellet size, number of fecal pellets, percent vegetative ground cover, and recent rain events were significant predictors of scat detection. Our results suggest that use of a fixed-width radial search protocol may increase the number of scats detected for wild pigs, or other social ungulates, allowing more robust estimation of population metrics using noninvasive genetic sampling methods. Further, as fecal pellet size affected scat detection, juvenile or smaller-sized animals may be less detectable than adult or large animals, which could introduce bias into abundance estimates. Knowledge of relationships between environmental variables and scat detection may allow researchers to optimize sampling protocols to maximize utility of noninvasive sampling for wild pigs and other social ungulates.

  12. Feasibility and acceptability of the DSM-5 Field Trial procedures in the Johns Hopkins Community Psychiatry Programs†

    PubMed Central

    Clarke, Diana E.; Wilcox, Holly C.; Miller, Leslie; Cullen, Bernadette; Gerring, Joan; Greiner, Lisa H.; Newcomer, Alison; Mckitty, Mellisha V.; Regier, Darrel A.; Narrow, William E.

    2014-01-01

    The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) contains criteria for psychiatric diagnoses that reflect advances in the science and conceptualization of mental disorders and address the needs of clinicians. DSM-5 also recommends research on dimensional measures of cross-cutting symptoms and diagnostic severity, which are expected to better capture patients’ experiences with mental disorders. Prior to its May 2013 release, the American Psychiatric Association (APA) conducted field trials to examine the feasibility, clinical utility, reliability, and where possible, the validity of proposed DSM-5 diagnostic criteria and dimensional measures. The methods and measures proposed for the DSM-5 field trials were pilot tested in adult and child/adolescent clinical samples, with the goal to identify and correct design and procedural problems with the proposed methods before resources were expended for the larger DSM-5 Field Trials. Results allowed for the refinement of the protocols, procedures, and measures, which facilitated recruitment, implementation, and completion of the DSM-5 Field Trials. These results highlight the benefits of pilot studies in planning large multisite studies. PMID:24615761

  13. Environmental contaminant studies by the Patuxent Wildlife Research Center

    USGS Publications Warehouse

    Heinz, G.H.; Hill, E.F.; Stickel, W.H.; Stickel, L.F.; Kenaga, E.E.

    1979-01-01

    Evaluation of the effects of environmental contaminants on wildlife is geared to interpreting events in the field, especially population effects, and both field and laboratory studies are planned for this purpose; procedures are adapted to specific problems and therefore do not include strict protocols or routine testing. Field evaluations include measurements of cholinesterase inhibition in brain or blood, search for dead or disabled animals, study of nesting success of birds, and general ecological observations. Residue analyses are used in evaluating organochlorine chemicals; samples may include whole bodies for determining level of exposure, brains for mortality diagnosis, whole blood for certain special studies, and eggs to help in evaluation of possible reproductive effects. Bird counts, singing-male census counts, small mammal trapping, and cage-in-field tests have proven to be ineffective or misleading and are not considered suitable for field evaluations under most circumstances. Usefulness of simulated field trials is limited to very special situations. Experimental studies that help predict and interpret field effects include determinations of lethal diagnostic levels, comparative lethal dietary toxicity tests, tests of secondary poisoning, measurement of residue loss rates, measurement of blood enzymes, tests of behavioral effects, and studies of reproductive effects.

  14. DEVELOPMENT OF LARGE RIVER BIOASSESSMENT PROTOCOLS (LR-BPS) FOR BENTHIC MACROINVERTEBRATES IN EPA REGION 5

    EPA Science Inventory

    Non-wadeable rivers have been largely overlooked by bioassessment programs because of sampling difficulties and a lack of appropriate methods and biological indicators. We are in the process of developing a Large River Bioassessment Protocol (LR-BP) for sampling macroinvertebrat...

  15. 21 CFR 660.46 - Samples; protocols; official release.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Samples; protocols; official release. 660.46 Section 660.46 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) BIOLOGICS ADDITIONAL STANDARDS FOR DIAGNOSTIC SUBSTANCES FOR LABORATORY TESTS Hepatitis B Surface...

  16. Optimization of a Sample Processing Protocol for Recovery of ...

    EPA Pesticide Factsheets

    Journal Article Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps.

  17. Infection control in healthcare settings: perspectives for mfDNA analysis in monitoring sanitation procedures.

    PubMed

    Valeriani, Federica; Protano, Carmela; Gianfranceschi, Gianluca; Cozza, Paola; Campanella, Vincenzo; Liguori, Giorgio; Vitali, Matteo; Divizia, Maurizio; Romano Spica, Vincenzo

    2016-08-09

    Appropriate sanitation procedures and monitoring of their actual efficacy represent critical points for improving hygiene and reducing the risk of healthcare-associated infections. Presently, surveillance is based on traditional protocols and classical microbiology. Innovation in monitoring is required not only to enhance safety or speed up controls but also to prevent cross infections due to novel or uncultivable pathogens. In order to improve surveillance monitoring, we propose that biological fluid microflora (mf) on reprocessed devices is a potential indicator of sanitation failure, when tested by an mfDNA-based approach. The survey focused on oral microflora traces in dental care settings. Experimental tests (n = 48) and an "in field" trial (n = 83) were performed on dental instruments. Conventional microbiology and amplification of bacterial genes by multiple real-time PCR were applied to detect traces of salivary microflora. Six different sanitation protocols were considered. A monitoring protocol was developed and performance of the mfDNA assay was evaluated by sensitivity and specificity. Contaminated samples resulted positive for saliva traces by the proposed approach (CT < 35). In accordance with guidelines, only fully sanitized samples were considered negative (100 %). Culture-based tests confirmed disinfectant efficacy, but failed in detecting incomplete sanitation. The method provided sensitivity and specificity over 95 %. The principle of detecting biological fluids by mfDNA analysis seems promising for monitoring the effectiveness of instrument reprocessing. The molecular approach is simple, fast and can provide a valid support for surveillance in dental care or other hospital settings.

  18. Metabolic profiling of body fluids and multivariate data analysis.

    PubMed

    Trezzi, Jean-Pierre; Jäger, Christian; Galozzi, Sara; Barkovits, Katalin; Marcus, Katrin; Mollenhauer, Brit; Hiller, Karsten

    2017-01-01

    Metabolome analyses of body fluids are challenging due pre-analytical variations, such as pre-processing delay and temperature, and constant dynamical changes of biochemical processes within the samples. Therefore, proper sample handling starting from the time of collection up to the analysis is crucial to obtain high quality samples and reproducible results. A metabolomics analysis is divided into 4 main steps: 1) Sample collection, 2) Metabolite extraction, 3) Data acquisition and 4) Data analysis. Here, we describe a protocol for gas chromatography coupled to mass spectrometry (GC-MS) based metabolic analysis for biological matrices, especially body fluids. This protocol can be applied on blood serum/plasma, saliva and cerebrospinal fluid (CSF) samples of humans and other vertebrates. It covers sample collection, sample pre-processing, metabolite extraction, GC-MS measurement and guidelines for the subsequent data analysis. Advantages of this protocol include: •Robust and reproducible metabolomics results, taking into account pre-analytical variations that may occur during the sampling process•Small sample volume required•Rapid and cost-effective processing of biological samples•Logistic regression based determination of biomarker signatures for in-depth data analysis.

  19. A priori collaboration in population imaging: The Uniform Neuro-Imaging of Virchow-Robin Spaces Enlargement consortium.

    PubMed

    Adams, Hieab H H; Hilal, Saima; Schwingenschuh, Petra; Wittfeld, Katharina; van der Lee, Sven J; DeCarli, Charles; Vernooij, Meike W; Katschnig-Winter, Petra; Habes, Mohamad; Chen, Christopher; Seshadri, Sudha; van Duijn, Cornelia M; Ikram, M Kamran; Grabe, Hans J; Schmidt, Reinhold; Ikram, M Arfan

    2015-12-01

    Virchow-Robin spaces (VRS), or perivascular spaces, are compartments of interstitial fluid enclosing cerebral blood vessels and are potential imaging markers of various underlying brain pathologies. Despite a growing interest in the study of enlarged VRS, the heterogeneity in rating and quantification methods combined with small sample sizes have so far hampered advancement in the field. The Uniform Neuro-Imaging of Virchow-Robin Spaces Enlargement (UNIVRSE) consortium was established with primary aims to harmonize rating and analysis (www.uconsortium.org). The UNIVRSE consortium brings together 13 (sub)cohorts from five countries, totaling 16,000 subjects and over 25,000 scans. Eight different magnetic resonance imaging protocols were used in the consortium. VRS rating was harmonized using a validated protocol that was developed by the two founding members, with high reliability independent of scanner type, rater experience, or concomitant brain pathology. Initial analyses revealed risk factors for enlarged VRS including increased age, sex, high blood pressure, brain infarcts, and white matter lesions, but this varied by brain region. Early collaborative efforts between cohort studies with respect to data harmonization and joint analyses can advance the field of population (neuro)imaging. The UNIVRSE consortium will focus efforts on other potential correlates of enlarged VRS, including genetics, cognition, stroke, and dementia.

  20. 76 FR 24862 - Proposed Information Collection; Comment Request; Protocol for Access to Tissue Specimen Samples...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-03

    ... Collection; Comment Request; Protocol for Access to Tissue Specimen Samples From the National Marine Mammal Tissue Bank AGENCY: National Oceanic and Atmospheric Administration (NOAA), Commerce. ACTION: Notice... National Marine Mammal Tissue Bank (NMMTB) was established by the National Marine Fisheries Service (NMFS...

  1. Development of reagents for immunoassay of Phytophthora ramorum in nursery water samples

    Treesearch

    Douglas G. Luster; Timothy Widmer; Michael McMahon; C. André Lévesque

    2017-01-01

    Current regulations under the August 6, 2014 USDA APHIS Official Regulatory Protocol (Confirmed Nursery Protocol: Version 8.2) for Nurseries Containing Plants Infected with Phytophthora ramorum mandates the sampling of water in affected nurseries to demonstrate they are free of P. ramorum. Currently, detection of

  2. A MORE COST-EFFECTIVE EMAP BENTHIC MACROFAUNAL SAMPLING PROTOCOL

    EPA Science Inventory

    Benthic macrofaunal sampling protocols in the U.S. Environmental Protection Agency's Environmental Monitoring and Assessment Program (EMAP) are to collect 30 to 50 random benthic macrofauna [defined as animals retained on a 0.5 mm (East and Gulf Coasts, USA) or a 1.0 mm mesh siev...

  3. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Research, determines that the reliability and consistency of the finished product can be assured with a... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Samples; protocols; official release. 660.6 Section 660.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES...

  4. 21 CFR 660.6 - Samples; protocols; official release.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Research, determines that the reliability and consistency of the finished product can be assured with a... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Samples; protocols; official release. 660.6 Section 660.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED...

  5. Authentication of beef versus horse meat using 60 MHz 1H NMR spectroscopy

    PubMed Central

    Jakes, W.; Gerdova, A.; Defernez, M.; Watson, A.D.; McCallum, C.; Limer, E.; Colquhoun, I.J.; Williamson, D.C.; Kemsley, E.K.

    2015-01-01

    This work reports a candidate screening protocol to distinguish beef from horse meat based upon comparison of triglyceride signatures obtained by 60 MHz 1H NMR spectroscopy. Using a simple chloroform-based extraction, we obtained classic low-field triglyceride spectra from typically a 10 min acquisition time. Peak integration was sufficient to differentiate samples of fresh beef (76 extractions) and horse (62 extractions) using Naïve Bayes classification. Principal component analysis gave a two-dimensional “authentic” beef region (p = 0.001) against which further spectra could be compared. This model was challenged using a subset of 23 freeze–thawed training samples. The outcomes indicated that storing samples by freezing does not adversely affect the analysis. Of a further collection of extractions from previously unseen samples, 90/91 beef spectra were classified as authentic, and 16/16 horse spectra as non-authentic. We conclude that 60 MHz 1H NMR represents a feasible high-throughput approach for screening raw meat. PMID:25577043

  6. Water-quality sampling by the U.S. Geological Survey-Standard protocols and procedures

    USGS Publications Warehouse

    Wilde, Franceska D.

    2010-01-01

    Thumbnail of and link to report PDF (1.0 MB) The U.S. Geological Survey (USGS) develops the sampling procedures and collects the data necessary for the accurate assessment and wise management of our Nation's surface-water and groundwater resources. Federal and State agencies, water-resource regulators and managers, and many organizations and interested parties in the public and private sectors depend on the reliability, timeliness, and integrity of the data we collect and the scientific soundness and impartiality of our data assessments and analysis. The standard data-collection methods uniformly used by USGS water-quality personnel are peer reviewed, kept up-to-date, and published in the National Field Manual for the Collection of Water-Quality Data (http://pubs.water.usgs.gov/twri9A/).

  7. A computer program for geochemical analysis of acid-rain and other low-ionic-strength, acidic waters

    USGS Publications Warehouse

    Johnsson, P.A.; Lord, D.G.

    1987-01-01

    ARCHEM, a computer program written in FORTRAN 77, is designed primarily for use in the routine geochemical interpretation of low-ionic-strength, acidic waters. On the basis of chemical analyses of the water, and either laboratory or field determinations of pH, temperature, and dissolved oxygen, the program calculates the equilibrium distribution of major inorganic aqueous species and of inorganic aluminum complexes. The concentration of the organic anion is estimated from the dissolved organic concentration. Ionic ferrous iron is calculated from the dissolved oxygen concentration. Ionic balances and comparisons of computed with measured specific conductances are performed as checks on the analytical accuracy of chemical analyses. ARCHEM may be tailored easily to fit different sampling protocols, and may be run on multiple sample analyses. (Author 's abstract)

  8. Human breath metabolomics using an optimized noninvasive exhaled breath condensate sampler

    PubMed Central

    Zamuruyev, Konstantin O.; Aksenov, Alexander A.; Pasamontes, Alberto; Brown, Joshua F.; Pettit, Dayna R.; Foutouhi, Soraya; Weimer, Bart C.; Schivo, Michael; Kenyon, Nicholas J.; Delplanque, Jean-Pierre; Davis, Cristina E.

    2017-01-01

    Exhaled breath condensate (EBC) analysis is a developing field with tremendous promise to advance personalized, non-invasive health diagnostics as new analytical instrumentation platforms and detection methods are developed. Multiple commercially-available and researcher-built experimental samplers are reported in the literature. However, there is very limited information available to determine an effective breath sampling approach, especially regarding the dependence of breath sample metabolomic content on the collection device design and sampling methodology. This lack of an optimal standard procedure results in a range of reported results that are sometimes contradictory. Here, we present a design of a portable human EBC sampler optimized for collection and preservation of the rich metabolomic content of breath. The performance of the engineered device is compared to two commercially available breath collection devices: the RTube™ and TurboDECCS. A number of design and performance parameters are considered, including: condenser temperature stability during sampling, collection efficiency, condenser material choice, and saliva contamination in the collected breath samples. The significance of the biological content of breath samples, collected with each device, is evaluated with a set of mass spectrometry methods and was the primary factor for evaluating device performance. The design includes an adjustable mass-size threshold for aerodynamic filtering of saliva droplets from the breath flow. Engineering an inexpensive device that allows efficient collection of metalomic-rich breath samples is intended to aid further advancement in the field of breath analysis for non-invasive health diagnostic. EBC sampling from human volunteers was performed under UC Davis IRB protocol 63701-3 (09/30/2014-07/07/2017). PMID:28004639

  9. Human breath metabolomics using an optimized non-invasive exhaled breath condensate sampler.

    PubMed

    Zamuruyev, Konstantin O; Aksenov, Alexander A; Pasamontes, Alberto; Brown, Joshua F; Pettit, Dayna R; Foutouhi, Soraya; Weimer, Bart C; Schivo, Michael; Kenyon, Nicholas J; Delplanque, Jean-Pierre; Davis, Cristina E

    2016-12-22

    Exhaled breath condensate (EBC) analysis is a developing field with tremendous promise to advance personalized, non-invasive health diagnostics as new analytical instrumentation platforms and detection methods are developed. Multiple commercially-available and researcher-built experimental samplers are reported in the literature. However, there is very limited information available to determine an effective breath sampling approach, especially regarding the dependence of breath sample metabolomic content on the collection device design and sampling methodology. This lack of an optimal standard procedure results in a range of reported results that are sometimes contradictory. Here, we present a design of a portable human EBC sampler optimized for collection and preservation of the rich metabolomic content of breath. The performance of the engineered device is compared to two commercially available breath collection devices: the RTube ™ and TurboDECCS. A number of design and performance parameters are considered, including: condenser temperature stability during sampling, collection efficiency, condenser material choice, and saliva contamination in the collected breath samples. The significance of the biological content of breath samples, collected with each device, is evaluated with a set of mass spectrometry methods and was the primary factor for evaluating device performance. The design includes an adjustable mass-size threshold for aerodynamic filtering of saliva droplets from the breath flow. Engineering an inexpensive device that allows efficient collection of metalomic-rich breath samples is intended to aid further advancement in the field of breath analysis for non-invasive health diagnostic. EBC sampling from human volunteers was performed under UC Davis IRB protocol 63701-3 (09/30/2014-07/07/2017).

  10. Field trials of line transect methods applied to estimation of desert tortoise abundance

    USGS Publications Warehouse

    Anderson, David R.; Burnham, Kenneth P.; Lubow, Bruce C.; Thomas, L. E. N.; Corn, Paul Stephen; Medica, Philip A.; Marlow, R.W.

    2001-01-01

    We examine the degree to which field observers can meet the assumptions underlying line transect sampling to monitor populations of desert tortoises (Gopherus agassizii). We present the results of 2 field trials using artificial tortoise models in 3 size classes. The trials were conducted on 2 occasions on an area south of Las Vegas, Nevada, where the density of the test population was known. In the first trials, conducted largely by experienced biologists who had been involved in tortoise surveys for many years, the density of adult tortoise models was well estimated (-3.9% bias), while the bias was higher (-20%) for subadult tortoise models. The bias for combined data was -12.0%. The bias was largely attributed to the failure to detect all tortoise models on or near the transect centerline. The second trials were conducted with a group of largely inexperienced student volunteers and used somewhat different searching methods, and the results were similar to the first trials. Estimated combined density of subadult and adult tortoise models had a negative bias (-7.3%), again attributable to failure to detect some models on or near the centerline. Experience in desert tortoise biology, either comparing the first and second trials or in the second trial with 2 experienced biologists versus 16 novices, did not have an apparent effect on the quality of the data or the accuracy of the estimates. Observer training, specific to line transect sampling, and field testing are important components of a reliable survey. Line transect sampling represents a viable method for large-scale monitoring of populations of desert tortoise; however, field protocol must be improved to assure the key assumptions are met.

  11. Comparison of precipitation chemistry measurements obtained by the Canadian Air and Precipitation Monitoring Network and National Atmospheric Deposition Program for the period 1995-2004

    USGS Publications Warehouse

    Wetherbee, Gregory A.; Shaw, Michael J.; Latysh, Natalie E.; Lehmann, Christopher M.B.; Rothert, Jane E.

    2010-01-01

    Precipitation chemistry and depth measurements obtained by the Canadian Air and Precipitation Monitoring Network (CAPMoN) and the US National Atmospheric Deposition Program/National Trends Network (NADP/NTN) were compared for the 10-year period 1995–2004. Colocated sets of CAPMoN and NADP instrumentation, consisting of precipitation collectors and rain gages, were operated simultaneously per standard protocols for each network at Sutton, Ontario and Frelighsburg, Ontario, Canada and at State College, PA, USA. CAPMoN samples were collected daily, and NADP samples were collected weekly, and samples were analyzed exclusively by each network’s laboratory for pH, H + , Ca2+  , Mg2+  , Na + , K + , NH+4 , Cl − , NO−3 , and SO2−4 . Weekly and annual precipitation-weighted mean concentrations for each network were compared. This study is a follow-up to an earlier internetwork comparison for the period 1986–1993, published by Alain Sirois, Robert Vet, and Dennis Lamb in 2000. Median weekly internetwork differences for 1995–2004 data were the same to slightly lower than for data for the previous study period (1986–1993) for all analytes except NO−3 , SO2−4 , and sample depth. A 1994 NADP sampling protocol change and a 1998 change in the types of filters used to process NADP samples reversed the previously identified negative bias in NADP data for hydrogen-ion and sodium concentrations. Statistically significant biases (α = 0.10) for sodium and hydrogen-ion concentrations observed in the 1986–1993 data were not significant for 1995–2004. Weekly CAPMoN measurements generally are higher than weekly NADP measurements due to differences in sample filtration and field instrumentation, not sample evaporation, contamination, or analytical laboratory differences.

  12. Electrical stimulation systems for cardiac tissue engineering

    PubMed Central

    Tandon, Nina; Cannizzaro, Christopher; Chao, Pen-Hsiu Grace; Maidhof, Robert; Marsano, Anna; Au, Hoi Ting Heidi; Radisic, Milica; Vunjak-Novakovic, Gordana

    2009-01-01

    We describe a protocol for tissue engineering of synchronously contractile cardiac constructs by culturing cardiac cells with the application of pulsatile electrical fields designed to mimic those present in the native heart. Tissue culture is conducted in a customized chamber built to allow for cultivation of (i) engineered three-dimensional (3D) cardiac tissue constructs, (ii) cell monolayers on flat substrates or (iii) cells on patterned substrates. This also allows for analysis of the individual and interactive effects of pulsatile electrical field stimulation and substrate topography on cell differentiation and assembly. The protocol is designed to allow for delivery of predictable electrical field stimuli to cells, monitoring environmental parameters, and assessment of cell and tissue responses. The duration of the protocol is 5 d for two-dimensional cultures and 10 d for 3D cultures. PMID:19180087

  13. Development, implementation, and experimentation of parametric routing protocol for sensor networks

    NASA Astrophysics Data System (ADS)

    Nassr, Matthew S.; Jun, Jangeun; Eidenbenz, Stephan J.; Frigo, Janette R.; Hansson, Anders A.; Mielke, Angela M.; Smith, Mark C.

    2006-09-01

    The development of a scalable and reliable routing protocol for sensor networks is traced from a theoretical beginning to positive simulation results to the end of verification experiments in large and heavily loaded networks. Design decisions and explanations as well as implementation hurdles are presented to give a complete picture of protocol development. Additional software and hardware is required to accurately test the performance of our protocol in field experiments. In addition, the developed protocol is tested in TinyOS on Mica2 motes against well-established routing protocols frequently used in sensor networks. Our protocol proves to outperform the standard (MINTRoute) and the trivial (Gossip) in a variety of different scenarios.

  14. Spin-Orbit-Coupled Interferometry with Ring-Trapped Bose-Einstein Condensates

    NASA Astrophysics Data System (ADS)

    Helm, J. L.; Billam, T. P.; Rakonjac, A.; Cornish, S. L.; Gardiner, S. A.

    2018-02-01

    We propose a method of atom interferometry using a spinor Bose-Einstein condensate with a time-varying magnetic field acting as a coherent beam splitter. Our protocol creates long-lived superpositional counterflow states, which are of fundamental interest and can be made sensitive to both the Sagnac effect and magnetic fields on the sub-μ G scale. We split a ring-trapped condensate, initially in the mf=0 hyperfine state, into superpositions of internal mf=±1 states and condensate superflow, which are spin-orbit coupled. After interrogation, the relative phase accumulation can be inferred from a population transfer to the mf=±1 states. The counterflow generation protocol is adiabatically deterministic and does not rely on coupling to additional optical fields or mechanical stirring techniques. Our protocol can maximize the classical Fisher information for any rotation, magnetic field, or interrogation time and so has the maximum sensitivity available to uncorrelated particles. Precision can increase with the interrogation time and so is limited only by the lifetime of the condensate.

  15. Writing Interview Protocols and Conducting Interviews: Tips for Students New to the Field of Qualitative Research

    ERIC Educational Resources Information Center

    Jacob, Stacy A.; Furgerson, S. Paige

    2012-01-01

    Students new to doing qualitative research in the ethnographic and oral traditions, often have difficulty creating successful interview protocols. This article offers practical suggestions for students new to qualitative research for both writing interview protocol that elicit useful data and for conducting the interview. This piece was originally…

  16. Comparison of three sampling and analytical methods for the determination of airborne hexavalent chromium.

    PubMed

    Boiano, J M; Wallace, M E; Sieber, W K; Groff, J H; Wang, J; Ashley, K

    2000-08-01

    A field study was conducted with the goal of comparing the performance of three recently developed or modified sampling and analytical methods for the determination of airborne hexavalent chromium (Cr(VI)). The study was carried out in a hard chrome electroplating facility and in a jet engine manufacturing facility where airborne Cr(VI) was expected to be present. The analytical methods evaluated included two laboratory-based procedures (OSHA Method ID-215 and NIOSH Method 7605) and a field-portable method (NIOSH Method 7703). These three methods employ an identical sampling methodology: collection of Cr(VI)-containing aerosol on a polyvinyl chloride (PVC) filter housed in a sampling cassette, which is connected to a personal sampling pump calibrated at an appropriate flow rate. The basis of the analytical methods for all three methods involves extraction of the PVC filter in alkaline buffer solution, chemical isolation of the Cr(VI) ion, complexation of the Cr(VI) ion with 1,5-diphenylcarbazide, and spectrometric measurement of the violet chromium diphenylcarbazone complex at 540 nm. However, there are notable specific differences within the sample preparation procedures used in three methods. To assess the comparability of the three measurement protocols, a total of 20 side-by-side air samples were collected, equally divided between a chromic acid electroplating operation and a spray paint operation where water soluble forms of Cr(VI) were used. A range of Cr(VI) concentrations from 0.6 to 960 microg m(-3), with Cr(VI) mass loadings ranging from 0.4 to 32 microg, was measured at the two operations. The equivalence of the means of the log-transformed Cr(VI) concentrations obtained from the different analytical methods was compared. Based on analysis of variance (ANOVA) results, no statistically significant differences were observed between mean values measured using each of the three methods. Small but statistically significant differences were observed between results obtained from performance evaluation samples for the NIOSH field method and the OSHA laboratory method.

  17. Diverse protocols for correlative super-resolution fluorescence imaging and electron microscopy of chemically fixed samples

    PubMed Central

    Kopek, Benjamin G.; Paez-Segala, Maria G.; Shtengel, Gleb; Sochacki, Kem A.; Sun, Mei G.; Wang, Yalin; Xu, C. Shan; van Engelenburg, Schuyler B.; Taraska, Justin W.; Looger, Loren L.; Hess, Harald F.

    2017-01-01

    Our groups have recently developed related approaches for sample preparation for super-resolution imaging within endogenous cellular environments using correlative light and electron microscopy (CLEM). Four distinct techniques for preparing and acquiring super-resolution CLEM datasets on aldehyde-fixed specimens are provided, including Tokuyasu cryosectioning, whole-cell mount, cell unroofing and platinum replication, and resin embedding and sectioning. Choice of the best protocol for a given application depends on a number of criteria that are discussed in detail. Tokuyasu cryosectioning is relatively rapid but is limited to small, delicate specimens. Whole-cell mount has the simplest sample preparation but is restricted to surface structures. Cell unroofing and platinum replica creates high-contrast, 3-dimensional images of the cytoplasmic surface of the plasma membrane, but is more challenging than whole-cell mount. Resin embedding permits serial sectioning of large samples, but is limited to osmium-resistant probes, and is technically difficult. Expected results from these protocols include super-resolution localization (~10–50 nm) of fluorescent targets within the context of electron microscopy ultrastructure, which can help address cell biological questions. These protocols can be completed in 2–7 days, are compatible with a number of super-resolution imaging protocols, and are broadly applicable across biology. PMID:28384138

  18. A rapid and efficient DNA extraction protocol from fresh and frozen human blood samples.

    PubMed

    Guha, Pokhraj; Das, Avishek; Dutta, Somit; Chaudhuri, Tapas Kumar

    2018-01-01

    Different methods available for extraction of human genomic DNA suffer from one or more drawbacks including low yield, compromised quality, cost, time consumption, use of toxic organic solvents, and many more. Herein, we aimed to develop a method to extract DNA from 500 μL of fresh or frozen human blood. Five hundred microliters of fresh and frozen human blood samples were used for standardization of the extraction procedure. Absorbance at 260 and 280 nm, respectively, (A 260 /A 280 ) were estimated to check the quality and quantity of the extracted DNA sample. Qualitative assessment of the extracted DNA was checked by Polymerase Chain reaction and double digestion of the DNA sample. Our protocol resulted in average yield of 22±2.97 μg and 20.5±3.97 μg from 500 μL of fresh and frozen blood, respectively, which were comparable to many reference protocols and kits. Besides yielding bulk amount of DNA, our protocol is rapid, economical, and avoids toxic organic solvents such as Phenol. Due to unaffected quality, the DNA is suitable for downstream applications. The protocol may also be useful for pursuing basic molecular researches in laboratories having limited funds. © 2017 Wiley Periodicals, Inc.

  19. Use of a Filter Cartridge for Filtration of Water Samples and Extraction of Environmental DNA.

    PubMed

    Miya, Masaki; Minamoto, Toshifumi; Yamanaka, Hiroki; Oka, Shin-Ichiro; Sato, Keiichi; Yamamoto, Satoshi; Sado, Tetsuya; Doi, Hideyuki

    2016-11-25

    Recent studies demonstrated the use of environmental DNA (eDNA) from fishes to be appropriate as a non-invasive monitoring tool. Most of these studies employed disk fiber filters to collect eDNA from water samples, although a number of microbial studies in aquatic environments have employed filter cartridges, because the cartridge has the advantage of accommodating large water volumes and of overall ease of use. Here we provide a protocol for filtration of water samples using the filter cartridge and extraction of eDNA from the filter without having to cut open the housing. The main portions of this protocol consists of 1) filtration of water samples (water volumes ≤4 L or >4 L); (2) extraction of DNA on the filter using a roller shaker placed in a preheated incubator; and (3) purification of DNA using a commercial kit. With the use of this and previously-used protocols, we perform metabarcoding analysis of eDNA taken from a huge aquarium tank (7,500 m 3 ) with known species composition, and show the number of detected species per library from the two protocols as the representative results. This protocol has been developed for metabarcoding eDNA from fishes, but is also applicable to eDNA from other organisms.

  20. D-RATS 2011: RAFT Protocol Overview

    NASA Technical Reports Server (NTRS)

    Utz, Hans

    2011-01-01

    A brief overview presentation on the protocol used during the D-RATS2011 field test for file transfer from the field-test robots at Black Point Lava Flow AZ to Johnson Space Center, Houston TX over a simulated time-delay. The file transfer actually uses a commercial implementation of an open communications standard. The focus of the work lies on how to make the state of the distributed system observable.

  1. The DNA isolation method has effect on allele drop-out and on the results of fluorescent PCR and DNA fragment analysis.

    PubMed

    Nagy, Bálint; Bán, Zoltán; Papp, Zoltán

    2005-10-01

    The quality and the quantity of isolated DNA have an effect on PCR amplifications. The authors studied three DNA isolation protocols (resin binding method using fresh and frozen amniotic fluid samples, and silica adsorption method using fresh samples) on the quantity and on the quality of the isolated DNA. Amniotic fluid samples were obtained from 20 pregnant women. The isolated DNA concentrations were determined by real-time fluorimeter using SYBRGreen I method. Each sample was studied for the presence of 8 STR markers. The authors compared the number of the detected alleles, electrophoretograms and peak areas. There was a significant difference between the concentration of the obtained DNA and in the peak areas between the three isolation protocols. The numbers of detected alleles were different, we observed the most allele drop outs in the resin type DNA isolation protocol from the fresh sample (detected allele numbers 182), followed by resin binding protocol from the frozen samples (detected allele number 243) and by the silica adsorption method (detected allele number 264). The authors demonstrated that the DNA isolation method has an effect on the quantity and quality of the isolated DNA, and on further PCR amplifications.

  2. Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.

    PubMed

    Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María

    2017-01-01

    This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.

  3. Digital gene expression analysis with sample multiplexing and PCR duplicate detection: A straightforward protocol.

    PubMed

    Rozenberg, Andrey; Leese, Florian; Weiss, Linda C; Tollrian, Ralph

    2016-01-01

    Tag-Seq is a high-throughput approach used for discovering SNPs and characterizing gene expression. In comparison to RNA-Seq, Tag-Seq eases data processing and allows detection of rare mRNA species using only one tag per transcript molecule. However, reduced library complexity raises the issue of PCR duplicates, which distort gene expression levels. Here we present a novel Tag-Seq protocol that uses the least biased methods for RNA library preparation combined with a novel approach for joint PCR template and sample labeling. In our protocol, input RNA is fragmented by hydrolysis, and poly(A)-bearing RNAs are selected and directly ligated to mixed DNA-RNA P5 adapters. The P5 adapters contain i5 barcodes composed of sample-specific (moderately) degenerate base regions (mDBRs), which later allow detection of PCR duplicates. The P7 adapter is attached via reverse transcription with individual i7 barcodes added during the amplification step. The resulting libraries can be sequenced on an Illumina sequencer. After sample demultiplexing and PCR duplicate removal with a free software tool we designed, the data are ready for downstream analysis. Our protocol was tested on RNA samples from predator-induced and control Daphnia microcrustaceans.

  4. Crystallization of Macromolecules

    PubMed Central

    Friedmann, David; Messick, Troy; Marmorstein, Ronen

    2014-01-01

    X-ray crystallography has evolved into a very powerful tool to determine the three-dimensional structure of macromolecules and macromolecular complexes. The major bottleneck in structure determination by X-ray crystallography is the preparation of suitable crystalline samples. This unit outlines steps for the crystallization of a macromolecule, starting with a purified, homogeneous sample. The first protocols describe preparation of the macromolecular sample (i.e., proteins, nucleic acids, and macromolecular complexes). The preparation and assessment of crystallization trials is then described, along with a protocol for confirming whether the crystals obtained are composed of macromolecule as opposed to a crystallization reagent . Next, the optimization of crystallization conditions is presented. Finally, protocols that facilitate the growth of larger crystals through seeding are described. PMID:22045560

  5. SeaWiFS technical report series. Volume 5: Ocean optics protocols for SeaWiFS validation

    NASA Technical Reports Server (NTRS)

    Mueller, James L.; Austin, Roswell W.; Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor)

    1992-01-01

    Protocols are presented for measuring optical properties, and other environmental variables, to validate the radiometric performance of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), and to develop and validate bio-optical algorithms for use with SeaWiFS data. The protocols are intended to establish foundations for a measurement strategy to verify the challenging SeaWiFS accuracy goals of 5 percent in water-leaving radiances and 35 percent in chlorophyll alpha concentration. The protocols first specify the variables which must be measured, and briefly review rationale. Subsequent chapters cover detailed protocols for instrument performance specifications, characterizing and calibration instruments, methods of making measurements in the field, and methods of data analysis. These protocols were developed at a workshop sponsored by the SeaWiFS Project Office (SPO) and held at the Naval Postgraduate School in Monterey, California (9-12 April, 1991). This report is the proceedings of that workshop, as interpreted and expanded by the authors and reviewed by workshop participants and other members of the bio-optical research community. The protocols are a first prescription to approach unprecedented measurement accuracies implied by the SeaWiFS goals, and research and development are needed to improve the state-of-the-art in specific areas. The protocols should be periodically revised to reflect technical advances during the SeaWiFS Project cycle.

  6. Protein precipitation of diluted samples in SDS-containing buffer with acetone leads to higher protein recovery and reproducibility in comparison with TCA/acetone approach.

    PubMed

    Santa, Cátia; Anjo, Sandra I; Manadas, Bruno

    2016-07-01

    Proteomic approaches are extremely valuable in many fields of research, where mass spectrometry methods have gained an increasing interest, especially because of the ability to perform quantitative analysis. Nonetheless, sample preparation prior to mass spectrometry analysis is of the utmost importance. In this work, two protein precipitation approaches, widely used for cleaning and concentrating protein samples, were tested and compared in very diluted samples solubilized in a strong buffer (containing SDS). The amount of protein recovered after acetone and TCA/acetone precipitation was assessed, as well as the protein identification and relative quantification by SWATH-MS yields were compared with the results from the same sample without precipitation. From this study, it was possible to conclude that in the case of diluted samples in denaturing buffers, the use of cold acetone as precipitation protocol is more favourable than the use of TCA/acetone in terms of reproducibility in protein recovery and number of identified and quantified proteins. Furthermore, the reproducibility in relative quantification of the proteins is even higher in samples precipitated with acetone compared with the original sample. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. The development of radioactive sample surrogates for training and exercises

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martha Finck; Bevin Brush; Dick Jansen

    2012-03-01

    The development of radioactive sample surrogates for training and exercises Source term information is required for to reconstruct a device used in a dispersed radiological dispersal device. Simulating a radioactive environment to train and exercise sampling and sample characterization methods with suitable sample materials is a continued challenge. The Idaho National Laboratory has developed and permitted a Radioactive Response Training Range (RRTR), an 800 acre test range that is approved for open air dispersal of activated KBr, for training first responders in the entry and exit from radioactively contaminated areas, and testing protocols for environmental sampling and field characterization. Membersmore » from the Department of Defense, Law Enforcement, and the Department of Energy participated in the first contamination exercise that was conducted at the RRTR in the July 2011. The range was contaminated using a short lived radioactive Br-82 isotope (activated KBr). Soil samples contaminated with KBr (dispersed as a solution) and glass particles containing activated potassium bromide that emulated dispersed radioactive materials (such as ceramic-based sealed source materials) were collected to assess environmental sampling and characterization techniques. This presentation summarizes the performance of a radioactive materials surrogate for use as a training aide for nuclear forensics.« less

  8. IONAC-Lite

    NASA Technical Reports Server (NTRS)

    Torgerson, Jordan L.; Clare, Loren P.; Pang, Jackson

    2011-01-01

    The Interplanetary Overlay Net - working Protocol Accelerator (IONAC) described previously in The Inter - planetary Overlay Networking Protocol Accelerator (NPO-45584), NASA Tech Briefs, Vol. 32, No. 10, (October 2008) p. 106 (http://www.techbriefs.com/component/ content/article/3317) provides functions that implement the Delay Tolerant Networking (DTN) bundle protocol. New missions that require high-speed downlink-only use of DTN can now be accommodated by the unidirectional IONAC-Lite to support high data rate downlink mission applications. Due to constrained energy resources, a conventional software implementation of the DTN protocol can provide only limited throughput for any given reasonable energy consumption rate. The IONAC-Lite DTN Protocol Accelerator is able to reduce this energy consumption by an order of magnitude and increase the throughput capability by two orders of magnitude. In addition, a conventional DTN implementation requires a bundle database with a considerable storage requirement. In very high downlink datarate missions such as near-Earth radar science missions, the storage space utilization needs to be maximized for science data and minimized for communications protocol-related storage needs. The IONAC-Lite DTN Protocol Accelerator is implemented in a reconfigurable hardware device to accomplish exactly what s needed for high-throughput DTN downlink-only scenarios. The following are salient features of the IONAC-Lite implementation: An implementation of the Bundle Protocol for an environment that requires a very high rate bundle egress data rate. The C&DH (command and data handling) subsystem is also expected to be very constrained so the interaction with the C&DH processor and the temporary storage are minimized. Fully pipelined design so that bundle processing database is not required. Implements a lookup table-based approach to eliminate multi-pass processing requirement imposed by the Bundle Protocol header s length field structure and the SDNV (self-delimiting numeric value) data field formatting. 8-bit parallel datapath to support high data-rate missions. Reduced resource utilization implementation for missions that do not require custody transfer features. There was no known implementation of the DTN protocol in a field programmable gate array (FPGA) device prior to the current implementation. The combination of energy and performance optimization that embodies this design makes the work novel.

  9. Consensus for second-order multi-agent systems with position sampled data

    NASA Astrophysics Data System (ADS)

    Wang, Rusheng; Gao, Lixin; Chen, Wenhai; Dai, Dameng

    2016-10-01

    In this paper, the consensus problem with position sampled data for second-order multi-agent systems is investigated. The interaction topology among the agents is depicted by a directed graph. The full-order and reduced-order observers with position sampled data are proposed, by which two kinds of sampled data-based consensus protocols are constructed. With the provided sampled protocols, the consensus convergence analysis of a continuous-time multi-agent system is equivalently transformed into that of a discrete-time system. Then, by using matrix theory and a sampled control analysis method, some sufficient and necessary consensus conditions based on the coupling parameters, spectrum of the Laplacian matrix and sampling period are obtained. While the sampling period tends to zero, our established necessary and sufficient conditions are degenerated to the continuous-time protocol case, which are consistent with the existing result for the continuous-time case. Finally, the effectiveness of our established results is illustrated by a simple simulation example. Project supported by the Natural Science Foundation of Zhejiang Province, China (Grant No. LY13F030005) and the National Natural Science Foundation of China (Grant No. 61501331).

  10. Robust DNA Isolation and High-throughput Sequencing Library Construction for Herbarium Specimens.

    PubMed

    Saeidi, Saman; McKain, Michael R; Kellogg, Elizabeth A

    2018-03-08

    Herbaria are an invaluable source of plant material that can be used in a variety of biological studies. The use of herbarium specimens is associated with a number of challenges including sample preservation quality, degraded DNA, and destructive sampling of rare specimens. In order to more effectively use herbarium material in large sequencing projects, a dependable and scalable method of DNA isolation and library preparation is needed. This paper demonstrates a robust, beginning-to-end protocol for DNA isolation and high-throughput library construction from herbarium specimens that does not require modification for individual samples. This protocol is tailored for low quality dried plant material and takes advantage of existing methods by optimizing tissue grinding, modifying library size selection, and introducing an optional reamplification step for low yield libraries. Reamplification of low yield DNA libraries can rescue samples derived from irreplaceable and potentially valuable herbarium specimens, negating the need for additional destructive sampling and without introducing discernible sequencing bias for common phylogenetic applications. The protocol has been tested on hundreds of grass species, but is expected to be adaptable for use in other plant lineages after verification. This protocol can be limited by extremely degraded DNA, where fragments do not exist in the desired size range, and by secondary metabolites present in some plant material that inhibit clean DNA isolation. Overall, this protocol introduces a fast and comprehensive method that allows for DNA isolation and library preparation of 24 samples in less than 13 h, with only 8 h of active hands-on time with minimal modifications.

  11. WebTag: Web browsing into sensor tags over NFC.

    PubMed

    Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Alvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio

    2012-01-01

    Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm.

  12. WebTag: Web Browsing into Sensor Tags over NFC

    PubMed Central

    Echevarria, Juan Jose; Ruiz-de-Garibay, Jonathan; Legarda, Jon; Álvarez, Maite; Ayerbe, Ana; Vazquez, Juan Ignacio

    2012-01-01

    Information and Communication Technologies (ICTs) continue to overcome many of the challenges related to wireless sensor monitoring, such as for example the design of smarter embedded processors, the improvement of the network architectures, the development of efficient communication protocols or the maximization of the life cycle autonomy. This work tries to improve the communication link of the data transmission in wireless sensor monitoring. The upstream communication link is usually based on standard IP technologies, but the downstream side is always masked with the proprietary protocols used for the wireless link (like ZigBee, Bluetooth, RFID, etc.). This work presents a novel solution (WebTag) for a direct IP based access to a sensor tag over the Near Field Communication (NFC) technology for secure applications. WebTag allows a direct web access to the sensor tag by means of a standard web browser, it reads the sensor data, configures the sampling rate and implements IP based security policies. It is, definitely, a new step towards the evolution of the Internet of Things paradigm. PMID:23012511

  13. Development of an HPV Educational Protocol for Adolescents

    PubMed Central

    Wetzel, Caitlin; Tissot, Abbigail; Kollar, Linda M.; Hillard, Paula A.; Stone, Rachel; Kahn, Jessica A.

    2007-01-01

    Study Objectives To develop an educational protocol about HPV and Pap tests for adolescents, to evaluate the protocol for understandability and clarity, and to evaluate the protocol for its effectiveness in increasing knowledge about HPV. Design In phase 1, investigators and adolescents developed the protocol. In phase 2, adolescents evaluated the protocol qualitatively, investigators evaluated its effectiveness in increasing HPV knowledge in a sample of adolescents, and the protocol was revised. In phase 3, investigators evaluated the effectiveness of the revised protocol in an additional adolescent sample. Setting Urban, hospital-based teen health center. Participants A total of 252 adolescent girls and boys in the three study phases. Main Outcome Measures Pre- and post-protocol knowledge about HPV, measured using a 10- or 11-item scale. Results Scores on the HPV knowledge scale increased significantly (p<.0001) among adolescents who participated in phases 2 and 3 after they received the protocol. Initial differences in scores based on race, insurance type and condom use were not noted post-protocol. Conclusion The protocol significantly increased knowledge scores about HPV in this population, regardless of sociodemographic characteristics and risk behaviors. Effective, developmentally appropriate educational protocols about HPV and Pap tests are particularly important in clinical settings as cervical cancer screening guidelines evolve, HPV DNA testing is integrated into screening protocols, and HPV vaccines become available. In-depth, one-on-one education about HPV may also prevent adverse psychosocial responses and promote healthy sexual and Pap screening behaviors in adolescents with abnormal HPV or Pap test results. Synopsis The investigators developed an educational protocol about HPV and Pap tests and evaluated its effectiveness in increasing knowledge about HPV among adolescents. PMID:17868894

  14. Optimization of a sample processing protocol for recovery of Bacillus anthracis spores from soil

    USGS Publications Warehouse

    Silvestri, Erin E.; Feldhake, David; Griffin, Dale; Lisle, John T.; Nichols, Tonya L.; Shah, Sanjiv; Pemberton, A; Schaefer III, Frank W

    2016-01-01

    Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps. Optimization of the protocol included: identifying an ideal extraction diluent, variation in the number of wash steps, variation in the initial centrifugation speed, sonication and shaking mechanisms. The optimized protocol was demonstrated at two laboratories in order to evaluate the recovery of spores from loamy and sandy soils. The new protocol demonstrated an improved limit of detection for loamy and sandy soils over the non-optimized protocol with an approximate matrix limit of detection at 14 spores/g of soil. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol will be robust enough to use at multiple laboratories while achieving comparable recoveries.

  15. Evanescent Field Based Photoacoustics: Optical Property Evaluation at Surfaces

    PubMed Central

    Goldschmidt, Benjamin S.; Rudy, Anna M.; Nowak, Charissa A.; Tsay, Yowting; Whiteside, Paul J. D.; Hunt, Heather K.

    2016-01-01

    Here, we present a protocol to estimate material and surface optical properties using the photoacoustic effect combined with total internal reflection. Optical property evaluation of thin films and the surfaces of bulk materials is an important step in understanding new optical material systems and their applications. The method presented can estimate thickness, refractive index, and use absorptive properties of materials for detection. This metrology system uses evanescent field-based photoacoustics (EFPA), a field of research based upon the interaction of an evanescent field with the photoacoustic effect. This interaction and its resulting family of techniques allow the technique to probe optical properties within a few hundred nanometers of the sample surface. This optical near field allows for the highly accurate estimation of material properties on the same scale as the field itself such as refractive index and film thickness. With the use of EFPA and its sub techniques such as total internal reflection photoacoustic spectroscopy (TIRPAS) and optical tunneling photoacoustic spectroscopy (OTPAS), it is possible to evaluate a material at the nanoscale in a consolidated instrument without the need for many instruments and experiments that may be cost prohibitive. PMID:27500652

  16. A comparison of macroinvertebrate and habitat methods of data collection in the Little Colorado River Watershed, Arizona 2007

    USGS Publications Warehouse

    Spindler, Patrice; Paretti, Nick V.

    2007-01-01

    The Arizona Department of Environmental Quality (ADEQ) and the U.S. Environmental Protection Agency (USEPA) Ecological Monitoring and Assessment Program (EMAP), use different field methods for collecting macroinvertebrate samples and habitat data for bioassessment purposes. Arizona’s Biocriteria index was developed using a riffle habitat sampling methodology, whereas the EMAP method employs a multi-habitat sampling protocol. There was a need to demonstrate comparability of these different bioassessment methodologies to allow use of the EMAP multi-habitat protocol for both statewide probabilistic assessments for integration of the EMAP data into the national (305b) assessment and for targeted in-state bioassessments for 303d determinations of standards violations and impaired aquatic life conditions. The purpose of this study was to evaluate whether the two methods yield similar bioassessment results, such that the data could be used interchangeably in water quality assessments. In this Regional EMAP grant funded project, a probabilistic survey of 30 sites in the Little Colorado River basin was conducted in the spring of 2007. Macroinvertebrate and habitat data were collected using both ADEQ and EMAP sampling methods, from adjacent reaches within these stream channels.


    All analyses indicated that the two macroinvertebrate sampling methods were significantly correlated. ADEQ and EMAP samples were classified into the same scoring categories (meeting, inconclusive, violating the biocriteria standard) 82% of the time. When the ADEQ-IBI was applied to both the ADEQ and EMAP taxa lists, the resulting IBI scores were significantly correlated (r=0.91), even though only 4 of the 7 metrics in the IBI were significantly correlated. The IBI scores from both methods were significantly correlated to the percent of riffle habitat, even though the average percent riffle habitat was only 30% of the stream reach. Multivariate analyses found that the percent riffle was an important attribute for both datasets in classifying IBI scores into assessment categories.


    Habitat measurements generated from EMAP and ADEQ methods were also significantly correlated; 13 of 16 habitat measures were significantly correlated (p<0.01). The visual-based percentage estimates of percent riffle and pool habitats, vegetative cover and percent canopy cover, and substrate measurements of percent fine substrate and embeddedness were all remarkably similar, given the different field methods used. A multivariate analysis identified substrate and flow conditions, as well as canopy cover as important combinations of habitat attributes affecting both IBI scores. These results indicate that similar habitat measures can be obtained using two different field sampling protocols. In addition, similar combinations of these habitat parameters were important to macroinvertebrate community condition in multivariate analyses of both ADEQ and EMAP datasets.


    These results indicate the two sampling methods for macroinvertebrates and habitat data were very similar in terms of bioassessment results and stressors. While the bioassessment category was not identical for all sites, overall the assessments were significantly correlated, providing similar bioassessment results for the cold water streams used in this study. The findings of this study indicate that ADEQ can utilize either a riffle-based sampling methodology or a multi-habitat sampling approach in cold water streams as both yield similar results relative to the macroinvertebrate assemblage. These results will allow for use of either macroinvertebrate dataset to determine water quality standards compliance with the ADEQ Indexes of Biological Integrity, for which threshold values were just recently placed into the Arizona Surface Water Quality Standards. While this survey did not include warm water desert streams of Arizona, we would predict that EMAP and ADEQ sampling methodologies would provide similar bioassessment results and would not be significantly different, as we have found that the percent riffle habitat in cold and warm water perennial, wadeable streams is not significantly different. However, a comparison study of sampling methodologies in warm water streams should be conducted to confirm the predicted similarity of bioassessment results. ADEQ will continue to implement a monitoring strategy that includes probabilistic monitoring for a statewide ecological assessment of stream conditions. Conclusions from this study will guide decisions regarding the most appropriate sampling methods for future probabilistic monitoring sample plans.

  17. Slow histidine H/D exchange protocol for thermodynamic analysis of protein folding and stability using mass spectrometry.

    PubMed

    Tran, Duc T; Banerjee, Sambuddha; Alayash, Abdu I; Crumbliss, Alvin L; Fitzgerald, Michael C

    2012-02-07

    Described here is a mass spectrometry-based protocol to study the thermodynamic stability of proteins and protein-ligand complexes using the chemical denaturant dependence of the slow H/D exchange reaction of the imidazole C(2) proton in histidine side chains. The protocol is developed using several model protein systems including: ribonuclease (Rnase) A, myoglobin, bovine carbonic anhydrase (BCA) II, hemoglobin (Hb), and the hemoglobin-haptoglobin (Hb-Hp) protein complex. Folding free energies consistent with those previously determined by other more conventional techniques were obtained for the two-state folding proteins, Rnase A and myoglobin. The protocol successfully detected a previously observed partially unfolded intermediate stabilized in the BCA II folding/unfolding reaction, and it could be used to generate a K(d) value of 0.24 nM for the Hb-Hp complex. The compatibility of the protocol with conventional mass spectrometry-based proteomic sample preparation and analysis methods was also demonstrated in an experiment in which the protocol was used to detect the binding of zinc to superoxide dismutase in the yeast cell lysate sample. The yeast cell sample analyses also helped define the scope of the technique, which requires the presence of globally protected histidine residues in a protein's three-dimensional structure for successful application. © 2011 American Chemical Society

  18. Combining archeomagnetic and volcanic data with historical geomagnetic observations to reconstruct global field evolution over the past 1000 years, including new paleomagnetic data from historical lava flows on Fogo, Cape Verde

    NASA Astrophysics Data System (ADS)

    Korte, M. C.; Senftleben, R.; Brown, M. C.; Finlay, C. C.; Feinberg, J. M.; Biggin, A. J.

    2016-12-01

    Geomagnetic field evolution of the recent past can be studied using different data sources: Jackson et al. (2000) combined historical observations with modern field measurements to derive a global geomagnetic field model (gufm1) spanning 1590 to 1990. Several published young archeo- and volcanic paleomagnetic data fall into this time interval. Here, we directly combine data from these different sources to derive a global field model covering the past 1000 years. We particularly focus on reliably recovering dipole moment evolution prior to the times of the first direct absolute intensity observations at around 1840. We first compared the different data types and their agreement with the gufm1 model to assess their compatibility and reliability. We used these results, in combination with statistical modelling tests, to obtain suitable uncertainty estimates as weighting factors for the data in the final model. In addition, we studied samples from seven lava flows from the island of Fogo, Cape Verde, erupted between 1664 and 1857. Oriented samples were available for two of them, providing declination and inclination results. Due to the complicated mineralogy of three of the flows, microwave paleointensity experiments using a modified version of the IZZI protocol were carried out on flows erupted in 1664, 1769, 1816 and 1847. The new directional results are compared with nearby historical data and the influence on, and agreement with, the new model are discussed.

  19. Spherical nanoindentation stress–strain curves

    DOE PAGES

    Pathak, Siddhartha; Kalidindi, Surya R.

    2015-03-24

    Although indentation experiments have long been used to measure the hardness and Young's modulus, the utility of this technique in analyzing the complete elastic–plastic response of materials under contact loading has only been realized in the past few years – mostly due to recent advances in testing equipment and analysis protocols. This paper provides a timely review of the recent progress made in this respect in extracting meaningful indentation stress–strain curves from the raw datasets measured in instrumented spherical nanoindentation experiments. These indentation stress–strain curves have produced highly reliable estimates of the indentation modulus and the indentation yield strength inmore » the sample, as well as certain aspects of their post-yield behavior, and have been critically validated through numerical simulations using finite element models as well as direct in situ scanning electron microscopy (SEM) measurements on micro-pillars. Much of this recent progress was made possible through the introduction of a new measure of indentation strain and the development of new protocols to locate the effective zero-point of initial contact between the indenter and the sample in the measured datasets. As a result, this has led to an important key advance in this field where it is now possible to reliably identify and analyze the initial loading segment in the indentation experiments.« less

  20. Comparison of Cryopreservation Protocols (Single and Two-steps) and Thawing (Fast and Slow) for Canine Sperm.

    PubMed

    Brito, Maíra M; Lúcio, Cristina F; Angrimani, Daniel S R; Losano, João Diego A; Dalmazzo, Andressa; Nichi, Marcílio; Vannucchi, Camila I

    2017-01-02

    In addition to the existence of several cryopreservation protocols, no systematic research has been carried out in order to confirm the suitable protocol for canine sperm. This study aims to assess the effect of adding 5% glycerol during cryopreservation at 37°C (one-step) and 5°C (two-steps), in addition of testing two thawing protocols (37°C for 30 seconds, and 70°C for 8 seconds). We used 12 sperm samples divided into four experimental groups: Single-Step - Slow Thawing Group; Two-Step - Slow Thawing Group; Single-Step - Fast Thawing Group; and Two-Step - Fast Thawing Group. Frozen-thawed samples were submitted to automated analysis of sperm motility, evaluation of plasmatic membrane integrity, acrosomal integrity, mitochondrial activity, sperm morphology, sperm susceptibility to oxidative stress, and sperm binding assay to perivitellinic membrane of chicken egg yolk. Considering the comparison between freezing protocols, no statistical differences were verified for any of the response variables. When comparison between thawing protocols was performed, slow thawing protocol presented higher sperm count bound to perivitelline membrane of chicken egg yolk, compared to fast thawing protocol. Regardless of the freezing process, the slow thawing protocol can be recommended for the large scale cryopreservation of canine semen, since it shows a consistent better functional result.

  1. Development of bull trout sampling protocols

    Treesearch

    R. F. Thurow; J. T. Peterson; J. W. Guzevich

    2001-01-01

    This report describes results of research conducted in Washington in 2000 through Interagency Agreement #134100H002 between the U.S. Fish and Wildlife Service (USFWS) and the U.S. Forest Service Rocky Mountain Research Station (RMRS). The purpose of this agreement is to develop a bull trout (Salvelinus confluentus) sampling protocol by integrating...

  2. Assessment of levels of bacterial contamination of large wild game meat in Europe.

    PubMed

    Membré, Jeanne-Marie; Laroche, Michel; Magras, Catherine

    2011-08-01

    The variations in prevalence and levels of pathogens and fecal contamination indicators in large wild game meat were studied to assess their potential impact on consumers. This analysis was based on hazard analysis, data generation and statistical analysis. A total of 2919 meat samples from three species (red deer, roe deer, wild boar) were collected at French game meat traders' facilities using two sampling protocols. Information was gathered on the types of meat cuts (forequarter or haunch; first sampling protocol) or type of retail-ready meat (stewing meat or roasting meat; second protocol), and also on the meat storage conditions (frozen or chilled), country of origin (eight countries) and shooting season (autumn, winter, spring). The samples were analyzed in both protocols for detection and enumeration of Escherichia coli, coagulase+staphylococci and Clostridium perfringens. In addition, detection and enumeration of thermotolerant coliforms and Listeria monocytogenes were performed for samples collected in the first and second protocols, respectively. The levels of bacterial contamination of the raw meat were determined by performing statistical analysis involving probabilistic techniques and Bayesian inference. C. perfringens was found in the highest numbers for the three indicators of microbial quality, hygiene and good handling, and L. monocytogenes in the lowest. Differences in contamination levels between game species and between meats distributed as chilled or frozen products were not significant. These results might be included in quantitative exposure assessments. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. On the Serpentinization Degree (S) of IODP Expedition 357 Atlantis Massif Rocks: Insights from Rock Magnetic Properties and Microscopic Magnetic Mineralogy Study of six Sites

    NASA Astrophysics Data System (ADS)

    Herrero-Bervera, E.; Whattam, S. A.; Frederichs, T.

    2016-12-01

    We have studied the magnetic properties of 37 serpentinized samples recovered via drilling during IODP Expedition 357, Atlantis Massif. We have recovered various lithologies including ultamafic rocks (primarily extensively serpentinized), subordidate gabbros, dolerites (small-scale melt injections) and schists. We have conducted remanence and induced magnetic experiments on the samples to determine for instance the degree of serpentinization (S). Stepwise alternating field and thermal demagnetization experiments from 2.5 to 70 mT and from 28 to 700°C, respectively, yielded univectorial diagrams showing the removal of secondary components (e.g., VRM, IRM, CRM) by isolating a characteristic component (ChRM) at various fields and temperatures. The normalized intensity of demagnetization (J/Jo) shows that the decrease of the magnetization of the specimens where about 50% of the original magnetization and is lost at about 5 mT and 100°C (i.e., Median Destructive Field). The stereograms show magnetic stability of the specimens by determining the directional behavior after 4 demagnetization steps (from 7.5-10 mT fields and low temperatures). Induced magnetization such as SIRM's, hysteresis saturation loops, back-fields and FORC experiments were performed. Diagnostic values of Mrs/Ms and Brc/Bc determine the domain structure of a magnetic sample. The magnetic grain sizes were determined using the protocol of Dunlop [2000]. Most of the samples were distributed over the Single (SD), Pseudo-Single Domain (PSD) and a few over the Multi Domain (MD) ranges with a certain degree of clustering on the PSD range. Curie points were obtained by measuring their low-field susceptibility vs. temperature from 28°C up to 700°C in an Argon atmosphere showing a minimum of 1-4 magnetic mineral phases with temperatures ranging from 100°C up to 640°C. These phases are predominantly Ti-poor, Ti-rich magnetite, maghemite and magnetite as corroborated by microscopic analysis as well as the Verwey transition (Tv≈110-120K). Samples studied show appreciable variation in bulk susceptibility (77.8 x 10-3 to 0.31 x 10-3 SI units). The samples are characterized by low, intermediate and high degree of serpentinization based on the results of their magnetic properties (e,g, Kappa, density, magnetic stability and Mrs/Ms vs Bcr/Bc).

  4. Flow cytometric method for measuring chromatin fragmentation in fixed sperm from yellow perch (Perca flavescens).

    PubMed

    Jenkins, J A; Draugelis-Dale, R O; Pinkney, A E; Iwanowicz, L R; Blazer, V S

    2015-03-15

    Declining harvests of yellow perch, Perca flavescens, in urbanized watersheds of Chesapeake Bay have prompted investigations of their reproductive fitness. The purpose of this study was to establish a flow cytometric technique for DNA analysis of fixed samples sent from the field to provide reliable gamete quality measurements. Similar to the sperm chromatin structure assay, measures were made on the susceptibility of nuclear DNA to acid-induced denaturation, but used fixed rather than live or thawed cells. Nuclei were best exposed to the acid treatment for 1 minute at 37 °C followed by the addition of cold (4 °C) propidium iodide staining solution before flow cytometry. The rationale for protocol development is presented graphically through cytograms. Field results collected in 2008 and 2009 revealed DNA fragmentation up to 14.5%. In 2008, DNA fragmentation from the more urbanized watersheds was significantly greater than from reference sites (P = 0.026) and in 2009, higher percentages of haploid testicular cells were noted from the less urbanized watersheds (P = 0.032) indicating better reproductive condition at sites with less urbanization. For both years, total and progressive live sperm motilities by computer-assisted sperm motion analysis ranged from 19.1% to 76.5%, being significantly higher at the less urbanized sites (P < 0.05). This flow cytometric method takes advantage of the propensity of fragmented DNA to be denatured under standard conditions, or 1 minute at 37 °C with 10% buffered formalin-fixed cells. The study of fixed sperm makes possible the restrospective investigation of germplasm fragmentation, spermatogenic ploidy patterns, and chromatin compaction levels from samples translocated over distance and time. The protocol provides an approach that can be modified for other species across taxa. Published by Elsevier Inc.

  5. Flow cytometric method for measuring chromatin fragmentation in fixed sperm from yellow perch (Perca flavescens)

    USGS Publications Warehouse

    Jenkins, Jill A.; Draugelis-Dale, Rassa O.; Pinkney, Alfred E.; Iwanowicz, Luke R.; Blazer, Vicki

    2015-01-01

    Declining harvests of yellow perch, Perca flavescens, in urbanized watersheds of Chesapeake Bay have prompted investigations of their reproductive fitness. The purpose of this study was to establish a flow cytometric technique for DNA analysis of fixed samples sent from the field to provide reliable gamete quality measurements. Similar to the sperm chromatin structure assay, measures were made on the susceptibility of nuclear DNA to acid-induced denaturation, but used fixed rather than live or thawed cells. Nuclei were best exposed to the acid treatment for 1 minute at 37 °C followed by the addition of cold (4 °C) propidium iodide staining solution before flow cytometry. The rationale for protocol development is presented graphically through cytograms. Field results collected in 2008 and 2009 revealed DNA fragmentation up to 14.5%. In 2008, DNA fragmentation from the more urbanized watersheds was significantly greater than from reference sites (P = 0.026) and in 2009, higher percentages of haploid testicular cells were noted from the less urbanized watersheds (P = 0.032) indicating better reproductive condition at sites with less urbanization. For both years, total and progressive live sperm motilities by computer-assisted sperm motion analysis ranged from 19.1% to 76.5%, being significantly higher at the less urbanized sites (P < 0.05). This flow cytometric method takes advantage of the propensity of fragmented DNA to be denatured under standard conditions, or 1 minute at 37 °C with 10% buffered formalin–fixed cells. The study of fixed sperm makes possible the restrospective investigation of germplasm fragmentation, spermatogenic ploidy patterns, and chromatin compaction levels from samples translocated over distance and time. The protocol provides an approach that can be modified for other species across taxa.

  6. Phase II of the International Study of Asthma and Allergies in Childhood (ISAAC II): rationale and methods.

    PubMed

    Weiland, S K; Björkstén, B; Brunekreef, B; Cookson, W O C; von Mutius, E; Strachan, D P

    2004-09-01

    International comparative studies, investigating whether disease incidence or prevalence rates differ between populations and, if so, which factors explain the observed differences, have made important contributions to the understanding of disease aetiology in many areas. In Phase I of the International Study of Asthma and Allergies in Childhood (ISAAC), the prevalence rates of symptoms of asthma, allergic rhinitis and atopic eczema in 13-14-yr-olds, assessed by standardised questionnaires, were found to differ >20-fold between the 155 study centres around the world. Phase II of ISAAC aims to identify determinants of these differences by studying informative populations. A detailed study protocol was developed for use in community-based random samples of children aged 9-11 yrs. The study modules include standardised questionnaires with detailed questions on the occurrence and severity of symptoms of asthma, allergic rhinitis and atopic eczema, their clinical management, and a broad range of previous and current exposure conditions. In addition, standardised protocols were applied for examination of flexural dermatitis, skin-prick testing, bronchial challenge with hypertonic saline, blood sampling for immunoglobulin E analyses and genotyping, and dust sampling for assessment of indoor exposures to allergens and endotoxin. To date, ISAAC II field work had been completed or started in 30 study centres in 22 countries. The majority of centres are in countries that participated in International Study of Asthma and Allergies in Childhood Phase I and reflect almost the full range of the observed variability in Phase I prevalence rates.

  7. iSANLA: intelligent sensor and actuator network for life science applications.

    PubMed

    Schloesser, Mario; Schnitzer, Andreas; Ying, Hong; Silex, Carmen; Schiek, Michael

    2008-01-01

    In the fields of neurological rehabilitation and neurophysiological research there is a strong need for miniaturized, multi channel, battery driven, wireless networking DAQ systems enabling real-time digital signal processing and feedback experiments. For the scientific investigation on the passive auditory based 3D-orientation of Barn Owls and the scientific research on vegetative locomotor coordination of Parkinson's disease patients during rehabilitation we developed our 'intelligent Sensor and Actuator Network for Life science Application' (iSANLA) system. Implemented on the ultra low power microcontroller MSP430 sample rates up to 96 kHz have been realised for single channel DAQ. The system includes lossless local data storage up to 4 GB. With its outer dimensions of 20mm per rim and less than 15 g of weight including the Lithium-Ion battery our modular designed sensor node is thoroughly capable of up to eight channel recordings with 8 kHz sample rate each and provides sufficient computational power for digital signal processing ready to start our first mobile experiments. For wireless mobility a compact communication protocol based on the IEEE 802.15.4 wireless standard with net data rates up to 141 kbit/s has been implemented. To merge the lossless acquired data of the distributed iNODEs a time synchronization protocol has been developed preserving causality. Hence the necessary time synchronous start of the data acquisition inside a network of multiple sensors with a precision better than the highest sample rate has been realized.

  8. Radionuclide observables during the Integrated Field Exercise of the Comprehensive Nuclear-Test-Ban Treaty.

    PubMed

    Burnett, Jonathan L; Miley, Harry S; Milbrath, Brian D

    2016-03-01

    In 2014 the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) undertook an Integrated Field Exercise (IFE14) in Jordan. The exercise consisted of a simulated 0.5-2 kT underground nuclear explosion triggering an On-site Inspection (OSI) to search for evidence of a Treaty violation. This research paper evaluates two of the OSI techniques used during the IFE14, laboratory-based gamma-spectrometry of soil samples and in-situ gamma-spectrometry, both of which were implemented to search for 17 OSI relevant particulate radionuclides indicative of nuclear explosions. The detection sensitivity is evaluated using real IFE and model data. It indicates that higher sensitivity laboratory measurements are the optimum technique during the IFE and within the Treaty/Protocol-specified OSI timeframes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. How to Improve the Peer Review Method: Free-Selection vs Assigned-Pair Protocol Evaluated in a Computer Networking Course

    ERIC Educational Resources Information Center

    Papadopoulos, Pantelis M.; Lagkas, Thomas D.; Demetriadis, Stavros N.

    2012-01-01

    This study provides field research evidence on the efficiency of a "free-selection" peer review assignment protocol as compared to the typically implemented "assigned-pair" protocol. The study employed 54 sophomore students who were randomly assigned into three groups: Assigned-Pair (AP) (the teacher assigns student works for review to student…

  10. Method- and species-specific detection probabilities of fish occupancy in Arctic lakes: Implications for design and management

    USGS Publications Warehouse

    Haynes, Trevor B.; Rosenberger, Amanda E.; Lindberg, Mark S.; Whitman, Matthew; Schmutz, Joel A.

    2013-01-01

    Studies examining species occurrence often fail to account for false absences in field sampling. We investigate detection probabilities of five gear types for six fish species in a sample of lakes on the North Slope, Alaska. We used an occupancy modeling approach to provide estimates of detection probabilities for each method. Variation in gear- and species-specific detection probability was considerable. For example, detection probabilities for the fyke net ranged from 0.82 (SE = 0.05) for least cisco (Coregonus sardinella) to 0.04 (SE = 0.01) for slimy sculpin (Cottus cognatus). Detection probabilities were also affected by site-specific variables such as depth of the lake, year, day of sampling, and lake connection to a stream. With the exception of the dip net and shore minnow traps, each gear type provided the highest detection probability of at least one species. Results suggest that a multimethod approach may be most effective when attempting to sample the entire fish community of Arctic lakes. Detection probability estimates will be useful for designing optimal fish sampling and monitoring protocols in Arctic lakes.

  11. Quantum memory with a controlled homogeneous splitting

    NASA Astrophysics Data System (ADS)

    Hétet, G.; Wilkowski, D.; Chanelière, T.

    2013-04-01

    We propose a quantum memory protocol where an input light field can be stored onto and released from a single ground state atomic ensemble by controlling dynamically the strength of an external static and homogeneous field. The technique relies on the adiabatic following of a polaritonic excitation onto a state for which the forward collective radiative emission is forbidden. The resemblance with the archetypal electromagnetically induced transparency is only formal because no ground state coherence-based slow-light propagation is considered here. As compared to the other grand category of protocols derived from the photon-echo technique, our approach only involves a homogeneous static field. We discuss two physical situations where the effect can be observed, and show that in the limit where the excited state lifetime is longer than the storage time; the protocols are perfectly efficient and noise free. We compare the technique with other quantum memories, and propose atomic systems where the experiment can be realized.

  12. Direct and sensitive detection of foodborne pathogens within fresh produce samples using a field-deployable handheld device.

    PubMed

    You, David J; Geshell, Kenneth J; Yoon, Jeong-Yeol

    2011-10-15

    Direct and sensitive detection of foodborne pathogens from fresh produce samples was accomplished using a handheld lab-on-a-chip device, requiring little to no sample processing and enrichment steps for a near-real-time detection and truly field-deployable device. The detection of Escherichia coli K12 and O157:H7 in iceberg lettuce was achieved utilizing optimized Mie light scatter parameters with a latex particle immunoagglutination assay. The system exhibited good sensitivity, with a limit of detection of 10 CFU mL(-1) and an assay time of <6 min. Minimal pretreatment with no detrimental effects on assay sensitivity and reproducibility was accomplished with a simple and cost-effective KimWipes filter and disposable syringe. Mie simulations were used to determine the optimal parameters (particle size d, wavelength λ, and scatter angle θ) for the assay that maximize light scatter intensity of agglutinated latex microparticles and minimize light scatter intensity of the tissue fragments of iceberg lettuce, which were experimentally validated. This introduces a powerful method for detecting foodborne pathogens in fresh produce and other potential sample matrices. The integration of a multi-channel microfluidic chip allowed for differential detection of the agglutinated particles in the presence of the antigen, revealing a true field-deployable detection system with decreased assay time and improved robustness over comparable benchtop systems. Additionally, two sample preparation methods were evaluated through simulated field studies based on overall sensitivity, protocol complexity, and assay time. Preparation of the plant tissue sample by grinding resulted in a two-fold improvement in scatter intensity over washing, accompanied with a significant increase in assay time: ∼5 min (grinding) versus ∼1 min (washing). Specificity studies demonstrated binding of E. coli O157:H7 EDL933 to only O157:H7 antibody conjugated particles, with no cross-reactivity to K12. This suggests the adaptability of the system for use with a wide variety of pathogens, and the potential to detect in a variety of biological matrices with little to no sample pretreatment. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Lab on a Chip

    NASA Astrophysics Data System (ADS)

    Puget, P.

    The reliable and fast detection of chemical or biological molecules, or the measurement of their concentrations in a sample, are key problems in many fields such as environmental analysis, medical diagnosis, or the food industry. There are traditionally two approaches to this problem. The first aims to carry out a measurement in situ in the sample using chemical and biological sensors. The constraints imposed by detection limits, specificity, and in some cases stability are entirely imputed to the sensor. The second approach uses so-called total analysis systems to process the sample according to a protocol made up of different steps, such as extractions, purifications, concentrations, and a final detection stage. The latter is made in better conditions than with the first approach, which may justify the greater complexity of the process. It is this approach that is implemented in most methods for identifying pathogens, whether they be in biological samples (especially for in vitro diagnosis) or samples taken from the environment. The instrumentation traditionally used to carry out these protocols comprises a set of bulky benchtop apparatus, which needs to be plugged into the mains in order to function. However, there are many specific applications (to be discussed in this chapter) for which analysis instruments with the following characteristics are needed: Possibility of use outside the laboratory, i.e., instruments as small as possible, consuming little energy, and largely insensitive to external conditions of temperature, humidity, vibrations, and so on. Possibility of use by non-specialised agents, or even unmanned operation. Possibility of handling a large number of samples in a limited time, typically for high-throughput screening applications. Possibility of handling small samples. At the same time, a high level of performance is required, in particular in terms of (1) the detection limit, which must be as low as possible, (2) specificity, i.e., the ability to detect a particular molecule in a complex mixture, and (3) speed.

  14. Understanding biological mechanisms underlying adverse birth outcomes in developing countries: protocol for a prospective cohort (AMANHI bio–banking) study

    PubMed Central

    Baqui, Abdullah H; Khanam, Rasheda; Rahman, Mohammad Sayedur; Ahmed, Aziz; Rahman, Hasna Hena; Moin, Mamun Ibne; Ahmed, Salahuddin; Jehan, Fyezah; Nisar, Imran; Hussain, Atiya; Ilyas, Muhammad; Hotwani, Aneeta; Sajid, Muhammad; Qureshi, Shahida; Zaidi, Anita; Sazawal, Sunil; Ali, Said M; Deb, Saikat; Juma, Mohammed Hamad; Dhingra, Usha; Dutta, Arup; Ame, Shaali Makame; Hayward, Caroline; Rudan, Igor; Zangenberg, Mike; Russell, Donna; Yoshida, Sachiyo; Polašek, Ozren; Manu, Alexander; Bahl, Rajiv

    2017-01-01

    Objectives The AMANHI study aims to seek for biomarkers as predictors of important pregnancy–related outcomes, and establish a biobank in developing countries for future research as new methods and technologies become available. Methods AMANHI is using harmonised protocols to enrol 3000 women in early pregnancies (8–19 weeks of gestation) for population–based follow–up in pregnancy up to 42 days postpartum in Bangladesh, Pakistan and Tanzania, with collection taking place between August 2014 and June 2016. Urine pregnancy tests will be used to confirm reported or suspected pregnancies for screening ultrasound by trained sonographers to accurately date the pregnancy. Trained study field workers will collect very detailed phenotypic and epidemiological data from the pregnant woman and her family at scheduled home visits during pregnancy (enrolment, 24–28 weeks, 32–36 weeks & 38+ weeks) and postpartum (days 0–6 or 42–60). Trained phlebotomists will collect maternal and umbilical blood samples, centrifuge and obtain aliquots of serum, plasma and the buffy coat for storage. They will also measure HbA1C and collect a dried spot sample of whole blood. Maternal urine samples will also be collected and stored, alongside placenta, umbilical cord tissue and membrane samples, which will both be frozen and prepared for histology examination. Maternal and newborn stool (for microbiota) as well as paternal and newborn saliva samples (for DNA extraction) will also be collected. All samples will be stored at –80°C in the biobank in each of the three sites. These samples will be linked to numerous epidemiological and phenotypic data with unique study identification numbers. Importance of the study AMANHI biobank proves that biobanking is feasible to implement in LMICs, but recognises that biobank creation is only the first step in addressing current global challenges. PMID:29163938

  15. Network Analysis with SiLK

    DTIC Science & Technology

    2015-01-06

    Carnegie Mellon University rwcut Default Display By default • sIP , sPort • dIP, dPort • protocol • packets, bytes • flags • sTime, eTime, duration...TCP/IP SOCKET IP address: 10.0.0.1 L4 protocol : TCP High-numbered ephemeral port # TCP/IP SOCKET IP address: 203.0.113.1 L4 protocol : TCP Low-numbered...Fields found to be useful in analysis: • source address, destination address • source port, destination port (Internet Control Message Protocol

  16. A Secure and Efficient Handover Authentication Protocol for Wireless Networks

    PubMed Central

    Wang, Weijia; Hu, Lei

    2014-01-01

    Handover authentication protocol is a promising access control technology in the fields of WLANs and mobile wireless sensor networks. In this paper, we firstly review an efficient handover authentication protocol, named PairHand, and its existing security attacks and improvements. Then, we present an improved key recovery attack by using the linearly combining method and reanalyze its feasibility on the improved PairHand protocol. Finally, we present a new handover authentication protocol, which not only achieves the same desirable efficiency features of PairHand, but enjoys the provable security in the random oracle model. PMID:24971471

  17. Design of the frame structure for a multiservice interactive system using ATM-PON

    NASA Astrophysics Data System (ADS)

    Nam, Jae-Hyun; Jang, Jongwook; Lee, Jung-Tae

    1998-10-01

    The MAC (Medium Access Control) protocol controls B-NT1s' (Optical Network Unit) access to the shared capacity on the PON, this protocol is very important if TDMA (Time Division Multiple Access) multiplexing is used on the upstream. To control the upstream traffic some kind of access protocol has to be implemented. There are roughly two different approaches to use request cells: in a collision free way or such that collisions in a request slot are allowed. It is the objective of this paper to describe a MAC-protocol structure that supports both approaches and hybrids of it. In our paper we grantee the QoS (Quality of Service) of each B-NT1 through LOC, LOV, LOA field that are the length field of the transmitted cell at each B-NT1. Each B-NT1 transmits its status of request on request cell.

  18. A Protocol for Collecting Human Cardiac Tissue for Research.

    PubMed

    Blair, Cheavar A; Haynes, Premi; Campbell, Stuart G; Chung, Charles; Mitov, Mihail I; Dennis, Donna; Bonnell, Mark R; Hoopes, Charles W; Guglin, Maya; Campbell, Kenneth S

    2016-01-01

    This manuscript describes a protocol at the University of Kentucky that allows a translational research team to collect human myocardium that can be used for biological research. We have gained a great deal of practical experience since we started this protocol in 2008, and we hope that other groups might be able to learn from our endeavors. To date, we have procured ~4000 samples from ~230 patients. The tissue that we collect comes from organ donors and from patients who are receiving a heart transplant or a ventricular assist device because they have heart failure. We begin our manuscript by describing the importance of human samples in cardiac research. Subsequently, we describe the process for obtaining consent from patients, the cost of running the protocol, and some of the issues and practical difficulties that we have encountered. We conclude with some suggestions for other researchers who may be considering starting a similar protocol.

  19. A Protocol for Collecting Human Cardiac Tissue for Research

    PubMed Central

    Blair, Cheavar A.; Haynes, Premi; Campbell, Stuart G.; Chung, Charles; Mitov, Mihail I.; Dennis, Donna; Bonnell, Mark R.; Hoopes, Charles W.; Guglin, Maya; Campbell, Kenneth S.

    2016-01-01

    This manuscript describes a protocol at the University of Kentucky that allows a translational research team to collect human myocardium that can be used for biological research. We have gained a great deal of practical experience since we started this protocol in 2008, and we hope that other groups might be able to learn from our endeavors. To date, we have procured ~4000 samples from ~230 patients. The tissue that we collect comes from organ donors and from patients who are receiving a heart transplant or a ventricular assist device because they have heart failure. We begin our manuscript by describing the importance of human samples in cardiac research. Subsequently, we describe the process for obtaining consent from patients, the cost of running the protocol, and some of the issues and practical difficulties that we have encountered. We conclude with some suggestions for other researchers who may be considering starting a similar protocol. PMID:28042604

  20. A Mobile Satellite Experiment (MSAT-X) network definition

    NASA Technical Reports Server (NTRS)

    Wang, Charles C.; Yan, Tsun-Yee

    1990-01-01

    The network architecture development of the Mobile Satellite Experiment (MSAT-X) project for the past few years is described. The results and findings of the network research activities carried out under the MSAT-X project are summarized. A framework is presented upon which the Mobile Satellite Systems (MSSs) operator can design a commercial network. A sample network configuration and its capability are also included under the projected scenario. The Communication Interconnection aspect of the MSAT-X network is discussed. In the MSAT-X network structure two basic protocols are presented: the channel access protocol, and the link connection protocol. The error-control techniques used in the MSAT-X project and the packet structure are also discussed. A description of two testbeds developed for experimentally simulating the channel access protocol and link control protocol, respectively, is presented. A sample network configuration and some future network activities of the MSAT-X project are also presented.

  1. DOC and nitrate export linked to dominant rainfall-runoff processes, end-members and seasonality - a long-term high frequency measurement campaign

    NASA Astrophysics Data System (ADS)

    Schwab, M. P.; Klaus, J.; Pfister, L.; Weiler, M.

    2016-12-01

    Over the past decades, stream sampling protocols for hydro-geochemical parameters were often limited by logistical and technological constraints. While long-term monitoring protocols were typically based on weekly sampling intervals, high frequency sampling was commonly limited to a few single events. In this contribution, we combined high frequency and long-term measurements to understand DOC and nitrate dynamics in a forest headwater for different runoff events and seasons. Our study area is the forested Weierbach catchment (0.47 km2) in Luxembourg, where the fractured schist bedrock is covered by cambisol soils. The runoff response is characterized by a double peak behaviour. The first peak occurs during or right after a rainfall event triggered by fast near surface runoff generation processes, while a second delayed peak lasts several days and is generated by subsurface flow. This second peak occurs only if a distinct storage threshold of the catchment is exceeded. Our observations were carried out with a field deployable UV-Vis spectrometer measuring DOC and nitrate concentrations in-situ at 15 min intervals for more than two years. In addition, a long-term validation was carried out with data obtained from the analysis of water collected with grab samples. The long-term, high-frequency measurements allowed us to calculate a complete and detailed balance of DOC and nitrate export over two years. Transport behaviour of the DOC and nitrate showed different dynamics between the first and second hydrograph peaks. DOC is mainly exported during the first peaks, while nitrate is mostly exported during the delayed second peaks. Biweekly end-member measurement of soil and groundwater over several years enables us to link the behaviour of DOC and nitrate export to various end-members in the catchment. Altogether, the long-term and high-frequency time series provides the opportunity to study DOC and nitrate export processes without having to just rely only on either a few single event measurements or coarse measurement protocols.

  2. ENVIRONMENTAL EVALUATION FOR UTILIZATION OF ASH IN SOIL STABILIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David J. Hassett; Loreal V. Heebink

    2001-08-01

    The Minnesota Pollution Control Agency (MPCA) approved the use of coal ash in soil stabilization, indicating that environmental data needed to be generated. The overall project goal is to evaluate the potential for release of constituents into the environment from ash used in soil stabilization projects. Supporting objectives are: (1) To ensure sample integrity through implementation of a sample collection, preservation, and storage protocol to avoid analyte concentration or loss. (2) To evaluate the potential of each component (ash, soil, water) of the stabilized soil to contribute to environmental release of analytes of interest. (3) To use laboratory leaching methodsmore » to evaluate the potential for release of constituents to the environment. (4) To facilitate collection of and to evaluate samples from a field runoff demonstration effort. The results of this study indicated limited mobility of the coal combustion fly ash constituents in laboratory tests and the field runoff samples. The results presented support previous work showing little to negligible impact on water quality. This and past work indicates that soil stabilization is an environmentally beneficial CCB utilization application as encouraged by the U.S. Environmental Protection Agency. This project addressed the regulatory-driven environmental aspect of fly ash use for soil stabilization, but the demonstrated engineering performance and economic advantages also indicate that the use of CCBs in soil stabilization can and should become an accepted engineering option.« less

  3. Detection of Noble Gas Radionuclides from an Underground Nuclear Explosion During a CTBT On-Site Inspection

    NASA Astrophysics Data System (ADS)

    Carrigan, Charles R.; Sun, Yunwei

    2014-03-01

    The development of a technically sound approach to detecting the subsurface release of noble gas radionuclides is a critical component of the on-site inspection (OSI) protocol under the Comprehensive Nuclear Test Ban Treaty. In this context, we are investigating a variety of technical challenges that have a significant bearing on policy development and technical guidance regarding the detection of noble gases and the creation of a technically justifiable OSI concept of operation. The work focuses on optimizing the ability to capture radioactive noble gases subject to the constraints of possible OSI scenarios. This focus results from recognizing the difficulty of detecting gas releases in geologic environments—a lesson we learned previously from the non-proliferation experiment (NPE). Most of our evaluations of a sampling or transport issue necessarily involve computer simulations. This is partly due to the lack of OSI-relevant field data, such as that provided by the NPE, and partly a result of the ability of computer-based models to test a range of geologic and atmospheric scenarios far beyond what could ever be studied by field experiments, making this approach very highly cost effective. We review some highlights of the transport and sampling issues we have investigated and complete the discussion of these issues with a description of a preliminary design for subsurface sampling that addresses some of the sampling challenges discussed here.

  4. Mars Sample Quarantine Protocol Workshop

    NASA Technical Reports Server (NTRS)

    DeVincenzi, Donald L. (Editor); Bagby, John (Editor); Race, Margaret (Editor); Rummel, John (Editor)

    1999-01-01

    The Mars Sample Quarantine Protocol (QP) Workshop was convened to deal with three specific aspects of the initial handling of a returned Mars sample: 1) biocontainment, to prevent uncontrolled release of sample material into the terrestrial environment; 2) life detection, to examine the sample for evidence of live organisms; and 3) biohazard testing, to determine if the sample poses any threat to terrestrial life forms and the Earth's biosphere. During the first part of the Workshop, several tutorials were presented on topics related to the workshop in order to give all participants a common basis in the technical areas necessary to achieve the objectives of the Workshop.

  5. Adaptive Peer Sampling with Newscast

    NASA Astrophysics Data System (ADS)

    Tölgyesi, Norbert; Jelasity, Márk

    The peer sampling service is a middleware service that provides random samples from a large decentralized network to support gossip-based applications such as multicast, data aggregation and overlay topology management. Lightweight gossip-based implementations of the peer sampling service have been shown to provide good quality random sampling while also being extremely robust to many failure scenarios, including node churn and catastrophic failure. We identify two problems with these approaches. The first problem is related to message drop failures: if a node experiences a higher-than-average message drop rate then the probability of sampling this node in the network will decrease. The second problem is that the application layer at different nodes might request random samples at very different rates which can result in very poor random sampling especially at nodes with high request rates. We propose solutions for both problems. We focus on Newscast, a robust implementation of the peer sampling service. Our solution is based on simple extensions of the protocol and an adaptive self-control mechanism for its parameters, namely—without involving failure detectors—nodes passively monitor local protocol events using them as feedback for a local control loop for self-tuning the protocol parameters. The proposed solution is evaluated by simulation experiments.

  6. An efficacious oral health care protocol for immunocompromised patients.

    PubMed

    Solomon, C S; Shaikh, A B; Arendorf, T M

    1995-01-01

    A twice-weekly oral and perioral examination was provided to 120 patients receiving antineoplastic therapy. Sixty patients were monitored while following the traditional hospital oral care protocol (chlorhexidine, hydrogen peroxide, sodium bicarbonate, thymol glycol, benzocaine mouthrinse, and nystatin). The mouth care protocol was then changed (experimental protocol = chlorhexidine, benzocaine lozenges, amphotericin B lozenges), and patients were monitored until the sample size matched that of the hospital mouth care regime. There was a statistically significant reduction in oral complications upon introduction and maintenance of the experimental protocol.

  7. PROTOCOL FOR EXAMINATION OF THE INNER CAN CLOSURE WELD REGION FOR 3013 DE CONTAINERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mickalonis, J.

    2014-09-16

    The protocol for the examination of the inner can closure weld region (ICCWR) for 3013 DE containers is presented within this report. The protocol includes sectioning of the inner can lid section, documenting the surface condition, measuring corrosion parameters, and storing of samples. This protocol may change as the investigation develops since findings may necessitate additional steps be taken. Details of the previous analyses, which formed the basis for this protocol, are also presented.

  8. On fixed-area plot sampling for downed coarse woody debris

    Treesearch

    Jeffrey H. Gove; Paul C. Van Deusen

    2011-01-01

    The use of fixed-area plots for sampling down coarse woody debris is reviewed. A set of clearly defined protocols for two previously described methods is established and a new method, which we call the 'sausage' method, is developed. All methods (protocols) are shown to be unbiased for volume estimation, but not necessarily for estimation of population...

  9. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR LABORATORY ANALYSIS OF HAIR SAMPLES FOR MERCURY (RTI-L-1.0)

    EPA Science Inventory

    The purpose of this protocol is to provide guidelines for the analysis of hair samples for total mercury by cold vapor atomic fluorescence (CVAFS) spectrometry. This protocol describes the methodology and all other analytical aspects involved in the analysis. Keywords: hair; s...

  10. 21 CFR 610.2 - Requests for samples and protocols; official release.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Biologics Evaluation and Research, a manufacturer shall not distribute a lot of a product until the lot is... Evaluation and Research, a manufacturer shall not distribute a lot of a biological product until the lot is... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Requests for samples and protocols; official...

  11. 21 CFR 610.2 - Requests for samples and protocols; official release.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Biologics Evaluation and Research, a manufacturer shall not distribute a lot of a product until the lot is... Evaluation and Research, a manufacturer shall not distribute a lot of a biological product until the lot is... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Requests for samples and protocols; official...

  12. Sampling and measurement protocols for long-term silvicultural studies on the Penobscot Experimental Forest

    Treesearch

    Justin D. Waskiewicz; Laura S. Kenefic; Nicole S. Rogers; Joshua J. Puhlick; John C. Brissette; Richard J. Dionne

    2015-01-01

    The U.S. Forest Service, Northern Research Station has been conducting research on the silviculture of northern conifers on the Penobscot Experimental Forest (PEF) in Maine since 1950. Formal study plans provide guidance and specifications for the experimental treatments, but documentation is also needed to ensure consistency in data collection and sampling protocols....

  13. Effects of Different Conditioning Activities on 100-m Dash Performance in High School Track and Field Athletes.

    PubMed

    Ferreira-Júnior, João B; Guttierres, Ana P M; Encarnação, Irismar G A; Lima, Jorge R P; Borba, Diego A; Freitas, Eduardo D S; Bemben, Michael G; Vieira, Carlos A; Bottaro, Martim

    2018-06-01

    This study compared the effects of different conditioning activities on the 100-m dash performance of 11 male, high school track and field athletes (mean age = 16.3; SD = 1.2 years). Participants performed a 100-m dash seven minutes after each of four randomized conditioning protocols, with each condition and 100-m dash separated by 3-10 days. The conditioning protocols were (a) control, no conditioning activity; (b) weighted plyometric, three sets of 10 repetitions of alternate leg bounding with additional load of 10% of the body mass; (c) free sprint, two 20-m sprints; and (d) resisted sprint (RS), two 20-m resisted sprints using an elastic tubing tool. We obtained session ratings of perceived exertion (SRPE) immediately after each conditioning protocol. There were no significant differences between any of the three experimental conditioning activities on 100-m sprint time, but the RS protocol improved 100-m sprint time compared with the control (no conditioning) protocol ( p < .001). The RS also led to greater sprint velocity and higher SRPE compared with the control condition ( p < .01). There was no significant association between SRPE and 100-m performance ( p = .77, r = .05). These results suggest a benefit for young male track and field athletes to the elastic tubing warm-up activities prior to the 100-m dash.

  14. Semi-Autonomous Small Unmanned Aircraft Systems for Sampling Tornadic Supercell Thunderstorms

    NASA Astrophysics Data System (ADS)

    Elston, Jack S.

    This work describes the development of a network-centric unmanned aircraft system (UAS) for in situ sampling of supercell thunderstorms. UAS have been identified as a well-suited platform for meteorological observations given their portability, endurance, and ability to mitigate atmospheric disturbances. They represent a unique tool for performing targeted sampling in regions of a supercell thunderstorm previously unreachable through other methods. Doppler radar can provide unique measurements of the wind field in and around supercell thunderstorms. In order to exploit this capability, a planner was developed that can optimize ingress trajectories for severe storm penetration. The resulting trajectories were examined to determine the feasibility of such a mission, and to optimize ingress in terms of flight time and exposure to precipitation. A network-centric architecture was developed to handle the large amount of distributed data produced during a storm sampling mission. Creation of this architecture was performed through a bottom-up design approach which reflects and enhances the interplay between networked communication and autonomous aircraft operation. The advantages of the approach are demonstrated through several field and hardware-in-the-loop experiments containing different hardware, networking protocols, and objectives. Results are provided from field experiments involving the resulting network-centric architecture. An airmass boundary was sampled in the Collaborative Colorado Nebraska Unmanned Aircraft Experiment (CoCoNUE). Utilizing lessons learned from CoCoNUE, a new concept of operations (CONOPS) and UAS were developed to perform in situ sampling of supercell thunderstorms. Deployment during the Verification of the Origins of Rotation in Tornadoes Experiment 2 (VORTEX2) resulted in the first ever sampling of the airmass associated with the rear flank downdraft of a tornadic supercell thunderstorm by a UAS. Hardware-in-the-loop simulation capability was added to the UAS to enable further assessment of the system and CONOPS. The simulation combines a full six degree-of-freedom aircraft dynamic model with wind and precipitation data from simulations of severe convective storms. Interfaces were written to involve as much of the system's field hardware as possible, including the creation of a simulated radar product server. A variety of simulations were conducted to evaluate different aspects of the CONOPS used for the 2010 VORTEX2 field campaign.

  15. Quantum Sensors for the Generating Functional of Interacting Quantum Field Theories

    NASA Astrophysics Data System (ADS)

    Bermudez, A.; Aarts, G.; Müller, M.

    2017-10-01

    Difficult problems described in terms of interacting quantum fields evolving in real time or out of equilibrium abound in condensed-matter and high-energy physics. Addressing such problems via controlled experiments in atomic, molecular, and optical physics would be a breakthrough in the field of quantum simulations. In this work, we present a quantum-sensing protocol to measure the generating functional of an interacting quantum field theory and, with it, all the relevant information about its in- or out-of-equilibrium phenomena. Our protocol can be understood as a collective interferometric scheme based on a generalization of the notion of Schwinger sources in quantum field theories, which make it possible to probe the generating functional. We show that our scheme can be realized in crystals of trapped ions acting as analog quantum simulators of self-interacting scalar quantum field theories.

  16. Coexistence of WiFi and WiMAX systems based on PS-request protocols.

    PubMed

    Kim, Jongwoo; Park, Suwon; Rhee, Seung Hyong; Choi, Yong-Hoon; Chung, Young-uk; Hwang, Ho Young

    2011-01-01

    We introduce both the coexistence zone within the WiMAX frame structure and a PS-Request protocol for the coexistence of WiFi and WiMAX systems sharing a frequency band. Because we know that the PS-Request protocol has drawbacks, we propose a revised PS-Request protocol to improve the performance. Two PS-Request protocols are based on the time division operation (TDO) of WiFi system and WiMAX system to avoid the mutual interference, and use the vestigial power management (PwrMgt) bit within the Frame Control field of the frames transmitted by a WiFi AP. The performance of the revised PS-Request protocol is evaluated by computer simulation, and compared to those of the cases without a coexistence protocol and to the original PS-Request protocol.

  17. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  18. Intercomparison of thermal-optical method with different temperature protocols: Implications from source samples and solvent extraction

    NASA Astrophysics Data System (ADS)

    Cheng, Yuan; Duan, Feng-kui; He, Ke-bin; Du, Zhen-yu; Zheng, Mei; Ma, Yong-liang

    2012-12-01

    Three temperature protocols with different peak inert mode temperature (Tpeak-inert) were compared based on source and ambient samples (both untreated and extracted using a mixture of hexane, methylene chloride, and acetone) collected in Beijing, China. The ratio of EC580 (elemental carbon measured by the protocol with a Tpeak-inert of 580 °C; similar hereinafter) to EC850 could be as high as 4.8 for biomass smoke samples whereas the ratio was about 1.0 for diesel and gasoline exhaust samples. The EC580 to EC850 ratio averaged 1.95 ± 0.89 and 1.13 ± 0.20 for the untreated and extracted ambient samples, whereas the EC580 to EC650 ratio of ambient samples was 1.22 ± 0.10 and 1.20 ± 0.12 before and after extraction. It was suggested that there are two competing mechanisms for the effects of Tpeak-inert on the EC results such that when Tpeak-inert is increased, one mechanism tends to decrease EC by increasing the amount of charring whereas the other tends to increase EC through promoting more charring to evolve before native EC. Results from this study showed that EC does not always decrease when increasing the peak inert mode temperature. Moreover, reducing the charring amount could improve the protocols agreement on EC measurements, whereas temperature protocol would not influence the EC results if no charring is formed. This study also demonstrated the benefits of allowing for the OC and EC split occurring in the inert mode when a high Tpeak-inert is used (e.g., 850 °C).

  19. Thermal/optical methods for elemental carbon quantification in soils and urban dusts: equivalence of different analysis protocols.

    PubMed

    Han, Yongming; Chen, Antony; Cao, Junji; Fung, Kochy; Ho, Fai; Yan, Beizhan; Zhan, Changlin; Liu, Suixin; Wei, Chong; An, Zhisheng

    2013-01-01

    Quantifying elemental carbon (EC) content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR) were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT). A high-temperature method with extended heating times (STN120) showed the highest ECT/ECR ratio (0.86) while a low-temperature protocol (IMPROVE-550), with heating time adjusted for sample loading, showed the lowest (0.53). STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC) removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average) than in soils (5.2 on average), most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.

  20. Soil sampling and analytical strategies for mapping fallout in nuclear emergencies based on the Fukushima Dai-ichi Nuclear Power Plant accident.

    PubMed

    Onda, Yuichi; Kato, Hiroaki; Hoshi, Masaharu; Takahashi, Yoshio; Nguyen, Minh-Long

    2015-01-01

    The Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident resulted in extensive radioactive contamination of the environment via deposited radionuclides such as radiocesium and (131)I. Evaluating the extent and level of environmental contamination is critical to protecting citizens in affected areas and to planning decontamination efforts. However, a standardized soil sampling protocol is needed in such emergencies to facilitate the collection of large, tractable samples for measuring gamma-emitting radionuclides. In this study, we developed an emergency soil sampling protocol based on preliminary sampling from the FDNPP accident-affected area. We also present the results of a preliminary experiment aimed to evaluate the influence of various procedures (e.g., mixing, number of samples) on measured radioactivity. Results show that sample mixing strongly affects measured radioactivity in soil samples. Furthermore, for homogenization, shaking the plastic sample container at least 150 times or disaggregating soil by hand-rolling in a disposable plastic bag is required. Finally, we determined that five soil samples within a 3 m × 3-m area are the minimum number required for reducing measurement uncertainty in the emergency soil sampling protocol proposed here. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. SeaWiFS technical report series. Volume 25: Ocean optics protocols for SeaWiFS validation, revision 1

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Acker, James G. (Editor); Mueller, James L.; Austin, Roswell W.

    1995-01-01

    This report presents protocols for measuring optical properties, and other environmental variables, to validate the radiometric performance of the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), and to develop and validate bio-optical algorithms for use with SeaWiFS data. The protocols are intended to establish foundations for a measurement strategy to verify the challenging SeaWiFS uncertainty goals of 5 percent in water-leaving radiances and 35 percent in chlorophyll alpha concentration. The protocols first specify the variables which must be measured, and briefly review the rationale for measuring each variable. Subsequent chapters cover detailed protocols for instrument performance specifications, characterizing and calibrating instruments, methods of making measurements in the field, and methods of data analysis. These protocols were developed at a workshop sponsored by the SeaWiFS Project Office (SPO) and held at the Naval Postgraduate School in Monterey, California (9-12 April 1991). This report began as the proceedings of the workshop, as interpreted and expanded by the authors and reviewed by workshop participants and other members of the bio-optical research community. The protocols are an evolving prescription to allow the research community to approach the unprecedented measurement uncertainties implied by the SeaWiFS goals; research and development are needed to improve the state-of-the-art in specific areas. These protocols should be periodically revised to reflect technical advances during the SeaWiFS Project cycle. The present edition (Revision 1) incorporates new protocols in several areas, including expanded protocol descriptions for Case-2 waters and other improvements, as contributed by several members of the SeaWiFS Science Team.

  2. Flow cytometry for enrichment and titration in massively parallel DNA sequencing

    PubMed Central

    Sandberg, Julia; Ståhl, Patrik L.; Ahmadian, Afshin; Bjursell, Magnus K.; Lundeberg, Joakim

    2009-01-01

    Massively parallel DNA sequencing is revolutionizing genomics research throughout the life sciences. However, the reagent costs and labor requirements in current sequencing protocols are still substantial, although improvements are continuously being made. Here, we demonstrate an effective alternative to existing sample titration protocols for the Roche/454 system using Fluorescence Activated Cell Sorting (FACS) technology to determine the optimal DNA-to-bead ratio prior to large-scale sequencing. Our method, which eliminates the need for the costly pilot sequencing of samples during titration is capable of rapidly providing accurate DNA-to-bead ratios that are not biased by the quantification and sedimentation steps included in current protocols. Moreover, we demonstrate that FACS sorting can be readily used to highly enrich fractions of beads carrying template DNA, with near total elimination of empty beads and no downstream sacrifice of DNA sequencing quality. Automated enrichment by FACS is a simple approach to obtain pure samples for bead-based sequencing systems, and offers an efficient, low-cost alternative to current enrichment protocols. PMID:19304748

  3. Feasibility of Providing Safe Mouth Care and Collecting Oral and Fecal Microbiome Samples from Nursing Home Residents with Dysphagia: Proof of Concept Study.

    PubMed

    Jablonski, Rita A; Winstead, Vicki; Azuero, Andres; Ptacek, Travis; Jones-Townsend, Corteza; Byrd, Elizabeth; Geisinger, Maria L; Morrow, Casey

    2017-09-01

    Individuals with dysphagia who reside in nursing homes often receive inadequate mouth care and experience poor oral health. From a policy perspective, the combination of absent evidence-based mouth care protocols coupled with insufficient dental coverage create a pool of individuals at great risk for preventable infectious illnesses that contribute to high health care costs. The purpose of the current study was to determine (a) the safety of a mouth care protocol tailored for individuals with dysphagia residing in nursing homes without access to suction equipment, and (b) the feasibility of collecting oral and fecal samples for microbiota analyses. The mouth care protocol resulted in improved oral hygiene without aspiration, and oral and fecal samples were safely collected from participants. Policies supporting ongoing testing of evidence-based mouth care protocols for individuals with dysphagia are important to improve quality, demonstrate efficacy, and save health care costs. [Journal of Gerontological Nursing, 43(9), 9-15.]. Copyright 2017, SLACK Incorporated.

  4. CONCEPTS AND APPROACHES FOR THE ...

    EPA Pesticide Factsheets

    This document is intended to assist users in establishing or refining protocols, including the specific methods related to field sampling, laboratory sample processing, taxonomy, data entry, management and analysis, and final assessment and reporting. It also reviews and provides information on development of monitoring designs to address certain types of environmental questions and approaches for documenting and reporting data quality and performance characteristics for large river biological monitoring. The approaches presented are not intended to replace existing program components but may in some cases be useful for refining them. The goal of this research is to develop methods and indicators that are useful for evaluating the condition of aquatic communities, for assessing the restoration of aquatic communities in response to mitigation and best management practices, and for determining the exposure of aquatic communities to different classes of stressors (i.e., pesticides, sedimentation, habitat alteration).

  5. METHOD FOR MICRORNA ISOLATION FROM CLINICAL SERUM SAMPLES

    PubMed Central

    Li, Yu; Kowdley, Kris V.

    2012-01-01

    MicroRNAs are a group of intracellular non-coding RNA molecules that have been implicated in a variety of human diseases. Due to their high stability in blood, microRNAs released into circulation could be potentially utilized as non-invasive biomarkers for diagnosis or prognosis. Current microRNA isolation protocols are specifically designed for solid tissues and are impractical for biomarker development utilizing small-volume serum samples on a large scale. Thus, a protocol for microRNA isolation from serum is needed to accommodate these conditions in biomarker development. To establish such a protocol, we developed a simplified approach to normalize sample input by using single synthetic spike-in microRNA. We evaluated three commonly used commercial microRNA isolation kits for the best performance by comparing RNA quality and yield. The manufacturer’s protocol was further modified to improve the microRNA yield from 200 μL of human serum. MicroRNAs isolated from a large set of clinical serum samples were tested on the miRCURY LNA real-time PCR panel and confirmed to be suitable for high-throughput microRNA profiling. In conclusion, we have established a proven method for microRNA isolation from clinical serum samples suitable for microRNA biomarker development. PMID:22982505

  6. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    NASA Astrophysics Data System (ADS)

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  7. Protocol for Monitoring Fish Assemblages in Pacific Northwest National Parks

    USGS Publications Warehouse

    Brenkman, Samuel J.; Connolly, Patrick J.

    2008-01-01

    Rivers and streams that drain from Olympic, Mount Rainier, and North Cascades National Parks are among the most protected corridors in the lower 48 States, and represent some of the largest tracts of contiguous, undisturbed habitat throughout the range of several key fish species of the Pacific Northwest. These watersheds are of high regional importance as freshwater habitat sanctuaries for native fish, where habitat conditions are characterized as having little to no disturbance from development, channelization, impervious surfaces, roads, diversions, or hydroelectric projects. Fishery resources are of high ecological and cultural importance in Pacific Northwest National Parks, and significantly contribute to economically important recreational, commercial, and tribal fisheries. This protocol describes procedures to monitor trends in fish assemblages, fish abundance, and water temperature in eight rivers and five wadeable streams in Olympic National Park during summer months, and is based on 4 years of field testing. Fish assemblages link freshwater, marine, and terrestrial ecosystems. They also serve as focal resources of national parks and are excellent indicators of ecological conditions of rivers and streams. Despite the vital importance of native anadromous and resident fish populations, there is no existing monitoring program for fish assemblages in the North Coast and Cascades Network. Specific monitoring objectives of this protocol are to determine seasonal and annual trends in: (1) fish species composition, (2) timing of migration of adult fish, (3) relative abundance, (4) age and size structure, (5) extent of non-native and hatchery fish, and (6) water temperature. To detect seasonal and annual trends in fish assemblages in reference sites, we rely on repeated and consistent annual sampling at each monitoring site. The general rationale for the repeated sampling of reference sites is to ensure that we account for the high interannual variability in fish movements and abundances in rivers. One underlying assumption is that the monitoring program is designed in perpetuity, and consequently our capability to detect trends substantially increases with time. The protocol describes sampling designs, methods, training procedures, safety considerations, data management, data analysis, and reporting. The allocation of sampling effort represents a balance between ecological considerations, a sound monitoring approach, and practical limitations caused by logistical constraints and a limited annual budget of $55,000. The widespread declines of native fish species in western North America highlights the importance and urgency of understanding trends in fish assemblages from undisturbed habitats. Seasonal and annual trends in fish assemblages will provide insights at the individual, population, and assemblage level. This protocol will allow managers to detect increases and decreases in abundance of priority management species, and occurrence of non-native, hatchery, and federally listed fish. The detection of trends in fish assemblages will allow for specific management actions that may include: implementation of more appropriate fishing regulations, evaluation of existing hatchery releases, control of non-native fish species, and prioritization of habitat restoration projects. Dissemination and communication of scientific findings on North Coast and Cascades Network fish assemblages will be a core product of this protocol, which will have much relevance to decision makers, park visitors, researchers, and educators.

  8. An Examination of the Design, Development, and Implementation of an Internet Protocol Version 6 Network: The ADTRAN Inc. Case Study

    ERIC Educational Resources Information Center

    Perigo, Levi

    2013-01-01

    In this dissertation, the author examined the capabilities of Internet Protocol version 6 (IPv6) in regard to replacing Internet Protocol version 4 (IPv4) as the internetworking technology for Medium-sized Businesses (MBs) in the Information Systems (IS) field. Transition to IPv6 is inevitable, and, thus, organizations are adopting this protocol…

  9. Field-based evaluation of two herbaceous plant community composition sampling methods for long-term monitoring in Northern Great Plains National Parks

    USGS Publications Warehouse

    Symstad, Amy J.; Wienk, Cody L.; Thorstenson, Andy

    2006-01-01

    The Northern Great Plains Inventory & Monitoring (I&M) Network (Network) of the National Park Service (NPS) consists of 13 NPS units in North Dakota, South Dakota, Nebraska, and eastern Wyoming. The Network is in the planning phase of a long-term program to monitor the health of park ecosystems. Plant community composition is one of the 'Vital Signs,' or indicators, that will be monitored as part of this program for three main reasons. First, plant community composition is information-rich; a single sampling protocol can provide information on the diversity of native and non-native species, the abundance of individual dominant species, and the abundance of groups of plants. Second, plant community composition is of specific management concern. The abundance and diversity of exotic plants, both absolute and relative to native species, is one of the greatest management concerns in almost all Network parks (Symstad 2004). Finally, plant community composition reflects the effects of a variety of current or anticipated stressors on ecosystem health in the Network parks including invasive exotic plants, large ungulate grazing, lack of fire in a fire-adapted system, chemical exotic plant control, nitrogen deposition, increased atmospheric carbon dioxide concentrations, and climate change. Before the Network begins its Vital Signs monitoring, a detailed plan describing specific protocols used for each of the Vital Signs must go through rigorous development and review. The pilot study on which we report here is one of the components of this protocol development. The goal of the work we report on here was to determine a specific method to use for monitoring plant community composition of the herb layer (< 2 m tall).

  10. A semi-nested real-time PCR method to detect low chimerism percentage in small quantity of hematopoietic stem cell transplant DNA samples.

    PubMed

    Aloisio, Michelangelo; Bortot, Barbara; Gandin, Ilaria; Severini, Giovanni Maria; Athanasakis, Emmanouil

    2017-02-01

    Chimerism status evaluation of post-allogeneic hematopoietic stem cell transplantation samples is essential to predict post-transplant relapse. The most commonly used technique capable of detecting small increments of chimerism is quantitative real-time PCR. Although this method is already used in several laboratories, previously described protocols often lack sensitivity and the amount of the DNA required for each chimerism analysis is too high. In the present study, we compared a novel semi-nested allele-specific real-time PCR (sNAS-qPCR) protocol with our in-house standard allele-specific real-time PCR (gAS-qPCR) protocol. We selected two genetic markers and analyzed technical parameters (slope, y-intercept, R2, and standard deviation) useful to determine the performances of the two protocols. The sNAS-qPCR protocol showed better sensitivity and precision. Moreover, the sNAS-qPCR protocol requires, as input, only 10 ng of DNA, which is at least 10-fold less than the gAS-qPCR protocols described in the literature. Finally, the proposed sNAS-qPCR protocol could prove very useful for performing chimerism analysis with a small amount of DNA, as in the case of blood cell subsets.

  11. New archaeomagnetic data recovered from the study of celtiberic remains from central Spain (Numantia and Ciadueña, 3rd-1st centuries BC). Implications on the fidelity of the Iberian paleointensity database

    NASA Astrophysics Data System (ADS)

    Osete, M. L.; Chauvin, A.; Catanzariti, G.; Jimeno, A.; Campuzano, S. A.; Benito-Batanero, J. P.; Tabernero-Galán, C.; Roperch, P.

    2016-11-01

    Variations of geomagnetic field in the Iberian Peninsula prior to roman times are poorly constrained. Here we report new archaeomagnetic results from four ceramic collections and two combustion structures recovered in two pre-roman (celtiberic) archaeological sites in central Spain. The studied materials have been dated by archaeological evidences and supported by five radiocarbon dates. Rock magnetic experiments indicate that the characteristic remanent manetization (ChRM) is carried by a low coercivity magnetic phase with Curie temperatures of 530-575 °C, most likely Ti-poor titanomagnetite/titanomaghemite. Archaeointensity determinations were carried out by using the classical Thellier-Thellier protocol including tests and corrections for magnetic anisotropy and cooling rate dependency. Two magnetic behaviours were depicted during the laboratory treatment. Black potsherds and poor heated samples from the kilns, presented two magnetization components, alterations or curved Arai plots and were therefore rejected. In contrast, well heated specimens (red ceramic fragments and well heated samples from the kilns) show one single well defined component of magnetization going through the origin and linear Arai plots providing successful archaeointensity determinations. The effect of anisotropy of the thermoremanent magnetization (ATRM) on paleointensity analysis was systematically investigated obtaining very high ATRM corrections on fine pottery specimens. In some cases, differences between the uncorrected and ATRM corrected paleointensity values reached up to 86 %. The mean intensity values obtained from three selected set of samples were 64.3 ± 5.8 μT; 56.8 ± 3.8 and 56.7 ± 4.6 μT (NUS2, CI2 and CIA, respectively), which contribute to better understand the evolution of the palaeofield intensity in central Iberia during the 3rd-1st centuries BC. The direction of the field at first century BC has also been determined from oriented samples from CIA kilns (D = 357.2°; I = 62.2°; N = 10, α95 = 2.7°). The new archaeointensity data disagrees with previous results from Iberian ceramics which were not corrected for the ATRM effect. On the contrary, they are in agreement with the most recent French paleointensity curve and the latest European intensity model; both based on a selection of high quality paleointensity data. This result reinforces the idea that the puzzling scatter often observed in the global paleointensity database is likely due to differences in the laboratory protocols. Further data from well-established laboratory protocols are still necessary to delineate confidently the evolution of the geomagnetic palaeofield during the first millennium BC.

  12. Saccadic vector optokinetic perimetry in children with neurodisability or isolated visual pathway lesions: observational cohort study.

    PubMed

    Tailor, Vijay; Glaze, Selina; Unwin, Hilary; Bowman, Richard; Thompson, Graham; Dahlmann-Noor, Annegret

    2016-10-01

    Children and adults with neurological impairments are often not able to access conventional perimetry; however, information about the visual field is valuable. A new technology, saccadic vector optokinetic perimetry (SVOP), may have improved accessibility, but its accuracy has not been evaluated. We aimed to explore accessibility, testability and accuracy of SVOP in children with neurodisability or isolated visual pathway deficits. Cohort study; recruitment October 2013-May 2014, at children's eye clinics at a tertiary referral centre and a regional Child Development Centre; full orthoptic assessment, SVOP (central 30° of the visual field) and confrontation visual fields (CVF). Group 1: age 1-16 years, neurodisability (n=16), group 2: age 10-16 years, confirmed or suspected visual field defect (n=21); group 2 also completed Goldmann visual field testing (GVFT). Group 1: testability with a full 40-point test protocol is 12.5%; with reduced test protocols, testability is 100%, but plots may be clinically meaningless. Children (44%) and parents/carers (62.5%) find the test easy. SVOP and CVF agree in 50%. Group 2: testability is 62% for the 40-point protocol, and 90.5% for reduced protocols. Corneal changes in childhood glaucoma interfere with SVOP testing. All children and parents/carers find SVOP easy. Overall agreement with GVFT is 64.7%. While SVOP is highly accessible to children, many cannot complete a full 40-point test. Agreement with current standard tests is moderate to poor. Abnormal saccades cause an apparent non-specific visual field defect. In children with glaucoma or nystagmus SVOP calibration often fails. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  13. The UK Biobank sample handling and storage protocol for the collection, processing and archiving of human blood and urine.

    PubMed

    Elliott, Paul; Peakman, Tim C

    2008-04-01

    UK Biobank is a large prospective study in the UK to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. Extensive data and biological samples are being collected from 500,000 participants aged between 40 and 69 years. The biological samples that are collected and how they are processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. The aim of the UK Biobank sample handling and storage protocol is to specify methods for the collection and storage of participant samples that give maximum scientific return within the available budget. Processing or storage methods that, as far as can be predicted, will preclude current or future assays have been avoided. The protocol was developed through a review of the literature on sample handling and processing, wide consultation within the academic community and peer review. Protocol development addressed which samples should be collected, how and when they should be processed and how the processed samples should be stored to ensure their long-term integrity. The recommended protocol was extensively tested in a series of validation studies. UK Biobank collects about 45 ml blood and 9 ml of urine with minimal local processing from each participant using the vacutainer system. A variety of preservatives, anti-coagulants and clot accelerators is used appropriate to the expected end use of the samples. Collection of other material (hair, nails, saliva and faeces) was also considered but rejected for the full cohort. Blood and urine samples from participants are transported overnight by commercial courier to a central laboratory where they are processed and aliquots of urine, plasma, serum, white cells and red cells stored in ultra-low temperature archives. Aliquots of whole blood are also stored for potential future production of immortalized cell lines. A standard panel of haematology assays is completed on whole blood from all participants, since such assays need to be conducted on fresh samples (whereas other assays can be done on stored samples). By the end of the recruitment phase, 15 million sample aliquots will be stored in two geographically separate archives: 9.5 million in a -80 degrees C automated archive and 5.5 million in a manual liquid nitrogen archive at -180 degrees C. Because of the size of the study and the numbers of samples obtained from participants, the protocol stipulates a highly automated approach for the processing and storage of samples. Implementation of the processes, technology, systems and facilities has followed best practices used in manufacturing industry to reduce project risk and to build in quality and robustness. The data produced from sample collection, processing and storage are highly complex and are managed by a commercially available LIMS system fully integrated with the entire process. The sample handling and storage protocol adopted by UK Biobank provides quality assured and validated methods that are feasible within the available funding and reflect the size and aims of the project. Experience from recruiting and processing the first 40,000 participants to the study demonstrates that the adopted methods and technologies are fit-for-purpose and robust.

  14. A quarantine protocol for analysis of returned extraterrestrial samples

    NASA Technical Reports Server (NTRS)

    Bagby, J. R.; Sweet, H. C.; Devincenzi, D. L.

    1983-01-01

    A protocol is presented for the analysis at an earth-orbiting quarantine facility of return samples of extraterrestrial material that might contain (nonterrestrial) life forms. The protocol consists of a series of tests designed to determine whether the sample, conceptualized as a 1-kg sample of Martian soil, is free from nonterrestrial biologically active agents and so may safely be sent to a terrestrial containment facility, or it exhibits biological activity requiring further (second-order) testing outside the biosphere. The first-order testing procedure seeks to detect the presence of any replicating organisms or toxic substances through a series of experiments including gas sampling, analysis of radioactivity, stereomicroscopic inspection, chemical analysis, microscopic examination, the search for metabolic products under growth conditions, microbiologicl assays, and the challenge of cultured cells with any agents found or with the extraterrestrial material as is. Detailed plans for the second-order testing would be developed in response to the actual data received from primary testing.

  15. Counting at low concentrations: the statistical challenges of verifying ballast water discharge standards

    USGS Publications Warehouse

    Frazier, Melanie; Miller, A. Whitman; Lee, Henry; Reusser, Deborah A.

    2013-01-01

    Discharge from the ballast tanks of ships is one of the primary vectors of nonindigenous species in marine environments. To mitigate this environmental and economic threat, international, national, and state entities are establishing regulations to limit the concentration of living organisms that may be discharged from the ballast tanks of ships. The proposed discharge standards have ranged from zero detectable organisms to 3. If standard sampling methods are used, verifying whether ballast discharge complies with these stringent standards will be challenging due to the inherent stochasticity of sampling. Furthermore, at low concentrations, very large volumes of water must be sampled to find enough organisms to accurately estimate concentration. Despite these challenges, adequate sampling protocols comprise a critical aspect of establishing standards because they help define the actual risk level associated with a standard. A standard that appears very stringent may be effectively lax if it is paired with an inadequate sampling protocol. We describe some of the statistical issues associated with sampling at low concentrations to help regulators understand the uncertainties of sampling as well as to inform the development of sampling protocols that ensure discharge standards are adequately implemented.

  16. Preparation for frontline end-of-life care: exploring the perspectives of paramedics and emergency medical technicians.

    PubMed

    Waldrop, Deborah P; Clemency, Brian; Maguin, Eugene; Lindstrom, Heather

    2014-03-01

    Prehospital emergency providers (emergency medical technicians [EMTs] and paramedics) who respond to emergency calls for patients near the end of life (EOL) make critical decisions in the field about initiating care and transport to an emergency department. To identify how a sample of prehospital providers learned about EOL care, their perceived confidence with and perspectives on improved preparation for such calls. This descriptive study used a cross-sectional survey design with mixed methods. One hundred seventy-eight prehospital providers (76 EMT-basics and 102 paramedics) from an emergency medical services agency participated. Multiple choice and open-ended survey questions addressed how they learned about EOL calls, their confidence with advance directives, and perspectives on improving care in the field. The response rate was 86%. Education about do-not-resuscitate (DNR) orders was formal (92%), experiential (77%), and self-directed (38%). Education about medical orders for life-sustaining treatment (MOLST) was formal (72%), experiential (67%), and self-directed (25%). Ninety-three percent were confident in upholding a DNR order, 87% were confident interpreting MOLST, and 87% were confident sorting out conflict between differing patient and family wishes. Qualitative data analysis yielded six themes on improving preparation of prehospital providers for EOL calls: (1) prehospital provider education; (2) public education; (3) educating health care providers on scope of practice; (4) conflict resolution skills; (5) handling emotional families; and (6) clarification of transfer protocols. These study results suggest the need for addressing the potential interrelationship between prehospital and EOL care through improved education and protocols for care in the field.

  17. Feasibility and acceptability of the DSM-5 Field Trial procedures in the Johns Hopkins Community Psychiatry Programs.

    PubMed

    Clarke, Diana E; Wilcox, Holly C; Miller, Leslie; Cullen, Bernadette; Gerring, Joan; Greiner, Lisa H; Newcomer, Alison; McKitty, Mellisha V; Regier, Darrel A; Narrow, William E

    2014-06-01

    The Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5) contains criteria for psychiatric diagnoses that reflect advances in the science and conceptualization of mental disorders and address the needs of clinicians. DSM-5 also recommends research on dimensional measures of cross-cutting symptoms and diagnostic severity, which are expected to better capture patients' experiences with mental disorders. Prior to its May 2013 release, the American Psychiatric Association (APA) conducted field trials to examine the feasibility, clinical utility, reliability, and where possible, the validity of proposed DSM-5 diagnostic criteria and dimensional measures. The methods and measures proposed for the DSM-5 field trials were pilot tested in adult and child/adolescent clinical samples, with the goal to identify and correct design and procedural problems with the proposed methods before resources were expended for the larger DSM-5 Field Trials. Results allowed for the refinement of the protocols, procedures, and measures, which facilitated recruitment, implementation, and completion of the DSM-5 Field Trials. These results highlight the benefits of pilot studies in planning large multisite studies. Copyright © 2013, American Psychiatric Association. All rights reserved.

  18. Fragmentation modeling of a resin bonded sand

    NASA Astrophysics Data System (ADS)

    Hilth, William; Ryckelynck, David

    2017-06-01

    Cemented sands exhibit a complex mechanical behavior that can lead to sophisticated models, with numerous parameters without real physical meaning. However, using a rather simple generalized critical state bonded soil model has proven to be a relevant compromise between an easy calibration and good results. The constitutive model formulation considers a non-associated elasto-plastic formulation within the critical state framework. The calibration procedure, using standard laboratory tests, is complemented by the study of an uniaxial compression test observed by tomography. Using finite elements simulations, this test is simulated considering a non-homogeneous 3D media. The tomography of compression sample gives access to 3D displacement fields by using image correlation techniques. Unfortunately these fields have missing experimental data because of the low resolution of correlations for low displacement magnitudes. We propose a recovery method that reconstructs 3D full displacement fields and 2D boundary displacement fields. These fields are mandatory for the calibration of the constitutive parameters by using 3D finite element simulations. The proposed recovery technique is based on a singular value decomposition of available experimental data. This calibration protocol enables an accurate prediction of the fragmentation of the specimen.

  19. Operational Procedures for Collecting Water-Quality Samples at Monitoring Sites on Maple Creek Near Nickerson and the Platte River at Louisville, Eastern Nebraska

    USGS Publications Warehouse

    Johnson, Steven M.; Swanson, Robert B.

    1994-01-01

    Prototype stream-monitoring sites were operated during part of 1992 in the Central Nebraska Basins (CNBR) and three other study areas of the National Water-Quality Assessment (NAWQ) Program of the U.S. Geological Survey. Results from the prototype project provide information needed to operate a net- work of intensive fixed station stream-monitoring sites. This report evaluates operating procedures for two NAWQA prototype sites at Maple Creek near Nickerson and the Platte River at Louisville, eastern Nebraska. Each site was sampled intensively in the spring and late summer 1992, with less intensive sampling in midsummer. In addition, multiple samples were collected during two high- flow periods at the Maple Creek site--one early and the other late in the growing season. Water-samples analyses included determination of pesticides, nutrients, major ions, suspended sediment, and measurements of physical properties. Equipment and protocols for the water-quality sampling procedures were evaluated. Operation of the prototype stream- monitoring sites included development and comparison of onsite and laboratory sample-processing proce- dures. Onsite processing was labor intensive but allowed for immediate preservation of all sampled constituents. Laboratory processing required less field labor and decreased the risk of contamination, but allowed for no immediate preservation of the samples.

  20. Strain-specific quantification of root colonization by plant growth promoting rhizobacteria Bacillus firmus I-1582 and Bacillus amyloliquefaciens QST713 in non-sterile soil and field conditions.

    PubMed

    Mendis, Hajeewaka C; Thomas, Varghese P; Schwientek, Patrick; Salamzade, Rauf; Chien, Jung-Ting; Waidyarathne, Pramuditha; Kloepper, Joseph; De La Fuente, Leonardo

    2018-01-01

    Bacillus amyloliquefaciens QST713 and B. firmus I-1582 are bacterial strains which are used as active ingredients of commercially-available soil application and seed treatment products Serenade® and VOTiVO®, respectively. These bacteria colonize plant roots promoting plant growth and offering protection against pathogens/pests. The objective of this study was to develop a qPCR protocol to quantitate the dynamics of root colonization by these two strains under field conditions. Primers and TaqMan® probes were designed based on genome comparisons of the two strains with publicly-available and unpublished bacterial genomes of the same species. An optimized qPCR protocol was developed to quantify bacterial colonization of corn roots after seed treatment. Treated corn seeds were planted in non-sterile soil in the greenhouse and grown for 28 days. Specific detection of bacteria was quantified weekly, and showed stable colonization between ~104-105 CFU/g during the experimental period for both bacteria, and the protocol detected as low as 103 CFU/g bacteria on roots. In a separate experiment, streptomycin-resistant QST713 and rifampicin-resistant I-1582 strains were used to compare dilution-plating on TSA with the newly developed qPCR method. Results also indicated that the presence of natural microflora and another inoculated strain does not affect root colonization of either one of these strains. The same qPCR protocol was used to quantitate root colonization by QST713 and I-1582 in two corn and two soybean varieties grown in the field. Both bacteria were quantitated up to two weeks after seeds were planted in the field and there were no significant differences in root colonization in either bacteria strain among varieties. Results presented here confirm that the developed qPCR protocol can be successfully used to understand dynamics of root colonization by these bacteria in plants growing in growth chamber, greenhouse and the field.

  1. Benchmarking of protein carbonylation analysis in Caenorhabditis elegans: specific considerations and general advice.

    PubMed

    Pyr Dit Ruys, S; Bonzom, J-M; Frelon, S

    2016-10-01

    Oxidative stress has been extensively studied due to its correlation with cellular disorders and aging. In proteins, one biomarker of oxidative stress is the presence of carbonyl groups, such as aldehyde and ketone, in specific amino acid side chains such as lysine, proline, arginine and threonine, so-called protein carbonylation (PC). PC study is now a growing field in general and medical science since PC accumulation is associated with various pathologies and disorders. At present, enzyme-linked immunosorbent assays (ELISA) seem to be the most robust method of quantifying the presence of carbonyl groups in proteins, despite having some recognised caveats. In parallel, gel-based approaches present cross-comparison difficulties, along with other technical problems. As generic PC analyses still suffer from poor homogeneity, leading to cross-data analysis difficulties and poor results overlap, the need for harmonisation in the field of carbonyl detection is now widely accepted. This study aims to highlight some of the technical challenges in proteomic gel-based multiplexing experiments when dealing with PC in difficult samples like those from Caenorhabditis elegans, from protein extraction to carbonyl detection. We demonstrate that some critical technical parameters, such as labelling time, probe concentration, and total and carbonylated protein recovery rates, should be re-addressed in a sample-specific way. We also defined a procedure to cost-effectively adapt CyDye™-hydrazide-based protocols to specific samples, especially when the experimental interest is focused on studying differences between stimulating conditions with a maximised signal-to-noise ratio. Moreover, we have improved an already-existing powerful solubilisation buffer, making it potentially useful for hard-to-solubilise protein pellets. Lastly, the depicted methodology exemplifies a simple way of normalising carbonyl-related signal to total protein in SDS-PAGE multiplexing experiments. Within that scope, we also proposed a simple way to quantify carbonyl groups by on-gel spotting diluted dye-containing labelling buffer. Proof of the robustness of the procedure was also highlighted by the high linear correlation between the level of carbonyls and the ultraviolet exposure duration of whole worms (R 2 =0.993). Altogether, these results will help to standardise existing protocols in the growing field of proteomic carbonylation studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Efficient Genome-Wide Sequencing and Low-Coverage Pedigree Analysis from Noninvasively Collected Samples

    PubMed Central

    Snyder-Mackler, Noah; Majoros, William H.; Yuan, Michael L.; Shaver, Amanda O.; Gordon, Jacob B.; Kopp, Gisela H.; Schlebusch, Stephen A.; Wall, Jeffrey D.; Alberts, Susan C.; Mukherjee, Sayan; Zhou, Xiang; Tung, Jenny

    2016-01-01

    Research on the genetics of natural populations was revolutionized in the 1990s by methods for genotyping noninvasively collected samples. However, these methods have remained largely unchanged for the past 20 years and lag far behind the genomics era. To close this gap, here we report an optimized laboratory protocol for genome-wide capture of endogenous DNA from noninvasively collected samples, coupled with a novel computational approach to reconstruct pedigree links from the resulting low-coverage data. We validated both methods using fecal samples from 62 wild baboons, including 48 from an independently constructed extended pedigree. We enriched fecal-derived DNA samples up to 40-fold for endogenous baboon DNA and reconstructed near-perfect pedigree relationships even with extremely low-coverage sequencing. We anticipate that these methods will be broadly applicable to the many research systems for which only noninvasive samples are available. The lab protocol and software (“WHODAD”) are freely available at www.tung-lab.org/protocols-and-software.html and www.xzlab.org/software.html, respectively. PMID:27098910

  3. Monitoring microbiological changes in drinking water systems using a fast and reproducible flow cytometric method.

    PubMed

    Prest, E I; Hammes, F; Kötzsch, S; van Loosdrecht, M C M; Vrouwenvelder, J S

    2013-12-01

    Flow cytometry (FCM) is a rapid, cultivation-independent tool to assess and evaluate bacteriological quality and biological stability of water. Here we demonstrate that a stringent, reproducible staining protocol combined with fixed FCM operational and gating settings is essential for reliable quantification of bacteria and detection of changes in aquatic bacterial communities. Triplicate measurements of diverse water samples with this protocol typically showed relative standard deviation values and 95% confidence interval values below 2.5% on all the main FCM parameters. We propose a straightforward and instrument-independent method for the characterization of water samples based on the combination of bacterial cell concentration and fluorescence distribution. Analysis of the fluorescence distribution (or so-called fluorescence fingerprint) was accomplished firstly through a direct comparison of the raw FCM data and subsequently simplified by quantifying the percentage of large and brightly fluorescent high nucleic acid (HNA) content bacteria in each sample. Our approach enables fast differentiation of dissimilar bacterial communities (less than 15 min from sampling to final result), and allows accurate detection of even small changes in aquatic environments (detection above 3% change). Demonstrative studies on (a) indigenous bacterial growth in water, (b) contamination of drinking water with wastewater, (c) household drinking water stagnation and (d) mixing of two drinking water types, univocally showed that this FCM approach enables detection and quantification of relevant bacterial water quality changes with high sensitivity. This approach has the potential to be used as a new tool for application in the drinking water field, e.g. for rapid screening of the microbial water quality and stability during water treatment and distribution in networks and premise plumbing. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Simultaneous dense coding affected by fluctuating massless scalar field

    NASA Astrophysics Data System (ADS)

    Huang, Zhiming; Ye, Yiyong; Luo, Darong

    2018-04-01

    In this paper, we investigate the simultaneous dense coding (SDC) protocol affected by fluctuating massless scalar field. The noisy model of SDC protocol is constructed and the master equation that governs the SDC evolution is deduced. The success probabilities of SDC protocol are discussed for different locking operators under the influence of vacuum fluctuations. We find that the joint success probability is independent of the locking operators, but other success probabilities are not. For quantum Fourier transform and double controlled-NOT operators, the success probabilities drop with increasing two-atom distance, but SWAP operator is not. Unlike the SWAP operator, the success probabilities of Bob and Charlie are different. For different noisy interval values, different locking operators have different robustness to noise.

  5. Research and development of a field-ready protocol for sampling of phosgene from stationary source emissions: Diethylamine reagent studies. Research report, 11 July 1995--30 September 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steger, J.L.; Bursey, J.T.; Merrill, R.G.

    1999-03-01

    This report presents the results of laboratory studies to develop and evaluate a method for the sampling and analysis of phosgene from stationary sources of air emissions using diethylamine (DEA) in toluene as the collection media. The method extracts stack gas from emission sources and stabilizes the reactive gas for subsequent analysis. DEA was evaluated both in a benchtop study and in a laboratory train spiking study. This report includes results for both the benchtop study and the train spiking study. Benchtop studies to evaluate the suitability of DEA for collecting and analyzing phosgene investigated five variables: storage time, DEAmore » concentration, moisture/pH, phosgene concentration, and sample storage temperature. Prototype sampling train studies were performed to determine if the benchtop chemical studies were transferable to a Modified Method 5 sampling train collecting phosgene in the presence of clean air mixed with typical stack gas components. Four conditions, which varied the moisture and phosgene spike were evaluated in triplicate. In addition to research results, the report includes a detailed draft method for sampling and analysis of phosgene from stationary source emissions.« less

  6. Sediment Sampling in Estuarine Mudflats with an Aerial-Ground Robotic Team

    PubMed Central

    Deusdado, Pedro; Guedes, Magno; Silva, André; Marques, Francisco; Pinto, Eduardo; Rodrigues, Paulo; Lourenço, André; Mendonça, Ricardo; Santana, Pedro; Corisco, José; Almeida, Susana Marta; Portugal, Luís; Caldeira, Raquel; Barata, José; Flores, Luis

    2016-01-01

    This paper presents a robotic team suited for bottom sediment sampling and retrieval in mudflats, targeting environmental monitoring tasks. The robotic team encompasses a four-wheel-steering ground vehicle, equipped with a drilling tool designed to be able to retain wet soil, and a multi-rotor aerial vehicle for dynamic aerial imagery acquisition. On-demand aerial imagery, properly fused on an aerial mosaic, is used by remote human operators for specifying the robotic mission and supervising its execution. This is crucial for the success of an environmental monitoring study, as often it depends on human expertise to ensure the statistical significance and accuracy of the sampling procedures. Although the literature is rich on environmental monitoring sampling procedures, in mudflats, there is a gap as regards including robotic elements. This paper closes this gap by also proposing a preliminary experimental protocol tailored to exploit the capabilities offered by the robotic system. Field trials in the south bank of the river Tagus’ estuary show the ability of the robotic system to successfully extract and transport bottom sediment samples for offline analysis. The results also show the efficiency of the extraction and the benefits when compared to (conventional) human-based sampling. PMID:27618060

  7. Current density imaging sequence for monitoring current distribution during delivery of electric pulses in irreversible electroporation.

    PubMed

    Serša, Igor; Kranjc, Matej; Miklavčič, Damijan

    2015-01-01

    Electroporation is gaining its importance in everyday clinical practice of cancer treatment. For its success it is extremely important that coverage of the target tissue, i.e. treated tumor, with electric field is within the specified range. Therefore, an efficient tool for the electric field monitoring in the tumor during delivery of electroporation pulses is needed. The electric field can be reconstructed by the magnetic resonance electric impedance tomography method from current density distribution data. In this study, the use of current density imaging with MRI for monitoring current density distribution during delivery of irreversible electroporation pulses was demonstrated. Using a modified single-shot RARE sequence, where four 3000 V and 100 μs long pulses were included at the start, current distribution between a pair of electrodes inserted in a liver tissue sample was imaged. Two repetitions of the sequence with phases of refocusing radiofrequency pulses 90° apart were needed to acquire one current density image. For each sample in total 45 current density images were acquired to follow a standard protocol for irreversible electroporation where 90 electric pulses are delivered at 1 Hz. Acquired current density images showed that the current density in the middle of the sample increased from first to last electric pulses by 60%, i.e. from 8 kA/m2 to 13 kA/m2 and that direction of the current path did not change with repeated electric pulses significantly. The presented single-shot RARE-based current density imaging sequence was used successfully to image current distribution during delivery of short high-voltage electric pulses. The method has a potential to enable monitoring of tumor coverage by electric field during irreversible electroporation tissue ablation.

  8. Energy neutral protocol based on hierarchical routing techniques for energy harvesting wireless sensor network

    NASA Astrophysics Data System (ADS)

    Muhammad, Umar B.; Ezugwu, Absalom E.; Ofem, Paulinus O.; Rajamäki, Jyri; Aderemi, Adewumi O.

    2017-06-01

    Recently, researchers in the field of wireless sensor networks have resorted to energy harvesting techniques that allows energy to be harvested from the ambient environment to power sensor nodes. Using such Energy harvesting techniques together with proper routing protocols, an Energy Neutral state can be achieved so that sensor nodes can run perpetually. In this paper, we propose an Energy Neutral LEACH routing protocol which is an extension to the traditional LEACH protocol. The goal of the proposed protocol is to use Gateway node in each cluster so as to reduce the data transmission ranges of cluster head nodes. Simulation results show that the proposed routing protocol achieves a higher throughput and ensure the energy neutral status of the entire network.

  9. Preventing disease transmission by deceased tissue donors by testing blood for viral nucleic acid.

    PubMed

    Strong, D Michael; Nelson, Karen; Pierce, Marge; Stramer, Susan L

    2005-01-01

    Nucleic acid testing (NAT) has reduced the risk of transmitting infectious disease through blood transfusion. Currently NAT for HIV-1 and HCV are FDA licensed and performed by nearly all blood collection facilities, but HBV NAT is performed under an investigational study protocol. Residual risk estimates indicate that NAT could potentially reduce disease transmission through transplanted tissue. However, tissue donor samples obtained post-mortem have the potential to produce an invalid NAT result due to inhibition of amplification reactions by hemolysis and other factors. The studies reported here summarize the development of protocols to allow NAT of deceased donor samples with reduced rates of invalid results. Using these protocols, inventories from two tissue centers were tested with greater than 99% of samples producing a valid test result.

  10. Short Review on Quantum Key Distribution Protocols.

    PubMed

    Giampouris, Dimitris

    2017-01-01

    Cryptographic protocols and mechanisms are widely investigated under the notion of quantum computing. Quantum cryptography offers particular advantages over classical ones, whereas in some cases established protocols have to be revisited in order to maintain their functionality. The purpose of this paper is to provide the basic definitions and review the most important theoretical advancements concerning the BB84 and E91 protocols. It also aims to offer a summary on some key developments on the field of quantum key distribution, closely related with the two aforementioned protocols. The main goal of this study is to provide the necessary background information along with a thorough review on the theoretical aspects of QKD, concentrating on specific protocols. The BB84 and E91 protocols have been chosen because most other protocols are similar to these, a fact that makes them important for the general understanding of how the QKD mechanism functions.

  11. Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conover, David R.; Crawford, Aladsair J.; Viswanathan, Vilayanur V.

    2014-06-01

    The Protocol for Uniformly Measuring and Expressing the Performance of Energy Storage Systems (PNNL-22010) was first issued in November 2012 as a first step toward providing a foundational basis for developing an initial standard for the uniform measurement and expression of energy storage system (ESS) performance. Its subsequent use in the field and review by the protocol working group and most importantly the users’ subgroup and the thermal subgroup has led to the fundamental modifications reflected in this update of the 2012 Protocol. As an update of the 2012 Protocol, this document (the June 2014 Protocol) is intended to supersedemore » its predecessor and be used as the basis for measuring and expressing ESS performance. The foreword provides general and specific details about what additions, revisions, and enhancements have been made to the 2012 Protocol and the rationale for them in arriving at the June 2014 Protocol.« less

  12. Geochemical and mineralogical data for soils of the conterminous United States

    USGS Publications Warehouse

    Smith, David B.; Cannon, William F.; Woodruff, Laurel G.; Solano, Federico; Kilburn, James E.; Fey, David L.

    2013-01-01

    In 2007, the U.S. Geological Survey initiated a low-density (1 site per 1,600 square kilometers, 4,857 sites) geochemical and mineralogical survey of soils of the conterminous United States as part of the North American Soil Geochemical Landscapes Project. Sampling and analytical protocols were developed at a workshop in 2003, and pilot studies were conducted from 2004 to 2007 to test and refine these recommended protocols. The final sampling protocol for the national-scale survey included, at each site, a sample from a depth of 0 to 5 centimeters, a composite of the soil A horizon, and a deeper sample from the soil C horizon or, if the top of the C horizon was at a depth greater than 1 meter, from a depth of approximately 80–100 centimeters. The <2-millimeter fraction of each sample was analyzed for a suite of 45 major and trace elements by methods that yield the total or near-total elemental content. The major mineralogical components in the samples from the soil A and C horizons were determined by a quantitative X-ray diffraction method using Rietveld refinement. Sampling in the conterminous United States was completed in 2010, with chemical and mineralogical analyses completed in May 2013. The resulting dataset provides an estimate of the abundance and spatial distribution of chemical elements and minerals in soils of the conterminous United States and represents a baseline for soil geochemistry and mineralogy against which future changes may be recognized and quantified. This report (1) describes the sampling, sample preparation, and analytical methods used; (2) gives details of the quality control protocols used to monitor the quality of chemical and mineralogical analyses over approximately six years; and (3) makes available the soil geochemical and mineralogical data in downloadable tables.

  13. Toward establishing model organisms for marine protists: Successful transfection protocols for Parabodo caudatus (Kinetoplastida: Excavata).

    PubMed

    Gomaa, Fatma; Garcia, Paulo A; Delaney, Jennifer; Girguis, Peter R; Buie, Cullen R; Edgcomb, Virginia P

    2017-09-01

    We developed protocols for, and demonstrated successful transfection of, the free-living kinetoplastid flagellate Parabodo caudatus with three plasmids carrying a fluorescence reporter gene (pEF-GFP with the EF1 alpha promoter, pUB-GFP with Ubiquitin C promoter, and pEYFP-Mitotrap with CMV promoter). We evaluated three electroporation approaches: (1) a square-wave electroporator designed for eukaryotes, (2) a novel microfluidic transfection system employing hydrodynamically-controlled electric field waveforms, and (3) a traditional exponential decay electroporator. We found the microfluidic device provides a simple and efficient platform to quickly test a wide range of electric field parameters to find the optimal set of conditions for electroporation of target species. It also allows for processing large sample volumes (>10 ml) within minutes, increasing throughput 100 times over cuvettes. Fluorescence signal from the reporter gene was detected a few hours after transfection and persisted for 3 days in cells transfected by pEF-GFP and pUB-GFP plasmids and for at least 5 days post-transfection for cells transfected with pEYFP-Mitotrap. Expression of the reporter genes (GFP and YFP) was also confirmed using reverse transcription-PCR (RT-PCR). This work opens the door for further efforts with this taxon and close relatives toward establishing model systems for genome editing. © 2017 Society for Applied Microbiology and John Wiley & Sons Ltd.

  14. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    NASA Astrophysics Data System (ADS)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  15. Signatures of two-step impurity mediated vortex lattice melting in Bose-Einstein condensate

    NASA Astrophysics Data System (ADS)

    Dey, Bishwajyoti

    2017-04-01

    We study impurity mediated vortex lattice melting in a rotating two-dimensional Bose-Einstein condensate (BEC). Impurities are introduced either through a protocol in which vortex lattice is produced in an impurity potential or first creating the vortex lattice in the absence of random pinning and then cranking up the impurity potential. These two protocols have obvious relation with the two commonly known protocols of creating vortex lattice in a type-II superconductor: zero field cooling protocol and the field cooling protocol respectively. Time-splitting Crank-Nicolson method has been used to numerically simulate the vortex lattice dynamics. It is shown that the vortex lattice follows a two-step melting via loss of positional and orientational order. This vortex lattice melting process in BEC closely mimics the recently observed two-step melting of vortex matter in weakly pinned type-II superconductor Co-intercalated NbSe2. Also, using numerical perturbation analysis, we compare between the states obtained in two protocols and show that the vortex lattice states are metastable and more disordered when impurities are introduced after the formation of an ordered vortex lattice. The author would like to thank SERB, Govt. of India and BCUD-SPPU for financial support through research Grants.

  16. Interpreting and Reporting Radiological Water-Quality Data

    USGS Publications Warehouse

    McCurdy, David E.; Garbarino, John R.; Mullin, Ann H.

    2008-01-01

    This document provides information to U.S. Geological Survey (USGS) Water Science Centers on interpreting and reporting radiological results for samples of environmental matrices, most notably water. The information provided is intended to be broadly useful throughout the United States, but it is recommended that scientists who work at sites containing radioactive hazardous wastes need to consult additional sources for more detailed information. The document is largely based on recognized national standards and guidance documents for radioanalytical sample processing, most notably the Multi-Agency Radiological Laboratory Analytical Protocols Manual (MARLAP), and on documents published by the U.S. Environmental Protection Agency and the American National Standards Institute. It does not include discussion of standard USGS practices including field quality-control sample analysis, interpretive report policies, and related issues, all of which shall always be included in any effort by the Water Science Centers. The use of 'shall' in this report signifies a policy requirement of the USGS Office of Water Quality.

  17. Prospects for phenological monitoring in an arid southwestern U.S. rangeland using field observations with hyperspatial and moderate resolution imagery

    NASA Astrophysics Data System (ADS)

    Browning, D. M.; Laliberte, A. S.; Rango, A.; Herrick, J. E.

    2009-12-01

    Relating field observations of plant phenological events to remotely sensed depictions of land surface phenology remains a challenge to the vertical integration of data from disparate sources. This research conducted at the Jornada Basin Long-Term Ecological Research site in southern New Mexico capitalizes on legacy datasets pertaining to reproductive phenology and biomass and hyperspatial imagery. Large amounts of exposed bare soil and modest cover from shrubs and grasses in these arid and semi-arid ecosystems challenge the integration of field observations of phenology and remotely sensed data to monitor changes in land surface phenology. Drawing on established field protocols for reproductive phenology, hyperspatial imagery (4 cm), and object-based image analysis, we explore the utility of two approaches to scale detailed observations (i.e., field and 4 cm imagery) to the extent of long-term field plots (50 x 50m) and moderate resolution Landsat Thematic Mapper (TM) imagery (30 x 30m). Very high resolution color-infrared imagery was collected June 2007 across 15 LTER study sites that transect five distinct vegetation communities along a continuum of grass to shrub dominance. We examined two methods for scaling spectral vegetation indices (SVI) at 4 cm resolution: pixel averaging and object-based integration. Pixel averaging yields the mean SVI value for all pixels within the plot or TM pixel. Alternatively, the object-based method is based on a weighted average of SVI values that correspond to discrete image objects (e.g., individual shrubs or grass patches). Object-based image analysis of 4 cm imagery provides a detailed depiction of ground cover and allows us to extract species-specific contributions to upscaled SVI values. The ability to discern species- or functional-group contributions to remotely sensed signals of vegetation greenness can greatly enhance the design of field sampling protocols for phenological research. Furthermore, imagery from unmanned aerial vehicles (UAV) is a cost-effective and increasingly available resource and generation of UAV mosaics has been accomplished so that larger study areas can be addressed. This technology can provide a robust basis for scaling relationships for phenology-based research applications.

  18. Anti-malarial drug quality in Lagos and Accra - a comparison of various quality assessments

    PubMed Central

    2010-01-01

    Background Two major cities in West Africa, Accra, the capital of Ghana, and Lagos, the largest city of Nigeria, have significant problems with substandard pharmaceuticals. Both have actively combated the problem in recent years, particularly by screening products on the market using the Global Pharma Health Fund e.V. Minilab® protocol. Random sampling of medicines from the two cities at least twice over the past 30 months allows a tentative assessment of whether improvements in drug quality have occurred. Since intelligence provided by investigators indicates that some counterfeit producers may be adapting products to pass Minilab tests, the results are compared with those from a Raman spectrometer and discrepancies are discussed. Methods Between mid-2007 and early-2010, samples of anti-malarial drugs were bought covertly from pharmacies in Lagos on three different occasions (October 2007, December 2008, February 2010), and from pharmacies in Accra on two different occasions (October 2007, February 2010). All samples were tested using the Minilab® protocol, which includes disintegration and active ingredient assays as well as visual inspection, and most samples were also tested by Raman spectrometry. Results In Lagos, the failure rate in the 2010 sampling fell to 29% of the 2007 finding using the Minilab® protocol, 53% using Raman spectrometry, and 46% using visual inspection. In Accra, the failure rate in the 2010 sampling fell to 54% of the 2007 finding using the Minilab® protocol, 72% using Raman spectrometry, and 90% using visual inspection. Conclusions The evidence presented shows that drug quality is probably improving in both cities, especially Lagos, since major reductions of failure rates over time occur with all means of assessment. Many more samples failed when examined by Raman spectrometry than by Minilab® protocol. The discrepancy is most likely caused by the two techniques measuring different aspects of the medication and hence the discrepancy may be the natural variation in these techniques. But other explanations are possible and are discussed. PMID:20537190

  19. Efficacy of a polyvalent mastitis vaccine against Staphylococcus aureus on a dairy Mediterranean buffalo farm: results of two clinical field trials.

    PubMed

    Guccione, Jacopo; Pesce, Antonella; Pascale, Massimo; Salzano, Caterina; Tedeschi, Gianni; D'Andrea, Luigi; De Rosa, Angela; Ciaramella, Paolo

    2017-01-19

    In the last years the knowledges on Mediterranean Buffalo (MB) mastitis is remarkably improving, nevertheless the attention has been never focused on vaccination as preventive strategy for the control of mastitis in these ruminates. The aim of the current study was to assess clinical efficacy over time of two different preventive vaccination protocols against S. aureus mastitis, in primiparous MB.Vaccinated (VG) and not-vaccinated (N-VG) groups, of 30 MB each one, were selected from two different herds (herd A: VG1 and N-VG1; herd B: VG2 and N-VG2) of the same farm. Herd A received a double vaccination (Startvac®, 45 and 10 days before calving, protocol A), while in herd B an additional administration was performed (52 days after calving, protocol B). Bacteriological milk culture and assessment of somatic cell count (SCC) were performed at 10, 30, 60 and 90 days in milk (DIM) from composite milk samples. After 90 DIM, daily milk yields and SCC values were monthly detected until dry-off. The overall incidence of positive MB for S. aureus was 40.8% (49/120) in VG1 and 43.3% (52/120) in N-VG1 (Protocol A), while 45.8% (55/120) and 50.8% (61/120) in VG2 and N-VG2 (Protocol B). The latter was associated with a significant decreased in prevalence (at 90 DIM) and incidence of mastitis (animals positive for S. aureus, SCC > 200^10 3 , with or without clinical signs) in the vaccinated MB, while no difference occurred in protocol A. Moreover, herd B showed a significant reduction in prevalence of intramammary infection (animals positive for S. aureus, SCC < 200^10 3 , no clinical signs) in the vaccinated MB at 60 DIM while no differences were detected in herd A, at any sampling time; N-VG2 had significantly higher overall SCC values than VG2 (4.97 ± 4.75 and 4.84 ± 4.60 Log10 cells/mL ± standard deviation, respectively), while no differences were recorded in herd A. The current investigation explores for the first time the clinical efficacy of vaccinations against S. aureus infections in MB, showing encouraging results regarding reduction in mastitis and somatic cell count; the polyvalent mastitis vaccine may be considered an additional tool for in-herd S aureus infection and should be associated to other control procedures to maximize its properties.

  20. Human immunodeficiency virus bDNA assay for pediatric cases.

    PubMed

    Avila, M M; Liberatore, D; Martínez Peralta, L; Biglione, M; Libonatti, O; Coll Cárdenas, P; Hodara, V L

    2000-01-01

    Techniques to quantify plasma HIV-1 RNA viral load (VL) are commercially available, and they are adequate for monitoring adults infected by HIV and treated with antiretroviral drugs. Little experience on HIV VL has been reported in pediatric cases. In Argentina, the evaluation of several assays for VL in pediatrics are now being considered. To evaluate the pediatric protocol for bDNA assay in HIV-infected children, 25 samples from HIV-infected children (according to CDC criteria for pediatric AIDS) were analyzed by using Quantiplex HIV RNA 2.0 Assay (Chiron Corporation) following the manufacturer's recommendations in a protocol that uses 50 microliters of patient's plasma (sensitivity: 10,000 copies/ml). When HIV-RNA was not detected, samples were run with the 1 ml standard bDNA protocol (sensitivity: 500 HIV-RNA c/ml). Nine samples belonged to infants under 12 months of age (group A) and 16 were over 12 months (group B). All infants under one year of age had high HIV-RNA copies in plasma. VL ranged from 30,800 to 2,560,000 RNA copies/ml (median = 362,000 c/ml) for group A and < 10,000 to 554,600 c/ml (median = < 10,000) for group B. Only 25% of children in group B had detectable HIV-RNA. By using the standard test of quantification, none of the patients had non detectable HIV-RNA, ranging between 950 and 226,200 c/ml for group B (median = 23,300 RNA c/ml). The suggested pediatric protocol could be useful in children under 12 months of age, but 1 ml standard protocol must be used for older children. Samples with undetectable results from children under one year of age should be repeated using the standard protocol.

Top