Processing Protocol for Soil Samples Potentially ...
Method Operating Procedures This protocol describes the processing steps for 45 g and 9 g soil samples potentially contaminated with Bacillus anthracis spores. The protocol is designed to separate and concentrate the spores from bulk soil down to a pellet that can be used for further analysis. Soil extraction solution and mechanical shaking are used to disrupt soil particle aggregates and to aid in the separation of spores from soil particles. Soil samples are washed twice with soil extraction solution to maximize recovery. Differential centrifugation is used to separate spores from the majority of the soil material. The 45 g protocol has been demonstrated by two laboratories using both loamy and sandy soil types. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol would be robust enough to use at multiple laboratories while achieving comparable recoveries. The 45 g protocol has demonstrated a matrix limit of detection at 14 spores/gram of soil for loamy and sandy soils.
Silvestri, Erin E.; Griffin, Dale W.
2017-01-01
This protocol describes the processing steps for 45 g and 9 g soil samples potentially contaminated with Bacillus anthracis spores. The protocol is designed to separate and concentrate the spores from bulk soil down to a pellet that can be used for further analysis. Soil extraction solution and mechanical shaking are used to disrupt soil particle aggregates and to aid in the separation of spores from soil particles. Soil samples are washed twice with soil extraction solution to maximize recovery. Differential centrifugation is used to separate spores from the majority of the soil material. The 45 g protocol has been demonstrated by two laboratories using both loamy and sandy soil types. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol would be robust enough to use at multiple laboratories while achieving comparable recoveries. The 45 g protocol has demonstrated a matrix limit of detection at 14 spores/gram of soil for loamy and sandy soils.
Optimization of a Sample Processing Protocol for Recovery of ...
Journal Article Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps.
Laurin, Nancy; DeMoors, Anick; Frégeau, Chantal
2012-09-01
Direct amplification of STR loci from biological samples collected on FTA cards without prior DNA purification was evaluated using Identifiler Direct and PowerPlex 16 HS in conjunction with the use of a high throughput Applied Biosystems 3730 DNA Analyzer. In order to reduce the overall sample processing cost, reduced PCR volumes combined with various FTA disk sizes were tested. Optimized STR profiles were obtained using a 0.53 mm disk size in 10 μL PCR volume for both STR systems. These protocols proved effective in generating high quality profiles on the 3730 DNA Analyzer from both blood and buccal FTA samples. Reproducibility, concordance, robustness, sample stability and profile quality were assessed using a collection of blood and buccal samples on FTA cards from volunteer donors as well as from convicted offenders. The new developed protocols offer enhanced throughput capability and cost effectiveness without compromising the robustness and quality of the STR profiles obtained. These results support the use of these protocols for processing convicted offender samples submitted to the National DNA Data Bank of Canada. Similar protocols could be applied to the processing of casework reference samples or in paternity or family relationship testing. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Metabolic profiling of body fluids and multivariate data analysis.
Trezzi, Jean-Pierre; Jäger, Christian; Galozzi, Sara; Barkovits, Katalin; Marcus, Katrin; Mollenhauer, Brit; Hiller, Karsten
2017-01-01
Metabolome analyses of body fluids are challenging due pre-analytical variations, such as pre-processing delay and temperature, and constant dynamical changes of biochemical processes within the samples. Therefore, proper sample handling starting from the time of collection up to the analysis is crucial to obtain high quality samples and reproducible results. A metabolomics analysis is divided into 4 main steps: 1) Sample collection, 2) Metabolite extraction, 3) Data acquisition and 4) Data analysis. Here, we describe a protocol for gas chromatography coupled to mass spectrometry (GC-MS) based metabolic analysis for biological matrices, especially body fluids. This protocol can be applied on blood serum/plasma, saliva and cerebrospinal fluid (CSF) samples of humans and other vertebrates. It covers sample collection, sample pre-processing, metabolite extraction, GC-MS measurement and guidelines for the subsequent data analysis. Advantages of this protocol include: •Robust and reproducible metabolomics results, taking into account pre-analytical variations that may occur during the sampling process•Small sample volume required•Rapid and cost-effective processing of biological samples•Logistic regression based determination of biomarker signatures for in-depth data analysis.
Elliott, Paul; Peakman, Tim C
2008-04-01
UK Biobank is a large prospective study in the UK to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. Extensive data and biological samples are being collected from 500,000 participants aged between 40 and 69 years. The biological samples that are collected and how they are processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. The aim of the UK Biobank sample handling and storage protocol is to specify methods for the collection and storage of participant samples that give maximum scientific return within the available budget. Processing or storage methods that, as far as can be predicted, will preclude current or future assays have been avoided. The protocol was developed through a review of the literature on sample handling and processing, wide consultation within the academic community and peer review. Protocol development addressed which samples should be collected, how and when they should be processed and how the processed samples should be stored to ensure their long-term integrity. The recommended protocol was extensively tested in a series of validation studies. UK Biobank collects about 45 ml blood and 9 ml of urine with minimal local processing from each participant using the vacutainer system. A variety of preservatives, anti-coagulants and clot accelerators is used appropriate to the expected end use of the samples. Collection of other material (hair, nails, saliva and faeces) was also considered but rejected for the full cohort. Blood and urine samples from participants are transported overnight by commercial courier to a central laboratory where they are processed and aliquots of urine, plasma, serum, white cells and red cells stored in ultra-low temperature archives. Aliquots of whole blood are also stored for potential future production of immortalized cell lines. A standard panel of haematology assays is completed on whole blood from all participants, since such assays need to be conducted on fresh samples (whereas other assays can be done on stored samples). By the end of the recruitment phase, 15 million sample aliquots will be stored in two geographically separate archives: 9.5 million in a -80 degrees C automated archive and 5.5 million in a manual liquid nitrogen archive at -180 degrees C. Because of the size of the study and the numbers of samples obtained from participants, the protocol stipulates a highly automated approach for the processing and storage of samples. Implementation of the processes, technology, systems and facilities has followed best practices used in manufacturing industry to reduce project risk and to build in quality and robustness. The data produced from sample collection, processing and storage are highly complex and are managed by a commercially available LIMS system fully integrated with the entire process. The sample handling and storage protocol adopted by UK Biobank provides quality assured and validated methods that are feasible within the available funding and reflect the size and aims of the project. Experience from recruiting and processing the first 40,000 participants to the study demonstrates that the adopted methods and technologies are fit-for-purpose and robust.
Optimization of a sample processing protocol for recovery of Bacillus anthracis spores from soil
Silvestri, Erin E.; Feldhake, David; Griffin, Dale; Lisle, John T.; Nichols, Tonya L.; Shah, Sanjiv; Pemberton, A; Schaefer III, Frank W
2016-01-01
Following a release of Bacillus anthracis spores into the environment, there is a potential for lasting environmental contamination in soils. There is a need for detection protocols for B. anthracis in environmental matrices. However, identification of B. anthracis within a soil is a difficult task. Processing soil samples helps to remove debris, chemical components, and biological impurities that can interfere with microbiological detection. This study aimed to optimize a previously used indirect processing protocol, which included a series of washing and centrifugation steps. Optimization of the protocol included: identifying an ideal extraction diluent, variation in the number of wash steps, variation in the initial centrifugation speed, sonication and shaking mechanisms. The optimized protocol was demonstrated at two laboratories in order to evaluate the recovery of spores from loamy and sandy soils. The new protocol demonstrated an improved limit of detection for loamy and sandy soils over the non-optimized protocol with an approximate matrix limit of detection at 14 spores/g of soil. There were no significant differences overall between the two laboratories for either soil type, suggesting that the processing protocol will be robust enough to use at multiple laboratories while achieving comparable recoveries.
Rapid Waterborne Pathogen Detection with Mobile Electronics.
Wu, Tsung-Feng; Chen, Yu-Chen; Wang, Wei-Chung; Kucknoor, Ashwini S; Lin, Che-Jen; Lo, Yu-Hwa; Yao, Chun-Wei; Lian, Ian
2017-06-09
Pathogen detection in water samples, without complex and time consuming procedures such as fluorescent-labeling or culture-based incubation, is essential to public safety. We propose an immunoagglutination-based protocol together with the microfluidic device to quantify pathogen levels directly from water samples. Utilizing ubiquitous complementary metal-oxide-semiconductor (CMOS) imagers from mobile electronics, a low-cost and one-step reaction detection protocol is developed to enable field detection for waterborne pathogens. 10 mL of pathogen-containing water samples was processed using the developed protocol including filtration enrichment, immune-reaction detection and imaging processing. The limit of detection of 10 E. coli O157:H7 cells/10 mL has been demonstrated within 10 min of turnaround time. The protocol can readily be integrated into a mobile electronics such as smartphones for rapid and reproducible field detection of waterborne pathogens.
Granato, G.E.; Smith, K.P.
1999-01-01
Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.
Fell, Shari; Bröckl, Stephanie; Büttner, Mathias; Rettinger, Anna; Zimmermann, Pia; Straubinger, Reinhard K
2016-09-15
Bovine tuberculosis (bTB), which is caused by Mycobacterium bovis and M. caprae, is a notifiable animal disease in Germany. Diagnostic procedure is based on a prescribed protocol that is published in the framework of German bTB legislation. In this protocol small sample volumes are used for DNA extraction followed by real-time PCR analyses. As mycobacteria tend to concentrate in granuloma and the infected tissue in early stages of infection does not necessarily show any visible lesions, it is likely that DNA extraction from only small tissue samples (20-40 mg) of a randomly chosen spot from the organ and following PCR testing may result in false negative results. In this study two DNA extraction methods were developed to process larger sample volumes to increase the detection sensitivity of mycobacterial DNA in animal tissue. The first extraction method is based on magnetic capture, in which specific capture oligonucleotides were utilized. These nucleotides are linked to magnetic particles and capture Mycobacterium-tuberculosis-complex (MTC) DNA released from 10 to 15 g of tissue material. In a second approach remaining sediments from the magnetic capture protocol were further processed with a less complex extraction protocol that can be used in daily routine diagnostics. A total number of 100 tissue samples from 34 cattle (n = 74) and 18 red deer (n = 26) were analyzed with the developed protocols and results were compared to the prescribed protocol. All three extraction methods yield reliable results by the real-time PCR analysis. The use of larger sample volume led to a sensitivity increase of DNA detection which was shown by the decrease of Ct-values. Furthermore five samples which were tested negative or questionable by the official extraction protocol were detected positive by real time PCR when the alternative extraction methods were used. By calculating the kappa index, the three extraction protocols resulted in a moderate (0.52; protocol 1 vs 3) to almost perfect agreement (1.00; red deer sample testing with all protocols). Both new methods yielded increased detection rates for MTC DNA detection in large sample volumes and consequently improve the official diagnostic protocol.
Corrosion and mechanical performance of AZ91 exposed to simulated inflammatory conditions.
Brooks, Emily K; Der, Stephanie; Ehrensberger, Mark T
2016-03-01
Magnesium (Mg) and its alloys, including Mg-9%Al-1%Zn (AZ91), are biodegradable metals with potential use as temporary orthopedic implants. Invasive orthopedic procedures can provoke an inflammatory response that produces hydrogen peroxide (H2O2) and an acidic environment near the implant. This study assessed the influence of inflammation on both the corrosion and mechanical properties of AZ91. The AZ91 samples in the inflammatory protocol were immersed for three days in a complex biologically relevant electrolyte (AMEM culture media) that contained serum proteins (FBS), 150 mM of H2O2, and was titrated to a pH of 5. The control protocol immersed AZ91 samples in the same biologically relevant electrolyte (AMEM & FBS) but without H2O2 and the acid titration. After 3 days all samples were switched into fresh AMEM & FBS for an additional 3-day immersion. During the initial immersion, inflammatory protocol samples showed increased corrosion rate determined by mass loss testing, increased Mg and Al ion released to solution, and a completely corroded surface morphology as compared to the control protocol. Although corrosion in both protocols slowed once the test electrolyte solution was replaced at 3 days, the samples originally exposed to the simulated inflammatory conditions continued to display enhanced corrosion rates as compared to the control protocol. These lingering effects may indicate the initial inflammatory corrosion processes modified components of the surface oxide and corrosion film or initiated aggressive localized processes that subsequently left the interface more vulnerable to continued enhanced corrosion. The electrochemical properties of the interfaces were also evaluated by EIS, which found that the corrosion characteristics of the AZ91 samples were potentially influenced by the role of intermediate adsorption layer processes. The increased corrosion observed for the inflammatory protocol did not affect the flexural mechanical properties of the AZ91 at any time point assessed. Copyright © 2015 Elsevier B.V. All rights reserved.
Non-wadeable rivers have been largely overlooked by bioassessment programs because of sampling difficulties and a lack of appropriate methods and biological indicators. We are in the process of developing a Large River Bioassessment Protocol (LR-BP) for sampling macroinvertebrat...
Optimizing the MAC Protocol in Localization Systems Based on IEEE 802.15.4 Networks
Claver, Jose M.; Ezpeleta, Santiago
2017-01-01
Radio frequency signals are commonly used in the development of indoor localization systems. The infrastructure of these systems includes some beacons placed at known positions that exchange radio packets with users to be located. When the system is implemented using wireless sensor networks, the wireless transceivers integrated in the network motes are usually based on the IEEE 802.15.4 standard. But, the CSMA-CA, which is the basis for the medium access protocols in this category of communication systems, is not suitable when several users want to exchange bursts of radio packets with the same beacon to acquire the radio signal strength indicator (RSSI) values needed in the location process. Therefore, new protocols are necessary to avoid the packet collisions that appear when multiple users try to communicate with the same beacons. On the other hand, the RSSI sampling process should be carried out very quickly because some systems cannot tolerate a large delay in the location process. This is even more important when the RSSI sampling process includes measures with different signal power levels or frequency channels. The principal objective of this work is to speed up the RSSI sampling process in indoor localization systems. To achieve this objective, the main contribution is the proposal of a new MAC protocol that eliminates the medium access contention periods and decreases the number of packet collisions to accelerate the RSSI collection process. Moreover, the protocol increases the overall network throughput taking advantage of the frequency channel diversity. The presented results show the suitability of this protocol for reducing the RSSI gathering delay and increasing the network throughput in simulated and real environments. PMID:28684666
Optimizing the MAC Protocol in Localization Systems Based on IEEE 802.15.4 Networks.
Pérez-Solano, Juan J; Claver, Jose M; Ezpeleta, Santiago
2017-07-06
Radio frequency signals are commonly used in the development of indoor localization systems. The infrastructure of these systems includes some beacons placed at known positions that exchange radio packets with users to be located. When the system is implemented using wireless sensor networks, the wireless transceivers integrated in the network motes are usually based on the IEEE 802.15.4 standard. But, the CSMA-CA, which is the basis for the medium access protocols in this category of communication systems, is not suitable when several users want to exchange bursts of radio packets with the same beacon to acquire the radio signal strength indicator (RSSI) values needed in the location process. Therefore, new protocols are necessary to avoid the packet collisions that appear when multiple users try to communicate with the same beacons. On the other hand, the RSSI sampling process should be carried out very quickly because some systems cannot tolerate a large delay in the location process. This is even more important when the RSSI sampling process includes measures with different signal power levels or frequency channels. The principal objective of this work is to speed up the RSSI sampling process in indoor localization systems. To achieve this objective, the main contribution is the proposal of a new MAC protocol that eliminates the medium access contention periods and decreases the number of packet collisions to accelerate the RSSI collection process. Moreover, the protocol increases the overall network throughput taking advantage of the frequency channel diversity. The presented results show the suitability of this protocol for reducing the RSSI gathering delay and increasing the network throughput in simulated and real environments.
Protocol for Detection of Yersinia pestis in Environmental ...
Methods Report This is the first ever open-access and detailed protocol available to all government departments and agencies, and their contractors to detect Yersinia pestis, the pathogen that causes plague, from multiple environmental sample types including water. Each analytical method includes sample processing procedure for each sample type in a step-by-step manner. It includes real-time PCR, traditional microbiological culture, and the Rapid Viability PCR (RV-PCR) analytical methods. For large volume water samples it also includes an ultra-filtration-based sample concentration procedure. Because of such a non-restrictive availability of this protocol to all government departments and agencies, and their contractors, the nation will now have increased laboratory capacity to analyze large number of samples during a wide-area plague incident.
Bayesian adaptive survey protocols for resource management
Halstead, Brian J.; Wylie, Glenn D.; Coates, Peter S.; Casazza, Michael L.
2011-01-01
Transparency in resource management decisions requires a proper accounting of uncertainty at multiple stages of the decision-making process. As information becomes available, periodic review and updating of resource management protocols reduces uncertainty and improves management decisions. One of the most basic steps to mitigating anthropogenic effects on populations is determining if a population of a species occurs in an area that will be affected by human activity. Species are rarely detected with certainty, however, and falsely declaring a species absent can cause improper conservation decisions or even extirpation of populations. We propose a method to design survey protocols for imperfectly detected species that accounts for multiple sources of uncertainty in the detection process, is capable of quantitatively incorporating expert opinion into the decision-making process, allows periodic updates to the protocol, and permits resource managers to weigh the severity of consequences if the species is falsely declared absent. We developed our method using the giant gartersnake (Thamnophis gigas), a threatened species precinctive to the Central Valley of California, as a case study. Survey date was negatively related to the probability of detecting the giant gartersnake, and water temperature was positively related to the probability of detecting the giant gartersnake at a sampled location. Reporting sampling effort, timing and duration of surveys, and water temperatures would allow resource managers to evaluate the probability that the giant gartersnake occurs at sampled sites where it is not detected. This information would also allow periodic updates and quantitative evaluation of changes to the giant gartersnake survey protocol. Because it naturally allows multiple sources of information and is predicated upon the idea of updating information, Bayesian analysis is well-suited to solving the problem of developing efficient sampling protocols for species of conservation concern.
Mars Sample Handling Protocol Workshop Series: Workshop 4
NASA Technical Reports Server (NTRS)
Race Margaret S. (Editor); DeVincenzi, Donald L. (Editor); Rummel, John D. (Editor); Acevedo, Sara E. (Editor)
2001-01-01
In preparation for missions to Mars that will involve the return of samples to Earth, it will be necessary to prepare for the receiving, handling, testing, distributing, and archiving of martian materials here on Earth. Previous groups and committees have studied selected aspects of sample return activities, but specific detailed protocols for the handling and testing of returned samples must still be developed. To further refine the requirements for sample hazard testing and to develop the criteria for subsequent release of sample materials from quarantine, the NASA Planetary Protection Officer convened a series of workshops in 2000-2001. The overall objective of the Workshop Series was to produce a Draft Protocol by which returned martian sample materials can be assessed for biological hazards and examined for evidence of life (extant or extinct) while safeguarding the purity of the samples from possible terrestrial contamination. This report also provides a record of the proceedings of Workshop 4, the final Workshop of the Series, which was held in Arlington, Virginia, June 5-7, 2001. During Workshop 4, the sub-groups were provided with a draft of the protocol compiled in May 2001 from the work done at prior Workshops in the Series. Then eight sub-groups were formed to discuss the following assigned topics: Review and Assess the Draft Protocol for Physical/Chemical Testing Review and Assess the Draft Protocol for Life Detection Testing Review and Assess the Draft Protocol for Biohazard Testing Environmental and Health/Monitoring and Safety Issues Requirements of the Draft Protocol for Facilities and Equipment Contingency Planning for Different Outcomes of the Draft Protocol Personnel Management Considerations in Implementation of the Draft Protocol Draft Protocol Implementation Process and Update Concepts This report provides the first complete presentation of the Draft Protocol for Mars Sample Handling to meet planetary protection needs. This Draft Protocol, which was compiled from deliberations and recommendations from earlier Workshops in the Series, represents a consensus that emerged from the discussions of all the sub-groups assembled over the course of the five Workshops of the Series. These discussions converged on a conceptual approach to sample handling, as well as on specific analytical requirements. Discussions also identified important issues requiring attention, as well as research and development needed for protocol implementation.
Dual-view plane illumination microscopy for rapid and spatially isotropic imaging
Kumar, Abhishek; Wu, Yicong; Christensen, Ryan; Chandris, Panagiotis; Gandler, William; McCreedy, Evan; Bokinsky, Alexandra; Colón-Ramos, Daniel A; Bao, Zhirong; McAuliffe, Matthew; Rondeau, Gary; Shroff, Hari
2015-01-01
We describe the construction and use of a compact dual-view inverted selective plane illumination microscope (diSPIM) for time-lapse volumetric (4D) imaging of living samples at subcellular resolution. Our protocol enables a biologist with some prior microscopy experience to assemble a diSPIM from commercially available parts, to align optics and test system performance, to prepare samples, and to control hardware and data processing with our software. Unlike existing light sheet microscopy protocols, our method does not require the sample to be embedded in agarose; instead, samples are prepared conventionally on glass coverslips. Tissue culture cells and Caenorhabditis elegans embryos are used as examples in this protocol; successful implementation of the protocol results in isotropic resolution and acquisition speeds up to several volumes per s on these samples. Assembling and verifying diSPIM performance takes ~6 d, sample preparation and data acquisition take up to 5 d and postprocessing takes 3–8 h, depending on the size of the data. PMID:25299154
NASA Astrophysics Data System (ADS)
Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato
2017-07-01
An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.
Laboratory procedures to generate viral metagenomes.
Thurber, Rebecca V; Haynes, Matthew; Breitbart, Mya; Wegley, Linda; Rohwer, Forest
2009-01-01
This collection of laboratory protocols describes the steps to collect viruses from various samples with the specific aim of generating viral metagenome sequence libraries (viromes). Viral metagenomics, the study of uncultured viral nucleic acid sequences from different biomes, relies on several concentration, purification, extraction, sequencing and heuristic bioinformatic methods. No single technique can provide an all-inclusive approach, and therefore the protocols presented here will be discussed in terms of hypothetical projects. However, care must be taken to individualize each step depending on the source and type of viral-particles. This protocol is a description of the processes we have successfully used to: (i) concentrate viral particles from various types of samples, (ii) eliminate contaminating cells and free nucleic acids and (iii) extract, amplify and purify viral nucleic acids. Overall, a sample can be processed to isolate viral nucleic acids suitable for high-throughput sequencing in approximately 1 week.
An optimised protocol for molecular identification of Eimeria from chickens☆
Kumar, Saroj; Garg, Rajat; Moftah, Abdalgader; Clark, Emily L.; Macdonald, Sarah E.; Chaudhry, Abdul S.; Sparagano, Olivier; Banerjee, Partha S.; Kundu, Krishnendu; Tomley, Fiona M.; Blake, Damer P.
2014-01-01
Molecular approaches supporting identification of Eimeria parasites infecting chickens have been available for more than 20 years, although they have largely failed to replace traditional measures such as microscopy and pathology. Limitations of microscopy-led diagnostics, including a requirement for specialist parasitological expertise and low sample throughput, are yet to be outweighed by the difficulties associated with accessing genomic DNA from environmental Eimeria samples. A key step towards the use of Eimeria species-specific PCR as a sensitive and reproducible discriminatory tool for use in the field is the production of a standardised protocol that includes sample collection and DNA template preparation, as well as primer selection from the numerous PCR assays now published. Such a protocol will facilitate development of valuable epidemiological datasets which may be easily compared between studies and laboratories. The outcome of an optimisation process undertaken in laboratories in India and the UK is described here, identifying four steps. First, samples were collected into a 2% (w/v) potassium dichromate solution. Second, oocysts were enriched by flotation in saturated saline. Third, genomic DNA was extracted using a QIAamp DNA Stool mini kit protocol including a mechanical homogenisation step. Finally, nested PCR was carried out using previously published primers targeting the internal transcribed spacer region 1 (ITS-1). Alternative methods tested included sample processing in the presence of faecal material, DNA extraction using a traditional phenol/chloroform protocol, the use of SCAR multiplex PCR (one tube and two tube versions) and speciation using the morphometric tool COCCIMORPH for the first time with field samples. PMID:24138724
Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M
2008-05-01
An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.
Bobik, Krzysztof; Dunlap, John R.; Burch-Smith, Tessa M.
2014-01-01
Since the 1940s transmission electron microscopy (TEM) has been providing biologists with ultra-high resolution images of biological materials. Yet, because of laborious and time-consuming protocols that also demand experience in preparation of artifact-free samples, TEM is not considered a user-friendly technique. Traditional sample preparation for TEM used chemical fixatives to preserve cellular structures. High-pressure freezing is the cryofixation of biological samples under high pressures to produce very fast cooling rates, thereby restricting ice formation, which is detrimental to the integrity of cellular ultrastructure. High-pressure freezing and freeze substitution are currently the methods of choice for producing the highest quality morphology in resin sections for TEM. These methods minimize the artifacts normally associated with conventional processing for TEM of thin sections. After cryofixation the frozen water in the sample is replaced with liquid organic solvent at low temperatures, a process called freeze substitution. Freeze substitution is typically carried out over several days in dedicated, costly equipment. A recent innovation allows the process to be completed in three hours, instead of the usual two days. This is typically followed by several more days of sample preparation that includes infiltration and embedding in epoxy resins before sectioning. Here we present a protocol combining high-pressure freezing and quick freeze substitution that enables plant sample fixation to be accomplished within hours. The protocol can readily be adapted for working with other tissues or organisms. Plant tissues are of special concern because of the presence of aerated spaces and water-filled vacuoles that impede ice-free freezing of water. In addition, the process of chemical fixation is especially long in plants due to cell walls impeding the penetration of the chemicals to deep within the tissues. Plant tissues are therefore particularly challenging, but this protocol is reliable and produces samples of the highest quality. PMID:25350384
Protocol for determining bull trout presence
Peterson, James; Dunham, Jason B.; Howell, Philip; Thurow, Russell; Bonar, Scott
2002-01-01
The Western Division of the American Fisheries Society was requested to develop protocols for determining presence/absence and potential habitat suitability for bull trout. The general approach adopted is similar to the process for the marbled murrelet, whereby interim guidelines are initially used, and the protocols are subsequently refined as data are collected. Current data were considered inadequate to precisely identify suitable habitat but could be useful in stratifying sampling units for presence/absence surveys. The presence/absence protocol builds on previous approaches (Hillman and Platts 1993; Bonar et al. 1997), except it uses the variation in observed bull trout densities instead of a minimum threshold density and adjusts for measured differences in sampling efficiency due to gear types and habitat characteristics. The protocol consists of: 1. recommended sample sizes with 80% and 95% detection probabilities for juvenile and resident adult bull trout for day and night snorkeling and electrofishing adjusted for varying habitat characteristics for 50m and 100m sampling units, 2. sampling design considerations, including possible habitat characteristics for stratification, 3. habitat variables to be measured in the sampling units, and 3. guidelines for training sampling crews. Criteria for habitat strata consist of coarse, watershed-scale characteristics (e.g., mean annual air temperature) and fine-scale, reach and habitat-specific features (e.g., water temperature, channel width). The protocols will be revised in the future using data from ongoing presence/absence surveys, additional research on sampling efficiencies, and development of models of habitat/species occurrence.
Evaluating mixed samples as a source of error in non-invasive genetic studies using microsatellites
Roon, David A.; Thomas, M.E.; Kendall, K.C.; Waits, L.P.
2005-01-01
The use of noninvasive genetic sampling (NGS) for surveying wild populations is increasing rapidly. Currently, only a limited number of studies have evaluated potential biases associated with NGS. This paper evaluates the potential errors associated with analysing mixed samples drawn from multiple animals. Most NGS studies assume that mixed samples will be identified and removed during the genotyping process. We evaluated this assumption by creating 128 mixed samples of extracted DNA from brown bear (Ursus arctos) hair samples. These mixed samples were genotyped and screened for errors at six microsatellite loci according to protocols consistent with those used in other NGS studies. Five mixed samples produced acceptable genotypes after the first screening. However, all mixed samples produced multiple alleles at one or more loci, amplified as only one of the source samples, or yielded inconsistent electropherograms by the final stage of the error-checking process. These processes could potentially reduce the number of individuals observed in NGS studies, but errors should be conservative within demographic estimates. Researchers should be aware of the potential for mixed samples and carefully design gel analysis criteria and error checking protocols to detect mixed samples.
Environmental DNA sampling protocol - filtering water to capture DNA from aquatic organisms
Laramie, Matthew B.; Pilliod, David S.; Goldberg, Caren S.; Strickler, Katherine M.
2015-09-29
Environmental DNA (eDNA) analysis is an effective method of determining the presence of aquatic organisms such as fish, amphibians, and other taxa. This publication is meant to guide researchers and managers in the collection, concentration, and preservation of eDNA samples from lentic and lotic systems. A sampling workflow diagram and three sampling protocols are included as well as a list of suggested supplies. Protocols include filter and pump assembly using: (1) a hand-driven vacuum pump, ideal for sample collection in remote sampling locations where no electricity is available and when equipment weight is a primary concern; (2) a peristaltic pump powered by a rechargeable battery-operated driver/drill, suitable for remote sampling locations when weight consideration is less of a concern; (3) a 120-volt alternating current (AC) powered peristaltic pump suitable for any location where 120-volt AC power is accessible, or for roadside sampling locations. Images and detailed descriptions are provided for each step in the sampling and preservation process.
An optimised protocol for molecular identification of Eimeria from chickens.
Kumar, Saroj; Garg, Rajat; Moftah, Abdalgader; Clark, Emily L; Macdonald, Sarah E; Chaudhry, Abdul S; Sparagano, Olivier; Banerjee, Partha S; Kundu, Krishnendu; Tomley, Fiona M; Blake, Damer P
2014-01-17
Molecular approaches supporting identification of Eimeria parasites infecting chickens have been available for more than 20 years, although they have largely failed to replace traditional measures such as microscopy and pathology. Limitations of microscopy-led diagnostics, including a requirement for specialist parasitological expertise and low sample throughput, are yet to be outweighed by the difficulties associated with accessing genomic DNA from environmental Eimeria samples. A key step towards the use of Eimeria species-specific PCR as a sensitive and reproducible discriminatory tool for use in the field is the production of a standardised protocol that includes sample collection and DNA template preparation, as well as primer selection from the numerous PCR assays now published. Such a protocol will facilitate development of valuable epidemiological datasets which may be easily compared between studies and laboratories. The outcome of an optimisation process undertaken in laboratories in India and the UK is described here, identifying four steps. First, samples were collected into a 2% (w/v) potassium dichromate solution. Second, oocysts were enriched by flotation in saturated saline. Third, genomic DNA was extracted using a QIAamp DNA Stool mini kit protocol including a mechanical homogenisation step. Finally, nested PCR was carried out using previously published primers targeting the internal transcribed spacer region 1 (ITS-1). Alternative methods tested included sample processing in the presence of faecal material, DNA extraction using a traditional phenol/chloroform protocol, the use of SCAR multiplex PCR (one tube and two tube versions) and speciation using the morphometric tool COCCIMORPH for the first time with field samples. Copyright © 2013 Dirk Vulpius The Authors. Published by Elsevier B.V. All rights reserved.
Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A
2013-08-20
A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.
Preparing Protocols for Institutional Review Boards.
ERIC Educational Resources Information Center
Lyons, Charles M.
1983-01-01
Introduces the process by which Institutional Review Boards (IRBs) review proposals for research involving human subjects. Describes the composition of IRBs. Presents the Nuremberg code, the elements of informed consent, the judging criteria for proposals, and a sample protocol format. References newly published regulations governing research with…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, Jordan S.; Khosravani, Ali; Castillo, Andrew
Recent spherical nanoindentation protocols have proven robust at capturing the local elastic-plastic response of polycrystalline metal samples at length scales much smaller than the grain size. In this work, we extend these protocols to length scales that include multiple grains to recover microindentation stress-strain curves. These new protocols are first established in this paper and then demonstrated for Al-6061 by comparing the measured indentation stress-strain curves with the corresponding measurements from uniaxial tension tests. More specifically, the scaling factors between the uniaxial yield strength and the indentation yield strength was determined to be about 1.9, which is significantly lower thanmore » the value of 2.8 used commonly in literature. Furthermore, the reasons for this difference are discussed. Second, the benefits of these new protocols in facilitating high throughput exploration of process-property relationships are demonstrated through a simple case study.« less
Weaver, Jordan S.; Khosravani, Ali; Castillo, Andrew; ...
2016-06-14
Recent spherical nanoindentation protocols have proven robust at capturing the local elastic-plastic response of polycrystalline metal samples at length scales much smaller than the grain size. In this work, we extend these protocols to length scales that include multiple grains to recover microindentation stress-strain curves. These new protocols are first established in this paper and then demonstrated for Al-6061 by comparing the measured indentation stress-strain curves with the corresponding measurements from uniaxial tension tests. More specifically, the scaling factors between the uniaxial yield strength and the indentation yield strength was determined to be about 1.9, which is significantly lower thanmore » the value of 2.8 used commonly in literature. Furthermore, the reasons for this difference are discussed. Second, the benefits of these new protocols in facilitating high throughput exploration of process-property relationships are demonstrated through a simple case study.« less
Maturo, Donna; Powell, Alexis; Major-Wilson, Hannah; Sanchez, Kenia; De Santis, Joseph P; Friedman, Lawrence B
2015-01-01
Advances in care and treatment of adolescents/young adults with HIV infection have made survival into adulthood possible, requiring transition to adult care. Researchers have documented that the transition process is challenging for adolescents/young adults. To ensure successful transition, a formal transition protocol is needed. Despite existing research, little quantitative evaluation of the transition process has been conducted. The purpose of the study was to pilot test the "Movin' Out" Transitioning Protocol, a formalized protocol developed to assist transition to adult care. A retrospective medical/nursing record review was conducted with 38 clients enrolled in the "Movin' Out" Transitioning Protocol at a university-based adolescent medicine clinic providing care to adolescents/young adults with HIV infection. Almost half of the participants were able to successfully transition to adult care. Reasons for failure to transition included relocation, attrition, lost to follow-up, and transfer to another adult service. Failure to transition to adult care was not related to adherence issues, X(2) (1, N=38)=2.49, p=.288; substance use, X(2) (1, N=38)=1.71, p=.474; mental health issues, X(2) (1, N=38)=2.23, p=.322; or pregnancy/childrearing, X(2) (1, N=38)=0.00, p=.627). Despite the small sample size, the "Movin' Out" Transitioning Protocol appears to be useful in guiding the transition process of adolescents/young adults with HIV infection to adult care. More research is needed with a larger sample to fully evaluate the "Movin' Out" Transitioning Protocol. Copyright © 2015 Elsevier Inc. All rights reserved.
Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.
Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras
2016-04-01
There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. © 2015 Society for Laboratory Automation and Screening.
A simplified protocol for molecular identification of Eimeria species in field samples.
Haug, Anita; Thebo, Per; Mattsson, Jens G
2007-05-15
This study aimed to find a fast, sensitive and efficient protocol for molecular identification of chicken Eimeria spp. in field samples. Various methods for each of the three steps of the protocol were evaluated: oocyst wall rupturing methods, DNA extraction methods, and identification of species-specific DNA sequences by PCR. We then compared and evaluated five complete protocols. Three series of oocyst suspensions of known number of oocysts from Eimeria mitis, Eimeria praecox, Eimeria maxima and Eimeria tenella were prepared and ground using glass beads or mini-pestle. DNA was extracted from ruptured oocysts using commercial systems (GeneReleaser, Qiagen Stoolkit and Prepman) or phenol-chloroform DNA extraction, followed by identification of species-specific ITS-1 sequences by optimised single species PCR assays. The Stoolkit and Prepman protocols showed insufficient repeatability, and the former was also expensive and relatively time-consuming. In contrast, both the GeneReleaser protocol and phenol-chloroform protocols were robust and sensitive, detecting less than 0.4 oocysts of each species per PCR. Finally, we evaluated our new protocol on 68 coccidia positive field samples. Our data suggests that rupturing the oocysts by mini-pestle grinding, preparing the DNA with GeneReleaser, followed by optimised single species PCR assays, makes a robust and sensitive procedure for identifying chicken Eimeria species in field samples. Importantly, it also provides minimal hands-on-time in the pre-PCR process, lower contamination risk and no handling of toxic chemicals.
Influenza A Virus Isolation, Culture and Identification
Eisfeld, Amie J.; Neumann, Gabriele; Kawaoka, Yoshihiro
2017-01-01
SUMMARY Influenza A viruses (IAV) cause epidemics and pandemics that result in considerable financial burden and loss of human life. To manage annual IAV epidemics and prepare for future pandemics, improved understanding of how IAVs emerge, transmit, cause disease, and acquire pandemic potential is urgently needed. Fundamental techniques essential for procuring such knowledge are IAV isolation and culture from experimental and surveillance samples. Here, we present a detailed protocol for IAV sample collection and processing, amplification in chicken eggs and mammalian cells, and identification from samples containing unknown pathogens. This protocol is robust, and allows for generation of virus cultures that can be used for downstream analyses. Once experimental or surveillance samples are obtained, virus cultures can be generated and the presence of IAV can be verified in 3–5 days. Increased time-frames may be required for less experienced laboratory personnel, or when large numbers of samples will be processed. PMID:25321410
Bova, G Steven; Eltoum, Isam A; Kiernan, John A; Siegal, Gene P; Frost, Andra R; Best, Carolyn J M; Gillespie, John W; Su, Gloria H; Emmert-Buck, Michael R
2005-02-01
Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of any tissue-based biological phenomenon. This article reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification and quantification. We provide a detailed comparison of some current tissue microdissection technologies, and provide detailed example protocols for tissue component handling upstream and downstream from microdissection. We also discuss some of the physical and chemical issues related to optimal tissue processing, and include methods specific to cytology specimens. We encourage each laboratory to use these as a starting point for optimization of their overall process of moving from collected tissue to high quality, appropriately anatomically tagged scientific results. In optimized protocols is a source of inefficiency in current life science research. Improvement in this area will significantly increase life science quality and productivity. The article is divided into introduction, materials, protocols, and notes sections. Because many protocols are covered in each of these sections, information relating to a single protocol is not contiguous. To get the greatest benefit from this article, readers are advised to read through the entire article first, identify protocols appropriate to their laboratory for each step in their workflow, and then reread entries in each section pertaining to each of these single protocols.
Immune system changes during simulated planetary exploration on Devon Island, high arctic
Crucian, Brian; Lee, Pascal; Stowe, Raymond; Jones, Jeff; Effenhauser, Rainer; Widen, Raymond; Sams, Clarence
2007-01-01
Background Dysregulation of the immune system has been shown to occur during spaceflight, although the detailed nature of the phenomenon and the clinical risks for exploration class missions have yet to be established. Also, the growing clinical significance of immune system evaluation combined with epidemic infectious disease rates in third world countries provides a strong rationale for the development of field-compatible clinical immunology techniques and equipment. In July 2002 NASA performed a comprehensive immune assessment on field team members participating in the Haughton-Mars Project (HMP) on Devon Island in the high Canadian Arctic. The purpose of the study was to evaluate the effect of mission-associated stressors on the human immune system. To perform the study, the development of techniques for processing immune samples in remote field locations was required. Ten HMP-2002 participants volunteered for the study. A field protocol was developed at NASA-JSC for performing sample collection, blood staining/processing for immunophenotype analysis, whole-blood mitogenic culture for functional assessments and cell-sample preservation on-location at Devon Island. Specific assays included peripheral leukocyte distribution; constitutively activated T cells, intracellular cytokine profiles, plasma cortisol and EBV viral antibody levels. Study timepoints were 30 days prior to mission start, mid-mission and 60 days after mission completion. Results The protocol developed for immune sample processing in remote field locations functioned properly. Samples were processed on Devon Island, and stabilized for subsequent analysis at the Johnson Space Center in Houston. The data indicated that some phenotype, immune function and stress hormone changes occurred in the HMP field participants that were largely distinct from pre-mission baseline and post-mission recovery data. These immune changes appear similar to those observed in astronauts following spaceflight. Conclusion The immune system changes described during the HMP field deployment validate the use of the HMP as a ground-based spaceflight/planetary exploration analog for some aspects of human physiology. The sample processing protocol developed for this study may have applications for immune studies in remote terrestrial field locations. Elements of this protocol could possibly be adapted for future in-flight immunology studies conducted during space missions. PMID:17521440
Zboromyrska, Y; Rubio, E; Alejo, I; Vergara, A; Mons, A; Campo, I; Bosch, J; Marco, F; Vila, J
2016-06-01
The current gold standard method for the diagnosis of urinary tract infections (UTI) is urine culture that requires 18-48 h for the identification of the causative microorganisms and an additional 24 h until the results of antimicrobial susceptibility testing (AST) are available. The aim of this study was to shorten the time of urine sample processing by a combination of flow cytometry for screening and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) for bacterial identification followed by AST directly from urine. The study was divided into two parts. During the first part, 675 urine samples were processed by a flow cytometry device and a cut-off value of bacterial count was determined to select samples for direct identification by MALDI-TOF-MS at ≥5 × 10(6) bacteria/mL. During the second part, 163 of 1029 processed samples reached the cut-off value. The sample preparation protocol for direct identification included two centrifugation and two washing steps. Direct AST was performed by the disc diffusion method if a reliable direct identification was obtained. Direct MALDI-TOF-MS identification was performed in 140 urine samples; 125 of the samples were positive by urine culture, 12 were contaminated and 3 were negative. Reliable direct identification was obtained in 108 (86.4%) of the 125 positive samples. AST was performed in 102 identified samples, and the results were fully concordant with the routine method among 83 monomicrobial infections. In conclusion, the turnaround time of the protocol described to diagnose UTI was about 1 h for microbial identification and 18-24 h for AST. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
A Draft Protocol for Detecting Possible Biohazards in Martian Samples Returned to Earth
NASA Technical Reports Server (NTRS)
Viso, M.; DeVincenzi, D. L.; Race, M. S.; Schad, P. J.; Stabekis, P. D.; Acevedo, S. E.; Rummel, J. D.
2002-01-01
In preparation for missions to Mars that will involve the return of samples, it is necessary to prepare for the safe receiving, handling, testing, distributing, and archiving of martian materials here on Earth. Previous groups and committees have studied selected aspects of sample return activities, but a specific protocol for handling and testing of returned -=1 samples from Mars remained to be developed. To refine the requirements for Mars sample hazard testing and to develop criteria for the subsequent release of sample materials from precautionary containment, NASA Planetary Protection Officer, working in collaboration with CNES, convened a series of workshops to produce a Protocol by which returned martian sample materials could be assessed for biological hazards and examined for evidence of life (extant or extinct), while safeguarding the samples from possible terrestrial contamination. The Draft Protocol was then reviewed by an Oversight and Review Committee formed specifically for that purpose and composed of senior scientists. In order to preserve the scientific value of returned martian samples under safe conditions, while avoiding false indications of life within the samples, the Sample Receiving Facility (SRF) is required to allow handling and processing of the Mars samples to prevent their terrestrial contamination while maintaining strict biological containment. It is anticipated that samples will be able to be shipped among appropriate containment facilities wherever necessary, under procedures developed in cooperation with international appropriate institutions. The SRF will need to provide different types of laboratory environments for carrying out, beyond sample description and curation, the various aspects of the protocol: Physical/Chemical analysis, Life Detection testing, and Biohazard testing. The main principle of these tests will be described and the criteria for release will be discussed, as well as the requirements for the SRF and its personnel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Jianying; Dann, Geoffrey P.; Shi, Tujin
2012-03-10
Sodium dodecyl sulfate (SDS) is one of the most popular laboratory reagents used for highly efficient biological sample extraction; however, SDS presents a significant challenge to LC-MS-based proteomic analyses due to its severe interference with reversed-phase LC separations and electrospray ionization interfaces. This study reports a simple SDS-assisted proteomic sample preparation method facilitated by a novel peptide-level SDS removal protocol. After SDS-assisted protein extraction and digestion, SDS was effectively (>99.9%) removed from peptides through ion substitution-mediated DS- precipitation with potassium chloride (KCl) followed by {approx}10 min centrifugation. Excellent peptide recovery (>95%) was observed for less than 20 {mu}g of peptides.more » Further experiments demonstrated the compatibility of this protocol with LC-MS/MS analyses. The resulting proteome coverage from this SDS-assisted protocol was comparable to or better than those obtained from other standard proteomic preparation methods in both mammalian tissues and bacterial samples. These results suggest that this SDS-assisted protocol is a practical, simple, and broadly applicable proteomic sample processing method, which can be particularly useful when dealing with samples difficult to solubilize by other methods.« less
Erramuzpe, Asier; Cortés, Jesús M; López, José I
2018-02-01
Intratumor heterogeneity (ITH) is an inherent process of tumor development that has received much attention in previous years, as it has become a major obstacle for the success of targeted therapies. ITH is also temporally unpredictable across tumor evolution, which makes its precise characterization even more problematic since detection success depends on the precise temporal snapshot at which ITH is analyzed. New and more efficient strategies for tumor sampling are needed to overcome these difficulties which currently rely entirely on the pathologist's interpretation. Recently, we showed that a new strategy, the multisite tumor sampling, works better than the routine sampling protocol for the ITH detection when the tumor time evolution was not taken into consideration. Here, we extend this work and compare the ITH detections of multisite tumor sampling and routine sampling protocols across tumor time evolution, and in particular, we provide in silico analyses of both strategies at early and late temporal stages for four different models of tumor evolution (linear, branched, neutral, and punctuated). Our results indicate that multisite tumor sampling outperforms routine protocols in detecting ITH at all different temporal stages of tumor evolution. We conclude that multisite tumor sampling is more advantageous than routine protocols in detecting intratumor heterogeneity.
A Protocol for Collecting Human Cardiac Tissue for Research.
Blair, Cheavar A; Haynes, Premi; Campbell, Stuart G; Chung, Charles; Mitov, Mihail I; Dennis, Donna; Bonnell, Mark R; Hoopes, Charles W; Guglin, Maya; Campbell, Kenneth S
2016-01-01
This manuscript describes a protocol at the University of Kentucky that allows a translational research team to collect human myocardium that can be used for biological research. We have gained a great deal of practical experience since we started this protocol in 2008, and we hope that other groups might be able to learn from our endeavors. To date, we have procured ~4000 samples from ~230 patients. The tissue that we collect comes from organ donors and from patients who are receiving a heart transplant or a ventricular assist device because they have heart failure. We begin our manuscript by describing the importance of human samples in cardiac research. Subsequently, we describe the process for obtaining consent from patients, the cost of running the protocol, and some of the issues and practical difficulties that we have encountered. We conclude with some suggestions for other researchers who may be considering starting a similar protocol.
A Protocol for Collecting Human Cardiac Tissue for Research
Blair, Cheavar A.; Haynes, Premi; Campbell, Stuart G.; Chung, Charles; Mitov, Mihail I.; Dennis, Donna; Bonnell, Mark R.; Hoopes, Charles W.; Guglin, Maya; Campbell, Kenneth S.
2016-01-01
This manuscript describes a protocol at the University of Kentucky that allows a translational research team to collect human myocardium that can be used for biological research. We have gained a great deal of practical experience since we started this protocol in 2008, and we hope that other groups might be able to learn from our endeavors. To date, we have procured ~4000 samples from ~230 patients. The tissue that we collect comes from organ donors and from patients who are receiving a heart transplant or a ventricular assist device because they have heart failure. We begin our manuscript by describing the importance of human samples in cardiac research. Subsequently, we describe the process for obtaining consent from patients, the cost of running the protocol, and some of the issues and practical difficulties that we have encountered. We conclude with some suggestions for other researchers who may be considering starting a similar protocol. PMID:28042604
Comparison of human umbilical cord blood processing with or without hydroxyethyl starch.
Souri, Milad; Nikougoftar Zarif, Mahin; Rasouli, Mahboobeh; Golzadeh, Khadijeh; Nakhlestani Hagh, Mozhdeh; Ezzati, Nasim; Atarodi, Kamran
2017-11-01
Umbilical cord blood (UCB) processing with hydroxyethyl starch (HES) is the most common protocol in the cord blood banks. The quality of UCB volume reduction was guaranteed by minimum manipulation of cord blood samples in the closed system. This study aimed to analyze and compare cell recovery and viability of UCB processed using the Sepax automated system in the presence and absence of HES. Thirty UCB bags with a total nucleated cell (TNC) count of more than 2.5 × 10 9 were divided in two bags with equal volume. HES solution was added to one bag and another was intact. Both bags were processed with the Sepax. To determine cell recovery, viability, and potential of colony-forming cells (CFCs), preprocessing, postprocessing, and thawing samples were analyzed. The mean TNC recovery after processing and after thaw was significantly better with the HES method (p < 0.01 for the postprocessing step and p < 0.05 for the postthaw step). There were no significant differences to mononucleated cells (MNCs) and CD34+ cell recovery between the two methods after processing and after thaw. TNC and MNC viability was significantly higher without HES after processing and after thaw (p < 0.01). The results of the CFC assay were similar for both methods after processing and after thaw. These results showed that processing of UCB using the Sepax system with the without-HES protocol due to the lower manipulation of samples could be used as an eligible protocol to reduce the volume of UCB. © 2017 AABB.
Collecting, archiving and processing DNA from wildlife samples using FTA® databasing paper
Smith, LM; Burgoyne, LA
2004-01-01
Background Methods involving the analysis of nucleic acids have become widespread in the fields of traditional biology and ecology, however the storage and transport of samples collected in the field to the laboratory in such a manner to allow purification of intact nucleic acids can prove problematical. Results FTA® databasing paper is widely used in human forensic analysis for the storage of biological samples and for purification of nucleic acids. The possible uses of FTA® databasing paper in the purification of DNA from samples of wildlife origin were examined, with particular reference to problems expected due to the nature of samples of wildlife origin. The processing of blood and tissue samples, the possibility of excess DNA in blood samples due to nucleated erythrocytes, and the analysis of degraded samples were all examined, as was the question of long term storage of blood samples on FTA® paper. Examples of the end use of the purified DNA are given for all protocols and the rationale behind the processing procedures is also explained to allow the end user to adjust the protocols as required. Conclusions FTA® paper is eminently suitable for collection of, and purification of nucleic acids from, biological samples from a wide range of wildlife species. This technology makes the collection and storage of such samples much simpler. PMID:15072582
Paoletti, Claudia; Esbensen, Kim H
2015-01-01
Material heterogeneity influences the effectiveness of sampling procedures. Most sampling guidelines used for assessment of food and/or feed commodities are based on classical statistical distribution requirements, the normal, binomial, and Poisson distributions-and almost universally rely on the assumption of randomness. However, this is unrealistic. The scientific food and feed community recognizes a strong preponderance of non random distribution within commodity lots, which should be a more realistic prerequisite for definition of effective sampling protocols. Nevertheless, these heterogeneity issues are overlooked as the prime focus is often placed only on financial, time, equipment, and personnel constraints instead of mandating acquisition of documented representative samples under realistic heterogeneity conditions. This study shows how the principles promulgated in the Theory of Sampling (TOS) and practically tested over 60 years provide an effective framework for dealing with the complete set of adverse aspects of both compositional and distributional heterogeneity (material sampling errors), as well as with the errors incurred by the sampling process itself. The results of an empirical European Union study on genetically modified soybean heterogeneity, Kernel Lot Distribution Assessment are summarized, as they have a strong bearing on the issue of proper sampling protocol development. TOS principles apply universally in the food and feed realm and must therefore be considered the only basis for development of valid sampling protocols free from distributional constraints.
A review of blood sample handling and pre-processing for metabolomics studies.
Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta
2017-09-01
Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Optimized Setup and Protocol for Magnetic Domain Imaging with In Situ Hysteresis Measurement.
Liu, Jun; Wilson, John; Davis, Claire; Peyton, Anthony
2017-11-07
This paper elaborates the sample preparation protocols required to obtain optimal domain patterns using the Bitter method, focusing on the extra steps compared to standard metallographic sample preparation procedures. The paper proposes a novel bespoke rig for dynamic domain imaging with in situ BH (magnetic hysteresis) measurements and elaborates the protocols for the sensor preparation and the use of the rig to ensure accurate BH measurement. The protocols for static and ordinary dynamic domain imaging (without in situ BH measurements) are also presented. The reported method takes advantage of the convenience and high sensitivity of the traditional Bitter method and enables in situ BH measurement without interrupting or interfering with the domain wall movement processes. This facilitates establishing a direct and quantitative link between the domain wall movement processes-microstructural feature interactions in ferritic steels with their BH loops. This method is anticipated to become a useful tool for the fundamental study of microstructure-magnetic property relationships in steels and to help interpret the electromagnetic sensor signals for non-destructive evaluation of steel microstructures.
Rozenberg, Andrey; Leese, Florian; Weiss, Linda C; Tollrian, Ralph
2016-01-01
Tag-Seq is a high-throughput approach used for discovering SNPs and characterizing gene expression. In comparison to RNA-Seq, Tag-Seq eases data processing and allows detection of rare mRNA species using only one tag per transcript molecule. However, reduced library complexity raises the issue of PCR duplicates, which distort gene expression levels. Here we present a novel Tag-Seq protocol that uses the least biased methods for RNA library preparation combined with a novel approach for joint PCR template and sample labeling. In our protocol, input RNA is fragmented by hydrolysis, and poly(A)-bearing RNAs are selected and directly ligated to mixed DNA-RNA P5 adapters. The P5 adapters contain i5 barcodes composed of sample-specific (moderately) degenerate base regions (mDBRs), which later allow detection of PCR duplicates. The P7 adapter is attached via reverse transcription with individual i7 barcodes added during the amplification step. The resulting libraries can be sequenced on an Illumina sequencer. After sample demultiplexing and PCR duplicate removal with a free software tool we designed, the data are ready for downstream analysis. Our protocol was tested on RNA samples from predator-induced and control Daphnia microcrustaceans.
LC-MS based analysis of endogenous steroid hormones in human hair.
Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias
2016-09-01
The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Rodriguez-Canales, Jaime; Hanson, Jeffrey C; Hipp, Jason D; Balis, Ulysses J; Tangrea, Michael A; Emmert-Buck, Michael R; Bova, G Steven
2013-01-01
Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of any tissue-based biological phenomenon. This updated chapter reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification and quantification. We provide a detailed comparison of some current tissue microdissection technologies and provide detailed example protocols for tissue component handling upstream and downstream from microdissection. We also discuss some of the physical and chemical issues related to optimal tissue processing and include methods specific to cytology specimens. We encourage each laboratory to use these as a starting point for optimization of their overall process of moving from collected tissue to high-quality, appropriately anatomically tagged scientific results. Improvement in this area will significantly increase life science quality and productivity. The chapter is divided into introduction, materials, protocols, and notes subheadings. Because many protocols are covered in each of these sections, information relating to a single protocol is not contiguous. To get the greatest benefit from this chapter, readers are advised to read through the entire chapter first, identify protocols appropriate to their laboratory for each step in their workflow, and then reread entries in each section pertaining to each of these single protocols.
Modeling unobserved sources of heterogeneity in animal abundance using a Dirichlet process prior
Dorazio, R.M.; Mukherjee, B.; Zhang, L.; Ghosh, M.; Jelks, H.L.; Jordan, F.
2008-01-01
In surveys of natural populations of animals, a sampling protocol is often spatially replicated to collect a representative sample of the population. In these surveys, differences in abundance of animals among sample locations may induce spatial heterogeneity in the counts associated with a particular sampling protocol. For some species, the sources of heterogeneity in abundance may be unknown or unmeasurable, leading one to specify the variation in abundance among sample locations stochastically. However, choosing a parametric model for the distribution of unmeasured heterogeneity is potentially subject to error and can have profound effects on predictions of abundance at unsampled locations. In this article, we develop an alternative approach wherein a Dirichlet process prior is assumed for the distribution of latent abundances. This approach allows for uncertainty in model specification and for natural clustering in the distribution of abundances in a data-adaptive way. We apply this approach in an analysis of counts based on removal samples of an endangered fish species, the Okaloosa darter. Results of our data analysis and simulation studies suggest that our implementation of the Dirichlet process prior has several attractive features not shared by conventional, fully parametric alternatives. ?? 2008, The International Biometric Society.
Microfluidic, marker-free isolation of circulating tumor cells from blood samples
Karabacak, Nezihi Murat; Spuhler, Philipp S; Fachin, Fabio; Lim, Eugene J; Pai, Vincent; Ozkumur, Emre; Martel, Joseph M; Kojic, Nikola; Smith, Kyle; Chen, Pin-i; Yang, Jennifer; Hwang, Henry; Morgan, Bailey; Trautwein, Julie; Barber, Thomas A; Stott, Shannon L; Maheswaran, Shyamala; Kapur, Ravi; Haber, Daniel A; Toner, Mehmet
2014-01-01
The ability to isolate and analyze rare circulating tumor cells (CTCs) has the potential to further our understanding of cancer metastasis and enhance the care of cancer patients. In this protocol, we describe the procedure for isolating rare CTCs from blood samples by using tumor antigen–independent microfluidic CTC-iChip technology. The CTC-iChip uses deterministic lateral displacement, inertial focusing and magnetophoresis to sort up to 107 cells/s. By using two-stage magnetophoresis and depletion antibodies against leukocytes, we achieve 3.8-log depletion of white blood cells and a 97% yield of rare cells with a sample processing rate of 8 ml of whole blood/h. The CTC-iChip is compatible with standard cytopathological and RNA-based characterization methods. This protocol describes device production, assembly, blood sample preparation, system setup and the CTC isolation process. Sorting 8 ml of blood sample requires 2 h including setup time, and chip production requires 2–5 d. PMID:24577360
Nieto, Sonia; Dragna, Justin M.; Anslyn, Eric V.
2010-01-01
A protocol for the rapid determination of the absolute configuration and enantiomeric excess of α-chiral primary amines with potential applications in asymmetric reaction discovery has been developed. The protocol requires derivatization of α-chiral primary amines via condensation with pyridine carboxaldehyde to quantitatively yield the corresponding imine. The Cu(I) complex with 2,2'-bis (diphenylphosphino)-1,1'-dinaphthyl (BINAP -CuI) with the imine yields a metal-to-ligand-charge-transfer band (MLCT) in the visible region of the circular dichroism spectrum upon binding. Diastereomeric host-guest complexes give CD signals of the same signs, but different amplitudes, allowing for differentiation of enantiomers. Processing the primary optical data from the CD spectrum with linear discriminant analysis (LDA) allows for the determination of absolute configuration and identification of the amines, and processing with a supervised multi-layer perceptron artifical neural network (MLP-ANN) allows for the simultaneous determination of ee and concentration. The primary optical data necessary to determine the ee of unknown samples is obtained in 2 minutes per sample. To demonstrate the utility of the protocol in asymmetric reaction discovery, the ee's and concentrations for an asymmetric metal catalyzed reaction are determined. The potential of the protocol's application in high-throughput screening (HTS) of ee is discussed. PMID:19946914
Field Immune Assessment during Simulated Planetary Exploration in the Canadian Arctic
NASA Technical Reports Server (NTRS)
Crucian, Brian; Lee, Pascal; Stowe, Raymond; Jones, Jeff; Effenhauser, Rainer; Widen, Raymond; Sams, Clarence
2006-01-01
Dysregulation of the immune system has been shown to occur during space flight, although the detailed nature of the phenomenon and the clinical risks for exploration class missions has yet to be established. In addition, the growing clinical significance of immune system evaluation combined with epidemic infectious disease rates in third world countries provides a strong rationale for the development of field-compatible clinical immunology techniques and equipment. In July 2002 NASA performed a comprehensive field immunology assessment on crewmembers participating in the Haughton-Mars Project (HMP) on Devon Island in the high Canadian Arctic. The purpose of the study was to evaluate mission-associated effects on the human immune system, as well as to evaluate techniques developed for processing immune samples in remote field locations. Ten HMP-2002 participants volunteered for the study. A field protocol was developed at NASA-JSC for performing sample collection, blood staining/processing for immunophenotype analysis, wholeblood mitogenic culture for functional assessments and cell-sample preservation on-location at Devon Island. Specific assays included peripheral leukocyte distribution; constitutively activated T cells, intracellular cytokine profiles and plasma EBV viral antibody levels. Study timepoints were L-30, midmission and R+60. The protocol developed for immune sample processing in remote field locations functioned properly. Samples were processed in the field location, and stabilized for subsequent analysis at the Johnson Space Center in Houston. The data indicated that some phenotype, immune function and stress hormone changes occurred in the HMP field participants that were largely distinct from pre-mission baseline and post-mission recovery data. These immune changes appear similar to those observed in Astronauts following spaceflight. The sample processing protocol developed for this study may have applications for immune assessment during exploration-class space missions or in remote terrestrial field locations. The data validate the use of the HMP as a ground-based spaceflight/planetary exploration analog for some aspects of human physiology.
Carretero, M I; Giuliano, S M; Arraztoa, C C; Santa Cruz, R C; Fumuso, F G; Neild, D M
2017-08-01
Seminal plasma (SP) of South American Camelids could interfere with the interaction of spermatozoa with the extenders; therefore it becomes necessary to improve semen management using enzymatic treatment. Our objective was to compare two cooling protocols for llama semen. Twelve ejaculates were incubated in 0.1% collagenase and then were divided into two aliquots. One was extended in lactose and egg yolk (LEY) (Protocol A: collagenase and SP present). The other aliquot was centrifuged, and the pellet was resuspended in LEY (Protocol B: collagenase and SP absent). Both samples were maintained at 5°C during 24 hr. Routine and DNA evaluations were carried out in raw and cooled semen. Both cooling protocols maintained sperm viability, membrane function and DNA fragmentation, with Protocol A showing a significantly lowered total and progressive motility (p < .05) and Protocol B showing a significant increase in chromatin decondensation (p < .05). Protocol A avoids centrifugation, reducing processing times and making application in the field simpler. However, as neither protocol showed a significant superiority over the other, studies should be carried out in vivo to evaluate the effect on pregnancy rates of the presence of collagenase and SP in semen samples prior to either cooling or freeze-thawing. © 2016 Blackwell Verlag GmbH.
Brito, Maíra M; Lúcio, Cristina F; Angrimani, Daniel S R; Losano, João Diego A; Dalmazzo, Andressa; Nichi, Marcílio; Vannucchi, Camila I
2017-01-02
In addition to the existence of several cryopreservation protocols, no systematic research has been carried out in order to confirm the suitable protocol for canine sperm. This study aims to assess the effect of adding 5% glycerol during cryopreservation at 37°C (one-step) and 5°C (two-steps), in addition of testing two thawing protocols (37°C for 30 seconds, and 70°C for 8 seconds). We used 12 sperm samples divided into four experimental groups: Single-Step - Slow Thawing Group; Two-Step - Slow Thawing Group; Single-Step - Fast Thawing Group; and Two-Step - Fast Thawing Group. Frozen-thawed samples were submitted to automated analysis of sperm motility, evaluation of plasmatic membrane integrity, acrosomal integrity, mitochondrial activity, sperm morphology, sperm susceptibility to oxidative stress, and sperm binding assay to perivitellinic membrane of chicken egg yolk. Considering the comparison between freezing protocols, no statistical differences were verified for any of the response variables. When comparison between thawing protocols was performed, slow thawing protocol presented higher sperm count bound to perivitelline membrane of chicken egg yolk, compared to fast thawing protocol. Regardless of the freezing process, the slow thawing protocol can be recommended for the large scale cryopreservation of canine semen, since it shows a consistent better functional result.
Aboal, J R; Boquete, M T; Carballeira, A; Casanova, A; Debén, S; Fernández, J A
2017-05-01
In this study we examined 6080 data gathered by our research group during more than 20 years of research on the moss biomonitoring technique, in order to quantify the variability generated by different aspects of the protocol and to calculate the overall measurement uncertainty associated with the technique. The median variance of the concentrations of different pollutants measured in moss tissues attributed to the different methodological aspects was high, reaching values of 2851 (ng·g -1 ) 2 for Cd (sample treatment), 35.1 (μg·g -1 ) 2 for Cu (sample treatment), 861.7 (ng·g -1 ) 2 and for Hg (material selection). These variances correspond to standard deviations that constitute 67, 126 and 59% the regional background levels of these elements in the study region. The overall measurement uncertainty associated with the worst experimental protocol (5 subsamples, refrigerated, washed, 5 × 5 m size of the sampling area and once a year sampling) was between 2 and 6 times higher than that associated with the optimal protocol (30 subsamples, dried, unwashed, 20 × 20 m size of the sampling area and once a week sampling), and between 1.5 and 7 times higher than that associated with the standardized protocol (30 subsamples and once a year sampling). The overall measurement uncertainty associated with the standardized protocol could generate variations of between 14 and 47% in the regional background levels of Cd, Cu, Hg, Pb and Zn in the study area and much higher levels of variation in polluted sampling sites. We demonstrated that although the overall measurement uncertainty of the technique is still high, it can be reduced by using already well defined aspects of the protocol. Further standardization of the protocol together with application of the information on the overall measurement uncertainty would improve the reliability and comparability of the results of different biomonitoring studies, thus extending use of the technique beyond the context of scientific research. Copyright © 2017 Elsevier Ltd. All rights reserved.
Protein blotting protocol for beginners.
Petrasovits, Lars A
2014-01-01
The transfer and immobilization of biological macromolecules onto solid nitrocellulose or nylon (polyvinylidene difluoride (PVDF)) membranes subsequently followed by specific detection is referred to as blotting. DNA blots are called Southerns after the inventor of the technique, Edwin Southern. By analogy, RNA blots are referred to as northerns and protein blots as westerns (Burnette, Anal Biochem 112:195-203, 1981). With few exceptions, western blotting involves five steps, namely, sample collection, preparation, separation, immobilization, and detection. In this chapter, protocols for the entire process from sample collection to detection are described.
Quantification of trace elements and speciation of iron in atmospheric particulate matter
NASA Astrophysics Data System (ADS)
Upadhyay, Nabin
Trace metal species play important roles in atmospheric redox processes and in the generation of oxidants in cloud systems. The chemical impact of these elements on atmospheric and cloud chemistry is dependent on their occurrence, solubility and speciation. First, analytical protocols have been developed to determine trace elements in particulate matter samples collected for carbonaceous analysis. The validated novel protocols were applied to the determination of trace elements in particulate samples collected in the remote marine atmosphere and urban areas in Arizona to study air pollution issues. The second part of this work investigates on solubility and speciation in environmental samples. A detailed study on the impact of the nature and strength of buffer solutions on solubility and speciation of iron lead to a robust protocol, allowing for comparative measurements in matrices representative of cloud water conditions. Application of this protocol to samples from different environments showed low iron solubility (less than 1%) in dust-impacted events and higher solubility (5%) in anthropogenically impacted urban samples. In most cases, Fe(II) was the dominant oxidation state in the soluble fraction of iron. The analytical protocol was then applied to investigate iron processing by fogs. Field observations showed that only a small fraction (1%) of iron was scavenged by fog droplets for which each of the soluble and insoluble fraction were similar. A coarse time resolution limited detailed insights into redox cycling within fog system. Overall results suggested that the major iron species in the droplets was Fe(1I) (80% of soluble iron). Finally, the occurrence and sources of emerging organic pollutants in the urban atmosphere were investigated. Synthetic musk species are ubiquitous in the urban environment (less than 5 ng m-3) and investigations at wastewater treatment plants showed that wastewater aeration basins emit a substantial amount of these species to the atmosphere.
NASA Astrophysics Data System (ADS)
Krotov, Aleksei; Pankin, Victor
2017-09-01
The assessment of central circulation (including heart function) parameters is vital in the preventive diagnostics of inherent and acquired heart failures and during polychemotherapy. The protocols currently applied in Russia do not fully utilize the first-pass assessment (FPRNA) and that results in poor data formalization, while the FPRNA is the one of the fastest, affordable and compact methods among other radioisotope diagnostics protocols. A non-imaging algorithm basing on existing protocols has been designed to use the readings of an additional detector above vena subclavia to determine the total blood volume (TBV), not requiring blood sampling in contrast to current protocols. An automated processing of precordial detector readings is presented, in order to determine the heart strike volume (SV). Two techniques to estimate the ejection fraction (EF) of the heart are discussed.
CONCEPTS AND APPROACHES FOR THE BIOASSESSMENT OF NON-WADEABLE STREAMS AND RIVERS
This document is intended to assist users in establishing or refining protocols, including the specific methods related to field sampling, laboratory sample processing, taxonomy, data entry, management and analysis, and final assessment and reporting. It also reviews and provide...
A Systematic Evaluation of Blood Serum and Plasma Pre-Analytics for Metabolomics Cohort Studies
Jobard, Elodie; Trédan, Olivier; Postoly, Déborah; André, Fabrice; Martin, Anne-Laure; Elena-Herrmann, Bénédicte; Boyault, Sandrine
2016-01-01
The recent thriving development of biobanks and associated high-throughput phenotyping studies requires the elaboration of large-scale approaches for monitoring biological sample quality and compliance with standard protocols. We present a metabolomic investigation of human blood samples that delineates pitfalls and guidelines for the collection, storage and handling procedures for serum and plasma. A series of eight pre-processing technical parameters is systematically investigated along variable ranges commonly encountered across clinical studies. While metabolic fingerprints, as assessed by nuclear magnetic resonance, are not significantly affected by altered centrifugation parameters or delays between sample pre-processing (blood centrifugation) and storage, our metabolomic investigation highlights that both the delay and storage temperature between blood draw and centrifugation are the primary parameters impacting serum and plasma metabolic profiles. Storing the blood drawn at 4 °C is shown to be a reliable routine to confine variability associated with idle time prior to sample pre-processing. Based on their fine sensitivity to pre-analytical parameters and protocol variations, metabolic fingerprints could be exploited as valuable ways to determine compliance with standard procedures and quality assessment of blood samples within large multi-omic clinical and translational cohort studies. PMID:27929400
Cheah, Pike See; Mohidin, Norhani; Mohd Ali, Bariah; Maung, Myint; Latif, Azian Abdul
2008-01-01
This study illustrates and quantifies the changes on corneal tissue between the paraffin-embedded and resin-embedded blocks and thus, selects a better target in investigational ophthalmology and optometry via light microscopy. Corneas of two cynomolgus monkeys (Macaca fascicularis) were used in this study. The formalin-fixed cornea was prepared in paraffin block via the conventional tissue processing protocol (4-day protocol) and stained with haematoxylin and eosin. The glutaraldehyde-fixed cornea was prepared in resin block via the rapid and modified tissue processing procedure (1.2-day protocol) and stained with toluidine blue. The paraffin-embedded sample exhibits various undesired tissue damage and artifact such as thinner epithelium (due to the substantial volumic extraction from the tissue), thicker stroma layer (due to the separation of lamellae and the presence of voids) and the distorted endothelium. In contrast, the resin-embedded corneal tissue has demonstrated satisfactory corneal ultrastructural preservation. The rapid and modified tissue processing method for preparing the resin-embedded is particularly beneficial to accelerate the microscopic evaluation in ophthalmology and optometry. PMID:22570589
Gang, Yadong; Zhou, Hongfu; Jia, Yao; Liu, Ling; Liu, Xiuli; Rao, Gong; Li, Longhui; Wang, Xiaojun; Lv, Xiaohua; Xiong, Hanqing; Yang, Zhongqin; Luo, Qingming; Gong, Hui; Zeng, Shaoqun
2017-01-01
Resin embedding has been widely applied to fixing biological tissues for sectioning and imaging, but has long been regarded as incompatible with green fluorescent protein (GFP) labeled sample because it reduces fluorescence. Recently, it has been reported that resin-embedded GFP-labeled brain tissue can be imaged with high resolution. In this protocol, we describe an optimized protocol for resin embedding and chemical reactivation of fluorescent protein labeled mouse brain, we have used mice as experiment model, but the protocol should be applied to other species. This method involves whole brain embedding and chemical reactivation of the fluorescent signal in resin-embedded tissue. The whole brain embedding process takes a total of 7 days. The duration of chemical reactivation is ~2 min for penetrating 4 μm below the surface in the resin-embedded brain. This protocol provides an efficient way to prepare fluorescent protein labeled sample for high-resolution optical imaging. This kind of sample was demonstrated to be imaged by various optical micro-imaging methods. Fine structures labeled with GFP across a whole brain can be detected. PMID:28352214
A high-throughput microRNA expression profiling system.
Guo, Yanwen; Mastriano, Stephen; Lu, Jun
2014-01-01
As small noncoding RNAs, microRNAs (miRNAs) regulate diverse biological functions, including physiological and pathological processes. The expression and deregulation of miRNA levels contain rich information with diagnostic and prognostic relevance and can reflect pharmacological responses. The increasing interest in miRNA-related research demands global miRNA expression profiling on large numbers of samples. We describe here a robust protocol that supports high-throughput sample labeling and detection on hundreds of samples simultaneously. This method employs 96-well-based miRNA capturing from total RNA samples and on-site biochemical reactions, coupled with bead-based detection in 96-well format for hundreds of miRNAs per sample. With low-cost, high-throughput, high detection specificity, and flexibility to profile both small and large numbers of samples, this protocol can be adapted in a wide range of laboratory settings.
A Data Scheduling and Management Infrastructure for the TEAM Network
NASA Astrophysics Data System (ADS)
Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.; Unwin, R.
2009-04-01
The objective of the Tropical Ecology Assessment and Monitoring Network (www.teamnetwork.org) is "To generate real time data for monitoring long-term trends in tropical biodiversity through a global network of TEAM sites (i.e. field stations in tropical forests), providing an early warning system on the status of biodiversity to effectively guide conservation action". To achieve this, the TEAM Network operates by collecting data via standardized protocols at TEAM Sites. The standardized TEAM protocols include the Climate, Vegetation and Terrestrial Vertebrate Protocols. Some sites also implement additional protocols. There are currently 7 TEAM Sites with plans to grow the network to 15 by June 30, 2009 and 50 TEAM Sites by the end of 2010. Climate Protocol The Climate Protocol entails the collection of climate data via meteorological stations located at the TEAM Sites. This includes information such as precipitation, temperature, wind direction and strength and various solar radiation measurements. Vegetation Protocol The Vegetation Protocol collects standardized information on tropical forest trees and lianas. A TEAM Site will have between 6-9 1ha plots where trees and lianas larger than a pre-specified size are mapped, identified and measured. This results in each TEAM Site repeatedly measuring between 3000-5000 trees annually. Terrestrial Vertebrate Protocol The Terrestrial Vertebrate Protocol collects standardized information on mid-sized tropical forest fauna (i.e. birds and mammals). This information is collected via camera traps (i.e. digital cameras with motion sensors housed in weather proof casings). The images taken by the camera trap are reviewed to identify what species are captured in the image by the camera trap. The image and the interpretation of what is in the image are the data for the Terrestrial Vertebrate Protocol. The amount of data collected through the TEAM protocols provides a significant yet exciting IT challenge. The TEAM Network is currently partnering with the San Diego Super Computer Center to build the data management infrastructure. Data collected from the three core protocols as well as others are currently made available through the TEAM Network portal, which provides the content management framework, the data scheduling and management framework, an administrative framework to implement and manage TEAM sites, collaborative tools and a number of tools and applications utilizing Google Map and Google Earth products. A critical element of the TEAM Network data management infrastructure is to make the data publicly available in as close to real-time as possible (the TEAM Network Data Use Policy: http://www.teamnetwork.org/en/data/policy). This requires two essential tasks to be accomplished, 1) A data collection schedule has to be planned, proposed and approved for a given TEAM site. This is a challenging process since TEAM sites are geographically distributed across the tropics and hence have different seasons where they schedule field sampling for the different TEAM protocols. Capturing this information and ensuring that TEAM sites follow the outlined legal contract is key to the data collection process and 2) A stream-lined and efficient information management system to ensure data collected from the field meet the minimum data standards (i.e. are of the highest scientific quality) and are securely transferred, archived, processed and be rapidly made publicaly available, as a finished consumable product via the TEAM Network portal. The TEAM Network is achieving these goals by implementing an end-to-end framework consisting of the Sampling Scheduler application and the Data Management Framework. Sampling Scheduler The Sampling Scheduler is a project management, calendar based portal application that will allow scientists at a TEAM site to schedule field sampling for each of the TEAM protocols implemented at that site. The sampling scheduler addresses the specific requirements established in the TEAM protocols with the logistical scheduling needs of each TEAM Site. For example, each TEAM protocol defines when data must be collected (e.g. time of day, number of times per year, during which seasons, etc) as well as where data must be collected (from which sampling units, which trees, etc). Each TEAM Site has a limited number of resources and must create plans that will both satisfy the requirements of the protocols as well as be logistically feasible for their TEAM Site. With 15 TEAM Sites (and many more coming soon) the schedules of each TEAM Site must be communicated to the Network Office to ensure data are being collected as scheduled and to address the many problems when working in difficult environments like Tropical Forests. The Sampling Schedule provides built-in proposal and approval functionality to ensure that the TEAM Sites are and the Network office are in sync as well as provides the capability to modify schedules when needed. The Data Management Framework The Data Management framework is a three-tier data ingestion, edit and review application for protocols defined in the TEAM network. The data ingestion framework provides online web forms for field personnel to submit and edit data collected at TEAM Sites. These web forms will be accessible from the TEAM content management site. Once the data is securely uploaded, cured, processed and approved, it will be made publicly available for consumption by the scientific community. The Data Management framework, when combined with the Sampling Scheduler provides a closed loop Data Scheduling and Management infrastructure. All information starting from data collection plan, tools to input, modify and curate data, review and run QA/QC tests, as well as verify data are collected as planed are included. Finally, TEAM Network data are available for download via the Data Query and Download Application. This application utilizes a Google Maps custom interface to search, visualize, and download TEAM Network data. References • TEAM Network, http://www.teamnetwork.org • Center for Applied Biodiversity Science, Conservation International. http://science.conservation.org/portal/server.pt • TEAM Data Query and Download Application, http://www.teamnetwork.org/en/data/query
Hermannsdörfer, Justus; de Jonge, Niels
2017-02-05
Samples fully embedded in liquid can be studied at a nanoscale spatial resolution with Scanning Transmission Electron Microscopy (STEM) using a microfluidic chamber assembled in the specimen holder for Transmission Electron Microscopy (TEM) and STEM. The microfluidic system consists of two silicon microchips supporting thin Silicon Nitride (SiN) membrane windows. This article describes the basic steps of sample loading and data acquisition. Most important of all is to ensure that the liquid compartment is correctly assembled, thus providing a thin liquid layer and a vacuum seal. This protocol also includes a number of tests necessary to perform during sample loading in order to ensure correct assembly. Once the sample is loaded in the electron microscope, the liquid thickness needs to be measured. Incorrect assembly may result in a too-thick liquid, while a too-thin liquid may indicate the absence of liquid, such as when a bubble is formed. Finally, the protocol explains how images are taken and how dynamic processes can be studied. A sample containing AuNPs is imaged both in pure water and in saline.
Hermannsdörfer, Justus; de Jonge, Niels
2017-01-01
Samples fully embedded in liquid can be studied at a nanoscale spatial resolution with Scanning Transmission Electron Microscopy (STEM) using a microfluidic chamber assembled in the specimen holder for Transmission Electron Microscopy (TEM) and STEM. The microfluidic system consists of two silicon microchips supporting thin Silicon Nitride (SiN) membrane windows. This article describes the basic steps of sample loading and data acquisition. Most important of all is to ensure that the liquid compartment is correctly assembled, thus providing a thin liquid layer and a vacuum seal. This protocol also includes a number of tests necessary to perform during sample loading in order to ensure correct assembly. Once the sample is loaded in the electron microscope, the liquid thickness needs to be measured. Incorrect assembly may result in a too-thick liquid, while a too-thin liquid may indicate the absence of liquid, such as when a bubble is formed. Finally, the protocol explains how images are taken and how dynamic processes can be studied. A sample containing AuNPs is imaged both in pure water and in saline. PMID:28190028
Identifying thermal breakdown products of thermoplastics.
Guillemot, Marianne; Oury, Benoît; Melin, Sandrine
2017-07-01
Polymers processed to produce plastic articles are subjected to temperatures between 150°C and 450°C or more during overheated processing and breakdowns. Heat-based processing of this nature can lead to emission of volatile organic compounds (VOCs) into the thermoplastic processing shop. In this study, laboratory experiments, qualitative and quantitative emissions measurement in thermoplastic factories were carried out. The first step was to identify the compounds released depending on the thermoplastic nature, the temperature and the type of process. Then a thermal degradation protocol that can extrapolate the laboratory results to industry scenarios was developed. The influence of three parameters on released thermal breakdown products was studied: the sample preparation methods-manual cutting, ambient, or cold grinding-the heating rate during thermal degradation-5, 10 20, and 50°C/min-and the decomposition method-thermogravimetric analysis and pyrolysis. Laboratory results were compared to atmospheric measurements taken at 13 companies to validate the protocol and thereby ensure its representativeness of industrial thermal processing. This protocol was applied to most commonly used thermoplastics to determine their thermal breakdown products and their thermal behaviour. Emissions data collected by personal exposure monitoring and sampling at the process emission area show airborne concentrations of detected compounds to be in the range of 0-3 mg/m 3 under normal operating conditions. Laser cutting or purging operations generate higher pollution levels in particular formaldehyde which was found in some cases at a concentration above the workplace exposure limit.
Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fritz, Brad G.; Abrecht, David G.; Hayes, James C.
2016-10-31
Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO 2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.
Biomek 3000: the workhorse in an automated accredited forensic genetic laboratory.
Stangegaard, Michael; Meijer, Per-Johan; Børsting, Claus; Hansen, Anders J; Morling, Niels
2012-10-01
We have implemented and validated automated protocols for a wide range of processes such as sample preparation, PCR setup, and capillary electrophoresis setup using small, simple, and inexpensive automated liquid handlers. The flexibility and ease of programming enable the Biomek 3000 to be used in many parts of the laboratory process in a modern forensic genetics laboratory with low to medium sample throughput. In conclusion, we demonstrated that sample processing for accredited forensic genetic DNA typing can be implemented on small automated liquid handlers, leading to the reduction of manual work as well as increased quality and throughput.
Smart, Kathleen F; Aggio, Raphael B M; Van Houtte, Jeremy R; Villas-Bôas, Silas G
2010-09-01
This protocol describes an analytical platform for the analysis of intra- and extracellular metabolites of microbial cells (yeast, filamentous fungi and bacteria) using gas chromatography-mass spectrometry (GC-MS). The protocol is subdivided into sampling, sample preparation, chemical derivatization of metabolites, GC-MS analysis and data processing and analysis. This protocol uses two robust quenching methods for microbial cultures, the first of which, cold glycerol-saline quenching, causes reduced leakage of intracellular metabolites, thus allowing a more reliable separation of intra- and extracellular metabolites with simultaneous stopping of cell metabolism. The second, fast filtration, is specifically designed for quenching filamentous micro-organisms. These sampling techniques are combined with an easy sample-preparation procedure and a fast chemical derivatization reaction using methyl chloroformate. This reaction takes place at room temperature, in aqueous medium, and is less prone to matrix effect compared with other derivatizations. This protocol takes an average of 10 d to complete and enables the simultaneous analysis of hundreds of metabolites from the central carbon metabolism (amino and nonamino organic acids, phosphorylated organic acids and fatty acid intermediates) using an in-house MS library and a data analysis pipeline consisting of two free software programs (Automated Mass Deconvolution and Identification System (AMDIS) and R).
NASA Astrophysics Data System (ADS)
Roubinet, Claire; Moreira, Manuel A.
2018-02-01
Noble gases in oceanic basalts always show the presence in variable proportions of a component having elemental and isotopic compositions that are similar to those of the atmosphere and distinct from the mantle composition. Although this component could be mantle-derived (e.g. subduction of air or seawater-derived noble gases trapped in altered oceanic crust and sediments), it is most often suggested that this air component is added after sample collection and probably during storage at ambient air, although the mechanism remains unknown. In an attempt to reduce this atmospheric component observed in MORBs, four experimental protocols have been followed in this study. These protocols are based on the hypothesis that air can be removed from the samples, as it appears to be sheltered in distinct vesicles compared to those filled with mantle gases. All of the protocols involve a glove box filled with nitrogen, and in certain cases, the samples are stored under primary vacuum (lower than 10-2 mbar) to pump air out or, alternatively, under high pressure of N2 to expel atmospheric noble gases. In all protocols, three components are observed: atmospheric, fractionated atmospheric and magmatic. The fractionated air component seems to be derived from the non-vitreous part of the pillow-lava, which has cooled more slowly. This component is enriched in Ne relative to Ar, reflecting a diffusive process. This contaminant has already been observed in other studies and thus seems to be relatively common. Although it is less visible, unfractionated air has also been detected in some crushing steps, which tends to indicate that despite the experiments, air is still present in the vesicles. This result is surprising, since studies have demonstrated that atmospheric contamination could be limited if samples were stored under nitrogen quickly after their recovery from the seafloor. Thus, the failure of the protocols could be explained by the insufficient duration of these protocols or by the inaccessibility of vesicles filled with air as assessed by (Ballentine and Barfod, 2000).
Hilliard, Mark; Alley, William R; McManus, Ciara A; Yu, Ying Qing; Hallinan, Sinead; Gebler, John; Rudd, Pauline M
Glycosylation is an important attribute of biopharmaceutical products to monitor from development through production. However, glycosylation analysis has traditionally been a time-consuming process with long sample preparation protocols and manual interpretation of the data. To address the challenges associated with glycan analysis, we developed a streamlined analytical solution that covers the entire process from sample preparation to data analysis. In this communication, we describe the complete analytical solution that begins with a simplified and fast N-linked glycan sample preparation protocol that can be completed in less than 1 hr. The sample preparation includes labelling with RapiFluor-MS tag to improve both fluorescence (FLR) and mass spectral (MS) sensitivities. Following HILIC-UPLC/FLR/MS analyses, the data are processed and a library search based on glucose units has been included to expedite the task of structural assignment. We then applied this total analytical solution to characterize the glycosylation of the NIST Reference Material mAb 8761. For this glycoprotein, we confidently identified 35 N-linked glycans and all three major classes, high mannose, complex, and hybrid, were present. The majority of the glycans were neutral and fucosylated; glycans featuring N-glycolylneuraminic acid and those with two galactoses connected via an α1,3-linkage were also identified.
Microwave Processing of Crowns from Winter Cereals for Light Microscopy.
USDA-ARS?s Scientific Manuscript database
Microwave processing of tissue considerably shortens the time it takes to prepare samples for light and electron microscopy. However, plant tissues from different species and different regions of the plant respond differently making it impossible to use a single protocol for all plant tissue. The ...
A Synopsis of Technical Issues of Concern for Monitoring Trace Elements in Highway and Urban Runoff
Breault, Robert F.; Granato, Gregory E.
2000-01-01
Trace elements, which are regulated for aquatic life protection, are a primary concern in highway- and urban-runoff studies because stormwater runoff may transport these constituents from the land surface to receiving waters. Many of these trace elements are essential for biological activity and become detrimental only when geologic or anthropogenic sources exceed concentrations beyond ranges typical of the natural environment. The Federal Highway Administration and State Transportation Agencies are concerned about the potential effects of highway runoff on the watershed scale and for the management and protection of watersheds. Transportation agencies need information that is documented as valid, current, and scientifically defensible to support planning and management decisions. There are many technical issues of concern for monitoring trace elements; therefore, trace-element data commonly are considered suspect, and the responsibility to provide data-quality information to support the validity of reported results rests with the data-collection agency. Paved surfaces are fundamentally different physically, hydraulically, and chemically from the natural surfaces typical of most freshwater systems that have been the focus of many traceelement- monitoring studies. Existing scientific conceptions of the behavior of trace elements in the environment are based largely upon research on natural systems, rather than on systems typical of pavement runoff. Additionally, the logistics of stormwater sampling are difficult because of the great uncertainty in the occurrence and magnitude of storm events. Therefore, trace-element monitoring programs may be enhanced if monitoring and sampling programs are automated. Automation would standardize the process and provide a continuous record of the variations in flow and water-quality characteristics. Great care is required to collect and process samples in a manner that will minimize potential contamination or attenuation of trace elements and other sources of bias and variability in the sampling process. Trace elements have both natural and anthropogenic sources that may affect the sampling process, including the sample-collection and handling materials used in many trace-element monitoring studies. Trace elements also react with these materials within the timescales typical for collection, processing and analysis of runoff samples. To study the characteristics and potential effects of trace elements in highway and urban runoff, investigators typically sample one or more operationally defined matrixes including: whole water, dissolved (filtered water), suspended sediment, bottom sediment, biological tissue, and contaminant sources. The sampling and analysis of each of these sample matrixes can provide specific information about the occurrence and distribution of trace elements in runoff and receiving waters. There are, however, technical concerns specific to each matrix that must be understood and addressed through use of proper collection and processing protocols. Valid protocols are designed to minimize inherent problems and to maximize the accuracy, precision, comparability, and representativeness of data collected. Documentation, including information about monitoring protocols, quality assurance and quality control efforts, and ancillary data also is necessary to establish data quality. This documentation is especially important for evaluation of historical traceelement monitoring data, because trace-element monitoring protocols and analysis methods have been constantly changing over the past 30 years.
Modeling abundance using multinomial N-mixture models
Royle, Andy
2016-01-01
Multinomial N-mixture models are a generalization of the binomial N-mixture models described in Chapter 6 to allow for more complex and informative sampling protocols beyond simple counts. Many commonly used protocols such as multiple observer sampling, removal sampling, and capture-recapture produce a multivariate count frequency that has a multinomial distribution and for which multinomial N-mixture models can be developed. Such protocols typically result in more precise estimates than binomial mixture models because they provide direct information about parameters of the observation process. We demonstrate the analysis of these models in BUGS using several distinct formulations that afford great flexibility in the types of models that can be developed, and we demonstrate likelihood analysis using the unmarked package. Spatially stratified capture-recapture models are one class of models that fall into the multinomial N-mixture framework, and we discuss analysis of stratified versions of classical models such as model Mb, Mh and other classes of models that are only possible to describe within the multinomial N-mixture framework.
Comparison of Two Methods of RNA Extraction from Formalin-Fixed Paraffin-Embedded Tissue Specimens
Gouveia, Gisele Rodrigues; Ferreira, Suzete Cleusa; Ferreira, Jerenice Esdras; Siqueira, Sheila Aparecida Coelho; Pereira, Juliana
2014-01-01
The present study aimed to compare two different methods of extracting RNA from formalin-fixed paraffin-embedded (FFPE) specimens of diffuse large B-cell lymphoma (DLBCL). We further aimed to identify possible influences of variables—such as tissue size, duration of paraffin block storage, fixative type, primers used for cDNA synthesis, and endogenous genes tested—on the success of amplification from the samples. Both tested protocols used the same commercial kit for RNA extraction (the RecoverAll Total Nucleic Acid Isolation Optimized for FFPE Samples from Ambion). However, the second protocol included an additional step of washing with saline buffer just after sample rehydration. Following each protocol, we compared the RNA amount and purity and the amplification success as evaluated by standard PCR and real-time PCR. The results revealed that the extra washing step added to the RNA extraction process resulted in significantly improved RNA quantity and quality and improved success of amplification from paraffin-embedded specimens. PMID:25105117
Muiños-Bühl, Anixa; González-Recio, Oscar; Muñoz, María; Óvilo, Cristina; García-Casco, Juan; Fernández, Ana I
2018-06-01
There is a growing interest in understanding the role of the gut microbiome on productive and meat quality-related traits in livestock species in order to develop new useful tools for improving pig production systems and industry. Faecal samples are analysed as a proxy of gut microbiota and here the selection of suitable protocols for faecal sampling and DNA isolation is a critical first step in order to obtain reliable results, even more to compare results obtained from different studies. The aim of the current study was to establish in a cost-effective way, using automated ribosomal intergenic spacer analysis technique, a protocol for porcine faecal sampling and storage at farm and slaughterhouse and to determine the most efficient microbiota DNA isolation kit among those most widely used. Operational Taxonomic Unit profiles were compared from Iberian pig faecal samples collected from rectum or ground, stored with liquid N 2 , room temperature or RNAlater, and processed with QIAamp DNA Stool (Qiagen), PowerFecal DNA Isolation (Mobio) or SpeedTools Tissue DNA extraction (Biotools) commercial kits. The results, focused on prokaryote sampling, based on DNA yield and quality, OTU number and Sørensen similarity Indexes, indicate that the recommended protocol for porcine faecal microbiome sampling at farm should include: the collection from porcine rectum to avoid contamination; the storage in liquid N 2 or even at room temperature, but not in RNAlater; and the isolation of microbiota DNA using PowerFecal DNA Isolation kit. These conditions provide more reliable DNA samples for further microbiome analysis.
The influence of solvent processing on polyester bioabsorbable polymers.
Manson, Joanne; Dixon, Dorian
2012-01-01
Solvent-based methods are commonly employed for the production of polyester-based samples and coatings in both medical device production and research. The influence of solvent casting and subsequent drying time was studied using thermal analysis, spectroscopy and weight measurement for four grades of 50 : 50 poly(lactic-co-glycolic acid) (PLGA) produced by using chloroform, dichloromethane, and acetone. The results demonstrate that solvent choice and PLGA molecular weight are critical factors in terms of solvent removal rate and maintaining sample integrity, respectively. The protocols widely employed result in high levels of residual solvent and a new protocol is presented together with solutions to commonly encountered problems.
A rapid and efficient SDS-based RNA isolation protocol from different tissues of coffee.
Huded, Arun Kumar C; Jingade, Pavankumar; Mishra, Manoj Kumar
2018-03-01
Isolation of high-quality RNA from coffee is challenging because of high level of polysaccharides, polyphenols and other secondary metabolites. In the present study, a rapid and efficient RNA extraction protocol from different tissues of coffee was optimized. Sufficiently high quality and quantity (225.6-454.8 µg/g) of RNA was obtained by using the optimized protocol. The presence of two distinct bands of 28S rRNA and 18S rRNA in agarose gel proved the intactness of the RNA samples. The average spectrophotometric values of the isolated RNA ranged from 1.96 to 2.02 ( A 260/280 ) and 1.95 to 2.14 ( A 260/230 ), indicating the high quality of RNA devoid of polyphenols, polysaccharides and protein contamination. In the optimized protocol, addition of PVPP to the extraction buffer and a brief incubation of samples at 65 °C and subsequent purification with potassium acetate resulted in good-quality RNA isolation. The suitability of RNA for downstream processing was confirmed by PCR amplification with cytochrome c oxidase gene-specific primers. The amplification of a single 392 bp fragment using cDNA and 1.5 kb fragment using genomic DNA samples confirmed the absence of DNA contamination. The present protocol is rapid and yielded good quality and quantity of RNA suitable for functional genomics studies.
Abras, Alba; Ballart, Cristina; Llovet, Teresa; Roig, Carme; Gutiérrez, Cristina; Tebar, Silvia; Berenguer, Pere; Pinazo, María-Jesús; Posada, Elizabeth; Gascón, Joaquim; Schijman, Alejandro G; Gállego, Montserrat; Muñoz, Carmen
2018-01-01
Polymerase chain reaction (PCR) has become a useful tool for the diagnosis of Trypanosoma cruzi infection. The development of automated DNA extraction methodologies and PCR systems is an important step toward the standardization of protocols in routine diagnosis. To date, there are only two commercially available Real-Time PCR assays for the routine laboratory detection of T. cruzi DNA in clinical samples: TCRUZIDNA.CE (Diagnostic Bioprobes Srl) and RealCycler CHAG (Progenie Molecular). Our aim was to evaluate the RealCycler CHAG assay taking into account the whole process. We assessed the usefulness of an automated DNA extraction system based on magnetic particles (EZ1 Virus Mini Kit v2.0, Qiagen) combined with a commercially available Real-Time PCR assay targeting satellite DNA (SatDNA) of T. cruzi (RealCycler CHAG), a methodology used for routine diagnosis in our hospital. It was compared with a well-known strategy combining a commercial DNA isolation kit based on silica columns (High Pure PCR Template Preparation Kit, Roche Diagnostics) with an in-house Real-Time PCR targeting SatDNA. The results of the two methodologies were in almost perfect agreement, indicating they can be used interchangeably. However, when variations in protocol factors were applied (sample treatment, extraction method and Real-Time PCR), the results were less convincing. A comprehensive fine-tuning of the whole procedure is the key to successful results. Guanidine EDTA-blood (GEB) samples are not suitable for DNA extraction based on magnetic particles due to inhibition, at least when samples are not processed immediately. This is the first study to evaluate the RealCycler CHAG assay taking into account the overall process, including three variables (sample treatment, extraction method and Real-Time PCR). Our findings may contribute to the harmonization of protocols between laboratories and to a wider application of Real-Time PCR in molecular diagnostic laboratories associated with health centers.
NASA Astrophysics Data System (ADS)
Preciado, C., Jr.
2016-12-01
Compound-specific stable isotope analysis (CSIA) has become a powerful tool for reconstructing consumer-resource relationships in modern and ancient systems. Stable nitrogen isotope analysis (δ15N) of "trophic" and "source" amino acids provides independent proxies of trophic position and the δ15N value at the base of the food web, respectively. When applied to avian egg tissues (e.g., shell protein, membrane), which can be preserved in the archaeological record, this approach can be used to address complex questions in food web architecture and biogeochemical cycling. In this study, we examined how sample-processing protocols affected the chromatography and reliability of δ15N values of individual amino acids in avian (chicken and penguin) egg components (shell protein, membrane, yolk, and albumen) via gas chromatography-combustion-isotope ratio mass spectrometry. "Unprocessed" egg tissues underwent standard acid hydrolysis protocols prior to derivatization, and resulted in poor chromatography with highly variable δ15N values across replicates. "Processed" samples were eluted through cation exchange columns (Dowex 50WX*400) prior to derivatization, followed by a P-buffer (KH2PO4 + Na2HPO4)-chloroform centrifugation extraction to remove confounding peaks and matrix impurities. These additional procedures greatly improved the chromatography of the "processed" samples, revealing better peak separation and baseline integration as well as lower δ15N variability. Additionally, a standard lipid extraction was necessary for yolk but not membrane, albumen, and shell. While the additional procedures applied to the "columned" samples did result in a significant reduction in sample yield ( 20%), it was non-fractionating and thus only affected the total sample sizes necessary for δ15N CSIA (shell protein 50mg and membrane, yolk, and albumen 0.5mg). The protocols developed here will streamline CSIA of egg tissues for future work in avian ecology.
Gutiérrez-Cepeda, L; Fernández, A; Crespo, F; Gosálvez, J; Serres, C
2011-03-01
For many years in human assisted-reproduction procedures there have been special protocols to prepare and improve sperm quality. Colloidal centrifugation (CC) is a useful technique that has been proved to enhance semen quality by selection of the best spermatozoa for different species. Its use is recommended to improve fertility of subfertile stallions but current CC protocols are clinically complicated in the equine sperm processing technique due to economic and technical difficulties. The aim of this study was to determine the optimal processing procedures to adapt the use of a CC product (EquiPure™) in the equine reproduction industry. A total of nineteen ejaculates were collected from 10 Purebred Spanish Horses (P.R.E horses) using a Missouri artificial vagina. Gel-free semen aliquots were analyzed prior to treatment (control). Semen was subjected to one of six CC protocols with EquiPure™ and centrifuged samples were statistically evaluated by ANOVA and Duncan tests (p<0.05) for sperm quality and recovery rate. We obtained higher values by colloidal centrifugation in LIN, STR and BCF variables and DNA fragmentation index trended to be lower in most of the CC protocols. The studied protocols were shown to be as efficient in improving equine sperm quality as the current commercial EquiPure™, with the added advantage of being much more economical and simple to use. According to these results it seems to be possible to incorporate single layer and or high colloidal centrifugation volume protocols what would make them simple, economic and clinically viable for the equine sperm processing procedure. Copyright © 2011 Elsevier B.V. All rights reserved.
Ozarda, Yesim; Ichihara, Kiyoshi; Barth, Julian H; Klee, George
2013-05-01
The reference intervals (RIs) given in laboratory reports have an important role in aiding clinicians in interpreting test results in reference to values of healthy populations. In this report, we present a proposed protocol and standard operating procedures (SOPs) for common use in conducting multicenter RI studies on a national or international scale. The protocols and consensus on their contents were refined through discussions in recent C-RIDL meetings. The protocol describes in detail (1) the scheme and organization of the study, (2) the target population, inclusion/exclusion criteria, ethnicity, and sample size, (3) health status questionnaire, (4) target analytes, (5) blood collection, (6) sample processing and storage, (7) assays, (8) cross-check testing, (9) ethics, (10) data analyses, and (11) reporting of results. In addition, the protocol proposes the common measurement of a panel of sera when no standard materials exist for harmonization of test results. It also describes the requirements of the central laboratory, including the method of cross-check testing between the central laboratory of each country and local laboratories. This protocol and the SOPs remain largely exploratory and may require a reevaluation from the practical point of view after their implementation in the ongoing worldwide study. The paper is mainly intended to be a basis for discussion in the scientific community.
Vipie: web pipeline for parallel characterization of viral populations from multiple NGS samples.
Lin, Jake; Kramna, Lenka; Autio, Reija; Hyöty, Heikki; Nykter, Matti; Cinek, Ondrej
2017-05-15
Next generation sequencing (NGS) technology allows laboratories to investigate virome composition in clinical and environmental samples in a culture-independent way. There is a need for bioinformatic tools capable of parallel processing of virome sequencing data by exactly identical methods: this is especially important in studies of multifactorial diseases, or in parallel comparison of laboratory protocols. We have developed a web-based application allowing direct upload of sequences from multiple virome samples using custom parameters. The samples are then processed in parallel using an identical protocol, and can be easily reanalyzed. The pipeline performs de-novo assembly, taxonomic classification of viruses as well as sample analyses based on user-defined grouping categories. Tables of virus abundance are produced from cross-validation by remapping the sequencing reads to a union of all observed reference viruses. In addition, read sets and reports are created after processing unmapped reads against known human and bacterial ribosome references. Secured interactive results are dynamically plotted with population and diversity charts, clustered heatmaps and a sortable and searchable abundance table. The Vipie web application is a unique tool for multi-sample metagenomic analysis of viral data, producing searchable hits tables, interactive population maps, alpha diversity measures and clustered heatmaps that are grouped in applicable custom sample categories. Known references such as human genome and bacterial ribosomal genes are optionally removed from unmapped ('dark matter') reads. Secured results are accessible and shareable on modern browsers. Vipie is a freely available web-based tool whose code is open source.
Chung, K Y; Carter, G J; Stancliffe, J D
1999-02-01
A new European/International Standard (ISOprEN 10882-1) on the sampling of airborne particulates generated during welding and allied processes has been proposed. The use of a number of samplers and sampling procedures is allowable within the defined protocol. The influence of these variables on welding fume exposures measured during welding and grinding of stainless and mild steel using the gas metal arc (GMA) and flux-cored arc (FCA) and GMA welding of aluminium has been examined. Results show that use of any of the samplers will not give significantly different measured exposures. The effect on exposure measurement of placing the samplers on either side of the head was variable; consequently, sampling position cannot be meaningfully defined. All samplers collected significant amounts of grinding dust. Therefore, gravimetric determination of welding fume exposure in atmospheres containing grinding dust will be inaccurate. The use of a new size selective sampler can, to some extent, be used to give a more accurate estimate of exposure. The reliability of fume analysis data of welding consumables has caused concern; and the reason for differences that existed between the material safety data sheet and the analysis of fume samples collected requires further investigation.
The Earth Microbiome Project and modeling the planets microbial potential (Invited)
NASA Astrophysics Data System (ADS)
Gilbert, J. A.
2013-12-01
The understanding of Earth's climate and ecology requires multiscale observations of the biosphere, of which microbial life are a major component. However, to acquire and process physical samples of soil, water and air that comprise the appropriate spatial and temporal resolution to capture the immense variation in microbial dynamics, would require a herculean effort and immense financial resources dwarfing even the most ambitious projects to date. To overcome this hurdle we created the Earth Microbiome Project, a crowd-sourced effort to acquire physical samples from researchers around the world that are, importantly, contextualized with physical, chemical and biological data detailing the environmental properties of that sample in the location and time it was acquired. The EMP leverages these existing efforts to target a systematic analysis of microbial taxonomic and functional dynamics across a vast array of environmental parameter gradients. The EMP captures the environmental gradients, location, time and sampling protocol information about every sample donated by our valued collaborators. Physical samples are then processed using a standardized DNA extraction, PCR, and shotgun sequencing protocol to generate comparable data regarding the microbial community structure and function in each sample. To date we have processed >17,000 samples from 40 different biomes. One of the key goals of the EMP is to map the spatiotemporal variability of microbial communities to capture the changes in important functional processes that need to be appropriately expressed in models to provide reliable forecasts of ecosystem phenotype across our changing planet. This is essential if we are to develop economically sound strategies to be good stewards of our Earth. The EMP recognizes that environments are comprised of complex sets of interdependent parameters and that the development of useful predictive computational models of both terrestrial and atmospheric systems requires recognition and accommodation of sources of uncertainty.
Validating the Inactivation Effectiveness of Chemicals on Ebola Virus.
Haddock, Elaine; Feldmann, Friederike
2017-01-01
While viruses such as Ebola virus must be handled in high-containment laboratories, there remains the need to process virus-infected samples for downstream research testing. This processing often includes removal to lower containment areas and therefore requires assurance of complete viral inactivation within the sample before removal from high-containment. Here we describe methods for the removal of chemical reagents used in inactivation procedures, allowing for validation of the effectiveness of various inactivation protocols.
Growth and Visual Information Processing in Infants in Southern Ethiopia
ERIC Educational Resources Information Center
Kennedy, Tay; Thomas, David G.; Woltamo, Tesfaye; Abebe, Yewelsew; Hubbs-Tait, Laura; Sykova, Vladimira; Stoecker, Barbara J.; Hambidge, K. Michael
2008-01-01
Speed of information processing and recognition memory can be assessed in infants using a visual information processing (VIP) paradigm. In a sample of 100 infants 6-8 months of age from Southern Ethiopia, we assessed relations between growth and VIP. The 69 infants who completed the VIP protocol had a mean weight z score of -1.12 plus or minus…
Matrone, M.; Keid, L.B.; Rocha, V.C.M.; Vejarano, M.P.; Ikuta, C.Y.; Rodriguez, C.A.R.; Ferreira, F.; Dias, R.A.; Ferreira Neto, J.S
2009-01-01
The objective of the present study was to improve the detection of B. abortus by PCR in organs of aborted fetuses from infected cows, an important mechanism to find infected herds on the eradication phase of the program. So, different DNA extraction protocols were compared, focusing the PCR detection of B. abortus in clinical samples collected from aborted fetuses or calves born from cows challenged with the 2308 B. abortus strain. Therefore, two gold standard groups were built based on classical bacteriology, formed from: 32 lungs (17 positives), 26 spleens (11 positives), 23 livers (8 positives) and 22 bronchial lymph nodes (7 positives). All samples were submitted to three DNA extraction protocols, followed by the same amplification process with the primers B4 and B5. From the accumulated results for organ, the proportion of positives for the lungs was higher than the livers (p=0.04) or bronchial lymph nodes (p=0.004) and equal to the spleens (p=0.18). From the accumulated results for DNA extraction protocol, the proportion of positives for the Boom protocol was bigger than the PK (p< 0.0001) and GT (p=0.0004). There was no difference between the PK and GT protocols (p=0.5). Some positive samples from the classical bacteriology were negative to the PCR and vice-versa. Therefore, the best strategy for B. abortus detection in the organs of aborted fetuses or calves born from infected cows is the use, in parallel, of isolation by classical bacteriology and the PCR, with the DNA extraction performed by the Boom protocol. PMID:24031391
Zhao, Xiaoyan; Qureshi, Ferhan; Eastman, P Scott; Manning, William C; Alexander, Claire; Robinson, William H; Hesterberg, Lyndal K
2012-04-30
Variability in pre-analytical blood sampling and handling can significantly impact results obtained in quantitative immunoassays. Understanding the impact of these variables is critical for accurate quantification and validation of biomarker measurements. Particularly, in the design and execution of large clinical trials, even small differences in sample processing and handling can have dramatic effects in analytical reliability, results interpretation, trial management and outcome. The effects of two common blood sampling methods (serum vs. plasma) and two widely-used serum handling methods (on the clot with ambient temperature shipping, "traditional", vs. centrifuged with cold chain shipping, "protocol") on protein and autoantibody concentrations were examined. Matched serum and plasma samples were collected from 32 rheumatoid arthritis (RA) patients representing a wide range of disease activity status. Additionally, a set of matched serum samples with two sample handling methods was collected. One tube was processed per manufacturer's instructions and shipped overnight on cold packs (protocol). The matched tube, without prior centrifugation, was simultaneously shipped overnight at ambient temperatures (traditional). Upon delivery, the traditional tube was centrifuged. All samples were subsequently aliquoted and frozen prior to analysis of protein and autoantibody biomarkers. Median correlation between paired serum and plasma across all autoantibody assays was 0.99 (0.98-1.00) with a median % difference of -3.3 (-7.5 to 6.0). In contrast, observed protein biomarker concentrations were significantly affected by sample types, with median correlation of 0.99 (0.33-1.00) and a median % difference of -10 (-55 to 23). When the two serum collection/handling methods were compared, the median correlation between paired samples for autoantibodies was 0.99 (0.91-1.00) with a median difference of 4%. In contrast, significant increases were observed in protein biomarker concentrations among certain biomarkers in samples processed with the 'traditional' method. Autoantibody quantification appears robust to both sample type (plasma vs. serum) and pre-analytical sample collection/handling methods (protocol vs. traditional). In contrast, for non-antibody protein biomarker concentrations, sample type had a significant impact; plasma samples generally exhibit decreased protein biomarker concentrations relative to serum. Similarly, sample handling significantly impacted the variability of protein biomarker concentrations. When biomarker concentrations are combined algorithmically into a single test score such as a multi-biomarker disease activity test for rheumatoid arthritis (MBDA), changes in protein biomarker concentrations may result in a bias of the score. These results illustrate the importance of characterizing pre-analytical methodology, sample type, sample processing and handling procedures for clinical testing in order to ensure test accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.
RNA extraction from decaying wood for (meta)transcriptomic analyses.
Adamo, Martino; Voyron, Samuele; Girlanda, Mariangela; Marmeisse, Roland
2017-10-01
Wood decomposition is a key step of the terrestrial carbon cycle and is of economic importance. It is essentially a microbiological process performed by fungi and to an unknown extent by bacteria. To gain access to the genes expressed by the diverse microbial communities participating in wood decay, we developed an RNA extraction protocol from this recalcitrant material rich in polysaccharides and phenolic compounds. This protocol was implemented on 22 wood samples representing as many tree species from 11 plant families in the Angiosperms and Gymnosperms. RNA was successfully extracted from all samples and converted into cDNAs from which were amplified both fungal and bacterial protein coding genes, including genes encoding hydrolytic enzymes participating in lignocellulose hydrolysis. This protocol applicable to a wide range of decomposing wood types represents a first step towards a metatranscriptomic analysis of wood degradation under natural conditions.
To Master or Perform? Exploring Relations between Achievement Goals and Conceptual Change Learning
ERIC Educational Resources Information Center
Ranellucci, John; Muis, Krista R.; Duffy, Melissa; Wang, Xihui; Sampasivam, Lavanya; Franco, Gina M.
2013-01-01
Background: Research is needed to explore conceptual change in relation to achievement goal orientations and depth of processing. Aims: To address this need, we examined relations between achievement goals, use of deep versus shallow processing strategies, and conceptual change learning using a think-aloud protocol. Sample and Method:…
2015-10-01
individuals and collaborators affiliated with the research team. These same individuals are in the process of learning the colony management skills ...as well as the experimental skills needed for the proposed prostate cancer models. We have also become familiar with the sample processing protocols
2013-06-01
lenses of unconsolidated sand and rounded river gravel overlain by as much as 5 m of silt. Gravel consists mostly of quartz and metamorphic rock with...iii LIST OF FIGURES Page Figure 1. Example of multi-increment sampling using a systematic-random sampling design for collecting two separate...The small arms firing Range 16 Record berms at Fort Wainwright. .................... 25 Figure 9. Location of berms sampled using ISM and grab
Mena, Marisa; Lloveras, Belen; Tous, Sara; Bogers, Johannes; Maffini, Fausto; Gangane, Nitin; Kumar, Rekha Vijay; Somanathan, Thara; Lucas, Eric; Anantharaman, Devasena; Gheit, Tarik; Castellsagué, Xavier; Pawlita, Michael; de Sanjosé, Silvia; Alemany, Laia; Tommasino, Massimo
2017-01-01
Worldwide use of formalin-fixed paraffin-embedded blocks (FFPE) is extensive in diagnosis and research. Yet, there is a lack of optimized/standardized protocols to process the blocks and verify the quality and presence of the targeted tissue. In the context of an international study on head and neck cancer (HNC)-HPV-AHEAD, a standardized protocol for optimizing the use of FFPEs in molecular epidemiology was developed and validated. First, a protocol for sectioning the FFPE was developed to prevent cross-contamination and distributed between participating centers. Before processing blocks, all sectioning centers underwent a quality control to guarantee a satisfactory training process. The first and last sections of the FFPEs were used for histopathological assessment. A consensus histopathology evaluation form was developed by an international panel of pathologists and evaluated for four indicators in a pilot analysis in order to validate it: 1) presence/type of tumor tissue, 2) identification of other tissue components that could affect the molecular diagnosis and 3) quality of the tissue. No HPV DNA was found in sections from empty FFPE generated in any histology laboratories of HPV-AHEAD consortium and all centers passed quality assurance for processing after quality control. The pilot analysis to validate the histopathology form included 355 HNC cases. The form was filled by six pathologists and each case was randomly assigned to two of them. Most samples (86%) were considered satisfactory. Presence of >50% of invasive carcinoma was observed in all sections of 66% of cases. Substantial necrosis (>50%) was present in <2% of samples. The concordance for the indicators targeted to validate the histopathology form was very high (kappa > 0.85) between first and last sections and fair to high between pathologists (kappa/pabak 0.21-0.72). The protocol allowed to correctly process without signs of contamination all FFPE of the study. The histopathology evaluation of the cases assured the presence of the targeted tissue, identified the presence of other tissues that could disturb the molecular diagnosis and allowed the assessment of tissue quality.
Santos, E M; Paula, J F R; Motta, P M C; Heinemann, M B; Leite, R C; Haddad, J P A; Del Puerto, H L; Reis, J K P
2010-08-17
We compared three different protocols for DNA extraction from horse peripheral blood mononuclear cells (PBMC) and lung fragments, determining average final DNA concentration, purity, percentage of PCR amplification using beta-actin, and cost. Thirty-four samples from PBMC, and 33 samples from lung fragments were submitted to DNA extraction by three different protocols. Protocol A consisted of a phenol-chloroform and isoamylic alcohol extraction, Protocol B used alkaline extraction with NaOH, and Protocol C used the DNAzol((R)) reagent kit. Protocol A was the best option for DNA extraction from lung fragments, producing high DNA concentrations, with high sensitivity in PCR amplification (100%), followed by Protocols C and B. On the other hand, for PBMC samples, Protocol B gave the highest sensitivity in PCR amplification (100%), followed by Protocols C and A. We conclude that Protocol A should be used for PCR diagnosis from lung fragment samples, while Protocol B should be used for PBMC.
Robotic Enrichment Processing of Roche 454 Titanium Emlusion PCR at the DOE Joint Genome Institute
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, Matthew; Wilson, Steven; Bauer, Diane
2010-05-28
Enrichment of emulsion PCR product is the most laborious and pipette-intensive step in the 454 Titanium process, posing the biggest obstacle for production-oriented scale up. The Joint Genome Institute has developed a pair of custom-made robots based on the Microlab Star liquid handling deck manufactured by Hamilton to mediate the complexity and ergonomic demands of the 454 enrichment process. The robot includes a custom built centrifuge, magnetic deck positions, as well as heating and cooling elements. At present processing eight emulsion cup samples in a single 2.5 hour run, these robots are capable of processing up to 24 emulsion cupmore » samples. Sample emulsions are broken using the standard 454 breaking process and transferred from a pair of 50ml conical tubes to a single 2ml tube and loaded on the robot. The robot performs the enrichment protocol and produces beads in 2ml tubes ready for counting. The robot follows the Roche 454 enrichment protocol with slight exceptions to the manner in which it resuspends beads via pipette mixing rather than vortexing and a set number of null bead removal washes. The robotic process is broken down in similar discrete steps: First Melt and Neutralization, Enrichment Primer Annealing, Enrichment Bead Incubation, Null Bead Removal, Second Melt and Neutralization and Sequencing Primer Annealing. Data indicating our improvements in enrichment efficiency and total number of bases per run will also be shown.« less
Optimized Setup and Protocol for Magnetic Domain Imaging with In Situ Hysteresis Measurement
Liu, Jun; Wilson, John; Davis, Claire; Peyton, Anthony
2017-01-01
This paper elaborates the sample preparation protocols required to obtain optimal domain patterns using the Bitter method, focusing on the extra steps compared to standard metallographic sample preparation procedures. The paper proposes a novel bespoke rig for dynamic domain imaging with in situ BH (magnetic hysteresis) measurements and elaborates the protocols for the sensor preparation and the use of the rig to ensure accurate BH measurement. The protocols for static and ordinary dynamic domain imaging (without in situ BH measurements) are also presented. The reported method takes advantage of the convenience and high sensitivity of the traditional Bitter method and enables in situ BH measurement without interrupting or interfering with the domain wall movement processes. This facilitates establishing a direct and quantitative link between the domain wall movement processes–microstructural feature interactions in ferritic steels with their BH loops. This method is anticipated to become a useful tool for the fundamental study of microstructure–magnetic property relationships in steels and to help interpret the electromagnetic sensor signals for non-destructive evaluation of steel microstructures. PMID:29155796
Kranzfelder, Petra; Anderson, Alyssa M.; Egan, Alexander T.; Mazack, Jane E.; Bouchard,, R. William; Rufer, Moriya M.; Ferrington,, Leonard C.
2015-01-01
Rapid bioassessment protocols using benthic macroinvertebrate assemblages have been successfully used to assess human impacts on water quality. Unfortunately, traditional benthic larval sampling methods, such as the dip-net, can be time-consuming and expensive. An alternative protocol involves collection of Chironomidae surface-floating pupal exuviae (SFPE). Chironomidae is a species-rich family of flies (Diptera) whose immature stages typically occur in aquatic habitats. Adult chironomids emerge from the water, leaving their pupal skins, or exuviae, floating on the water’s surface. Exuviae often accumulate along banks or behind obstructions by action of the wind or water current, where they can be collected to assess chironomid diversity and richness. Chironomids can be used as important biological indicators, since some species are more tolerant to pollution than others. Therefore, the relative abundance and species composition of collected SFPE reflect changes in water quality. Here, methods associated with field collection, laboratory processing, slide mounting, and identification of chironomid SFPE are described in detail. Advantages of the SFPE method include minimal disturbance at a sampling area, efficient and economical sample collection and laboratory processing, ease of identification, applicability in nearly all aquatic environments, and a potentially more sensitive measure of ecosystem stress. Limitations include the inability to determine larval microhabitat use and inability to identify pupal exuviae to species if they have not been associated with adult males. PMID:26274889
Kranzfelder, Petra; Anderson, Alyssa M; Egan, Alexander T; Mazack, Jane E; Bouchard, R William; Rufer, Moriya M; Ferrington, Leonard C
2015-07-24
Rapid bioassessment protocols using benthic macroinvertebrate assemblages have been successfully used to assess human impacts on water quality. Unfortunately, traditional benthic larval sampling methods, such as the dip-net, can be time-consuming and expensive. An alternative protocol involves collection of Chironomidae surface-floating pupal exuviae (SFPE). Chironomidae is a species-rich family of flies (Diptera) whose immature stages typically occur in aquatic habitats. Adult chironomids emerge from the water, leaving their pupal skins, or exuviae, floating on the water's surface. Exuviae often accumulate along banks or behind obstructions by action of the wind or water current, where they can be collected to assess chironomid diversity and richness. Chironomids can be used as important biological indicators, since some species are more tolerant to pollution than others. Therefore, the relative abundance and species composition of collected SFPE reflect changes in water quality. Here, methods associated with field collection, laboratory processing, slide mounting, and identification of chironomid SFPE are described in detail. Advantages of the SFPE method include minimal disturbance at a sampling area, efficient and economical sample collection and laboratory processing, ease of identification, applicability in nearly all aquatic environments, and a potentially more sensitive measure of ecosystem stress. Limitations include the inability to determine larval microhabitat use and inability to identify pupal exuviae to species if they have not been associated with adult males.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tfaily, Malak M.; Chu, Rosalie K.; Toyoda, Jason
A vast number of organic compounds are present in soil organic matter (SOM) and play an important role in the terrestrial carbon cycle, facilitate interactions between organisms, and represent a sink for atmospheric CO2. The diversity of different SOM compounds and their molecular characteristics is a function of the organic source material and biogeochemical history. By understanding how SOM composition changes with sources and the processes by which it is biogeochemically altered in different terrestrial ecosystems, it may be possible to predict nutrient and carbon cycling, response to system perturbations, and impact of climate change will have on SOM composition.more » In this study, a sequential chemical extraction procedure was developed to reveal the diversity of organic matter (OM) in different ecosystems and was compared to the previously published protocol using parallel solvent extraction (PSE). We compared six extraction methods using three sample types, peat soil, spruce forest soil and river sediment, so as to select the best method for extracting a representative fraction of organic matter from soils and sediments from a wide range of ecosystems. We estimated the extraction yield of dissolved organic carbon (DOC) by total organic carbon analysis, and measured the composition of extracted OM using high resolution mass spectrometry. This study showed that OM composition depends primarily on soil and sediment characteristics. Two sequential extraction protocols, progressing from polar to non-polar solvents, were found to provide the highest number and diversity of organic compounds extracted from the soil and sediments. Water (H2O) is the first solvent used for both protocols followed by either co-extraction with methanol-chloroform (MeOH-CHCl3) mixture, or acetonitrile (ACN) and CHCl3 sequentially. The sequential extraction protocol developed in this study offers improved sensitivity, and requires less sample compared to the PSE workflow where a new sample is used for each solvent type. Furthermore, a comparison of SOM composition from the different sample types revealed that our sequential protocol allows for ecosystem comparisons based on the diversity of compounds present, which in turn could provide new insights about source and processing of organic compounds in different soil and sediment types.« less
Tfaily, Malak M; Chu, Rosalie K; Toyoda, Jason; Tolić, Nikola; Robinson, Errol W; Paša-Tolić, Ljiljana; Hess, Nancy J
2017-06-15
A vast number of organic compounds are present in soil organic matter (SOM) and play an important role in the terrestrial carbon cycle, facilitate interactions between organisms, and represent a sink for atmospheric CO 2 . The diversity of different SOM compounds and their molecular characteristics is a function of the organic source material and biogeochemical history. By understanding how SOM composition changes with sources and the processes by which it is biogeochemically altered in different terrestrial ecosystems, it may be possible to predict nutrient and carbon cycling, response to system perturbations, and impact of climate change will have on SOM composition. In this study, a sequential chemical extraction procedure was developed to reveal the diversity of organic matter (OM) in different ecosystems and was compared to the previously published protocol using parallel solvent extraction (PSE). We compared six extraction methods using three sample types, peat soil, spruce forest soil and river sediment, so as to select the best method for extracting a representative fraction of organic matter from soils and sediments from a wide range of ecosystems. We estimated the extraction yield of dissolved organic carbon (DOC) by total organic carbon analysis, and measured the composition of extracted OM using high resolution mass spectrometry. This study showed that OM composition depends primarily on soil and sediment characteristics. Two sequential extraction protocols, progressing from polar to non-polar solvents, were found to provide the highest number and diversity of organic compounds extracted from the soil and sediments. Water (H 2 O) is the first solvent used for both protocols followed by either co-extraction with methanol-chloroform (MeOH-CHCl 3 ) mixture, or acetonitrile (ACN) and CHCl 3 sequentially. The sequential extraction protocol developed in this study offers improved sensitivity, and requires less sample compared to the PSE workflow where a new sample is used for each solvent type. Furthermore, a comparison of SOM composition from the different sample types revealed that our sequential protocol allows for ecosystem comparisons based on the diversity of compounds present, which in turn could provide new insights about source and processing of organic compounds in different soil and sediment types. Copyright © 2017 Elsevier B.V. All rights reserved.
Guidelines and sample protocol for sampling forest gaps.
J.R. Runkle
1992-01-01
A protocol for sampling forest canopy gaps is presented. Methods used in published gap studies are reviewed. The sample protocol will be useful in developing a broader understanding of forest structure and dynamics through comparative studies across different forest ecosystems.
It's Time to Develop a New "Draft Test Protocol" for a Mars Sample Return Mission (or Two…).
Rummel, John D; Kminek, Gerhard
2018-04-01
The last time NASA envisioned a sample return mission from Mars, the development of a protocol to support the analysis of the samples in a containment facility resulted in a "Draft Test Protocol" that outlined required preparations "for the safe receiving, handling, testing, distributing, and archiving of martian materials here on Earth" (Rummel et al., 2002 ). This document comprised a specific protocol to be used to conduct a biohazard test for a returned martian sample, following the recommendations of the Space Studies Board of the US National Academy of Sciences. Given the planned launch of a sample-collecting and sample-caching rover (Mars 2020) in 2 years' time, and with a sample return planned for the end of the next decade, it is time to revisit the Draft Test Protocol to develop a sample analysis and biohazard test plan to meet the needs of these future missions. Key Words: Biohazard detection-Mars sample analysis-Sample receiving facility-Protocol-New analytical techniques-Robotic sample handling. Astrobiology 18, 377-380.
Lee, Ju Seok; Chen, Junghuei; Deaton, Russell; Kim, Jin-Woo
2014-01-01
Genetic material extracted from in situ microbial communities has high promise as an indicator of biological system status. However, the challenge is to access genomic information from all organisms at the population or community scale to monitor the biosystem's state. Hence, there is a need for a better diagnostic tool that provides a holistic view of a biosystem's genomic status. Here, we introduce an in vitro methodology for genomic pattern classification of biological samples that taps large amounts of genetic information from all genes present and uses that information to detect changes in genomic patterns and classify them. We developed a biosensing protocol, termed Biological Memory, that has in vitro computational capabilities to "learn" and "store" genomic sequence information directly from genomic samples without knowledge of their explicit sequences, and that discovers differences in vitro between previously unknown inputs and learned memory molecules. The Memory protocol was designed and optimized based upon (1) common in vitro recombinant DNA operations using 20-base random probes, including polymerization, nuclease digestion, and magnetic bead separation, to capture a snapshot of the genomic state of a biological sample as a DNA memory and (2) the thermal stability of DNA duplexes between new input and the memory to detect similarities and differences. For efficient read out, a microarray was used as an output method. When the microarray-based Memory protocol was implemented to test its capability and sensitivity using genomic DNA from two model bacterial strains, i.e., Escherichia coli K12 and Bacillus subtilis, results indicate that the Memory protocol can "learn" input DNA, "recall" similar DNA, differentiate between dissimilar DNA, and detect relatively small concentration differences in samples. This study demonstrated not only the in vitro information processing capabilities of DNA, but also its promise as a genomic pattern classifier that could access information from all organisms in a biological system without explicit genomic information. The Memory protocol has high potential for many applications, including in situ biomonitoring of ecosystems, screening for diseases, biosensing of pathological features in water and food supplies, and non-biological information processing of memory devices, among many.
Optimizing Urine Processing Protocols for Protein and Metabolite Detection.
Siddiqui, Nazema Y; DuBois, Laura G; St John-Williams, Lisa; Will, Thompson J; Grenier, Carole; Burke, Emily; Fraser, Matthew O; Amundsen, Cindy L; Murphy, Susan K
In urine, factors such as timing of voids, and duration at room temperature (RT) may affect the quality of recovered protein and metabolite data. Additives may aid with detection, but can add more complexity in sample collection or analysis. We aimed to identify the optimal urine processing protocol for clinically-obtained urine samples that allows for the highest protein and metabolite yields with minimal degradation. Healthy women provided multiple urine samples during the same day. Women collected their first morning (1 st AM) void and another "random void". Random voids were aliquotted with: 1) no additive; 2) boric acid (BA); 3) protease inhibitor (PI); or 4) both BA + PI. Of these aliquots, some were immediately stored at 4°C, and some were left at RT for 4 hours. Proteins and individual metabolites were quantified, normalized to creatinine concentrations, and compared across processing conditions. Sample pools corresponding to each processing condition were analyzed using mass spectrometry to assess protein degradation. Ten Caucasian women between 35-65 years of age provided paired 1 st morning and random voided urine samples. Normalized protein concentrations were slightly higher in 1 st AM compared to random "spot" voids. The addition of BA did not significantly change proteins, while PI significantly improved normalized protein concentrations, regardless of whether samples were immediately cooled or left at RT for 4 hours. In pooled samples, there were minimal differences in protein degradation under the various conditions we tested. In metabolite analyses, there were significant differences in individual amino acids based on the timing of the void. For comparative translational research using urine, information about void timing should be collected and standardized. For urine samples processed in the same day, BA does not appear to be necessary while the addition of PI enhances protein yields, regardless of 4°C or RT storage temperature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flewett, S.; Saintenoy, T.; Sepulveda, M.
Archeological ceramic paste material typically consists of a mix of a clay matrix and various millimeter and sub-millimeter sized mineral inclusions. Micro X-ray Fluorescence (μXRF) is a standard compositional classification tool, and in this work we propose and demonstrate an improved fluorescence map processing protocol where the mineral inclusions are automatically separated from the clay matrix to allow independent statistical analysis of the two parts. Application of this protocol allowed us to improve enhance the differentiation discrimination between different ceramic shards compared with the standard procedure of comparing working with only the spatially averaged elemental concentrations. Using the new protocol,more » we performed an initial compositional classification of a set of 83 ceramic shards from the western slopes of the south central Andean region in the Arica y Parinacota region of present-day far northern Chile. Comparing the classifications obtained using the new versus the old (average concentrations only) protocols, we found that some samples were erroneously classified with the old protocol. From an archaeological perspective, a very broad and heterogeneous sample set was used in this study due to the fact that this was the first such study to be performed on ceramics from this region. This allowed a general overview to be obtained, however further work on more specific sample sets will be necessary to extract concrete archaeological conclusions.« less
Tensile and Microindentation Stress-Strain Curves of Al-6061
Weaver, Jordan S [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Center for Integrated Nanotechnologies (CINT); Khosravani, Ali [Georgia Inst. of Technology, Atlanta, GA (United States); Castillo, Andrew [Georgia Inst. of Technology, Atlanta, GA (United States); Kalidind, Surya R [Georgia Inst. of Technology, Atlanta, GA (United States)
2016-07-13
Recent spherical microindentation stress-strain protocols were developed and validated on Al-6061 (DOI: 10.1186/s40192-016-0054-3). The scaling factor between the uniaxial yield strength and the indentation yield strength was determined to be about 1.9. The microindentation stress-strain protocols were then applied to a microstructurally graded sample in an effort to extract high throughput process-property relationships. The tensile and microindentation force-displacement and stress-strain data are presented in this data set.
Rosskopf, Johannes; Müller, Hans-Peter; Dreyhaupt, Jens; Gorges, Martin; Ludolph, Albert C; Kassubek, Jan
2015-03-01
Diffusion tensor imaging (DTI) for assessing ALS-associated white matter alterations has still not reached the level of a neuroimaging biomarker. Since large-scale multicentre DTI studies in ALS may be hampered by differences in scanning protocols, an approach for pooling of DTI data acquired with different protocols was investigated. Three hundred and nine datasets from 170 ALS patients and 139 controls were collected ex post facto from a monocentric database reflecting different scanning protocols. A 3D correction algorithm was introduced for a combined analysis of DTI metrics despite different acquisition protocols, with the focus on the CST as the tract correlate of ALS neuropathological stage 1. A homogenous set of data was obtained by application of 3D correction matrices. Results showed that a fractional anisotropy (FA) threshold of 0.41 could be defined to discriminate ALS patients from controls (sensitivity/specificity, 74%/72%). For the remaining test sample, sensitivity/specificity values of 68%/74% were obtained. In conclusion, the objective was to merge data recorded with different DTI protocols with 3D correction matrices for analyses at group level. These post processing tools might facilitate analysis of large study samples in a multicentre setting for DTI analysis at group level to aid in establishing DTI as a non-invasive biomarker for ALS.
ISPyB: an information management system for synchrotron macromolecular crystallography.
Delagenière, Solange; Brenchereau, Patrice; Launer, Ludovic; Ashton, Alun W; Leal, Ricardo; Veyrier, Stéphanie; Gabadinho, José; Gordon, Elspeth J; Jones, Samuel D; Levik, Karl Erik; McSweeney, Seán M; Monaco, Stéphanie; Nanao, Max; Spruce, Darren; Svensson, Olof; Walsh, Martin A; Leonard, Gordon A
2011-11-15
Individual research groups now analyze thousands of samples per year at synchrotron macromolecular crystallography (MX) resources. The efficient management of experimental data is thus essential if the best possible experiments are to be performed and the best possible data used in downstream processes in structure determination pipelines. Information System for Protein crystallography Beamlines (ISPyB), a Laboratory Information Management System (LIMS) with an underlying data model allowing for the integration of analyses down-stream of the data collection experiment was developed to facilitate such data management. ISPyB is now a multisite, generic LIMS for synchrotron-based MX experiments. Its initial functionality has been enhanced to include improved sample tracking and reporting of experimental protocols, the direct ranking of the diffraction characteristics of individual samples and the archiving of raw data and results from ancillary experiments and post-experiment data processing protocols. This latter feature paves the way for ISPyB to play a central role in future macromolecular structure solution pipelines and validates the application of the approach used in ISPyB to other experimental techniques, such as biological solution Small Angle X-ray Scattering and spectroscopy, which have similar sample tracking and data handling requirements.
The UK Biobank sample handling and storage validation studies.
Peakman, Tim C; Elliott, Paul
2008-04-01
and aims UK Biobank is a large prospective study in the United Kingdom to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. It involves the collection of blood and urine from 500 000 individuals aged between 40 and 69 years. How the samples are collected, processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. A series of validation studies was recommended to test the robustness of the draft sample handling and storage protocol. Samples of blood and urine were collected from 40 healthy volunteers and either processed immediately according to the protocol or maintained at specified temperatures (4 degrees C for all tubes with the exception of vacutainers containing acid citrate dextrose that were maintained at 18 degrees C) for 12, 24 or 36 h prior to processing. A further sample was maintained for 24 h at 4 degrees C, processed and the aliquots frozen at -80 degrees C for 20 days and then thawed under controlled conditions. The stability of the samples was compared for the different times in a wide variety of assays. The samples maintained at 4 degrees C were stable for at least 24 h after collection for a wide range of assays. Small but significant changes were observed in metabonomic studies in samples maintained at 4 degrees C for 36 h. There was no degradation of the samples for a range of biochemical assays after short-term freezing and thawing under controlled conditions. Whole blood maintained at 18 degrees C for 24 h in vacutainers containing acid citrate dextrose is suitable for viral immortalization techniques. The validation studies reported in this supplement provide justification for the sample handling and storage procedures adopted in the UK Biobank project.
Lo, Andy; Tang, Yanan; Chen, Lu; Li, Liang
2013-07-25
Isotope labeling liquid chromatography-mass spectrometry (LC-MS) is a major analytical platform for quantitative proteome analysis. Incorporation of isotopes used to distinguish samples plays a critical role in the success of this strategy. In this work, we optimized and automated a chemical derivatization protocol (dimethylation after guanidination, 2MEGA) to increase the labeling reproducibility and reduce human intervention. We also evaluated the reagent compatibility of this protocol to handle biological samples in different types of buffers and surfactants. A commercially available liquid handler was used for reagent dispensation to minimize analyst intervention and at least twenty protein digest samples could be prepared in a single run. Different front-end sample preparation methods for protein solubilization (SDS, urea, Rapigest™, and ProteaseMAX™) and two commercially available cell lysis buffers were evaluated for compatibility with the automated protocol. It was found that better than 94% desired labeling could be obtained in all conditions studied except urea, where the rate was reduced to about 92% due to carbamylation on the peptide amines. This work illustrates the automated 2MEGA labeling process can be used to handle a wide range of protein samples containing various reagents that are often encountered in protein sample preparation for quantitative proteome analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Sampling protocol for post-landfall Deepwater Horizon oil release, Gulf of Mexico, 2010
Wilde, F.D.; Skrobialowski, S.C.; Hart, J.S.
2010-01-01
The protocols and procedures described in this report are designed to be used by U.S. Geological Survey (USGS) field teams for the collection of environmental data and samples in coastal areas affected by the 2010 Deepwater Horizon oil spill in the Gulf of Mexico. This sampling protocol focuses specifically on sampling for water, sediments, benthic invertebrates, and microorganisms (ambient bacterial populations) after shoreline arrival of petroleum-associated product on beach, barrier island, and wetland environments of the Gulf of Mexico coastal states. Deployment to sampling sites, site setup, and sample collection in these environments necessitates modifications to standard USGS sampling procedures in order to address the regulatory, logistical, and legal requirements associated with samples collected in oil-impacted coastal areas. This document, therefore, has been written as an addendum to the USGS National Field Manual for the Collection of Water-Quality Data (NFM) (http://pubs.water.usgs.gov/twri9A/), which provides the basis for training personnel in the use of standard USGS sampling protocols. The topics covered in this Gulf of Mexico oil-spill sampling protocol augment NFM protocols for field-deployment preparations, health and safety precautions, sampling and quality-assurance procedures, and decontamination requirements under potentially hazardous environmental conditions. Documentation procedures and maintenance of sample integrity by use of chain-of-custody procedures also are described in this protocol.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stemmer, Kerstin; Ellinger-Ziegelbauer, Heidrun; Lotz, Kerstin
2006-11-15
Laser microdissection in conjunction with microarray technology allows selective isolation and analysis of specific cell populations, e.g., preneoplastic renal lesions. To date, only limited information is available on sample preparation and preservation techniques that result in both optimal histomorphological preservation of sections and high-quality RNA for microarray analysis. Furthermore, amplification of minute amounts of RNA from microdissected renal samples allowing analysis with genechips has only scantily been addressed to date. The objective of this study was therefore to establish a reliable and reproducible protocol for laser microdissection in conjunction with microarray technology using kidney tissue from Eker rats p.o. treatedmore » for 7 days and 6 months with 10 and 1 mg Aristolochic acid/kg bw, respectively. Kidney tissues were preserved in RNAlater or snap frozen. Cryosections were cut and stained with either H and E or cresyl violet for subsequent morphological and RNA quality assessment and laser microdissection. RNA quality was comparable in snap frozen and RNAlater-preserved samples, however, the histomorphological preservation of renal sections was much better following cryopreservation. Moreover, the different staining techniques in combination with sample processing time at room temperature can have an influence on RNA quality. Different RNA amplification protocols were shown to have an impact on gene expression profiles as demonstrated with Affymetrix Rat Genome 230{sub 2}.0 arrays. Considering all the parameters analyzed in this study, a protocol for RNA isolation from laser microdissected samples with subsequent Affymetrix chip hybridization was established that was also successfully applied to preneoplastic lesions laser microdissected from Aristolochic acid-treated rats.« less
Evaluation of storage and filtration protocols for alpine/subalpine lake water quality samples
John L. Korfmacher; Robert C. Musselman
2007-01-01
Many government agencies and other organizations sample natural alpine and subalpine surface waters using varying protocols for sample storage and filtration. Simplification of protocols would be beneficial if it could be shown that sample quality is unaffected. In this study, samples collected from low ionic strength waters in alpine and subalpine lake inlets...
7 CFR 301.92-11 - Inspection and sampling protocols.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 5 2010-01-01 2010-01-01 false Inspection and sampling protocols. 301.92-11 Section... Inspection and sampling protocols. Type(s) of plants in the nursery Type(s) of plants shipped interstate... interstate. (1) Annual inspection, sampling, and testing—(i) Inspection. The nursery must be inspected...
7 CFR 301.92-11 - Inspection and sampling protocols.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 5 2011-01-01 2011-01-01 false Inspection and sampling protocols. 301.92-11 Section... Inspection and sampling protocols. Type(s) of plants in the nursery Type(s) of plants shipped interstate... interstate. (1) Annual inspection, sampling, and testing—(i) Inspection. The nursery must be inspected...
NASA Astrophysics Data System (ADS)
Schwab, Michael; Klaus, Julian; Pfister, Laurent; Weiler, Markus
2015-04-01
Over the past decades, stream sampling protocols for environmental tracers were often limited by logistical and technological constraints. Long-term sampling programs would typically rely on weekly sampling campaigns, while high-frequency sampling would remain restricted to a few days or hours at best. We stipulate that the currently predominant sampling protocols are too coarse to capture and understand the full amplitude of rainfall-runoff processes and its relation to water quality fluctuations. Weekly sampling protocols are not suited to get insights into the hydrological system during high flow conditions. Likewise, high frequency measurements of a few isolated events do not allow grasping inter-event variability in contributions and processes. Our working hypothesis is based on the potential of a new generation of field-deployable instruments for measuring environmental tracers at high temporal frequencies over an extended period. With this new generation of instruments we expect to gain new insights into rainfall-runoff dynamics, both at intra- and inter-event scales. Here, we present the results of one year of DOC and nitrate measurements with the field deployable UV-Vis spectrometer spectro::lyser (scan Messtechnik GmbH). The instrument measures the absorption spectrum from 220 to 720 nm in situ and at high frequencies and derives DOC and nitrate concentrations. The measurements were carried out at 15 minutes intervals in the Weierbach catchment (0.47 km2) in Luxemburg. This fully forested catchment is characterized by cambisol soils and fractured schist as underlying bedrock. The time series of DOC and nitrate give insights into the high frequency dynamics of stream water. Peaks in DOC concentrations are closely linked to discharge peaks that occur during or right after a rainfall event. Those first discharge peaks can be linked to fast near surface runoff processes and are responsible for a remarkable amount of DOC export. A special characterisation of the Weierbach catchment are the delayed second peaks a few days after the rainfall event. Nitrate concentrations are following this second peak. We assume that this delayed response is going back to subsurface or upper groundwater flows, with nitrate enriched water. On an inter-event scale during low flow / base flow conditions, we observe interesting diurnal patterns of both DOC and nitrate concentrations. Overall, the long-term high-frequency measurements of DOC and nitrate provide us the opportunity to separate different rainfall-runoff processes and link the amount of DOC and nitrate export to them to quantify the overall relevance of the different processes.
Dukić, Lora; Kopčinović, Lara Milevoj; Dorotić, Adrijana; Baršić, Ivana
2016-10-15
Blood gas analysis (BGA) is exposed to risks of errors caused by improper sampling, transport and storage conditions. The Clinical and Laboratory Standards Institute (CLSI) generated documents with recommendations for avoidance of potential errors caused by sample mishandling. Two main documents related to BGA issued by the CLSI are GP43-A4 (former H11-A4) Procedures for the collection of arterial blood specimens; approved standard - fourth edition, and C46-A2 Blood gas and pH analysis and related measurements; approved guideline - second edition. Practices related to processing of blood gas samples are not standardized in the Republic of Croatia. Each institution has its own protocol for ordering, collection and analysis of blood gases. Although many laboratories use state of the art analyzers, still many preanalytical procedures remain unchanged. The objective of the Croatian Society of Medical Biochemistry and Laboratory Medicine (CSMBLM) is to standardize the procedures for BGA based on CLSI recommendations. The Working Group for Blood Gas Testing as part of the Committee for the Scientific Professional Development of the CSMBLM prepared a set of recommended protocols for sampling, transport, storage and processing of blood gas samples based on relevant CLSI documents, relevant literature search and on the results of Croatian survey study on practices and policies in acid-base testing. Recommendations are intended for laboratory professionals and all healthcare workers involved in blood gas processing.
Dukić, Lora; Kopčinović, Lara Milevoj; Dorotić, Adrijana; Baršić, Ivana
2016-01-01
Blood gas analysis (BGA) is exposed to risks of errors caused by improper sampling, transport and storage conditions. The Clinical and Laboratory Standards Institute (CLSI) generated documents with recommendations for avoidance of potential errors caused by sample mishandling. Two main documents related to BGA issued by the CLSI are GP43-A4 (former H11-A4) Procedures for the collection of arterial blood specimens; approved standard – fourth edition, and C46-A2 Blood gas and pH analysis and related measurements; approved guideline – second edition. Practices related to processing of blood gas samples are not standardized in the Republic of Croatia. Each institution has its own protocol for ordering, collection and analysis of blood gases. Although many laboratories use state of the art analyzers, still many preanalytical procedures remain unchanged. The objective of the Croatian Society of Medical Biochemistry and Laboratory Medicine (CSMBLM) is to standardize the procedures for BGA based on CLSI recommendations. The Working Group for Blood Gas Testing as part of the Committee for the Scientific Professional Development of the CSMBLM prepared a set of recommended protocols for sampling, transport, storage and processing of blood gas samples based on relevant CLSI documents, relevant literature search and on the results of Croatian survey study on practices and policies in acid-base testing. Recommendations are intended for laboratory professionals and all healthcare workers involved in blood gas processing. PMID:27812301
NASA Astrophysics Data System (ADS)
Ohlendorf, Christian; Gebhardt, Catalina; Hahn, Annette; Kliem, Pierre; Zolitschka, Bernd
2011-07-01
Using the ICDP (International Continental Scientific Drilling Program) deep lake drilling expedition no. 5022 as an example, we describe core processing and sampling procedures as well as new tools developed for subsampling. A manual core splitter is presented that is (1) mobile, (2) able to cut plastic core liners lengthwise without producing swarf of liner material and (3) consists of off-the-shelf components. In order to improve the sampling of sediment cores, a new device, the core sampling assembly (CSA), was developed that meets the following targets: (1) the partitioning of the sediment into discs of equal thickness is fast and precise, (2) disturbed sediment at the inner surface of the liner is discarded during this sampling process, (3) usage of the available sediment is optimised, (4) subsamples are volumetric and oriented, and (5) identical subsamples are taken. The CSA can be applied to D-shaped split sediment cores of any diameter and consists of a divider and a D-shaped scoop. The sampling plan applied for ICDP expedition 5022 is illustrated and may be used as a guideline for planning the efficient partitioning of sediment amongst different lake research groups involved in multidisciplinary projects. For every subsample, the use of quality flags is suggested (1) to document the sample condition, (2) to give a first sediment classification and (3) to guarantee a precise adjustment of logging and scanning data with data determined on individual samples. Based on this, we propose a protocol that might be applied across lake drilling projects in order to facilitate planning and documentation of sampling campaigns and to ensure a better comparability of results.
Hoffmayer, Eric R; Hendon, Jill M; Parsons, Glenn R; Driggers, William B; Campbell, Matthew D
2015-10-01
Elasmobranch stress responses are traditionally measured in the field by either singly or serially sampling an animal after a physiologically stressful event. Although capture and handling techniques are effective at inducing a stress response, differences in protocols could affect the degree of stress experienced by an individual, making meaningful comparisons between the protocols difficult, if not impossible. This study acutely stressed Atlantic sharpnose sharks, Rhizoprionodon terraenovae, by standardized capture (rod and reel) and handling methods and implemented either a single or serial blood sampling protocol to monitor four indicators of the secondary stress response. Single-sampled sharks were hooked and allowed to swim around the boat until retrieved for a blood sample at either 0, 15, 30, 45, or 60 min post-hooking. Serially sampled sharks were retrieved, phlebotomized, released while still hooked, and subsequently resampled at 15, 30, 45, and 60 min intervals post-hooking. Blood was analyzed for hematocrit, and plasma glucose, lactate, and osmolality levels. Although both single and serial sampling protocols resulted in an increase in glucose, no significant difference in glucose level was found between protocols. Serially sampled sharks exhibited cumulatively heightened levels for lactate and osmolality at all time intervals when compared to single-sampled animals at the same time. Maximal concentration differences of 217.5, 9.8, and 41.6 % were reported for lactate, osmolality, and glucose levels, respectively. Hematocrit increased significantly over time for the single sampling protocol but did not change significantly during the serial sampling protocol. The differences in resultant blood chemistry levels between implemented stress protocols and durations are significant and need to be considered when assessing stress in elasmobranchs.
A Draft Test Protocol for Detecting Possible Biohazards in Martian Samples Returned to Earth
NASA Technical Reports Server (NTRS)
Rummel, John D. (Editor); Race, Margaret S.; DeVincenzi, Donald L.; Schad, P. Jackson; Stabekis, Pericles D.; Viso, Michel; Acevedo, Sara E.
2002-01-01
This document presents the first complete draft of a protocol for detecting possible biohazards in Mars samples returned to Earth: it is the final product of the Mars Sample Handling Protocol Workshop Series. convened in 2000-2001 by NASA's Planetary Protection Officer. The goal of the five-workshop Series vas to develop a comprehensive protocol by which returned martian sample materials could be assessed k r the presence of any biological hazard(s) while safeguarding the purity of the samples from possible terrestrial contamination.
2013-09-01
sequence dataset. All procedures were performed by personnel in the IIMT UT Southwestern Genomics and Microarray Core using standard protocols. More... sequencing run, samples were demultiplexed using standard algorithms in the Genomics and Microarray Core and processed into individual sample Illumina single... Sequencing (RNA-Seq), using Illumina’s multiplexing mRNA-Seq to generate full sequence libraries from the poly-A tailed RNA to a read depth of 30
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-03
... assess the performance of an approved sampling protocol and to allow for continued sample collection and... developmental sampling protocol. While this application was being reviewed and was available for public comment, the sampling protocol being tested was adopted into the National Shellfish Sanitation Program by the...
A process for creating multimetric indices for large-scale aquatic surveys
Differences in sampling and laboratory protocols, differences in techniques used to evaluate metrics, and differing scales of calibration and application prohibit the use of many existing multimetric indices (MMIs) in large-scale bioassessments. We describe an approach to develop...
Recommended protocols for sampling macrofungi
Gregory M. Mueller; John Paul Schmit; Sabine M. Hubndorf Leif Ryvarden; Thomas E. O' Dell; D. Jean Lodge; Patrick R. Leacock; Milagro Mata; Loengrin Umania; Qiuxin (Florence) Wu; Daniel L. Czederpiltz
2004-01-01
This chapter discusses several issues regarding reommended protocols for sampling macrofungi: Opportunistic sampling of macrofungi, sampling conspicuous macrofungi using fixed-size, sampling small Ascomycetes using microplots, and sampling a fixed number of downed logs.
National Sample Assessment Protocols
ERIC Educational Resources Information Center
Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2012
2012-01-01
These protocols represent a working guide for planning and implementing national sample assessments in connection with the national Key Performance Measures (KPMs). The protocols are intended for agencies involved in planning or conducting national sample assessments and personnel responsible for administering associated tenders or contracts,…
Modeling the Sensitivity of Field Surveys for Detection of Environmental DNA (eDNA)
Schultz, Martin T.; Lance, Richard F.
2015-01-01
The environmental DNA (eDNA) method is the practice of collecting environmental samples and analyzing them for the presence of a genetic marker specific to a target species. Little is known about the sensitivity of the eDNA method. Sensitivity is the probability that the target marker will be detected if it is present in the water body. Methods and tools are needed to assess the sensitivity of sampling protocols, design eDNA surveys, and interpret survey results. In this study, the sensitivity of the eDNA method is modeled as a function of ambient target marker concentration. The model accounts for five steps of sample collection and analysis, including: 1) collection of a filtered water sample from the source; 2) extraction of DNA from the filter and isolation in a purified elution; 3) removal of aliquots from the elution for use in the polymerase chain reaction (PCR) assay; 4) PCR; and 5) genetic sequencing. The model is applicable to any target species. For demonstration purposes, the model is parameterized for bighead carp (Hypophthalmichthys nobilis) and silver carp (H. molitrix) assuming sampling protocols used in the Chicago Area Waterway System (CAWS). Simulation results show that eDNA surveys have a high false negative rate at low concentrations of the genetic marker. This is attributed to processing of water samples and division of the extraction elution in preparation for the PCR assay. Increases in field survey sensitivity can be achieved by increasing sample volume, sample number, and PCR replicates. Increasing sample volume yields the greatest increase in sensitivity. It is recommended that investigators estimate and communicate the sensitivity of eDNA surveys to help facilitate interpretation of eDNA survey results. In the absence of such information, it is difficult to evaluate the results of surveys in which no water samples test positive for the target marker. It is also recommended that invasive species managers articulate concentration-based sensitivity objectives for eDNA surveys. In the absence of such information, it is difficult to design appropriate sampling protocols. The model provides insights into how sampling protocols can be designed or modified to achieve these sensitivity objectives. PMID:26509674
Modeling the Sensitivity of Field Surveys for Detection of Environmental DNA (eDNA).
Schultz, Martin T; Lance, Richard F
2015-01-01
The environmental DNA (eDNA) method is the practice of collecting environmental samples and analyzing them for the presence of a genetic marker specific to a target species. Little is known about the sensitivity of the eDNA method. Sensitivity is the probability that the target marker will be detected if it is present in the water body. Methods and tools are needed to assess the sensitivity of sampling protocols, design eDNA surveys, and interpret survey results. In this study, the sensitivity of the eDNA method is modeled as a function of ambient target marker concentration. The model accounts for five steps of sample collection and analysis, including: 1) collection of a filtered water sample from the source; 2) extraction of DNA from the filter and isolation in a purified elution; 3) removal of aliquots from the elution for use in the polymerase chain reaction (PCR) assay; 4) PCR; and 5) genetic sequencing. The model is applicable to any target species. For demonstration purposes, the model is parameterized for bighead carp (Hypophthalmichthys nobilis) and silver carp (H. molitrix) assuming sampling protocols used in the Chicago Area Waterway System (CAWS). Simulation results show that eDNA surveys have a high false negative rate at low concentrations of the genetic marker. This is attributed to processing of water samples and division of the extraction elution in preparation for the PCR assay. Increases in field survey sensitivity can be achieved by increasing sample volume, sample number, and PCR replicates. Increasing sample volume yields the greatest increase in sensitivity. It is recommended that investigators estimate and communicate the sensitivity of eDNA surveys to help facilitate interpretation of eDNA survey results. In the absence of such information, it is difficult to evaluate the results of surveys in which no water samples test positive for the target marker. It is also recommended that invasive species managers articulate concentration-based sensitivity objectives for eDNA surveys. In the absence of such information, it is difficult to design appropriate sampling protocols. The model provides insights into how sampling protocols can be designed or modified to achieve these sensitivity objectives.
Segat, Ludovica; Padovan, Lara; Doc, Darja; Petix, Vincenzo; Morgutti, Marcello; Crovella, Sergio; Ricci, Giuseppe
2012-12-01
We describe a real-time polymerase chain reaction (PCR) protocol based on the fluorescent molecule SYBR Green chemistry, for a low- to medium-throughput analysis of Y-chromosome microdeletions, optimized according to the European guidelines and aimed at making the protocol faster, avoiding post-PCR processing, and simplifying the results interpretation. We screened 156 men from the Assisted Reproduction Unit, Department of Obstetrics and Gynecology, Institute for Maternal and Child Health IRCCS Burlo Garofolo (Trieste, Italy), 150 not presenting Y-chromosome microdeletion, and 6 with microdeletions in different azoospermic factor (AZF) regions. For each sample, the Zinc finger Y-chromosomal protein (ZFY), sex-determining region Y (SRY), sY84, sY86, sY127, sY134, sY254, and sY255 loci were analyzed by performing one reaction for each locus. AZF microdeletions were successfully detected in six individuals, confirming the results obtained with commercial kits. Our real-time PCR protocol proved to be a rapid, safe, and relatively cheap method that was suitable for a low- to medium-throughput diagnosis of Y-chromosome microdeletion, which allows an analysis of approximately 10 samples (with the addition of positive and negative controls) in a 96-well plate format, or approximately 46 samples in a 384-well plate for all markers simultaneously, in less than 2 h without the need of post-PCR manipulation.
High-speed shaking of frozen blood clots for extraction of human and malaria parasite DNA.
Lundblom, Klara; Macharia, Alex; Lebbad, Marianne; Mohammed, Adan; Färnert, Anna
2011-08-08
Frozen blood clots remaining after serum collection is an often disregarded source of host and pathogen DNA due to troublesome handling and suboptimal outcome. High-speed shaking of clot samples in a cell disruptor manufactured for homogenization of tissue and faecal specimens was evaluated for processing frozen blood clots for DNA extraction. The method was compared to two commercial clot protocols based on a chemical kit and centrifugation through a plastic sieve, followed by the same DNA extraction protocol. Blood clots with different levels of parasitaemia (1-1,000 p/μl) were prepared from parasite cultures to assess sensitivity of PCR detection. In addition, clots retrieved from serum samples collected within two epidemiological studies in Kenya (n = 630) were processed by high speed shaking and analysed by PCR for detection of malaria parasites and the human α-thalassaemia gene. High speed shaking succeeded in fully dispersing the clots and the method generated the highest DNA yield. The level of PCR detection of P. falciparum parasites and the human thalassaemia gene was the same as samples optimally collected with an anticoagulant. The commercial clot protocol and centrifugation through a sieve failed to fully dissolve the clots and resulted in lower sensitivity of PCR detection. High speed shaking was a simple and efficacious method for homogenizing frozen blood clots before DNA purification and resulted in PCR templates of high quality both from humans and malaria parasites. This novel method enables genetic studies from stored blood clots.
A new, ultra-low latency data transmission protocol for Earthquake Early Warning Systems
NASA Astrophysics Data System (ADS)
Hill, P.; Hicks, S. P.; McGowan, M.
2016-12-01
One measure used to assess the performance of Earthquake Early Warning Systems (EEWS) is the delay time between earthquake origin and issued alert. EEWS latency is dependent on a number of sources (e.g. P-wave propagation, digitisation, transmission, receiver processing, triggering, event declaration). Many regional seismic networks use the SEEDlink protocol; however, packet size is fixed to 512-byte miniSEED records, resulting in transmission latencies of >0.5 s. Data packetisation is seen as one of the main sources of delays in EEWS (Brown et al., 2011). Optimising data-logger and telemetry configurations is a cost-effective strategy to improve EEWS alert times (Behr et al., 2015). Digitisers with smaller, selectable packets can result in faster alerts (Sokos et al., 2016). We propose a new seismic protocol for regional seismic networks benefiting low-latency applications such as EEWS. The protocol, based on Güralp's existing GDI-link format is an efficient and flexible method to exchange data between seismic stations and data centers for a range of network configurations. The main principle is to stream data sample-by-sample instead of fixed-length packets to minimise transmission latency. Self-adaptive packetisation with compression maximises available telemetry bandwidth. Highly flexible metadata fields within GDI-link are compatible with existing miniSEED definitions. Data is sent as integers or floats, supporting a wide range of data formats, including discrete parameters such as Pd & τC for on-site earthquake early warning. Other advantages include: streaming station state-of-health information, instrument control, support of backfilling and fail-over strategies during telemetry outages. Based on tests carried out on the Güralp Minimus data-logger, we show our new protocol can reduce transmission latency to as low as 1 ms. The low-latency protocol is currently being implemented with common processing packages. The results of these tests will help to highlight latency levels that can be achieved with next-generation EEWS.
Performance Characteristics of Plasma Amyloid β 40 and 42 Assays
Okereke, Olivia I.; Xia, Weiming; Irizarry, Michael C.; Sun, Xiaoyan; Qiu, Wei Q.; Fagan, Anne M.; Mehta, Pankaj D.; Hyman, Bradley T.; Selkoe, Dennis J.; Grodstein, Francine
2009-01-01
Background Identifying biomarkers of Alzheimer disease (AD) risk will be critical to effective AD prevention. Levels of circulating amyloid β (Aβ) 40 and 42 may be candidate biomarkers. However, properties of plasma Aβ assays must be established. Methods Using five different protocols, blinded samples were used to assess: intra-assay reproducibility; impact of EDTA vs. heparin anticoagulant tubes; and effect of time-to-blood processing. In addition, percent recovery of known Aβ concentrations in spiked samples was assessed. Results Median intra-assay coefficients of variation (CVs) for the assay protocols ranged from 6–24% for Aβ-40, and 8–14% for Aβ-42. There were no systematic differences in reproducibility by collection method. Plasma concentrations of Aβ (particularly Aβ-42) appeared stable in whole blood kept in ice packs and processed as long as 24 hours after collection. Recovery of expected concentrations was modest, ranging from -24% to 44% recovery of Aβ-40, and 17% to 61% of Aβ-42. Conclusions Across five protocols, plasma Aβ-40 and Aβ-42 levels were measured with generally low error, and measurements appeared similar in blood collected in EDTA vs. heparin. While these preliminary findings suggest that measuring plasma Aβ-40 and Aβ-42 may be feasible in varied research settings, additional work in this area is necessary. PMID:19221417
A Draft Test Protocol for Detecting Possible Biohazards in Martian Samples Returned to Earth
NASA Technical Reports Server (NTRS)
Rummel, John D.; Race, Margaret S.; DeVinenzi, Donald L.; Schad, P. Jackson; Stabekis, Pericles D.; Viso, Michel; Acevedo, Sara E.
2002-01-01
This document presents the first complete draft of a protocol for detecting possible biohazards in Mars samples returned to Earth; it is the final product of the Mars Sample Handling Protocol Workshop Series, convened in 2000-2001 by NASA's Planetary Protection Officer. The goal of the five-workshop Series vas to develop a comprehensive protocol by which returned martian sample materials could be assessed for the presence of any biological hazard(s) while safeguarding the purity of the samples from possible terrestrial contamination The reference numbers for the proceedings from the five individual Workshops.
Glycerolized Reticular Dermis as a New Human Acellular Dermal Matrix: An Exploratory Study
Ferrando, Pietro Maria; Balmativola, Davide; Cambieri, Irene; Scalzo, Maria Stella; Bergallo, Massimiliano; Annaratone, Laura; Casarin, Stefania; Fumagalli, Mara; Stella, Maurizio; Sapino, Anna; Castagnoli, Carlotta
2016-01-01
Human Acellular Dermal Matrices (HADM) are employed in various reconstructive surgery procedures as scaffolds for autologous tissue regeneration. The aim of this project was to develop a new type of HADM for clinical use, composed of glycerolized reticular dermis decellularized through incubation and tilting in Dulbecco’s Modified Eagle’s Medium (DMEM). This manufacturing method was compared with a decellularization procedure already described in the literature, based on the use of sodium hydroxide (NaOH), on samples from 28 donors. Cell viability was assessed using an MTT assay and microbiological monitoring was performed on all samples processed after each step. Two surgeons evaluated the biomechanical characteristics of grafts of increasing thickness. The effects of the different decellularization protocols were assessed by means of histological examination and immunohistochemistry, and residual DNA after decellularization was quantified using a real-time TaqMan MGB probe. Finally, we compared the results of DMEM based decellularization protocol on reticular dermis derived samples with the results of the same protocol applied on papillary dermis derived grafts. Our experimental results indicated that the use of glycerolized reticular dermis after 5 weeks of treatment with DMEM results in an HADM with good handling and biocompatibility properties. PMID:26918526
Schulze, M; Henning, H; Rüdiger, K; Wallner, U; Waberski, D
2013-12-01
Freshly collected boar spermatozoa are sensitive to a fast reduction in temperature because of lipid phase transition and phase separation processes. Temperature management during semen processing may determine the quality of stored samples. The aim of this study was to evaluate the influence of isothermic and hypothermic semen processing protocols on boar sperm quality under laboratory and field conditions. In the laboratory study, ejaculates (n = 12) were first diluted (1:1) with Beltsville Thawing Solution (BTS) at 32 °C, then processed either with isothermic (32 °C) or hypothermic (21 °C) BTS, stored at 17 °C, and assessed on days 1, 3, and 6. Temperature curves showed that 150 minutes after the first dilution, semen doses of both groups reached the same temperature. Two-step hypothermic processing resulted in lower sperm motility on days 1 and 6 (P < 0.05). Concomitantly, hypothermally processed samples contained less membrane intact sperm on days 3 and 6 (P < 0.05). Using AndroStar Plus extender instead of BTS reduced the negative effect of hypothermic processing. In the field study, 15 semen samples from each of 23 European artificial insemination studs were evaluated as part of an external quality control program. Semen quality based on motility, membrane integrity, mitochondrial activity, and a thermoresistance test was higher for stations using one-step isothermic dilutions (n = 7) compared with artificial insemination centers using two-step hypothermic protocols (n = 16). Both studies show that chilling injury associated with hypothermic dilution results in lower quality of stored boar semen compared with isothermic dilution and that the type of semen extender affects the outcomes. Copyright © 2013 Elsevier Inc. All rights reserved.
Laser capture microdissection of embryonic cells and preparation of RNA for microarray assays.
Redmond, Latasha C; Pang, Christopher J; Dumur, Catherine; Haar, Jack L; Lloyd, Joyce A
2014-01-01
In order to compare the global gene expression profiles of different embryonic cell types, it is first necessary to isolate the specific cells of interest. The purpose of this chapter is to provide a step-by-step protocol to perform laser capture microdissection (LCM) on embryo samples and obtain sufficient amounts of high-quality RNA for microarray hybridizations. Using the LCM/microarray strategy on mouse embryo samples has some challenges, because the cells of interest are available in limited quantities. The first step in the protocol is to obtain embryonic tissue, and immediately cryoprotect and freeze it in a cryomold containing Optimal Cutting Temperature freezing media (Sakura Finetek), using a dry ice-isopentane bath. The tissue is then cryosectioned, and the microscope slides are processed to fix, stain, and dehydrate the cells. LCM is employed to isolate specific cell types from the slides, identified under the microscope by virtue of their morphology. Detailed protocols are provided for using the currently available ArcturusXT LCM instrument and CapSure(®) LCM Caps, to which the selected cells adhere upon laser capture. To maintain RNA integrity, upon removing a slide from the final processing step, or attaching the first cells on the LCM cap, LCM is completed within 20 min. The cells are then immediately recovered from the LCM cap using a denaturing solution that stabilizes RNA integrity. RNA is prepared using standard methods, modified for working with small samples. To ensure the validity of the microarray data, the quality of the RNA is assessed using the Agilent bioanalyzer. Only RNA that is of sufficient integrity and quantity is used to perform microarray assays. This chapter provides guidance regarding troubleshooting and optimization to obtain high-quality RNA from cells of limited availability, obtained from embryo samples by LCM.
Laser Capture Microdissection of Embryonic Cells and Preparation of RNA for Microarray Assays
Redmond, Latasha C.; Pang, Christopher J.; Dumur, Catherine; Haar, Jack L.; Lloyd, Joyce A.
2014-01-01
In order to compare the global gene expression profiles of different embryonic cell types, it is first necessary to isolate the specific cells of interest. The purpose of this chapter is to provide a step-by-step protocol to perform laser capture microdissection (LCM) on embryo samples and obtain sufficient amounts of high-quality RNA for microarray hybridizations. Using the LCM/microarray strategy on mouse embryo samples has some challenges, because the cells of interest are available in limited quantities. The first step in the protocol is to obtain embryonic tissue, and immediately cryoprotect and freeze it in a cryomold containing Optimal Cutting Temperature freezing media (Sakura Finetek), using a dry ice–isopentane bath. The tissue is then cryosectioned, and the microscope slides are processed to fix, stain, and dehydrate the cells. LCM is employed to isolate specific cell types from the slides, identified under the microscope by virtue of their morphology. Detailed protocols are provided for using the currently available ArcturusXT LCM instrument and CapSure® LCM Caps, to which the selected cells adhere upon laser capture. To maintain RNA integrity, upon removing a slide from the final processing step, or attaching the first cells on the LCM cap, LCM is completed within 20 min. The cells are then immediately recovered from the LCM cap using a denaturing solution that stabilizes RNA integrity. RNA is prepared using standard methods, modified for working with small samples. To ensure the validity of the microarray data, the quality of the RNA is assessed using the Agilent bioanalyzer. Only RNA that is of sufficient integrity and quantity is used to perform microarray assays. This chapter provides guidance regarding troubleshooting and optimization to obtain high-quality RNA from cells of limited availability, obtained from embryo samples by LCM. PMID:24318813
Chapter A5. Processing of Water Samples
Wilde, Franceska D.; Radtke, Dean B.; Gibs, Jacob; Iwatsubo, Rick T.
1999-01-01
The National Field Manual for the Collection of Water-Quality Data (National Field Manual) describes protocols and provides guidelines for U.S. Geological Survey (USGS) personnel who collect data used to assess the quality of the Nation's surface-water and ground-water resources. This chapter addresses methods to be used in processing water samples to be analyzed for inorganic and organic chemical substances, including the bottling of composite, pumped, and bailed samples and subsamples; sample filtration; solid-phase extraction for pesticide analyses; sample preservation; and sample handling and shipping. Each chapter of the National Field Manual is published separately and revised periodically. Newly published and revised chapters will be announced on the USGS Home Page on the World Wide Web under 'New Publications of the U.S. Geological Survey.' The URL for this page is http:/ /water.usgs.gov/lookup/get?newpubs.
Tobe, Shanan S; Bailey, Stuart; Govan, James; Welch, Lindsey A
2013-03-01
Although poaching is a common wildlife crime, the high and prohibitive cost of specialised animal testing means that many cases are left un-investigated. We previously described a novel approach to wildlife crime investigation that looked at the identification of human DNA on poached animal remains (Tobe, Govan and Welch, 2011). Human DNA was successfully isolated and amplified from simulated poaching incidents, however a low template protocol was required which made this method unsuitable for use in many laboratories. We now report on an optimised recovery and amplification protocol which removes the need for low template analysis. Samples from 10 deer (40 samples total - one from each leg) analysed in the original study were re-analysed in the current study with an additional 11 deer samples. Four samples analysed using Chelex did not show any results and a new method was devised whereby the available DNA was concentrated. By combining the DNA extracts from all tapings of the same deer remains followed by concentration, the recovered quantity of human DNA was found to be 29.5pg±43.2pg, 31× greater than the previous study. The use of the Investigator Decaplex SE (QIAGEN) STR kit provided better results in the form of more complete profiles than did the AmpFℓSTR® SGM Plus® kit at 30cycles (Applied Biosystems). Re-analysis of the samples from the initial study using the new, optimised protocol resulted in an average increase of 18% of recovered alleles. Over 17 samples, 71% of the samples analysed using the optimised protocol showed sufficient amplification for comparison to a reference profile and gave match probabilities ranging from 7.7690×10(-05) to 2.2706×10(-14). The removal of low template analysis means this optimised method provides evidence of high probative value and is suitable for immediate use in forensic laboratories. All methods and techniques used are standard and are compatible with current SOPs. As no high cost non-human DNA analysis is required the overall process is no more expensive than the investigation of other volume crime samples. The technique is suitable for immediate use in poaching incidents. Copyright © 2012 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.
A multigear protocol for sampling crayfish assemblages in Gulf of Mexico coastal streams
William R. Budnick; William E. Kelso; Susan B. Adams; Michael D. Kaller
2018-01-01
Identifying an effective protocol for sampling crayfish in streams that vary in habitat and physical/chemical characteristics has proven problematic. We evaluated an active, combined-gear (backpack electrofishing and dipnetting) sampling protocol in 20 Coastal Plain streams in Louisiana. Using generalized linear models and rarefaction curves, we evaluated environmental...
2006-09-30
High-Pressure Waterjet • CO2 Pellet/Turbine Wheel • Ultrahigh-Pressure Waterjet 5 Process Water Reuse/Recycle • Cross-Flow Microfiltration ...documented on a process or laboratory form. Corrective action will involve taking all necessary steps to restore a measuring system to proper working order...In all cases, a nonconformance will be rectified before sample processing and analysis continues. If corrective action does not restore the
Fisher, Shawn C.; Reilly, Timothy J.; Jones, Daniel K.; Benzel, William M.; Griffin, Dale W.; Loftin, Keith A.; Iwanowicz, Luke R.; Cohl, Jonathan A.
2015-12-17
An understanding of the effects on human and ecological health brought by major coastal storms or flooding events is typically limited because of a lack of regionally consistent baseline and trends data in locations proximal to potential contaminant sources and mitigation activities, sensitive ecosystems, and recreational facilities where exposures are probable. In an attempt to close this gap, the U.S. Geological Survey (USGS) has implemented the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study to collect regional sediment-quality data prior to and in response to future coastal storms. The standard operating procedure (SOP) detailed in this document serves as the sample-collection protocol for the SCoRR strategy by providing step-by-step instructions for site preparation, sample collection and processing, and shipping of soil and surficial sediment (for example, bed sediment, marsh sediment, or beach material). The objectives of the SCoRR strategy pilot study are (1) to create a baseline of soil-, sand-, marsh sediment-, and bed-sediment-quality data from sites located in the coastal counties from Maine to Virginia based on their potential risk of being contaminated in the event of a major coastal storm or flooding (defined as Resiliency mode); and (2) respond to major coastal storms and flooding by reoccupying select baseline sites and sampling within days of the event (defined as Response mode). For both modes, samples are collected in a consistent manner to minimize bias and maximize quality control by ensuring that all sampling personnel across the region collect, document, and process soil and sediment samples following the procedures outlined in this SOP. Samples are analyzed using four USGS-developed screening methods—inorganic geochemistry, organic geochemistry, pathogens, and biological assays—which are also outlined in this SOP. Because the SCoRR strategy employs a multi-metric approach for sample analyses, this protocol expands upon and reconciles differences in the sample collection protocols outlined in the USGS “National Field Manual for the Collection of Water-Quality Data,” which should be used in conjunction with this SOP. A new data entry and sample tracking system also is presented to ensure all relevant data and metadata are gathered at the sample locations and in the laboratories.
Using semantics for representing experimental protocols.
Giraldo, Olga; García, Alexander; López, Federico; Corcho, Oscar
2017-11-13
An experimental protocol is a sequence of tasks and operations executed to perform experimental research in biological and biomedical areas, e.g. biology, genetics, immunology, neurosciences, virology. Protocols often include references to equipment, reagents, descriptions of critical steps, troubleshooting and tips, as well as any other information that researchers deem important for facilitating the reusability of the protocol. Although experimental protocols are central to reproducibility, the descriptions are often cursory. There is the need for a unified framework with respect to the syntactic structure and the semantics for representing experimental protocols. In this paper we present "SMART Protocols ontology", an ontology for representing experimental protocols. Our ontology represents the protocol as a workflow with domain specific knowledge embedded within a document. We also present the S ample I nstrument R eagent O bjective (SIRO) model, which represents the minimal common information shared across experimental protocols. SIRO was conceived in the same realm as the Patient Intervention Comparison Outcome (PICO) model that supports search, retrieval and classification purposes in evidence based medicine. We evaluate our approach against a set of competency questions modeled as SPARQL queries and processed against a set of published and unpublished protocols modeled with the SP Ontology and the SIRO model. Our approach makes it possible to answer queries such as Which protocols use tumor tissue as a sample. Improving reporting structures for experimental protocols requires collective efforts from authors, peer reviewers, editors and funding bodies. The SP Ontology is a contribution towards this goal. We build upon previous experiences and bringing together the view of researchers managing protocols in their laboratory work. Website: https://smartprotocols.github.io/ .
Fonseca, Rochele Paz; Fachel, Jandyra Maria Guimarães; Chaves, Márcia Lorena Fagundes; Liedtke, Francéia Veiga; Parente, Maria Alice de Mattos Pimenta
2007-01-01
Right-brain-damaged individuals may present discursive, pragmatic, lexical-semantic and/or prosodic disorders. To verify the effect of right hemisphere damage on communication processing evaluated by the Brazilian version of the Protocole Montréal d'Évaluation de la Communication (Montreal Communication Evaluation Battery) - Bateria Montreal de Avaliação da Comunicação, Bateria MAC, in Portuguese. A clinical group of 29 right-brain-damaged participants and a control group of 58 non-brain-damaged adults formed the sample. A questionnaire on sociocultural and health aspects, together with the Brazilian MAC Battery was administered. Significant differences between the clinical and control groups were observed in the following MAC Battery tasks: conversational discourse, unconstrained, semantic and orthographic verbal fluency, linguistic prosody repetition, emotional prosody comprehension, repetition and production. Moreover, the clinical group was less homogeneous than the control group. A right-brain-damage effect was identified directly, on three communication processes: discursive, lexical-semantic and prosodic processes, and indirectly, on pragmatic process.
Barreto, Goncalo; Soininen, Antti; Sillat, Tarvo; Konttinen, Yrjö T; Kaivosoja, Emilia
2014-01-01
Time-of-flight secondary ion mass spectrometry (ToF-SIMS) is increasingly being used in analysis of biological samples. For example, it has been applied to distinguish healthy and osteoarthritic human cartilage. This chapter discusses ToF-SIMS principle and instrumentation including the three modes of analysis in ToF-SIMS. ToF-SIMS sets certain requirements for the samples to be analyzed; for example, the samples have to be vacuum compatible. Accordingly, sample processing steps for different biological samples, i.e., proteins, cells, frozen and paraffin-embedded tissues and extracellular matrix for the ToF-SIMS are presented. Multivariate analysis of the ToF-SIMS data and the necessary data preprocessing steps (peak selection, data normalization, mean-centering, and scaling and transformation) are discussed in this chapter.
Van Ginderdeuren, Rita; Van Calster, Joachim; Stalmans, Peter; Van den Oord, Joost
2014-08-01
In this prospective study, a universal protocol for sampling and analysing vitreous material was investigated. Vitreous biopsies are difficult to handle because of the paucity of cells and the gelatinous structure of the vitreous. Histopathological analysis of the vitreous is useful in difficult uveitis cases to differentiate uveitis from lymphoma or infection and to define the type of cellular reaction. Hundred consecutive vitreous samples were analysed with the Cellient tissue processor (Hologic). This machine is a fully automated processor starting from a specified container with PreservCyt (fixative fluid) with cells to paraffin. Cytology was compared with fixatives Cytolyt (contains a mucolyticum) and PreservCyt. Routine histochemical and immunostainings were evaluated. In 92% of the cases, sufficient material was found for diagnosis. In 14%, a Cytolyt wash was necessary to prevent clotting of the tubes in the Cellient due to the viscosity of the sample. In 23%, the diagnosis was an acute inflammation (presence of granulocytes); in 33%, chronic active inflammation (presence of T lymphocytes); in 33%, low-grade inflammation (presence of CD68 cells, without T lymphocytes); and in 3%, a malignant process. A standardized protocol for sampling and handling vitreous biopsies, fixing in PreservCyt and processing by the Cellient gives a satisfactory result in morphology, number of cells and possibility of immuno-histochemical stainings. The diagnosis can be established or confirmed in more than 90% of cases. © 2014 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
Single-cell transcriptome conservation in cryopreserved cells and tissues.
Guillaumet-Adkins, Amy; Rodríguez-Esteban, Gustavo; Mereu, Elisabetta; Mendez-Lago, Maria; Jaitin, Diego A; Villanueva, Alberto; Vidal, August; Martinez-Marti, Alex; Felip, Enriqueta; Vivancos, Ana; Keren-Shaul, Hadas; Heath, Simon; Gut, Marta; Amit, Ido; Gut, Ivo; Heyn, Holger
2017-03-01
A variety of single-cell RNA preparation procedures have been described. So far, protocols require fresh material, which hinders complex study designs. We describe a sample preservation method that maintains transcripts in viable single cells, allowing one to disconnect time and place of sampling from subsequent processing steps. We sequence single-cell transcriptomes from >1000 fresh and cryopreserved cells using 3'-end and full-length RNA preparation methods. Our results confirm that the conservation process did not alter transcriptional profiles. This substantially broadens the scope of applications in single-cell transcriptomics and could lead to a paradigm shift in future study designs.
NEW APPROACHES TO ESTIMATION OF SOLID-WASTE QUANTITY AND COMPOSITION
Efficient and statistically sound sampling protocols for estimating the quantity and composition of solid waste over a stated period of time in a given location, such as a landfill site or at a specific point in an industrial or commercial process, are essential to the design ...
PROTOCOL - A COMPUTERIZED SOLID WASTE QUANTITY AND COMPOSITION ESTIMATION SYSTEM: OPERATIONAL MANUAL
The assumptions of traditional sampling theory often do not fit the circumstances when estimating the quantity and composition of solid waste arriving at a given location, such as a landfill site, or at a specific point in an industrial or commercial process. The investigator oft...
Protocol for Initial Purification of Bacteriocin
2015-10-01
lysate/extract preparation, column purification, and a desalting . The peptide was tracked throughout the process using a soft agar overlay activity...tris PAGE. It is necessary to desalt those samples for 150-mM and 1-M fractions, by using dialysis or G10 sephadex columns, in order to prevent
Yilmaz, Vedat; Ince-Yilmaz, Ebru; Yilmazel, Yasemin Dilsad; Duran, Metin
2014-06-01
In this study, biomass samples were obtained from six municipal and nine industrial full-scale anaerobic processes to investigate whether the aceticlastic methanogen population composition is related to acetate utilization capacity and the nature of the wastewater treated, i.e. municipal sludge or industrial wastewater. Batch serum bottle tests were used to determine the specific acetate utilization rate (AUR), and a quantitative real-time polymerase chain reaction protocol was used to enumerate the acetate-utilizing Methanosaeta and Methanosarcina populations in the biomass samples. Methanosaeta was the dominant aceticlastic methanogen in all samples, except for one industrial wastewater-treating anaerobic process. However, Methanosarcina density in industrial biomass samples was higher than the Methanosarcina density in the municipal samples. The average AUR values of municipal and industrial wastewater treatment plant biomass samples were 10.49 and 10.65 mg CH3COO(-)/log(aceticlastic methanogen gene copy).d, respectively. One-way ANOVA test and principle component analysis showed that the acetate utilization capacities and aceticlastic methanogen community composition did not show statistically significant correlation among the municipal digesters and industrial wastewater-treating processes investigated.
Framework for managing mycotoxin risks in the food industry.
Baker, Robert C; Ford, Randall M; Helander, Mary E; Marecki, Janusz; Natarajan, Ramesh; Ray, Bonnie
2014-12-01
We propose a methodological framework for managing mycotoxin risks in the food processing industry. Mycotoxin contamination is a well-known threat to public health that has economic significance for the food processing industry; it is imperative to address mycotoxin risks holistically, at all points in the procurement, processing, and distribution pipeline, by tracking the relevant data, adopting best practices, and providing suitable adaptive controls. The proposed framework includes (i) an information and data repository, (ii) a collaborative infrastructure with analysis and simulation tools, (iii) standardized testing and acceptance sampling procedures, and (iv) processes that link the risk assessments and testing results to the sourcing, production, and product release steps. The implementation of suitable acceptance sampling protocols for mycotoxin testing is considered in some detail.
A distance limited method for sampling downed coarse woody debris
Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams
2012-01-01
A new sampling method for down coarse woody debris is proposed based on limiting the perpendicular distance from individual pieces to a randomly chosen sample point. Two approaches are presented that allow different protocols to be used to determine field measurements; estimators for each protocol are also developed. Both protocols are compared via simulation against...
Blanch, E; Tomás, C; Casares, L; Gómez, E A; Sansano, S; Giménez, I; Mocé, E
2014-06-01
Glycerol (11%; v:v) is the cryoprotectant most often used for the cryopreservation of rooster sperm. However, chicken breeds differ in the resistance of their sperm to the cryopreservation process and endangered or local breeds usually present low fertilizing ability when conventional sperm cryopreservation protocols are used. The objective of this study was to optimize the protocol for the cryopreservation of the sperm from the endangered breed "Gallina Valenciana de Chulilla". For this purpose, 10 pools of semen from 43 roosters of this breed were cryopreserved using 8%, 7%, 6%, or 4% glycerol, and the sperm quality was determined immediately after thawing and in the insemination doses. Lohmann Brown Classic laying hens (n = 40) were used for the insemination trials. The sperm quality after cryopreservation progressively decreased as the glycerol concentration was reduced (P < 0.01); samples frozen using 4% glycerol exhibited the lowest quality (38% total motile sperm and 49% live sperm), and samples frozen using 8% glycerol exhibited the highest quality (67% total motile sperm and 66% live sperm). These differences were also observed after the glycerol was removed (P < 0.01). However, the sperm fertilizing ability was similar for all the treatments (23%-30% fertilized eggs), and increased as the glycerol concentration decreased. In conclusion, semen from roosters frozen using 4% glycerol exhibited lower sperm quality but similar fertilizing ability compared with samples processed using higher glycerol concentrations. These results may provide useful information for developing cryopreservation protocols for other breeds. Copyright © 2014 Elsevier Inc. All rights reserved.
High-speed shaking of frozen blood clots for extraction of human and malaria parasite DNA
2011-01-01
Background Frozen blood clots remaining after serum collection is an often disregarded source of host and pathogen DNA due to troublesome handling and suboptimal outcome. Methods High-speed shaking of clot samples in a cell disruptor manufactured for homogenization of tissue and faecal specimens was evaluated for processing frozen blood clots for DNA extraction. The method was compared to two commercial clot protocols based on a chemical kit and centrifugation through a plastic sieve, followed by the same DNA extraction protocol. Blood clots with different levels of parasitaemia (1-1,000 p/μl) were prepared from parasite cultures to assess sensitivity of PCR detection. In addition, clots retrieved from serum samples collected within two epidemiological studies in Kenya (n = 630) were processed by high speed shaking and analysed by PCR for detection of malaria parasites and the human α-thalassaemia gene. Results High speed shaking succeeded in fully dispersing the clots and the method generated the highest DNA yield. The level of PCR detection of P. falciparum parasites and the human thalassaemia gene was the same as samples optimally collected with an anticoagulant. The commercial clot protocol and centrifugation through a sieve failed to fully dissolve the clots and resulted in lower sensitivity of PCR detection. Conclusions High speed shaking was a simple and efficacious method for homogenizing frozen blood clots before DNA purification and resulted in PCR templates of high quality both from humans and malaria parasites. This novel method enables genetic studies from stored blood clots. PMID:21824391
Tack, Lois C; Thomas, Michelle; Reich, Karl
2007-03-01
Forensic labs globally face the same problem-a growing need to process a greater number and wider variety of samples for DNA analysis. The same forensic lab can be tasked all at once with processing mixed casework samples from crime scenes, convicted offender samples for database entry, and tissue from tsunami victims for identification. Besides flexibility in the robotic system chosen for forensic automation, there is a need, for each sample type, to develop new methodology that is not only faster but also more reliable than past procedures. FTA is a chemical treatment of paper, unique to Whatman Bioscience, and is used for the stabilization and storage of biological samples. Here, the authors describe optimization of the Whatman FTA Purification Kit protocol for use with the AmpFlSTR Identifiler PCR Amplification Kit.
Ilyin, S E; Plata-Salamán, C R
2000-02-15
Homogenization of tissue samples is a common first step in the majority of current protocols for RNA, DNA, and protein isolation. This report describes a simple device for centrifugation-mediated homogenization of tissue samples. The method presented is applicable to RNA, DNA, and protein isolation, and we show examples where high quality total cell RNA, DNA, and protein were obtained from brain and other tissue samples. The advantages of the approach presented include: (1) a significant reduction in time investment relative to hand-driven or individual motorized-driven pestle homogenization; (2) easy construction of the device from inexpensive parts available in any laboratory; (3) high replicability in the processing; and (4) the capacity for the parallel processing of multiple tissue samples, thus allowing higher efficiency, reliability, and standardization.
Rapid microfluidic analysis of a Y-STR multiplex for screening of forensic samples.
Gibson-Daw, Georgiana; Albani, Patricia; Gassmann, Marcus; McCord, Bruce
2017-02-01
In this paper, we demonstrate a rapid analysis procedure for use with a small set of rapidly mutating Y chromosomal short tandem repeat (Y-STR) loci that combines both rapid polymerase chain reaction (PCR) and microfluidic separation elements. The procedure involves a high-speed polymerase and a rapid cycling protocol to permit PCR amplification in 16 min. The resultant amplified sample is next analysed using a short 1.8-cm microfluidic electrophoresis system that permits a four-locus Y-STR genotype to be produced in 80 s. The entire procedure takes less than 25 min from sample collection to result. This paper describes the rapid amplification protocol as well as studies of the reproducibility and sensitivity of the procedure and its optimisation. The amplification process utilises a small high-speed thermocycler, microfluidic device and compact laptop, making it portable and potentially useful for rapid, inexpensive on-site genotyping. The four loci used for the multiplex were selected due to their rapid mutation rates and should proved useful in preliminary screening of samples and suspects. Overall, this technique provides a method for rapid sample screening of suspect and crime scene samples in forensic casework. Graphical abstract ᅟ.
Eminaga, O; Semjonow, A; Oezguer, E; Herden, J; Akbarov, I; Tok, A; Engelmann, U; Wille, S
2014-01-01
The integrity of collection protocols in biobanking is essential for a high-quality sample preparation process. However, there is not currently a well-defined universal method for integrating collection protocols in the biobanking information system (BIMS). Therefore, an electronic schema of the collection protocol that is based on Extensible Markup Language (XML) is required to maintain the integrity and enable the exchange of collection protocols. The development and implementation of an electronic specimen collection protocol schema (eSCPS) was performed at two institutions (Muenster and Cologne) in three stages. First, we analyzed the infrastructure that was already established at both the biorepository and the hospital information systems of these institutions and determined the requirements for the sufficient preparation of specimens and documentation. Second, we designed an eSCPS according to these requirements. Finally, a prospective study was conducted to implement and evaluate the novel schema in the current BIMS. We designed an eSCPS that provides all of the relevant information about collection protocols. Ten electronic collection protocols were generated using the supplementary Protocol Editor tool, and these protocols were successfully implemented in the existing BIMS. Moreover, an electronic list of collection protocols for the current studies being performed at each institution was included, new collection protocols were added, and the existing protocols were redesigned to be modifiable. The documentation time was significantly reduced after implementing the eSCPS (5 ± 2 min vs. 7 ± 3 min; p = 0.0002). The eSCPS improves the integrity and facilitates the exchange of specimen collection protocols in the existing open-source BIMS.
Dynamics of Nafion membrane swelling in H2O/D2O mixtures as studied using FTIR technique
NASA Astrophysics Data System (ADS)
Bunkin, Nikolai F.; Kozlov, Valeriy A.; Shkirin, Alexey V.; Ninham, Barry W.; Balashov, Anatoliy A.; Gudkov, Sergey V.
2018-03-01
Experiments with Fourier transform spectrometry of Nafion, a water-swollen polymeric membrane, are described. The transmittance spectra of liquid samples and Nafion, soaked in these samples, were studied, depending on the deuterium content in water in the spectral range 1.8-2.15 μm. The experiments were carried out using two protocols: in the first protocol we studied the dynamics of Nafion swelling in H2O + D2O mixtures for the deuterium concentrations 3 < C < 104 ppm, and in the second protocol we studied the dynamics of swelling in pure heavy water (C = 106 ppm). For liquid mixtures in the concentration range 3 < C < 104 ppm, the transmittance spectra are the same, but for Nafion soaked in these fluids, the corresponding spectra are different. It is shown that, in the range of deuterium contents C = 90-500 ppm, the behavior of transmittance of the polymer membrane is non-monotonic. In experiments using the second protocol, the dynamics of diffusion replacement of residual water, which is always present in the bulk of the polymer membrane inside closed cavities (i.e., without access to atmospheric air), were studied. The experimentally estimated diffusion coefficient for this process is ≈6.10-11 cm2/s.
Ashizawa, Kazuho; Murata, Syota; Terada, Takashi; Ito, Daisuke; Bunya, Masaru; Watanabe, Koji; Teruuchi, Yoko; Tsuchida, Sachio; Satoh, Mamoru; Nishimura, Motoi; Matsushita, Kazuyuki; Sugama, Yuji; Nomura, Fumio
2017-08-01
Matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) can be used to identify pathogens in blood culture samples. However, sample pretreatment is needed for direct identification of microbes in blood culture bottles. Conventional protocols are complex and time-consuming. Therefore, in this study, we developed a method for collecting bacteria using polyallylamine-polystyrene copolymer for application in wastewater treatment technology. Using representative bacterial species Escherichia coli and Staphylococcus capitis, we found that polyallylamine-polystyrene can form visible aggregates with bacteria, which can be identified using MALDI-TOF MS. The processing time of our protocol was as short as 15min. Hemoglobin interference in MALDI spectra analysis was significantly decreased in our method compared with the conventional method. In a preliminary experiment, we evaluated the use of our protocol to identify clinical isolates from blood culture bottles. MALDI-TOF MS-based identification of 17 strains from five bacterial species (E. coli, Klebsiella pneumoniae, Enterococcus faecalis, S. aureus, and S. capitis) collected by our protocol was satisfactory. Prospective large-scale studies are needed to further evaluate the clinical application of this novel and simple method of collecting bacteria in blood culture bottles. Copyright © 2017 Elsevier B.V. All rights reserved.
Rahman, Shafiq; Griffin, Michelle; Naik, Anish; Szarko, Matthew; Butler, Peter E M
2018-02-15
Decellularized scaffolds can induce chondrogenic differentiation of stem cells. This study compares different methods to optimise the decellularization of auricular cartilage. The process consisted of an initial 12 hour dry freeze thaw which froze the cartilage specimens in an empty tube at -20 °C. Samples were allowed to thaw at room temperature followed by submersion in phosphate buffer solution in which they were frozen at -20 °C for a 12 hour period. They were then allowed to thaw at room temperature as before. Protocol A subsequently involved subjecting specimens to both deoxyribonuclease and sodium deoxycholate. Protocol B and C were adaptations of this using 0.25% trypsin (7 cycles) and a 0.5 molar solution of ethylenediaminetetraacetic acid (3 hours for each cycle) respectively as additional steps. Trypsin accelerated the decellularization process with a reduction in DNA content from 55.4 ng/μL (native) to 17.3 ng/μL (P-value < 0.05) after 14 days. Protocol B showed a faster reduction in DNA content when compared with protocol A. In comparison to protocol C after 14 days, trypsin also showed greater decellularization with a mean difference of 11.7 ng/μL (P-value < 0.05). Histological analysis with H&E and DAPI confirmed depletion of cells at 14 days with trypsin.
Documentation of operational protocol for the use of MAMA software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Daniel S.
2016-01-21
Image analysis of Scanning Electron Microscope (SEM) micrographs is a complex process that can vary significantly between analysts. The factors causing the variation are numerous, and the purpose of Task 2b is to develop and test a set of protocols designed to minimize variation in image analysis between different analysts and laboratories, specifically using the MAMA software package, Version 2.1. The protocols were designed to be “minimally invasive”, so that expert SEM operators will not be overly constrained in the way they analyze particle samples. The protocols will be tested using a round-robin approach where results from expert SEM usersmore » at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, Pacific Northwest National Laboratory, Savannah River National Laboratory, and the National Institute of Standards and Testing will be compared. The variation of the results will be used to quantify uncertainty in the particle image analysis process. The round-robin exercise will proceed with 3 levels of rigor, each with their own set of protocols, as described below in Tasks 2b.1, 2b.2, and 2b.3. The uncertainty will be developed using NIST standard reference material SRM 1984 “Thermal Spray Powder – Particle Size Distribution, Tungsten Carbide/Cobalt (Acicular)” [Reference 1]. Full details are available in the Certificate of Analysis, posted on the NIST website (http://www.nist.gov/srm/).« less
Verma, Digvijay; Satyanarayana, T
2011-09-01
An improved single-step protocol has been developed for extracting pure community humic substance-free DNA from alkaline soils and sediments. The method is based on direct cell lysis in the presence of powdered activated charcoal and polyvinylpolypyrrolidone followed by precipitation with polyethyleneglycol and isopropanol. The strategy allows simultaneous isolation and purification of DNA while minimizing the loss of DNA with respect to other available protocols for metagenomic DNA extraction. Moreover, the purity levels are significant, which are difficult to attain with any of the methods reported in the literature for DNA extraction from soils. The DNA thus extracted was free from humic substances and, therefore, could be processed for restriction digestion, PCR amplification as well as for the construction of metagenomic libraries.
Bagchi, Sourav Kumar; Rao, Pavuluri Srinivasa; Mallick, Nirupama
2015-03-01
Drying of wet algal biomass is a major bottleneck in viable commercial production of the microalgal biodiesel. In the present investigation, an oven drying protocol was standardized for drying of wet Scenedesmus biomass at 60, 80 and 100°C with initial sample thickness of 5.0, 7.5 and 10.0mm. The optimum drying temperature was found to be 80°C with a maximum lipid yield of 425.0±5.9mgg(-1) at 15h drying time for 5.0mm thick samples with 0.033kWh power consumption. Partial drying at 80°C up to 10% residual moisture content was efficient showing 93% lipid recovery with 8h drying and a power consumption of 0.017kWh. Scenedesmus biomass was also found to be rich in saturated and mono-unsaturated fatty acids. Thus, the drying protocol demonstrates its suitability to improve the downstream processing of biodiesel production by significantly lowering the power consumption and the drying time. Copyright © 2014 Elsevier Ltd. All rights reserved.
Shahsavari, Esmaeil; Aburto-Medina, Arturo; Taha, Mohamed; Ball, Andrew S
2016-01-01
Polycyclic aromatic hydrocarbons (PAHs) are major pollutants globally and due to their carcinogenic and mutagenic properties their clean-up is paramount. Bioremediation or using PAH degrading microorganisms (mainly bacteria) to degrade the pollutants represents cheap, effective methods. These PAH degraders harbor functional genes which help microorganisms use PAHs as source of food and energy. Most probable number (MPN) and plate counting methods are widely used for counting PAHs degraders; however, as culture based methods only count a small fraction (<1%) of microorganisms capable of carrying out PAH degradation, the use of culture-independent methodologies is desirable.•This protocol presents a robust, rapid and sensitive qPCR method for the quantification of the functional genes involved in the degradation of PAHs in soil samples.•This protocol enables us to screen a vast number of PAH contaminated soil samples in few hours.•This protocol provides valuable information about the natural attenuation potential of contaminated soil and can be used to monitor the bioremediation process.
Singh, Satwinder Kaur; Meyering, Maaike; Ramwadhdoebe, Tamara H; Stynenbosch, Linda F M; Redeker, Anke; Kuppen, Peter J K; Melief, Cornelis J M; Welters, Marij J P; van der Burg, Sjoerd H
2012-11-01
The ability to measure antigen-specific T cells at the single-cell level by intracellular cytokine staining (ICS) is a promising immunomonitoring tool and is extensively applied in the evaluation of immunotherapy of cancer. The protocols used to detect antigen-specific CD8+ T-cell responses generally work for the detection of antigen-specific T cells in samples that have undergone at least one round of in vitro pre-stimulation. Application of a common protocol but now using long peptides as antigens was not suitable to simultaneously detect antigen-specific CD8+ and CD4+ T cells directly ex vivo in cryopreserved samples. CD8 T-cell reactivity to monocytes pulsed with long peptides as antigens ranged between 5 and 25 % of that observed against monocytes pulsed with a direct HLA class I fitting minimal CTL peptide epitope. Therefore, we adapted our ICS protocol and show that the use of tenfold higher concentration of long peptides to load APC, the use of IFN-α and poly(I:C) to promote antigen processing and improve T-cell stimulation, does allow for the ex vivo detection of low-frequency antigen-specific CD8+ and CD4+ T cells in an HLA-independent setting. While most of the improvements were related to increasing the ability to measure CD8+ T-cell reactivity following stimulation with long peptides to at least 50 % of the response detected when using a minimal peptide epitope, the final analysis of blood samples from vaccinated patients successfully showed that the adapted ICS protocol also increases the ability to ex vivo detect low-frequency p53-specific CD4+ T-cell responses in cryopreserved PBMC samples.
Sauter, Jennifer L; Grogg, Karen L; Vrana, Julie A; Law, Mark E; Halvorson, Jennifer L; Henry, Michael R
2016-02-01
The objective of the current study was to establish a process for validating immunohistochemistry (IHC) protocols for use on the Cellient cell block (CCB) system. Thirty antibodies were initially tested on CCBs using IHC protocols previously validated on formalin-fixed, paraffin-embedded tissue (FFPE). Cytology samples were split to generate thrombin cell blocks (TCB) and CCBs. IHC was performed in parallel. Antibody immunoreactivity was scored, and concordance or discordance in immunoreactivity between the TCBs and CCBs for each sample was determined. Criteria for validation of an antibody were defined as concordant staining in expected positive and negative cells, in at least 5 samples each, and concordance in at least 90% of the samples total. Antibodies that failed initial validation were retested after alterations in IHC conditions. Thirteen of the 30 antibodies (43%) did not meet initial validation criteria. Of those, 8 antibodies (calretinin, clusters of differentiation [CD] 3, CD20, CDX2, cytokeratin 20, estrogen receptor, MOC-31, and p16) were optimized for CCBs and subsequently validated. Despite several alterations in conditions, 3 antibodies (Ber-EP4, D2-40, and paired box gene 8 [PAX8]) were not successfully validated. Nearly one-half of the antibodies tested in the current study failed initial validation using IHC conditions that were established in the study laboratory for FFPE material. Although some antibodies subsequently met validation criteria after optimization of conditions, a few continued to demonstrate inadequate immunoreactivity. These findings emphasize the importance of validating IHC protocols for methanol-fixed tissue before clinical use and suggest that optimization for alcohol fixation may be needed to obtain adequate immunoreactivity on CCBs. © 2016 American Cancer Society.
Zhou, Menglan; Yang, Qiwen; Kudinha, Timothy; Sun, Liying; Zhang, Rui; Liu, Chang; Yu, Shuying; Xiao, Meng; Kong, Fanrong; Zhao, Yupei; Xu, Ying-Chun
2017-01-01
Background: Bloodstream infection is a major cause of morbidity and mortality in hospitalized patients worldwide. Delays in the identification of microorganisms often leads to a poor prognosis. The application of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) directly to blood culture (BC) broth can potentially identify bloodstream infections earlier, and facilitate timely management. Methods: We developed an "in-house" (IH) protocol for direct MALDI-TOF MS based identification of organisms in positive BCs. The IH protocol was initially evaluated and improved with spiked BC samples, and its performance was compared with the commercial Sepsityper™ kit using both traditional and modified cut-off values. We then studied in parallel the performance of the IH protocol and the colony MS identifications in positive clinical BC samples using only modified cut-off values. All discrepancies were investigated by "gold standard" of gene sequencing. Results: In 54 spiked BC samples, the IH method showed comparable results with Sepsityper™ after applying modified cut-off values. Specifically, accurate species and genus level identification was achieved in 88.7 and 3.9% of all the clinical monomicrobial BCs (284/301, 94.4%), respectively. The IH protocol exhibited superior performance for Gram negative bacteria than for Gram positive bacteria (92.8 vs. 82.4%). For anaerobes and yeasts, accurate species identification was achieved in 80.0 and 90.0% of the cases, respectively. For polymicrobial cultures (17/301, 5.6%), MALDI-TOF MS correctly identified a single species present in all the polymicrobial BCs under the Standard mode, while using the MIXED method, two species were correctly identified in 52.9% of the samples. Comparisons based on BC bottle type, showed that the BACTEC™ Lytic/10 Anaerobic/F culture vials performed the best. Conclusion: Our study provides a novel and effective sample preparation method for MALDI-TOF MS direct identification of pathogens from positive BC vials, with a lower cost ($1.5 vs. $ 7) albeit a slightly more laborious extracting process (an extra 15 min) compared with Sepsityper™ kit.
Simões, André E S; Pereira, Diane M; Amaral, Joana D; Nunes, Ana F; Gomes, Sofia E; Rodrigues, Pedro M; Lo, Adrian C; D'Hooge, Rudi; Steer, Clifford J; Thibodeau, Stephen N; Borralho, Pedro M; Rodrigues, Cecília M P
2013-03-15
Simultaneous isolation of nucleic acids and proteins from a single biological sample facilitates meaningful data interpretation and reduces time, cost and sampling errors. This is particularly relevant for rare human and animal specimens, often scarce, and/or irreplaceable. TRIzol(®) and TRIzol(®)LS are suitable for simultaneous isolation of RNA, DNA and proteins from the same biological sample. These reagents are widely used for RNA and/or DNA isolation, while reports on their use for protein extraction are limited, attributable to technical difficulties in protein solubilisation. TRIzol(®)LS was used for RNA isolation from 284 human colon cancer samples, including normal colon mucosa, tubulovillous adenomas, and colon carcinomas with proficient and deficient mismatch repair system. TRIzol(®) was used for RNA isolation from human colon cancer cells, from brains of transgenic Alzheimer's disease mice model, and from cultured mouse cortical neurons. Following RNA extraction, the TRIzol(®)-chloroform fractions from human colon cancer samples and from mouse hippocampus and frontal cortex were stored for 2 years and 3 months, respectively, at -80°C until used for protein isolation.Simple modifications to the TRIzol(®) manufacturer's protocol, including Urea:SDS solubilization and sonication, allowed improved protein recovery yield compared to the TRIzol(®) manufacturer's protocol. Following SDS-PAGE and Ponceau and Coomassie staining, recovered proteins displayed wide molecular weight range and staining pattern comparable to those obtainable with commonly used protein extraction protocols. We also show that nuclear and cytosolic proteins can be easily extracted and detected by immunoblotting, and that posttranslational modifications, such as protein phosphorylation, are detectable in proteins recovered from TRIzol(®)-chloroform fractions stored for up to 2 years at -80°C. We provide a novel approach to improve protein recovery from samples processed for nucleic acid extraction with TRIzol(®) and TRIzol(®)LS compared to the manufacturer`s protocol, allowing downstream immunoblotting and evaluation of steady-state relative protein expression levels. The method was validated in large sets of samples from multiple sources, including human colon cancer and brains of transgenic Alzheimer's disease mice model, stored in TRIzol(®)-chloroform for up to two years. Collectively, we provide a faster and cheaper alternative to the TRIzol(®) manufacturer`s protein extraction protocol, illustrating the high relevance, and wide applicability, of the present protein isolation method for the immunoblot evaluation of steady-state relative protein expression levels in samples from multiple sources, and following prolonged storage.
Correlation Between Iron and alpha and pi Glutathione-S-Transferase Levels in Humans
2012-09-01
assays were performed as described in the Biotrin High Sensitivity Alpha GST EIA kit protocol. First, serum samples were diluted 1:10 with wash solution...immunosorbent assays were performed as described in the Biotrin Pi GST EIA kit protocol. First, plasma samples were diluted 1:5 with sample diluent...immunosorbent assays were performed as described in the AssayMax Human Transferrin ELISA kit protocol. First, serum samples were diluted 1:2000 with MIX
A laboratory information management system for the analysis of tritium (3H) in environmental waters.
Belachew, Dagnachew Legesse; Terzer-Wassmuth, Stefan; Wassenaar, Leonard I; Klaus, Philipp M; Copia, Lorenzo; Araguás, Luis J Araguás; Aggarwal, Pradeep
2018-07-01
Accurate and precise measurements of low levels of tritium ( 3 H) in environmental waters are difficult to attain due to complex steps of sample preparation, electrolytic enrichment, liquid scintillation decay counting, and extensive data processing. We present a Microsoft Access™ relational database application, TRIMS (Tritium Information Management System) to assist with sample and data processing of tritium analysis by managing the processes from sample registration and analysis to reporting and archiving. A complete uncertainty propagation algorithm ensures tritium results are reported with robust uncertainty metrics. TRIMS will help to increase laboratory productivity and improve the accuracy and precision of 3 H assays. The software supports several enrichment protocols and LSC counter types. TRIMS is available for download at no cost from the IAEA at www.iaea.org/water. Copyright © 2018 Elsevier Ltd. All rights reserved.
Petersen, James C.; Justus, B.G.; Dodd, H.R.; Bowles, D.E.; Morrison, L.W.; Williams, M.H.; Rowell, G.A.
2008-01-01
Buffalo National River located in north-central Arkansas, and Ozark National Scenic Riverways, located in southeastern Missouri, are the two largest units of the National Park Service in the Ozark Plateaus physiographic province. The purpose of this report is to provide a protocol that will be used by the National Park Service to sample fish communities and collect related water-quality, habitat, and stream discharge data of Buffalo National River and Ozark National Scenic Riverways to meet inventory and long-term monitoring objectives. The protocol includes (1) a protocol narrative, (2) several standard operating procedures, and (3) supplemental information helpful for implementation of the protocol. The protocol narrative provides background information about the protocol such as the rationale of why a particular resource or resource issue was selected for monitoring, information concerning the resource or resource issue of interest, a description of how monitoring results will inform management decisions, and a discussion of the linkages between this and other monitoring projects. The standard operating procedures cover preparation, training, reach selection, water-quality sampling, fish community sampling, physical habitat collection, measuring stream discharge, equipment maintenance and storage, data management and analysis, reporting, and protocol revision procedures. Much of the information in the standard operating procedures was gathered from existing protocols of the U.S. Geological Survey National Water Quality Assessment program or other sources. Supplemental information that would be helpful for implementing the protocol is included. This information includes information on fish species known or suspected to occur in the parks, sample sites, sample design, fish species traits, index of biotic integrity metrics, sampling equipment, and field forms.
Wüppenhorst, N; Consoir, C; Lörch, D; Schneider, C
2012-10-01
Several protocols for direct matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry (MALDI-TOF MS) from positive blood cultures are currently used to speed up the diagnostic process of bacteraemia. Identification rates are high and results are accurate for the BACTEC™ system and for charcoal-free bottles. Only a few studies have evaluated protocols for charcoal-containing BacT/ALERT bottles reaching substantially lower identification rates. We established a new protocol for sample preparation from aerobic and anaerobic positive charcoal-containing BacT/ALERT blood culture bottles and measured the protein profiles (n = 167). Then, we integrated this protocol in the routine workflow of our laboratory (n = 212). During the establishment of our protocol, 74.3 % of bacteria were correctly identified to the species level, in 23.4 %, no result and in 2.4 %, a false identification were obtained. Reliable criteria for correct species identification were a score value ≥1.400 and a best match on rank 1-3 of the same species. Identification rates during routine workflow were 77.8 % for correct identification, 20.8 % for not identified samples and 1.4 % for discordant identification. In conclusion, our results indicate that MALDI-TOF MS is possible, even from charcoal-containing blood cultures. Reliable criteria for correct species identification are a score value ≥1.400 and a best match on rank 1-3 of a single species.
Tissue Sampling Guides for Porcine Biomedical Models.
Albl, Barbara; Haesner, Serena; Braun-Reichhart, Christina; Streckel, Elisabeth; Renner, Simone; Seeliger, Frank; Wolf, Eckhard; Wanke, Rüdiger; Blutke, Andreas
2016-04-01
This article provides guidelines for organ and tissue sampling adapted to porcine animal models in translational medical research. Detailed protocols for the determination of sampling locations and numbers as well as recommendations on the orientation, size, and trimming direction of samples from ∼50 different porcine organs and tissues are provided in the Supplementary Material. The proposed sampling protocols include the generation of samples suitable for subsequent qualitative and quantitative analyses, including cryohistology, paraffin, and plastic histology; immunohistochemistry;in situhybridization; electron microscopy; and quantitative stereology as well as molecular analyses of DNA, RNA, proteins, metabolites, and electrolytes. With regard to the planned extent of sampling efforts, time, and personnel expenses, and dependent upon the scheduled analyses, different protocols are provided. These protocols are adjusted for (I) routine screenings, as used in general toxicity studies or in analyses of gene expression patterns or histopathological organ alterations, (II) advanced analyses of single organs/tissues, and (III) large-scale sampling procedures to be applied in biobank projects. Providing a robust reference for studies of porcine models, the described protocols will ensure the efficiency of sampling, the systematic recovery of high-quality samples representing the entire organ or tissue as well as the intra-/interstudy comparability and reproducibility of results. © The Author(s) 2016.
2016-02-11
process ( gas /vapor or liquid ), sampling will be conducted as soon as possible. Samples will be incubated for 12 to 48 hours (depending on the...Final 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE Test Operations Procedure (TOP) 08-2-065 Developmental Testing of Liquid and Gaseous...biological decontamination protocol to analyze the efficacy of liquid and gaseous/vaporous decontaminants on military-relevant surfaces. The
Validating Cognitive Models of Task Performance in Algebra on the SAT®. Research Report No. 2009-3
ERIC Educational Resources Information Center
Gierl, Mark J.; Leighton, Jacqueline P.; Wang, Changjiang; Zhou, Jiawen; Gokiert, Rebecca; Tan, Adele
2009-01-01
The purpose of the study is to present research focused on validating the four algebra cognitive models in Gierl, Wang, et al., using student response data collected with protocol analysis methods to evaluate the knowledge structures and processing skills used by a sample of SAT test takers.
The assumptions of traditional sampling theory often do not fit the circumstances when estimating the quantity and composition of solid waste arriving at a given location, such as a landfill site, or at a specific point in an industrial or commercial process. The investigator oft...
Treweek, Jennifer B; Chan, Ken Y; Flytzanis, Nicholas C; Yang, Bin; Deverman, Benjamin E; Greenbaum, Alon; Lignell, Antti; Xiao, Cheng; Cai, Long; Ladinsky, Mark S; Bjorkman, Pamela J; Fowlkes, Charless C; Gradinaru, Viviana
2015-11-01
To facilitate fine-scale phenotyping of whole specimens, we describe here a set of tissue fixation-embedding, detergent-clearing and staining protocols that can be used to transform excised organs and whole organisms into optically transparent samples within 1-2 weeks without compromising their cellular architecture or endogenous fluorescence. PACT (passive CLARITY technique) and PARS (perfusion-assisted agent release in situ) use tissue-hydrogel hybrids to stabilize tissue biomolecules during selective lipid extraction, resulting in enhanced clearing efficiency and sample integrity. Furthermore, the macromolecule permeability of PACT- and PARS-processed tissue hybrids supports the diffusion of immunolabels throughout intact tissue, whereas RIMS (refractive index matching solution) grants high-resolution imaging at depth by further reducing light scattering in cleared and uncleared samples alike. These methods are adaptable to difficult-to-image tissues, such as bone (PACT-deCAL), and to magnified single-cell visualization (ePACT). Together, these protocols and solutions enable phenotyping of subcellular components and tracing cellular connectivity in intact biological networks.
Matiasek, Kaspar; Pumarola I Batlle, Martí; Rosati, Marco; Fernández-Flores, Francisco; Fischer, Andrea; Wagner, Eva; Berendt, Mette; Bhatti, Sofie F M; De Risio, Luisa; Farquhar, Robyn G; Long, Sam; Muñana, Karen; Patterson, Edward E; Pakozdy, Akos; Penderis, Jacques; Platt, Simon; Podell, Michael; Potschka, Heidrun; Rusbridge, Clare; Stein, Veronika M; Tipold, Andrea; Volk, Holger A
2015-08-28
Traditionally, histological investigations of the epileptic brain are required to identify epileptogenic brain lesions, to evaluate the impact of seizure activity, to search for mechanisms of drug-resistance and to look for comorbidities. For many instances, however, neuropathological studies fail to add substantial data on patients with complete clinical work-up. This may be due to sparse training in epilepsy pathology and or due to lack of neuropathological guidelines for companion animals.The protocols introduced herein shall facilitate systematic sampling and processing of epileptic brains and therefore increase the efficacy, reliability and reproducibility of morphological studies in animals suffering from seizures.Brain dissection protocols of two neuropathological centres with research focus in epilepsy have been optimised with regards to their diagnostic yield and accuracy, their practicability and their feasibility concerning clinical research requirements.The recommended guidelines allow for easy, standardised and ubiquitous collection of brain regions, relevant for seizure generation. Tissues harvested the prescribed way will increase the diagnostic efficacy and provide reliable material for scientific investigations.
Lead Sampling Protocols: Why So Many and What Do They Tell You?
Sampling protocols can be broadly categorized based on their intended purpose of 1) Pb regulatory compliance/corrosion control efficacy, 2) Pb plumbing source determination or Pb type identification, and 3) Pb exposure assessment. Choosing the appropriate protocol is crucial to p...
NHEXAS PHASE I MARYLAND STUDY--LIST OF AVAILABLE DOCUMENTS: PROTOCOLS AND SOPS
This document lists available protocols and SOPs for the NHEXAS Phase I Maryland study. It identifies protocols and SOPs for the following study components: (1) Sample collection and field operations, (2) Sample analysis and general laboratory procedures, (3) Data Analysis Proced...
Aïoun, Josiane; Chat, Sophie; Bordat, Christian; Péchoux, Christine
2013-01-01
Most studies using microwave irradiation (MWI) for the preparation of tissue samples have reported an improvement in structural integrity. However, there have been few studies on the effect of microwave (MW) on antigen preservation during sample preparation prior to immunolocalization. This report documents our experience of specimen preparation using an automatic microwave apparatus to obtain antigen preservation and retrieval. We tested the effects of MW processing vs. conventional procedures on the morphology and antigenicity of two different tissues: the brain and mammary gland, whose chemical composition and anatomical organization are quite different. We chose to locate the transcription factor PPARβ/δ using immunocytochemistry on brain tissue sections from hamsters. Antigen retrieval protocols involving MWI were used to restore immunoreactivity. We also studied the efficiency of the ultrastructural immunolocalization of both PPARγ and caveolin-1 following MWI vs. conventional treatment, on mammary gland tissue from mice at 10 days of lactation. Our findings showed that the treatment of tissue samples with MWI, in the context of a process lasting just a few hours from fixation to immunolocalization, enabled similar, or even better, results than conventional protocols. The quantification of immunolabeling for cav-1 indicated an increase in density of up to three-fold in tissues processed in the microwave oven. Furthermore, MW treatment permitted the localization of PPARβ/δ in glutaraldehyde-fixed specimens, which was impossible in the absence of MWI. This study thus showed that techniques involving the use of microwaves could largely improve both ultrastructure and immunodetection. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fong, Erika J.; Huang, Chao; Hamilton, Julie
Here, a major advantage of microfluidic devices is the ability to manipulate small sample volumes, thus reducing reagent waste and preserving precious sample. However, to achieve robust sample manipulation it is necessary to address device integration with the macroscale environment. To realize repeatable, sensitive particle separation with microfluidic devices, this protocol presents a complete automated and integrated microfluidic platform that enables precise processing of 0.15–1.5 ml samples using microfluidic devices. Important aspects of this system include modular device layout and robust fixtures resulting in reliable and flexible world to chip connections, and fully-automated fluid handling which accomplishes closed-loop sample collection,more » system cleaning and priming steps to ensure repeatable operation. Different microfluidic devices can be used interchangeably with this architecture. Here we incorporate an acoustofluidic device, detail its characterization, performance optimization, and demonstrate its use for size-separation of biological samples. By using real-time feedback during separation experiments, sample collection is optimized to conserve and concentrate sample. Although requiring the integration of multiple pieces of equipment, advantages of this architecture include the ability to process unknown samples with no additional system optimization, ease of device replacement, and precise, robust sample processing.« less
Sanchez, Gaëtan; Lecaignard, Françoise; Otman, Anatole; Maby, Emmanuel; Mattout, Jérémie
2016-01-01
The relatively young field of Brain-Computer Interfaces has promoted the use of electrophysiology and neuroimaging in real-time. In the meantime, cognitive neuroscience studies, which make extensive use of functional exploration techniques, have evolved toward model-based experiments and fine hypothesis testing protocols. Although these two developments are mostly unrelated, we argue that, brought together, they may trigger an important shift in the way experimental paradigms are being designed, which should prove fruitful to both endeavors. This change simply consists in using real-time neuroimaging in order to optimize advanced neurocognitive hypothesis testing. We refer to this new approach as the instantiation of an Active SAmpling Protocol (ASAP). As opposed to classical (static) experimental protocols, ASAP implements online model comparison, enabling the optimization of design parameters (e.g., stimuli) during the course of data acquisition. This follows the well-known principle of sequential hypothesis testing. What is radically new, however, is our ability to perform online processing of the huge amount of complex data that brain imaging techniques provide. This is all the more relevant at a time when physiological and psychological processes are beginning to be approached using more realistic, generative models which may be difficult to tease apart empirically. Based upon Bayesian inference, ASAP proposes a generic and principled way to optimize experimental design adaptively. In this perspective paper, we summarize the main steps in ASAP. Using synthetic data we illustrate its superiority in selecting the right perceptual model compared to a classical design. Finally, we briefly discuss its future potential for basic and clinical neuroscience as well as some remaining challenges.
Non-Contact Conductivity Measurement for Automated Sample Processing Systems
NASA Technical Reports Server (NTRS)
Beegle, Luther W.; Kirby, James P.
2012-01-01
A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables
Representativeness of laboratory sampling procedures for the analysis of trace metals in soil.
Dubé, Jean-Sébastien; Boudreault, Jean-Philippe; Bost, Régis; Sona, Mirela; Duhaime, François; Éthier, Yannic
2015-08-01
This study was conducted to assess the representativeness of laboratory sampling protocols for purposes of trace metal analysis in soil. Five laboratory protocols were compared, including conventional grab sampling, to assess the influence of sectorial splitting, sieving, and grinding on measured trace metal concentrations and their variability. It was concluded that grinding was the most important factor in controlling the variability of trace metal concentrations. Grinding increased the reproducibility of sample mass reduction by rotary sectorial splitting by up to two orders of magnitude. Combined with rotary sectorial splitting, grinding increased the reproducibility of trace metal concentrations by almost three orders of magnitude compared to grab sampling. Moreover, results showed that if grinding is used as part of a mass reduction protocol by sectorial splitting, the effect of sieving on reproducibility became insignificant. Gy's sampling theory and practice was also used to analyze the aforementioned sampling protocols. While the theoretical relative variances calculated for each sampling protocol qualitatively agreed with the experimental variances, their quantitative agreement was very poor. It was assumed that the parameters used in the calculation of theoretical sampling variances may not correctly estimate the constitutional heterogeneity of soils or soil-like materials. Finally, the results have highlighted the pitfalls of grab sampling, namely, the fact that it does not exert control over incorrect sampling errors and that it is strongly affected by distribution heterogeneity.
Current Protocols in Pharmacology
2016-01-01
Determination of drug or drug metabolite concentrations in biological samples, particularly in serum or plasma, is fundamental to describing the relationships between administered dose, route of administration, and time after dose to the drug concentrations achieved and to the observed effects of the drug. A well-characterized, accurate analytical method is needed, but it must also be established that the analyte concentration in the sample at the time of analysis is the same as the concentration at sample acquisition. Drugs and metabolites may be susceptible to degradation in samples due to metabolism or to physical and chemical processes, resulting in a lower measured concentration than was in the original sample. Careful examination of analyte stability during processing and storage and adjustment of procedures and conditions to maximize that stability are a critical part of method validation for the analysis, and can ensure the accuracy of the measured concentrations. PMID:27960029
NASA Astrophysics Data System (ADS)
Schwab, M. P.; Klaus, J.; Pfister, L.; Weiler, M.
2016-12-01
Over the past decades, stream sampling protocols for hydro-geochemical parameters were often limited by logistical and technological constraints. While long-term monitoring protocols were typically based on weekly sampling intervals, high frequency sampling was commonly limited to a few single events. In this contribution, we combined high frequency and long-term measurements to understand DOC and nitrate dynamics in a forest headwater for different runoff events and seasons. Our study area is the forested Weierbach catchment (0.47 km2) in Luxembourg, where the fractured schist bedrock is covered by cambisol soils. The runoff response is characterized by a double peak behaviour. The first peak occurs during or right after a rainfall event triggered by fast near surface runoff generation processes, while a second delayed peak lasts several days and is generated by subsurface flow. This second peak occurs only if a distinct storage threshold of the catchment is exceeded. Our observations were carried out with a field deployable UV-Vis spectrometer measuring DOC and nitrate concentrations in-situ at 15 min intervals for more than two years. In addition, a long-term validation was carried out with data obtained from the analysis of water collected with grab samples. The long-term, high-frequency measurements allowed us to calculate a complete and detailed balance of DOC and nitrate export over two years. Transport behaviour of the DOC and nitrate showed different dynamics between the first and second hydrograph peaks. DOC is mainly exported during the first peaks, while nitrate is mostly exported during the delayed second peaks. Biweekly end-member measurement of soil and groundwater over several years enables us to link the behaviour of DOC and nitrate export to various end-members in the catchment. Altogether, the long-term and high-frequency time series provides the opportunity to study DOC and nitrate export processes without having to just rely only on either a few single event measurements or coarse measurement protocols.
Mercury Assessment and Monitoring Protocol for the Bear Creek Watershed, Colusa County, California
Suchanek, Thomas H.; Hothem, Roger L.; Rytuba, James J.; Yee, Julie L.
2010-01-01
This report summarizes the known information on the occurrence and distribution of mercury (Hg) in physical/chemical and biological matrices within the Bear Creek watershed. Based on these data, a matrix-specific monitoring protocol for the evaluation of the effectiveness of activities designed to remediate Hg contamination in the Bear Creek watershed is presented. The monitoring protocol documents procedures for collecting and processing water, sediment, and biota for estimation of total Hg (TotHg) and monomethyl mercury (MMeHg) in the Bear Creek watershed. The concurrent sampling of TotHg and MMeHg in biota as well as water and sediment from 10 monitoring sites is designed to assess the relative bioavailability of Hg released from Hg sources in the watershed and identify environments conducive to Hg methylation. These protocols are designed to assist landowners, land managers, water quality regulators, and scientists in determining whether specific restoration/mitigation actions lead to significant progress toward achieving water quality goals to reduce Hg in Bear and Sulphur Creeks.
Efficient Isolation Protocol for B and T Lymphocytes from Human Palatine Tonsils
Assadian, Farzaneh; Sandström, Karl; Laurell, Göran; Svensson, Catharina; Akusjärvi, Göran; Punga, Tanel
2015-01-01
Tonsils form a part of the immune system providing the first line of defense against inhaled pathogens. Usually the term “tonsils” refers to the palatine tonsils situated at the lateral walls of the oral part of the pharynx. Surgically removed palatine tonsils provide a convenient accessible source of B and T lymphocytes to study the interplay between foreign pathogens and the host immune system. This video protocol describes the dissection and processing of surgically removed human palatine tonsils, followed by the isolation of the individual B and T cell populations from the same tissue sample. We present a method, which efficiently separates tonsillar B and T lymphocytes using an antibody-dependent affinity protocol. Further, we use the method to demonstrate that human adenovirus infects specifically the tonsillar T cell fraction. The established protocol is generally applicable to efficiently and rapidly isolate tonsillar B and T cell populations to study the role of different types of pathogens in tonsillar immune responses. PMID:26650582
Sticky trap and stem-tap sampling protocols for the Asian citrus psyllid (Hemiptera: Psyllidae)
USDA-ARS?s Scientific Manuscript database
Sampling statistics were obtained to develop a sampling protocol for estimating numbers of adult Diaphorina citri in citrus using two different sampling methods: yellow sticky traps and stem–tap samples. A 4.0 ha block of mature orange trees was stratified into ten 0.4 ha strata and sampled using...
Bova, G Steven; Eltoum, Isam A; Kiernan, John A; Siegal, Gene P; Frost, Andra R; Best, Carolyn J M; Gillespie, John W; Emmert-Buck, Michael R
2005-01-01
Isolation of well-preserved pure cell populations is a prerequisite for sound studies of the molecular basis of pancreatic malignancy and other biological phenomena. This chapter reviews current methods for obtaining anatomically specific signals from molecules isolated from tissues, a basic requirement for productive linking of phenotype and genotype. The quality of samples isolated from tissue and used for molecular analysis is often glossed-over or omitted from publications, making interpretation and replication of data difficult or impossible. Fortunately, recently developed techniques allow life scientists to better document and control the quality of samples used for a given assay, creating a foundation for improvement in this area. Tissue processing for molecular studies usually involves some or all of the following steps: tissue collection, gross dissection/identification, fixation, processing/embedding, storage/archiving, sectioning, staining, microdissection/annotation, and pure analyte labeling/identification. High-quality tissue microdissection does not necessarily mean high-quality samples to analyze. The quality of biomaterials obtained for analysis is highly dependent on steps upstream and downstream from tissue microdissection. We provide protocols for each of these steps, and encourage you to improve upon these. It is worth the effort of every laboratory to optimize and document its technique at each stage of the process, and we provide a starting point for those willing to spend the time to optimize. In our view, poor documentation of tissue and cell type of origin and the use of nonoptimized protocols is a source of inefficiency in current life science research. Even incremental improvement in this area will increase productivity significantly.
Oliveira, R R; Viana, A J C; Reátegui, A C E; Vincentz, M G A
2015-12-29
Determination of gene expression is an important tool to study biological processes and relies on the quality of the extracted RNA. Changes in gene expression profiles may be directly related to mutations in regulatory DNA sequences or alterations in DNA cytosine methylation, which is an epigenetic mark. Correlation of gene expression with DNA sequence or epigenetic mark polymorphism is often desirable; for this, a robust protocol to isolate high-quality RNA and DNA simultaneously from the same sample is required. Although commercial kits and protocols are available, they are mainly optimized for animal tissues and, in general, restricted to RNA or DNA extraction, not both. In the present study, we describe an efficient and accessible method to extract both RNA and DNA simultaneously from the same sample of various plant tissues, using small amounts of starting material. The protocol was efficient in the extraction of high-quality nucleic acids from several Arabidopsis thaliana tissues (e.g., leaf, inflorescence stem, flower, fruit, cotyledon, seedlings, root, and embryo) and from other tissues of non-model plants, such as Avicennia schaueriana (Acanthaceae), Theobroma cacao (Malvaceae), Paspalum notatum (Poaceae), and Sorghum bicolor (Poaceae). The obtained nucleic acids were used as templates for downstream analyses, such as mRNA sequencing, quantitative real time-polymerase chain reaction, bisulfite treatment, and others; the results were comparable to those obtained with commercial kits. We believe that this protocol could be applied to a broad range of plant species, help avoid technical and sampling biases, and facilitate several RNA- and DNA-dependent analyses.
2012-01-01
Background Traditional PCR methods for forensic STR genotyping require approximately 2.5 to 4 hours to complete, contributing a significant portion of the time required to process forensic DNA samples. The purpose of this study was to develop and validate a fast PCR protocol that enabled amplification of the 16 loci targeted by the AmpFℓSTR® Identifiler® primer set, allowing decreased cycling times. Methods Fast PCR conditions were achieved by substituting the traditional Taq polymerase for SpeedSTAR™ HS DNA polymerase which is designed for fast PCR, by upgrading to a thermal cycler with faster temperature ramping rates and by modifying cycling parameters (less time at each temperature) and adopting a two-step PCR approach. Results The total time required for the optimized protocol is 26 min. A total of 147 forensically relevant DNA samples were amplified using the fast PCR protocol for Identifiler. Heterozygote peak height ratios were not affected by fast PCR conditions, and full profiles were generated for single-source DNA amounts between 0.125 ng and 2.0 ng. Individual loci in profiles produced with the fast PCR protocol exhibited average n-4 stutter percentages ranging from 2.5 ± 0.9% (THO1) to 9.9 ± 2.7% (D2S1338). No increase in non-adenylation or other amplification artefacts was observed. Minor contributor alleles in two-person DNA mixtures were reliably discerned. Low level cross-reactivity (monomorphic peaks) was observed with some domestic animal DNA. Conclusions The fast PCR protocol presented offers a feasible alternative to current amplification methods and could aid in reducing the overall time in STR profile production or could be incorporated into a fast STR genotyping procedure for time-sensitive situations. PMID:22394458
Foster, Amanda; Laurin, Nancy
2012-03-06
Traditional PCR methods for forensic STR genotyping require approximately 2.5 to 4 hours to complete, contributing a significant portion of the time required to process forensic DNA samples. The purpose of this study was to develop and validate a fast PCR protocol that enabled amplification of the 16 loci targeted by the AmpFℓSTR® Identifiler® primer set, allowing decreased cycling times. Fast PCR conditions were achieved by substituting the traditional Taq polymerase for SpeedSTAR™ HS DNA polymerase which is designed for fast PCR, by upgrading to a thermal cycler with faster temperature ramping rates and by modifying cycling parameters (less time at each temperature) and adopting a two-step PCR approach. The total time required for the optimized protocol is 26 min. A total of 147 forensically relevant DNA samples were amplified using the fast PCR protocol for Identifiler. Heterozygote peak height ratios were not affected by fast PCR conditions, and full profiles were generated for single-source DNA amounts between 0.125 ng and 2.0 ng. Individual loci in profiles produced with the fast PCR protocol exhibited average n-4 stutter percentages ranging from 2.5 ± 0.9% (THO1) to 9.9 ± 2.7% (D2S1338). No increase in non-adenylation or other amplification artefacts was observed. Minor contributor alleles in two-person DNA mixtures were reliably discerned. Low level cross-reactivity (monomorphic peaks) was observed with some domestic animal DNA. The fast PCR protocol presented offers a feasible alternative to current amplification methods and could aid in reducing the overall time in STR profile production or could be incorporated into a fast STR genotyping procedure for time-sensitive situations.
Gamma radiation-induced thermoluminescence emission of minerals adhered to Mexican sesame seeds
NASA Astrophysics Data System (ADS)
Rodríguez-Lazcano, Y.; Correcher, V.; Garcia-Guinea, J.; Cruz-Zaragoza, E.
2013-02-01
The thermoluminescence (TL) emission of minerals isolated from Mexican sesame seeds appear as a good tool to discern between irradiated and non-irradiated samples. According to the X-ray diffraction (XRD) and environmental scanning electron microscope (ESEM) data, the adhered dust in both samples is mainly composed of different amounts of quartz and feldspars. These mineral phases exhibit (i) enough sensitivity to ionizing radiation inducing good TL intensity, (ii) high stability of the TL signal during the storage of the material, i.e. low fading, and (iii) are thermally and chemically stable. Blind tests were performed under laboratory conditions, but simulating industrial preservation processes, allow us to distinguish between 1 kGy gamma-irradiated and non-irradiated samples even 15 months after irradiation processing followed the EN 1788 European Standard protocol in sesame samples.
21 CFR 660.6 - Samples; protocols; official release.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... of official release is no longer required under paragraph (c)(2) of this section. (ii) One sample at... required under paragraph (c)(2) of this section. The sample submitted at the 90-day interval shall be from...
21 CFR 660.6 - Samples; protocols; official release.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... of official release is no longer required under paragraph (c)(2) of this section. (ii) One sample at... required under paragraph (c)(2) of this section. The sample submitted at the 90-day interval shall be from...
21 CFR 660.6 - Samples; protocols; official release.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Hepatitis B Surface Antigen § 660.6 Samples; protocols; official release. (a) Samples. (1) For the purposes... of official release is no longer required under paragraph (c)(2) of this section. (ii) One sample at... required under paragraph (c)(2) of this section. The sample submitted at the 90-day interval shall be from...
Fong, Erika J.; Huang, Chao; Hamilton, Julie; ...
2015-11-23
Here, a major advantage of microfluidic devices is the ability to manipulate small sample volumes, thus reducing reagent waste and preserving precious sample. However, to achieve robust sample manipulation it is necessary to address device integration with the macroscale environment. To realize repeatable, sensitive particle separation with microfluidic devices, this protocol presents a complete automated and integrated microfluidic platform that enables precise processing of 0.15–1.5 ml samples using microfluidic devices. Important aspects of this system include modular device layout and robust fixtures resulting in reliable and flexible world to chip connections, and fully-automated fluid handling which accomplishes closed-loop sample collection,more » system cleaning and priming steps to ensure repeatable operation. Different microfluidic devices can be used interchangeably with this architecture. Here we incorporate an acoustofluidic device, detail its characterization, performance optimization, and demonstrate its use for size-separation of biological samples. By using real-time feedback during separation experiments, sample collection is optimized to conserve and concentrate sample. Although requiring the integration of multiple pieces of equipment, advantages of this architecture include the ability to process unknown samples with no additional system optimization, ease of device replacement, and precise, robust sample processing.« less
Roberts, J. Scott; Shalowitz, David I.; Christensen, Kurt D.; Everett, Jessica N.; Kim, Scott Y. H.; Raskin, Leon; Gruber, Stephen B.
2011-01-01
The obligations of researchers to disclose clinically and/or personally significant individual research results are highly debated, but few empirical studies have addressed this topic. We describe the development of a protocol for returning research results to participants at one site of a multicenter study of the genetic epidemiology of melanoma. Protocol development involved numerous challenges: (1) deciding whether genotype results merited disclosure; (2) achieving an appropriate format for communicating results; (3) developing education materials; (4) deciding whether to retest samples for additional laboratory validation; (5) identifying and notifying selected participants; and (6) assessing the impact of disclosure. Our experience suggests potential obstacles depending on researcher resources and the design of the parent study, but offers a process by which researchers can responsibly return individual study results and evaluate the impact of disclosure. PMID:20831418
A new real-time PCR protocol for detection of avian haemosporidians.
Bell, Jeffrey A; Weckstein, Jason D; Fecchio, Alan; Tkach, Vasyl V
2015-07-19
Birds possess the most diverse assemblage of haemosporidian parasites; including three genera, Plasmodium, Haemoproteus, and Leucocytozoon. Currently there are over 200 morphologically identified avian haemosporidian species, although true species richness is unknown due to great genetic diversity and insufficient sampling in highly diverse regions. Studies aimed at surveying haemosporidian diversity involve collecting and screening samples from hundreds to thousands of individuals. Currently, screening relies on microscopy and/or single or nested standard PCR. Although effective, these methods are time and resource consuming, and in the case of microscopy require substantial expertise. Here we report a newly developed real-time PCR protocol designed to quickly and reliably detect all three genera of avian haemosporidians in a single biochemical reaction. Using available DNA sequences from avian haemosporidians we designed primers R330F and R480RL, which flank a 182 base pair fragment of mitochondrial conserved rDNA. These primers were initially tested using real-time PCR on samples from Malawi, Africa, previously screened for avian haemosporidians using traditional nested PCR. Our real time protocol was further tested on 94 samples from the Cerrado biome of Brazil, previously screened using a single PCR assay for haemosporidian parasites. These samples were also amplified using modified nested PCR protocols, allowing for comparisons between the three different screening methods (single PCR, nested PCR, real-time PCR). The real-time PCR protocol successfully identified all three genera of avian haemosporidians from both single and mixed infections previously detected from Malawi. There was no significant difference between the three different screening protocols used for the 94 samples from the Brazilian Cerrado (χ(2) = 0.3429, df = 2, P = 0.842). After proving effective, the real-time protocol was used to screen 2113 Brazilian samples, identifying 693 positive samples. Our real-time PCR assay proved as effective as two widely used molecular screening techniques, single PCR and nested PCR. However, the real-time protocol has the distinct advantage of detecting all three genera in a single reaction, which significantly increases efficiency by greatly decreasing screening time and cost. Our real-time PCR protocol is therefore a valuable tool in the quickly expanding field of avian haemosporidian research.
Rapid screening method for male DNA by using the loop-mediated isothermal amplification assay.
Kitamura, Masashi; Kubo, Seiji; Tanaka, Jin; Adachi, Tatsushi
2017-08-12
Screening for male-derived biological material from collected samples plays an important role in criminal investigations, especially those involving sexual assaults. We have developed a loop-mediated isothermal amplification (LAMP) assay targeting multi-repeat sequences of the Y chromosome for detecting male DNA. Successful amplification occurred with 0.5 ng of male DNA under isothermal conditions of 61 to 67 °C, but no amplification occurred with up to 10 ng of female DNA. Under the optimized conditions, the LAMP reaction initiated amplification within 10 min and amplified for 20 min. The LAMP reaction was sensitive at levels as low as 1-pg male DNA, and a quantitative LAMP assay could be developed because of the strong correlation between the reaction time and the amount of template DNA in the range of 10 pg to 10 ng. Furthermore, to apply the LAMP assay to on-site screening for male-derived samples, we evaluated a protocol using a simple DNA extraction method and a colorimetric intercalating dye that allows detection of the LAMP reaction by evaluating the change in color of the solution. Using this protocol, samples of male-derived blood and saliva stains were processed in approximately 30 min from DNA extraction to detection. Because our protocol does not require much hands-on time or special equipment, this LAMP assay promises to become a rapid and simple screening method for male-derived samples in forensic investigations.
General introduction for the “National Field Manual for the Collection of Water-Quality Data”
,
2018-02-28
BackgroundAs part of its mission, the U.S. Geological Survey (USGS) collects data to assess the quality of our Nation’s water resources. A high degree of reliability and standardization of these data are paramount to fulfilling this mission. Documentation of nationally accepted methods used by USGS personnel serves to maintain consistency and technical quality in data-collection activities. “The National Field Manual for the Collection of Water-Quality Data” (NFM) provides documented guidelines and protocols for USGS field personnel who collect water-quality data. The NFM provides detailed, comprehensive, and citable procedures for monitoring the quality of surface water and groundwater. Topics in the NFM include (1) methods and protocols for sampling water resources, (2) methods for processing samples for analysis of water quality, (3) methods for measuring field parameters, and (4) specialized procedures, such as sampling water for low levels of mercury and organic wastewater chemicals, measuring biological indicators, and sampling bottom sediment for chemistry. Personnel who collect water-quality data for national USGS programs and projects, including projects supported by USGS cooperative programs, are mandated to use protocols provided in the NFM per USGS Office of Water Quality Technical Memorandum 2002.13. Formal training, for example, as provided in the USGS class, “Field Water-Quality Methods for Groundwater and Surface Water,” and field apprenticeships supplement the guidance provided in the NFM and ensure that the data collected are high quality, accurate, and scientifically defensible.
The role of the dorsolateral prefrontal cortex in early threat processing: a TMS study.
Sagliano, Laura; D'Olimpio, Francesca; Panico, Francesco; Gagliardi, Serena; Trojano, Luigi
2016-12-01
Previous studies demonstrated that excitatory (high frequency) offline transcranial magnetic stimulation (TMS) over the left and right dorsolateral prefrontal cortex (DLPFC) modulates attention allocation on threatening stimuli in non-clinical samples. These studies only employed offline TMS protocol that did not allow investigating the effect of the stimulation on the early stage of threat processing. In this study, the role of the right and left dorsolateral prefrontal cortex in early threat processing was investigated in high and low anxious individuals by means of an inhibitory single-pulse online TMS protocol. Our results demonstrated the role of the left DLPFC in an early stage of threat processing and that this effect is modulated by individuals' anxiety level. The inhibitory stimulation of the left DLPFC determined a disengagement bias in high anxious individuals, while the same stimulation determined an attentional avoidance in low anxious individuals. The findings of the present study suggest that right and left DLPFC are differently involved in early threat processing of healthy individuals. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Fonseca, Rochele Paz; Fachel, Jandyra Maria Guimarães; Chaves, Márcia Lorena Fagundes; Liedtke, Francéia Veiga; Parente, Maria Alice de Mattos Pimenta
2007-01-01
Right-brain-damaged individuals may present discursive, pragmatic, lexical-semantic and/or prosodic disorders. Objective To verify the effect of right hemisphere damage on communication processing evaluated by the Brazilian version of the Protocole Montréal d’Évaluation de la Communication (Montreal Communication Evaluation Battery) – Bateria Montreal de Avaliação da Comunicação, Bateria MAC, in Portuguese. Methods A clinical group of 29 right-brain-damaged participants and a control group of 58 non-brain-damaged adults formed the sample. A questionnaire on sociocultural and health aspects, together with the Brazilian MAC Battery was administered. Results Significant differences between the clinical and control groups were observed in the following MAC Battery tasks: conversational discourse, unconstrained, semantic and orthographic verbal fluency, linguistic prosody repetition, emotional prosody comprehension, repetition and production. Moreover, the clinical group was less homogeneous than the control group. Conclusions A right-brain-damage effect was identified directly, on three communication processes: discursive, lexical-semantic and prosodic processes, and indirectly, on pragmatic process. PMID:29213400
Oliveira, Hugo M; Segundo, Marcela A; Lima, José L F C; Miró, Manuel; Cerdà, Victor
2010-05-01
In the present work, it is proposed, for the first time, an on-line automatic renewable molecularly imprinted solid-phase extraction (MISPE) protocol for sample preparation prior to liquid chromatographic analysis. The automatic microscale procedure was based on the bead injection (BI) concept under the lab-on-valve (LOV) format, using a multisyringe burette as propulsion unit for handling solutions and suspensions. A high precision on handling the suspensions containing irregularly shaped molecularly imprinted polymer (MIP) particles was attained, enabling the use of commercial MIP as renewable sorbent. The features of the proposed BI-LOV manifold also allowed a strict control of the different steps within the extraction protocol, which are essential for promoting selective interactions in the cavities of the MIP. By using this on-line method, it was possible to extract and quantify riboflavin from different foodstuff samples in the range between 0.450 and 5.00 mg L(-1) after processing 1,000 microL of sample (infant milk, pig liver extract, and energy drink) without any prior treatment. For milk samples, LOD and LOQ values were 0.05 and 0.17 mg L(-1), respectively. The method was successfully applied to the analysis of two certified reference materials (NIST 1846 and BCR 487) with high precision (RSD < 5.5%). Considering the downscale and simplification of the sample preparation protocol and the simultaneous performance of extraction and chromatographic assays, a cost-effective and enhanced throughput (six determinations per hour) methodology for determination of riboflavin in foodstuff samples is deployed here.
NASA Technical Reports Server (NTRS)
1981-01-01
The purpose of the Orbiting Quarantine Facility is to provide maximum protection of the terrestrial biosphere by ensuring that the returned Martian samples are safe to bring to Earth. The protocol designed to detect the presence of biologically active agents in the Martian soil is described. The protocol determines one of two things about the sample: (1) that it is free from nonterrestrial life forms and can be sent to a terrestrial containment facility where extensive chemical, biochemical, geological, and physical investigations can be conducted; or (2) that it exhibits "biological effects" of the type that dictate second order testing. The quarantine protocol is designed to be conducted on a small portion of the returned sample, leaving the bulk of the sample undisturbed for study on Earth.
COMPARISON OF USEPA FIELD SAMPLING METHODS FOR BENTHIC MACROINVERTEBRATE STUDIES
Two U.S. Environmental Protection Agency (USEPA) macroinvertebrate sampling protocols were compared in the Mid-Atlantic Highlands region. The Environmental Monitoring and Assessment Program (EMAP) wadeable streams protocol results in a single composite sample from nine transects...
A Field Comparison of Sampling Protocols for Measuring Lead in Drinking Water
US EPA Region 5 conducted a sampling study that demonstrates existing sampling protocols used for the Lead and Copper Rule (LCR) underestimate peak and probable mass of lead released in a system with lead service lines (LSLs). This comparative stagnation sampling was conducted i...
Cleft Audit Protocol for Speech (CAPS-A): A Comprehensive Training Package for Speech Analysis
ERIC Educational Resources Information Center
Sell, D.; John, A.; Harding-Bell, A.; Sweeney, T.; Hegarty, F.; Freeman, J.
2009-01-01
Background: The previous literature has largely focused on speech analysis systems and ignored process issues, such as the nature of adequate speech samples, data acquisition, recording and playback. Although there has been recognition of the need for training on tools used in speech analysis associated with cleft palate, little attention has been…
Hope Therapy in a Community Sample: A Pilot Investigation
ERIC Educational Resources Information Center
Cheavens, Jennifer S.; Feldman, David B.; Gum, Amber; Michael, Scott T.; Snyder, C. R.
2006-01-01
We report findings from an initial empirical test of a hope-based, group therapy protocol. In this context, hope is defined as a cognitive process through which individuals pursue their goals [Snyder, C. R.: 1994, Free Press, New York]. As such, the eight-session group treatment emphasized building goal-pursuit skills. Findings from a randomized,…
Inspection of care: Findings from an innovative demonstration
Morris, John N.; Sherwood, Clarence C.; Dreyer, Paul
1989-01-01
In this article, information is presented concerning the efficacy of a sample-based approach to completing inspection of care reviews of Medicaid-supported nursing home residents. Massachusetts nursing homes were randomly assigned to full (the control group) or sample (the experimental group) review conditions. The primary research focus was to determine whether the proportion of facilities found to be deficient (based on quality of care and level of care criteria) in the experimental sample was comparable to the proportion in the control sample. The findings supported such a hypothesis: Deficient facilities appear to be equally identifiable using the random or full-sampling protocols, and the process can be completed with a considerable savings of surveyor time. PMID:10313458
Yocgo, Rosita E; Geza, Ephifania; Chimusa, Emile R; Mazandu, Gaston K
2017-11-23
Advances in forward and reverse genetic techniques have enabled the discovery and identification of several plant defence genes based on quantifiable disease phenotypes in mutant populations. Existing models for testing the effect of gene inactivation or genes causing these phenotypes do not take into account eventual uncertainty of these datasets and potential noise inherent in the biological experiment used, which may mask downstream analysis and limit the use of these datasets. Moreover, elucidating biological mechanisms driving the induced disease resistance and influencing these observable disease phenotypes has never been systematically tackled, eliciting the need for an efficient model to characterize completely the gene target under consideration. We developed a post-gene silencing bioinformatics (post-GSB) protocol which accounts for potential biases related to the disease phenotype datasets in assessing the contribution of the gene target to the plant defence response. The post-GSB protocol uses Gene Ontology semantic similarity and pathway dataset to generate enriched process regulatory network based on the functional degeneracy of the plant proteome to help understand the induced plant defence response. We applied this protocol to investigate the effect of the NPR1 gene silencing to changes in Arabidopsis thaliana plants following Pseudomonas syringae pathovar tomato strain DC3000 infection. Results indicated that the presence of a functionally active NPR1 reduced the plant's susceptibility to the infection, with about 99% of variability in Pseudomonas spore growth between npr1 mutant and wild-type samples. Moreover, the post-GSB protocol has revealed the coordinate action of target-associated genes and pathways through an enriched process regulatory network, summarizing the potential target-based induced disease resistance mechanism. This protocol can improve the characterization of the gene target and, potentially, elucidate induced defence response by more effectively utilizing available phenotype information and plant proteome functional knowledge.
Dentinger, Bryn T M; Margaritescu, Simona; Moncalvo, Jean-Marc
2010-07-01
We present two methods for DNA extraction from fresh and dried mushrooms that are adaptable to high-throughput sequencing initiatives, such as DNA barcoding. Our results show that these protocols yield ∼85% sequencing success from recently collected materials. Tests with both recent (<2 year) and older (>100 years) specimens reveal that older collections have low success rates and may be an inefficient resource for populating a barcode database. However, our method of extracting DNA from herbarium samples using small amount of tissue is reliable and could be used for important historical specimens. The application of these protocols greatly reduces time, and therefore cost, of generating DNA sequences from mushrooms and other fungi vs. traditional extraction methods. The efficiency of these methods illustrates that standardization and streamlining of sample processing should be shifted from the laboratory to the field. © 2009 Blackwell Publishing Ltd.
Judges' Agreement and Disagreement Patterns When Encoding Verbal Protocols.
ERIC Educational Resources Information Center
Schael, Jocelyne; Dionne, Jean-Paul
The basis of agreement or disagreement among judges/evaluators when applying a coding scheme to concurrent verbal protocols was studied. The sample included 20 university graduates, from varied backgrounds; 10 subjects had and 10 subjects did not have experience in protocol analysis. The total sample was divided into four balanced groups according…
Protocol-based care: the standardisation of decision-making?
Rycroft-Malone, Jo; Fontenla, Marina; Seers, Kate; Bick, Debra
2009-05-01
To explore how protocol-based care affects clinical decision-making. In the context of evidence-based practice, protocol-based care is a mechanism for facilitating the standardisation of care and streamlining decision-making through rationalising the information with which to make judgements and ultimately decisions. However, whether protocol-based care does, in the reality of practice, standardise decision-making is unknown. This paper reports on a study that explored the impact of protocol-based care on nurses' decision-making. Theoretically informed by realistic evaluation and the promoting action on research implementation in health services framework, a case study design using ethnographic methods was used. Two sites were purposively sampled; a diabetic and endocrine unit and a cardiac medical unit. Within each site, data collection included observation, postobservation semi-structured interviews with staff and patients, field notes, feedback sessions and document review. Data were inductively and thematically analysed. Decisions made by nurses in both sites were varied according to many different and interacting factors. While several standardised care approaches were available for use, in reality, a variety of information sources informed decision-making. The primary approach to knowledge exchange and acquisition was person-to-person; decision-making was a social activity. Rarely were standardised care approaches obviously referred to; nurses described following a mental flowchart, not necessarily linked to a particular guideline or protocol. When standardised care approaches were used, it was reported that they were used flexibly and particularised. While the logic of protocol-based care is algorithmic, in the reality of clinical practice, other sources of information supported nurses' decision-making process. This has significant implications for the political goal of standardisation. The successful implementation and judicious use of tools such as protocols and guidelines will likely be dependant on approaches that facilitate the development of nurses' decision-making processes in parallel to paying attention to the influence of context.
Host-Associated Metagenomics: A Guide to Generating Infectious RNA Viromes
Robert, Catherine; Pascalis, Hervé; Michelle, Caroline; Jardot, Priscilla; Charrel, Rémi; Raoult, Didier; Desnues, Christelle
2015-01-01
Background Metagenomic analyses have been widely used in the last decade to describe viral communities in various environments or to identify the etiology of human, animal, and plant pathologies. Here, we present a simple and standardized protocol that allows for the purification and sequencing of RNA viromes from complex biological samples with an important reduction of host DNA and RNA contaminants, while preserving the infectivity of viral particles. Principal Findings We evaluated different viral purification steps, random reverse transcriptions and sequence-independent amplifications of a pool of representative RNA viruses. Viruses remained infectious after the purification process. We then validated the protocol by sequencing the RNA virome of human body lice engorged in vitro with artificially contaminated human blood. The full genomes of the most abundant viruses absorbed by the lice during the blood meal were successfully sequenced. Interestingly, random amplifications differed in the genome coverage of segmented RNA viruses. Moreover, the majority of reads were taxonomically identified, and only 7–15% of all reads were classified as “unknown”, depending on the random amplification method. Conclusion The protocol reported here could easily be applied to generate RNA viral metagenomes from complex biological samples of different origins. Our protocol allows further virological characterizations of the described viral communities because it preserves the infectivity of viral particles and allows for the isolation of viruses. PMID:26431175
EPR study on gamma-irradiated fruits dehydrated via osmosis
NASA Astrophysics Data System (ADS)
Yordanov, N. D.; Aleksieva, K.
2007-06-01
The shape and time stability of the electron paramagnetic resonance (EPR) spectra of non- and γ-irradiated papaya, melon, cherry and fig samples dehydrated via osmosis are reported. It is shown that non-irradiated samples are generally EPR silent whereas γ-irradiated exhibit "sugar-like" EPR spectra. The recorded EPR spectra are monitored for a period of 7 months after irradiation (stored at low humidity and in the dark). The results suggest longer period of unambiguous identification of the radiation processing of osmose dehydrated fruits. Therefore, the Protocol EN 13708,2001 issued by CEN is fully applicable for the studied fruit samples.
Direct and long-term detection of gene doping in conventional blood samples.
Beiter, T; Zimmermann, M; Fragasso, A; Hudemann, J; Niess, A M; Bitzer, M; Lauer, U M; Simon, P
2011-03-01
The misuse of somatic gene therapy for the purpose of enhancing athletic performance is perceived as a coming threat to the world of sports and categorized as 'gene doping'. This article describes a direct detection approach for gene doping that gives a clear yes-or-no answer based on the presence or absence of transgenic DNA in peripheral blood samples. By exploiting a priming strategy to specifically amplify intronless DNA sequences, we developed PCR protocols allowing the detection of very small amounts of transgenic DNA in genomic DNA samples to screen for six prime candidate genes. Our detection strategy was verified in a mouse model, giving positive signals from minute amounts (20 μl) of blood samples for up to 56 days following intramuscular adeno-associated virus-mediated gene transfer, one of the most likely candidate vector systems to be misused for gene doping. To make our detection strategy amenable for routine testing, we implemented a robust sample preparation and processing protocol that allows cost-efficient analysis of small human blood volumes (200 μl) with high specificity and reproducibility. The practicability and reliability of our detection strategy was validated by a screening approach including 327 blood samples taken from professional and recreational athletes under field conditions.
NASA Technical Reports Server (NTRS)
Mahaffy, P. R.
2006-01-01
The Mars Science Laboratory, under development for launch in 2009, is designed explore and quantitatively asses a local region on Mars as a potential habitat for present or past life. Its ambitious goals are to (1) assess the past or present biological potential of the target environment, (2) to characterize the geology and geochemistry at the MSL landing site, and (3) to investigate planetary processes that influence habitability. The planned capabilities of the rover payload will enable a comprehensive search for organic molecules, a determination of definitive mineralogy of sampled rocks and fines, chemical and isotopic analysis of both atmospheric and solid samples, and precision isotope measurements of several volatile elements. A range of contact and remote surface and subsurface survey tools will establish context for these measurements and will facilitate sample identification and selection. The Sample Analysis at Mars (SAM) suite of MSL addresses several of the mission's core measurement goals. It includes a gas chromatograph, a mass spectrometer, and a tunable laser spectrometer. These instruments will be designed to analyze either atmospheric samples or gases extracted from solid phase samples such as rocks and fines. We will describe the range of measurement protocols under development and study by the SAM engineering and science teams for use on the surface of Mars.
Michaud, Jean-Philippe; Moreau, Gaétan
2013-07-01
Experimental protocols in forensic entomology successional field studies generally involve daily sampling of insects to document temporal changes in species composition on animal carcasses. One challenge with that method has been to adjust the sampling intensity to obtain the best representation of the community present without affecting the said community. To this date, little is known about how such investigator perturbations affect decomposition-related processes. Here, we investigated how different levels of daily sampling of fly eggs and fly larvae affected, over time, carcass decomposition rate and the carrion insect community. Results indicated that a daily sampling of <5% of the egg and larvae volumes present on a carcass, a sampling intensity believed to be consistent with current accepted practices in successional field studies, had little effect overall. Higher sampling intensities, however, slowed down carcass decomposition, affected the abundance of certain carrion insects, and caused an increase in the volume of eggs laid by dipterans. This study suggests that the carrion insect community not only has a limited resilience to recurrent perturbations but that a daily sampling intensity equal to or <5% of the egg and larvae volumes appears adequate to ensure that the system is representative of unsampled conditions. Hence we propose that this threshold be accepted as best practice in future forensic entomology successional field studies.
Thermogravimetric Analysis of Single-Wall Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Arepalli, Sivram; Nikolaev, Pavel; Gorelik, Olga
2010-01-01
An improved protocol for thermogravimetric analysis (TGA) of samples of single-wall carbon nanotube (SWCNT) material has been developed to increase the degree of consistency among results so that meaningful comparisons can be made among different samples. This improved TGA protocol is suitable for incorporation into the protocol for characterization of carbon nanotube material. In most cases, TGA of carbon nanotube materials is performed in gas mixtures that contain oxygen at various concentrations. The improved protocol is summarized.
DNA Extraction from Protozoan Oocysts/Cysts in Feces for Diagnostic PCR
2014-01-01
PCR detection of intestinal protozoa is often restrained by a poor DNA recovery or by inhibitors present in feces. The need for an extraction protocol that can overcome these obstacles is therefore clear. QIAamp® DNA Stool Mini Kit (Qiagen) was evaluated for its ability to recover DNA from oocysts/cysts directly from feces. Twenty-five Giardia-positive, 15 Cryptosporidium-positive, 15 Entamoeba histolytica-positive, and 45 protozoa-free samples were processed as control by microscopy and immunoassay tests. DNA extracts were amplified using 3 sets of published primers. Following the manufacturer's protocol, the kit showed sensitivity and specificity of 100% towards Giardia and Entamoeba. However, for Cryptosporidium, the sensitivity and specificity were 60% (9/15) and 100%, respectively. A series of optimization experiments involving various steps of the kit's protocol were conducted using Cryptosporidium-positive samples. The best DNA recoveries were gained by raising the lysis temperature to the boiling point for 10 min and the incubation time of the InhibitEX tablet to 5 min. Also, using a pre-cooled ethanol for nucleic acid precipitation and small elution volume (50-100 µl) were valuable. The sensitivity of the amended protocol to Cryptosporidium was raised to 100%. Cryptosporidium DNA was successfully amplified by either the first or the second primer set. When applied on parasite-free feces spiked with variable oocysts/cysts counts, ≈ 2 oocysts/cysts were theoretically enough for detection by PCR. To conclude, the Qiagen kit with the amended protocol was proved to be suitable for protozoan DNA extraction directly from feces and support PCR diagnosis. PMID:25031466
DNA extraction from protozoan oocysts/cysts in feces for diagnostic PCR.
Hawash, Yousry
2014-06-01
PCR detection of intestinal protozoa is often restrained by a poor DNA recovery or by inhibitors present in feces. The need for an extraction protocol that can overcome these obstacles is therefore clear. QIAamp® DNA Stool Mini Kit (Qiagen) was evaluated for its ability to recover DNA from oocysts/cysts directly from feces. Twenty-five Giardia-positive, 15 Cryptosporidium-positive, 15 Entamoeba histolytica-positive, and 45 protozoa-free samples were processed as control by microscopy and immunoassay tests. DNA extracts were amplified using 3 sets of published primers. Following the manufacturer's protocol, the kit showed sensitivity and specificity of 100% towards Giardia and Entamoeba. However, for Cryptosporidium, the sensitivity and specificity were 60% (9/15) and 100%, respectively. A series of optimization experiments involving various steps of the kit's protocol were conducted using Cryptosporidium-positive samples. The best DNA recoveries were gained by raising the lysis temperature to the boiling point for 10 min and the incubation time of the InhibitEX tablet to 5 min. Also, using a pre-cooled ethanol for nucleic acid precipitation and small elution volume (50-100 µl) were valuable. The sensitivity of the amended protocol to Cryptosporidium was raised to 100%. Cryptosporidium DNA was successfully amplified by either the first or the second primer set. When applied on parasite-free feces spiked with variable oocysts/cysts counts, ≈ 2 oocysts/cysts were theoretically enough for detection by PCR. To conclude, the Qiagen kit with the amended protocol was proved to be suitable for protozoan DNA extraction directly from feces and support PCR diagnosis.
Sanchez, Sophie; Fernandez, Vincent; Pierce, Stephanie E; Tafforeau, Paul
2013-09-01
Propagation phase-contrast synchrotron radiation microtomography (PPC-SRμCT) has proved to be very successful for examining fossils. Because fossils range widely in taphonomic preservation, size, shape and density, X-ray computed tomography protocols are constantly being developed and refined. Here we present a 1-h procedure that combines a filtered high-energy polychromatic beam with long-distance PPC-SRμCT (sample to detector: 4-16 m) and an attenuation protocol normalizing the absorption profile (tested on 13-cm-thick and 5.242 g cm(-3) locally dense samples but applicable to 20-cm-thick samples). This approach provides high-quality imaging results, which show marked improvement relative to results from images obtained without the attenuation protocol in apparent transmission, contrast and signal-to-noise ratio. The attenuation protocol involves immersing samples in a tube filled with aluminum or glass balls in association with a U-shaped aluminum profiler. This technique therefore provides access to a larger dynamic range of the detector used for tomographic reconstruction. This protocol homogenizes beam-hardening artifacts, thereby rendering it effective for use with conventional μCT scanners.
FIELD SAMPLING PROTOCOLS AND ANALYSIS
I have been asked to speak again to the environmental science class regarding actual research scenarios related to my work at Kerr Lab. I plan to discuss sampling protocols along with various field analyses performed during sampling activities. Many of the students have never see...
da Silva, Fabiana Alves; Vidal, Cláudia Fernanda de Lacerda; de Araújo, Ednaldo Cavalcante
2015-01-01
Abstract Objective: to validate the content of the prevention protocol for early sepsis caused by Streptococcus agalactiaein newborns. Method: a transversal, descriptive and methodological study, with a quantitative approach. The sample was composed of 15 judges, 8 obstetricians and 7 pediatricians. The validation occurred through the assessment of the content of the protocol by the judges that received the instrument for data collection - checklist - which contained 7 items that represent the requisites to be met by the protocol. The validation of the content was achieved by applying the Content Validity Index. Result: in the judging process, all the items that represented requirements considered by the protocol obtained concordance within the established level (Content Validity Index > 0.75). Of 7 items, 6 have obtained full concordance (Content Validity Index 1.0) and the feasibility item obtained a Content Validity Index of 0.93. The global assessment of the instruments obtained a Content Validity Index of 0.99. Conclusion: the validation of content that was done was an efficient tool for the adjustment of the protocol, according to the judgment of experienced professionals, which demonstrates the importance of conducting a previous validation of the instruments. It is expected that this study will serve as an incentive for the adoption of universal tracking by other institutions through validated protocols. PMID:26444165
Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco
2016-02-09
Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established protocols, namely biomass adjustment and limited throughput. Automation was shown to improve data reliability, as well as experimental throughput simultaneously minimizing the needed hands-on-time to a third. Thereby, the presented protocol meets the demands for the analysis of samples generated by the upcoming generation of devices for higher throughput phototrophic cultivation and thereby contributes to boosting the time efficiency for setting up algae lipid production processes.
21 CFR 660.46 - Samples; protocols; official release.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Samples; protocols; official release. 660.46 Section 660.46 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES..., a sample of product not iodinated with 125I means a sample from each filling of each lot packaged as...
do Nascimento, Cássio; dos Santos, Janine Navarro; Pedrazzi, Vinícius; Pita, Murillo Sucena; Monesi, Nadia; Ribeiro, Ricardo Faria; de Albuquerque, Rubens Ferreira
2014-01-01
Molecular diagnosis methods have been largely used in epidemiological or clinical studies to detect and quantify microbial species that may colonize the oral cavity in healthy or disease. The preservation of genetic material from samples remains the major challenge to ensure the feasibility of these methodologies. Long-term storage may compromise the final result. The aim of this study was to evaluate the effect of temperature and time storage on the microbial detection of oral samples by Checkerboard DNA-DNA hybridization. Saliva and supragingival biofilm were taken from 10 healthy subjects, aliquoted (n=364) and processed according to proposed protocols: immediate processing and processed after 2 or 4 weeks, and 6 or 12 months of storage at 4°C, -20°C and -80°C. Either total or individual microbial counts were recorded in lower values for samples processed after 12 months of storage, irrespective of temperatures tested. Samples stored up to 6 months at cold temperatures showed similar counts to those immediately processed. The microbial incidence was also significantly reduced in samples stored during 12 months in all temperatures. Temperature and time of oral samples storage have relevant impact in the detection and quantification of bacterial and fungal species by Checkerboard DNA-DNA hybridization method. Samples should be processed immediately after collection or up to 6 months if conserved at cold temperatures to avoid false-negative results. Copyright © 2013 Elsevier Ltd. All rights reserved.
King, Brendon; Fanok, Stella; Phillips, Renae; Swaffer, Brooke
2015-01-01
Cryptosporidium continues to be problematic for the water industry, with risk assessments often indicating that treatment barriers may fail under extreme conditions. However, risk analyses have historically used oocyst densities and not considered either oocyst infectivity or species/genotype, which can result in an overestimation of risk if the oocysts are not human infective. We describe an integrated assay for determining oocyst density, infectivity, and genotype from a single-sample concentrate, an important advance that overcomes the need for processing multiple-grab samples or splitting sample concentrates for separate analyses. The assay incorporates an oocyst recovery control and is compatible with standard primary concentration techniques. Oocysts were purified from primary concentrates using immunomagnetic separation prior to processing by an infectivity assay. Plate-based cell culture was used to detect infectious foci, with a monolayer washing protocol developed to allow recovery and enumeration of oocysts. A simple DNA extraction protocol was developed to allow typing of any wells containing infectious Cryptosporidium. Water samples from a variety of source water and wastewater matrices, including a semirural catchment, wastewater, an aquifer recharge site, and storm water, were analyzed using the assay. Results demonstrate that the assay can reliably determine oocyst densities, infectivity, and genotype from single-grab samples for a variety of water matrices and emphasize the varying nature of Cryptosporidium risk extant throughout source waters and wastewaters. This assay should therefore enable a more comprehensive understanding of Cryptosporidium risk for different water sources, assisting in the selection of appropriate risk mitigation measures. PMID:25769833
King, Brendon; Fanok, Stella; Phillips, Renae; Swaffer, Brooke; Monis, Paul
2015-05-15
Cryptosporidium continues to be problematic for the water industry, with risk assessments often indicating that treatment barriers may fail under extreme conditions. However, risk analyses have historically used oocyst densities and not considered either oocyst infectivity or species/genotype, which can result in an overestimation of risk if the oocysts are not human infective. We describe an integrated assay for determining oocyst density, infectivity, and genotype from a single-sample concentrate, an important advance that overcomes the need for processing multiple-grab samples or splitting sample concentrates for separate analyses. The assay incorporates an oocyst recovery control and is compatible with standard primary concentration techniques. Oocysts were purified from primary concentrates using immunomagnetic separation prior to processing by an infectivity assay. Plate-based cell culture was used to detect infectious foci, with a monolayer washing protocol developed to allow recovery and enumeration of oocysts. A simple DNA extraction protocol was developed to allow typing of any wells containing infectious Cryptosporidium. Water samples from a variety of source water and wastewater matrices, including a semirural catchment, wastewater, an aquifer recharge site, and storm water, were analyzed using the assay. Results demonstrate that the assay can reliably determine oocyst densities, infectivity, and genotype from single-grab samples for a variety of water matrices and emphasize the varying nature of Cryptosporidium risk extant throughout source waters and wastewaters. This assay should therefore enable a more comprehensive understanding of Cryptosporidium risk for different water sources, assisting in the selection of appropriate risk mitigation measures. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Droplet-based pyrosequencing using digital microfluidics.
Boles, Deborah J; Benton, Jonathan L; Siew, Germaine J; Levy, Miriam H; Thwar, Prasanna K; Sandahl, Melissa A; Rouse, Jeremy L; Perkins, Lisa C; Sudarsan, Arjun P; Jalili, Roxana; Pamula, Vamsee K; Srinivasan, Vijay; Fair, Richard B; Griffin, Peter B; Eckhardt, Allen E; Pollack, Michael G
2011-11-15
The feasibility of implementing pyrosequencing chemistry within droplets using electrowetting-based digital microfluidics is reported. An array of electrodes patterned on a printed-circuit board was used to control the formation, transportation, merging, mixing, and splitting of submicroliter-sized droplets contained within an oil-filled chamber. A three-enzyme pyrosequencing protocol was implemented in which individual droplets contained enzymes, deoxyribonucleotide triphosphates (dNTPs), and DNA templates. The DNA templates were anchored to magnetic beads which enabled them to be thoroughly washed between nucleotide additions. Reagents and protocols were optimized to maximize signal over background, linearity of response, cycle efficiency, and wash efficiency. As an initial demonstration of feasibility, a portion of a 229 bp Candida parapsilosis template was sequenced using both a de novo protocol and a resequencing protocol. The resequencing protocol generated over 60 bp of sequence with 100% sequence accuracy based on raw pyrogram levels. Excellent linearity was observed for all of the homopolymers (two, three, or four nucleotides) contained in the C. parapsilosis sequence. With improvements in microfluidic design it is expected that longer reads, higher throughput, and improved process integration (i.e., "sample-to-sequence" capability) could eventually be achieved using this low-cost platform.
Droplet-Based Pyrosequencing Using Digital Microfluidics
Boles, Deborah J.; Benton, Jonathan L.; Siew, Germaine J.; Levy, Miriam H.; Thwar, Prasanna K.; Sandahl, Melissa A.; Rouse, Jeremy L.; Perkins, Lisa C.; Sudarsan, Arjun P.; Jalili, Roxana; Pamula, Vamsee K.; Srinivasan, Vijay; Fair, Richard B.; Griffin, Peter B.; Eckhardt, Allen E.; Pollack, Michael G.
2013-01-01
The feasibility of implementing pyrosequencing chemistry within droplets using electrowetting-based digital microfluidics is reported. An array of electrodes patterned on a printed-circuit board was used to control the formation, transportation, merging, mixing, and splitting of submicroliter-sized droplets contained within an oil-filled chamber. A three-enzyme pyrosequencing protocol was implemented in which individual droplets contained enzymes, deoxyribonucleotide triphosphates (dNTPs), and DNA templates. The DNA templates were anchored to magnetic beads which enabled them to be thoroughly washed between nucleotide additions. Reagents and protocols were optimized to maximize signal over background, linearity of response, cycle efficiency, and wash efficiency. As an initial demonstration of feasibility, a portion of a 229 bp Candida parapsilosis template was sequenced using both a de novo protocol and a resequencing protocol. The resequencing protocol generated over 60 bp of sequence with 100% sequence accuracy based on raw pyrogram levels. Excellent linearity was observed for all of the homopolymers (two, three, or four nucleotides) contained in the C. parapsilosis sequence. With improvements in microfluidic design it is expected that longer reads, higher throughput, and improved process integration (i.e., “sample-to-sequence” capability) could eventually be achieved using this low-cost platform. PMID:21932784
Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis
Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej
2016-01-01
Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world. PMID:28060297
Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis.
Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej
2016-12-16
Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world.
Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn E.; Munoz, Cesar A.
2009-01-01
This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carbaugh, Eugene H.
2008-10-01
The origin of the approximate 24-hour urine sampling protocol used at Hanford for routine bioassay is attributed to an informal study done in the mid-1940s. While the actual data were never published and have been lost, anecdotal recollections by staff involved in the initial bioassay program design and administration suggest that the sampling protocol had a solid scientific basis. Numerous alternate methods for normalizing partial day samples to represent a total 24-hour collection have since been proposed and used, but no one method is obviously preferred.
KatharoSeq Enables High-Throughput Microbiome Analysis from Low-Biomass Samples
Minich, Jeremiah J.; Zhu, Qiyun; Janssen, Stefan; Hendrickson, Ryan; Amir, Amnon; Vetter, Russ; Hyde, John; Doty, Megan M.; Stillwell, Kristina; Benardini, James; Kim, Jae H.; Allen, Eric E.
2018-01-01
ABSTRACT Microbiome analyses of low-biomass samples are challenging because of contamination and inefficiencies, leading many investigators to employ low-throughput methods with minimal controls. We developed a new automated protocol, KatharoSeq (from the Greek katharos [clean]), that outperforms single-tube extractions while processing at least five times as fast. KatharoSeq incorporates positive and negative controls to reveal the whole bacterial community from inputs of as few as 50 cells and correctly identifies 90.6% (standard error, 0.013%) of the reads from 500 cells. To demonstrate the broad utility of KatharoSeq, we performed 16S rRNA amplicon and shotgun metagenome analyses of the Jet Propulsion Laboratory spacecraft assembly facility (SAF; n = 192, 96), 52 rooms of a neonatal intensive care unit (NICU; n = 388, 337), and an endangered-abalone-rearing facility (n = 192, 123), obtaining spatially resolved, unique microbiomes reproducible across hundreds of samples. The SAF, our primary focus, contains 32 sOTUs (sub-OTUs, defined as exact sequence matches) and their inferred variants identified by the deblur algorithm, with four (Acinetobacter lwoffii, Paracoccus marcusii, Mycobacterium sp., and Novosphingobium) being present in >75% of the samples. According to microbial spatial topography, the most abundant cleanroom contaminant, A. lwoffii, is related to human foot traffic exposure. In the NICU, we have been able to discriminate environmental exposure related to patient infectious disease, and in the abalone facility, we show that microbial communities reflect the marine environment rather than human input. Consequently, we demonstrate the feasibility and utility of large-scale, low-biomass metagenomic analyses using the KatharoSeq protocol. IMPORTANCE Various indoor, outdoor, and host-associated environments contain small quantities of microbial biomass and represent a niche that is often understudied because of technical constraints. Many studies that attempt to evaluate these low-biomass microbiome samples are riddled with erroneous results that are typically false positive signals obtained during the sampling process. We have investigated various low-biomass kits and methods to determine the limit of detection of these pipelines. Here we present KatharoSeq, a high-throughput protocol combining laboratory and bioinformatic methods that can differentiate a true positive signal in samples with as few as 50 to 500 cells. We demonstrate the application of this method in three unique low-biomass environments, including a SAF, a hospital NICU, and an abalone-rearing facility. PMID:29577086
Feine, Ilan; Shpitzen, Moshe; Geller, Boris; Salmon, Eran; Peleg, Tsach; Roth, Jonathan; Gafny, Ron
2017-07-01
Electrical tapes (ETs) are a common component of improvised explosive devices (IEDs) used by terrorists or criminal organizations and represent a valuable forensic resource for DNA and latent fingerprints recovery. However, DNA recovery rates are typically low and usually below the minimal amount required for amplification. In addition, most DNA extraction methods are destructive and do not allow further latent fingerprints development. In the present study a cell culture based touch DNA model was used to demonstrate a two-step acetone-water DNA recovery protocol from ETs. This protocol involves only the adhesive side of the ET and increases DNA recovery rates by up to 70%. In addition, we demonstrated partially successful latent fingerprints development from the non-sticky side of the ETs. Taken together, this protocol maximizes the forensic examination of ETs and is recommended for routine casework processing. Copyright © 2017 Elsevier B.V. All rights reserved.
Robert-Peillard, Fabien; Boudenne, Jean-Luc; Coulomb, Bruno
2014-05-01
This paper presents a simple, accurate and multi-sample method for the determination of proline in wines thanks to a 96-well microplate technique. Proline is the most abundant amino acid in wine and is an important parameter related to wine characteristics or maturation processes of grape. In the current study, an improved application of the general method based on sodium hypochlorite oxidation and o-phthaldialdehyde (OPA)-thiol spectrofluorometric detection is described. The main interfering compounds for specific proline detection in wines are strongly reduced by selective reaction with OPA in a preliminary step under well-defined pH conditions. Application of the protocol after a 500-fold dilution of wine samples provides a working range between 0.02 and 2.90gL(-1), with a limit of detection of 7.50mgL(-1). Comparison and validation on real wine samples by ion-exchange chromatography prove that this procedure yields accurate results. Simplicity of the protocol used, with no need for centrifugation or filtration, organic solvents or high temperature enables its full implementation in plastic microplates and efficient application for routine analysis of proline in wines. Copyright © 2013 Elsevier Ltd. All rights reserved.
Wetherbee, Gregory A.; Martin, RoseAnn
2017-02-06
The U.S. Geological Survey Branch of Quality Systems operates the Precipitation Chemistry Quality Assurance Project (PCQA) for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) and National Atmospheric Deposition Program/Mercury Deposition Network (NADP/MDN). Since 1978, various programs have been implemented by the PCQA to estimate data variability and bias contributed by changing protocols, equipment, and sample submission schemes within NADP networks. These programs independently measure the field and laboratory components which contribute to the overall variability of NADP wet-deposition chemistry and precipitation depth measurements. The PCQA evaluates the quality of analyte-specific chemical analyses from the two, currently (2016) contracted NADP laboratories, Central Analytical Laboratory and Mercury Analytical Laboratory, by comparing laboratory performance among participating national and international laboratories. Sample contamination and stability are evaluated for NTN and MDN by using externally field-processed blank samples provided by the Branch of Quality Systems. A colocated sampler program evaluates the overall variability of NTN measurements and bias between dissimilar precipitation gages and sample collectors.This report documents historical PCQA operations and general procedures for each of the external quality-assurance programs from 2007 to 2016.
Kulstein, Galina; Marienfeld, Ralf; Miltner, Erich; Wiegand, Peter
2016-10-01
In the last years, microRNA (miRNA) analysis came into focus in the field of forensic genetics. Yet, no standardized and recommendable protocols for co-isolation of miRNA and DNA from forensic relevant samples have been developed so far. Hence, this study evaluated the performance of an automated Maxwell® 16 System-based strategy (Promega) for co-extraction of DNA and miRNA from forensically relevant (blood and saliva) samples compared to (semi-)manual extraction methods. Three procedures were compared on the basis of recovered quantity of DNA and miRNA (as determined by real-time PCR and Bioanalyzer), miRNA profiling (shown by Cq values and extraction efficiency), STR profiles, duration, contamination risk and handling. All in all, the results highlight that the automated co-extraction procedure yielded the highest miRNA and DNA amounts from saliva and blood samples compared to both (semi-)manual protocols. Also, for aged and genuine samples of forensically relevant traces the miRNA and DNA yields were sufficient for subsequent downstream analysis. Furthermore, the strategy allows miRNA extraction only in cases where it is relevant to obtain additional information about the sample type. Besides, this system enables flexible sample throughput and labor-saving sample processing with reduced risk of cross-contamination. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Evaluation protocol for amusia: Portuguese sample.
Peixoto, Maria Conceição; Martins, Jorge; Teixeira, Pedro; Alves, Marisa; Bastos, José; Ribeiro, Carlos
2012-12-01
Amusia is a disorder that affects the processing of music. Part of this processing happens in the primary auditory cortex. The study of this condition allows us to evaluate the central auditory pathways. To explore the diagnostic evaluation tests of amusia. The authors propose an evaluation protocol for patients with suspected amusia (after brain injury or complaints of poor musical perception), in parallel with the assessment of central auditory processing, already implemented in the department. The Montreal Evaluation of Battery of amusia was the basis for the selection of the tests. From this comprehensive battery of tests we selected some of the musical examples to evaluate different musical aspects, including memory and perception of music, ability concerning musical recognition and discrimination. In terms of memory there is a test for assessing delayed memory, adapted to the Portuguese culture. Prospective study. Although still experimental, with the possibility of adjustments in the assessment, we believe that this assessment, combined with the study of central auditory processing, will allow us to understand some central lesions, congenital or acquired hearing perception limitations.
Colyar, Jessica M; Eggett, Dennis L; Steele, Frost M; Dunn, Michael L; Ogden, Lynn V
2009-09-01
The relative sensitivity of side-by-side and sequential monadic consumer liking protocols was compared. In the side-by-side evaluation, all samples were presented at once and evaluated together 1 characteristic at a time. In the sequential monadic evaluation, 1 sample was presented and evaluated on all characteristics, then returned before panelists received and evaluated another sample. Evaluations were conducted on orange juice, frankfurters, canned chili, potato chips, and applesauce. Five commercial brands, having a broad quality range, were selected as samples for each product category to assure a wide array of consumer liking scores. Without their knowledge, panelists rated the same 5 retail brands by 1 protocol and then 3 wk later by the other protocol. For 3 of the products, both protocols yielded the same order of overall liking. Slight differences in order of overall liking for the other 2 products were not significant. Of the 50 pairwise overall liking comparisons, 44 were in agreement. The different results obtained by the 2 protocols in order of liking and significance of paired comparisons were due to the experimental variation and differences in sensitivity. Hedonic liking scores were subjected to statistical power analyses and used to calculate minimum number of panelists required to achieve varying degrees of sensitivity when using side-by-side and sequential monadic protocols. In most cases, the side-by-side protocol was more sensitive, thus providing the same information with fewer panelists. Side-by-side protocol was less sensitive in cases where sensory fatigue was a factor.
GARBIERI, Thais Francini; BROZOSKI, Daniel Thomas; DIONÍSIO, Thiago José; SANTOS, Carlos Ferreira; NEVES, Lucimara Teixeira das
2017-01-01
Abstract Saliva when compared to blood collection has the following advantages: it requires no specialized personnel for collection, allows for remote collection by the patient, is painless, well accepted by participants, has decreased risks of disease transmission, does not clot, can be frozen before DNA extraction and possibly has a longer storage time. Objective and Material and Methods This study aimed to compare the quantity and quality of human DNA extracted from saliva that was fresh or frozen for three, six and twelve months using five different DNA extraction protocols: protocol 1 – Oragene™ commercial kit, protocol 2 – QIAamp DNA mini kit, protocol 3 – DNA extraction using ammonium acetate, protocol 4 – Instagene™ Matrix and protocol 5 – Instagene™ Matrix diluted 1:1 using proteinase K and 1% SDS. Briefly, DNA was analyzed using spectrophotometry, electrophoresis and PCR. Results Results indicated that time spent in storage typically decreased the DNA quantity with the exception of protocol 1. The purity of DNA was generally not affected by storage times for the commercial based protocols, while the purity of the DNA samples extracted by the noncommercial protocols typically decreased when the saliva was stored longer. Only protocol 1 consistently extracted unfragmented DNA samples. In general, DNA samples extracted through protocols 1, 2, 3 and 4, regardless of storage time, were amplified by human specific primers whereas protocol 5 produced almost no samples that were able to be amplified by human specific primers. Depending on the protocol used, it was possible to extract DNA in high quantities and of good quality using whole saliva, and furthermore, for the purposes of DNA extraction, saliva can be reliably stored for relatively long time periods. Conclusions In summary, a complicated picture emerges when taking into account the extracted DNA’s quantity, purity and quality; depending on a given researchers needs, one protocol’s particular strengths and costs might be the deciding factor for its employment. PMID:28403355
USDA-ARS?s Scientific Manuscript database
Native plant biodiversity loss and exotic species invasions are threatening the ability of many ecosystems to maintain key functions and processes. We currently lack detailed plant biodiversity data at a national scale with which to make management decisions and recommendations based on current cons...
This protocol describes how quality control samples should be handled in the field, and was designed as a quick reference source for the field staff. The protocol describes quality control samples for air-VOCs, air-particles, water samples, house dust, soil, urine, blood, hair, a...
Treweek, Jennifer B; Deverman, Benjamin E; Greenbaum, Alon; Lignell, Antti; Xiao, Cheng; Cai, Long; Ladinsky, Mark S; Bjorkman, Pamela J; Fowlkes, Charless C; Gradinaru, Viviana
2016-01-01
To facilitate fine-scale phenotyping of whole specimens, we describe here a set of tissue fixation-embedding, detergent-clearing and staining protocols that can be used to transform excised organs and whole organisms into optically transparent samples within 1–2 weeks without compromising their cellular architecture or endogenous fluorescence. PACT (passive CLARITY technique) and PARS (perfusion-assisted agent release in situ) use tissue-hydrogel hybrids to stabilize tissue biomolecules during selective lipid extraction, resulting in enhanced clearing efficiency and sample integrity. Furthermore, the macromolecule permeability of PACT- and PARS-processed tissue hybrids supports the diffusion of immunolabels throughout intact tissue, whereas RIMS (refractive index matching solution) grants high-resolution imaging at depth by further reducing light scattering in cleared and uncleared samples alike. These methods are adaptable to difficult-to-image tissues, such as bone (PACT-deCAL), and to magnified single-cell visualization (ePACT). Together, these protocols and solutions enable phenotyping of subcellular components and tracing cellular connectivity in intact biological networks. PMID:26492141
Johnson, Steven M.; Swanson, Robert B.
1994-01-01
Prototype stream-monitoring sites were operated during part of 1992 in the Central Nebraska Basins (CNBR) and three other study areas of the National Water-Quality Assessment (NAWQ) Program of the U.S. Geological Survey. Results from the prototype project provide information needed to operate a net- work of intensive fixed station stream-monitoring sites. This report evaluates operating procedures for two NAWQA prototype sites at Maple Creek near Nickerson and the Platte River at Louisville, eastern Nebraska. Each site was sampled intensively in the spring and late summer 1992, with less intensive sampling in midsummer. In addition, multiple samples were collected during two high- flow periods at the Maple Creek site--one early and the other late in the growing season. Water-samples analyses included determination of pesticides, nutrients, major ions, suspended sediment, and measurements of physical properties. Equipment and protocols for the water-quality sampling procedures were evaluated. Operation of the prototype stream- monitoring sites included development and comparison of onsite and laboratory sample-processing proce- dures. Onsite processing was labor intensive but allowed for immediate preservation of all sampled constituents. Laboratory processing required less field labor and decreased the risk of contamination, but allowed for no immediate preservation of the samples.
Lewandowska, Dagmara W; Zagordi, Osvaldo; Geissberger, Fabienne-Desirée; Kufner, Verena; Schmutz, Stefan; Böni, Jürg; Metzner, Karin J; Trkola, Alexandra; Huber, Michael
2017-08-08
Sequence-specific PCR is the most common approach for virus identification in diagnostic laboratories. However, as specific PCR only detects pre-defined targets, novel virus strains or viruses not included in routine test panels will be missed. Recently, advances in high-throughput sequencing allow for virus-sequence-independent identification of entire virus populations in clinical samples, yet standardized protocols are needed to allow broad application in clinical diagnostics. Here, we describe a comprehensive sample preparation protocol for high-throughput metagenomic virus sequencing using random amplification of total nucleic acids from clinical samples. In order to optimize metagenomic sequencing for application in virus diagnostics, we tested different enrichment and amplification procedures on plasma samples spiked with RNA and DNA viruses. A protocol including filtration, nuclease digestion, and random amplification of RNA and DNA in separate reactions provided the best results, allowing reliable recovery of viral genomes and a good correlation of the relative number of sequencing reads with the virus input. We further validated our method by sequencing a multiplexed viral pathogen reagent containing a range of human viruses from different virus families. Our method proved successful in detecting the majority of the included viruses with high read numbers and compared well to other protocols in the field validated against the same reference reagent. Our sequencing protocol does work not only with plasma but also with other clinical samples such as urine and throat swabs. The workflow for virus metagenomic sequencing that we established proved successful in detecting a variety of viruses in different clinical samples. Our protocol supplements existing virus-specific detection strategies providing opportunities to identify atypical and novel viruses commonly not accounted for in routine diagnostic panels.
Zou, Lili; Shen, Kaini; Zhong, Dingrong; Zhou, Daobin; Sun, Wei; Li, Jian
2015-01-01
Laser microdissection followed by mass spectrometry has been successfully used for amyloid typing. However, sample contamination can interfere with proteomic analysis, and overnight digestion limits the analytical throughput. Moreover, current quantitative analysis methods are based on the spectrum count, which ignores differences in protein length and may lead to misdiagnoses. Here, we developed a microwave-assisted filter-aided sample preparation (maFASP) method that can efficiently remove contaminants with a 10-kDa cutoff ultrafiltration unit and can accelerate the digestion process with the assistance of a microwave. Additionally, two parameters (P- and D-scores) based on the exponentially modified protein abundance index were developed to define the existence of amyloid deposits and those causative proteins with the greatest abundance. Using our protocol, twenty cases of systemic amyloidosis that were well-typed according to clinical diagnostic standards (training group) and another twenty-four cases without subtype diagnoses (validation group) were analyzed. Using this approach, sample preparation could be completed within four hours. We successfully subtyped 100% of the cases in the training group, and the diagnostic success rate in the validation group was 91.7%. This maFASP-aided proteomic protocol represents an efficient approach for amyloid diagnosis and subtyping, particularly for serum-contaminated samples. PMID:25984759
Generalized estimators of avian abundance from count survey data
Royle, J. Andrew
2004-01-01
I consider modeling avian abundance from spatially referenced bird count data collected according to common protocols such as capture?recapture, multiple observer, removal sampling and simple point counts. Small sample sizes and large numbers of parameters have motivated many analyses that disregard the spatial indexing of the data, and thus do not provide an adequate treatment of spatial structure. I describe a general framework for modeling spatially replicated data that regards local abundance as a random process, motivated by the view that the set of spatially referenced local populations (at the sample locations) constitute a metapopulation. Under this view, attention can be focused on developing a model for the variation in local abundance independent of the sampling protocol being considered. The metapopulation model structure, when combined with the data generating model, define a simple hierarchical model that can be analyzed using conventional methods. The proposed modeling framework is completely general in the sense that broad classes of metapopulation models may be considered, site level covariates on detection and abundance may be considered, and estimates of abundance and related quantities may be obtained for sample locations, groups of locations, unsampled locations. Two brief examples are given, the first involving simple point counts, and the second based on temporary removal counts. Extension of these models to open systems is briefly discussed.
Computational fragment-based screening using RosettaLigand: the SAMPL3 challenge
NASA Astrophysics Data System (ADS)
Kumar, Ashutosh; Zhang, Kam Y. J.
2012-05-01
SAMPL3 fragment based virtual screening challenge provides a valuable opportunity for researchers to test their programs, methods and screening protocols in a blind testing environment. We participated in SAMPL3 challenge and evaluated our virtual fragment screening protocol, which involves RosettaLigand as the core component by screening a 500 fragments Maybridge library against bovine pancreatic trypsin. Our study reaffirmed that the real test for any virtual screening approach would be in a blind testing environment. The analyses presented in this paper also showed that virtual screening performance can be improved, if a set of known active compounds is available and parameters and methods that yield better enrichment are selected. Our study also highlighted that to achieve accurate orientation and conformation of ligands within a binding site, selecting an appropriate method to calculate partial charges is important. Another finding is that using multiple receptor ensembles in docking does not always yield better enrichment than individual receptors. On the basis of our results and retrospective analyses from SAMPL3 fragment screening challenge we anticipate that chances of success in a fragment screening process could be increased significantly with careful selection of receptor structures, protein flexibility, sufficient conformational sampling within binding pocket and accurate assignment of ligand and protein partial charges.
A MORE COST-EFFECTIVE EMAP-ESTUARIES BENTHIC MACROFAUNAL SAMPLING PROTOCOL
The standard benthic macrofaunal sampling protocol in the U.S. Environmental Protection Agency's Pacific Coast Environmental Monitoring and Assessment Program (EMAP) is to collect a minimum of 30 random benthic samples per reporting unit (e.g., estuary) using a 0.1 m2 grab and to...
It's Time to Develop a New "Draft Test Protocol" for a Mars Sample Return Mission (or Two....)
NASA Astrophysics Data System (ADS)
Rummel, J. D.
2018-04-01
A Mars Sample Return (MSR) will involve analysis of those samples in containment, including their safe receiving, handling, testing, and archiving. With an MSR planned for the end of the next decade, it is time to update the existing MSR protocol.
Beno, Sarah M; Stasiewicz, Matthew J; Andrus, Alexis D; Ralyea, Robert D; Kent, David J; Martin, Nicole H; Wiedmann, Martin; Boor, Kathryn J
2016-12-01
Pathogen environmental monitoring programs (EMPs) are essential for food processing facilities of all sizes that produce ready-to-eat food products exposed to the processing environment. We developed, implemented, and evaluated EMPs targeting Listeria spp. and Salmonella in nine small cheese processing facilities, including seven farmstead facilities. Individual EMPs with monthly sample collection protocols were designed specifically for each facility. Salmonella was detected in only one facility, with likely introduction from the adjacent farm indicated by pulsed-field gel electrophoresis data. Listeria spp. were isolated from all nine facilities during routine sampling. The overall Listeria spp. (other than Listeria monocytogenes ) and L. monocytogenes prevalences in the 4,430 environmental samples collected were 6.03 and 1.35%, respectively. Molecular characterization and subtyping data suggested persistence of a given Listeria spp. strain in seven facilities and persistence of L. monocytogenes in four facilities. To assess routine sampling plans, validation sampling for Listeria spp. was performed in seven facilities after at least 6 months of routine sampling. This validation sampling was performed by independent individuals and included collection of 50 to 150 samples per facility, based on statistical sample size calculations. Two of the facilities had a significantly higher frequency of detection of Listeria spp. during the validation sampling than during routine sampling, whereas two other facilities had significantly lower frequencies of detection. This study provides a model for a science- and statistics-based approach to developing and validating pathogen EMPs.
Improving Leishmania Species Identification in Different Types of Samples from Cutaneous Lesions
Cruz-Barrera, Mónica L.; Ovalle-Bracho, Clemencia; Ortegon-Vergara, Viviana; Pérez-Franco, Jairo E.
2015-01-01
The discrimination of Leishmania species from patient samples has epidemiological and clinical relevance. In this study, different gene target PCR-restriction fragment length polymorphism (RFLP) protocols were evaluated for their robustness as Leishmania species discriminators in 61 patients with cutaneous leishmaniasis. We modified the hsp70-PCR-RFLP protocol and found it to be the most reliable protocol for species identification. PMID:25609727
Assessment of an improved bone washing protocol for deceased donor human bone.
Eagle, M J; Man, J; Rooney, P; Hogg, P; Kearney, J N
2015-03-01
NHSBT Tissue Services issues bone to surgeons in the UK in two formats, fresh-frozen unprocessed bone from living donors and processed bone from deceased donors. Processed bone may be frozen or freeze dried and all processed bone is currently subjected to a washing protocol to remove blood and bone marrow. In this study we have improved the current bone washing protocol for cancellous bone and assessed the success of the protocol by measuring the removal of the bone marrow components: soluble protein, DNA and haemoglobin at each step in the process, and residual components in the bone at the end of the process. The bone washing protocol is a combination of sonication, warm water washes, centrifugation and chemical (ethanol and hydrogen peroxide) treatments. We report that the bone washing protocol is capable of removing up to 99.85 % soluble protein, 99.95 % DNA and 100 % of haemoglobin from bone. The new bone washing protocol does not render any bone cytotoxic as shown by contact cytotoxicity assays. No microbiological cell growth was detected in any of the wash steps. This process is now in use for processed cancellous bone issued by NHSBT.
Zhang, Heng; Lan, Fang; Shi, Yupeng; Wan, Zhi-Gang; Yue, Zhen-Feng; Fan, Fang; Lin, Yan-Kui; Tang, Mu-Jin; Lv, Jing-Zhang; Xiao, Tan; Yi, Changqing
2014-06-15
VitaFast(®) test kits designed for the microbiological assay in microtiter plate format can be applied to quantitative determination of B-group water-soluble vitamins such as vitamin B12, folic acid and biotin, et al. Compared to traditional microbiological methods, VitaFast(®) kits significantly reduce sample processing time and provide greater reliability, higher productivity and better accuracy. Recently, simultaneous determination of vitamin B12, folic acid and biotin in one sample is urgently required when evaluating the quality of infant formulae in our practical work. However, the present sample preparation protocols which are developed for individual test systems, are incompatible with simultaneous determination of several analytes. To solve this problem, a novel "three-in-one" sample preparation method is herein developed for simultaneous determination of B-group water-soluble vitamins using VitaFast(®) kits. The performance of this novel "three-in-one" sample preparation method was systematically evaluated through comparing with individual sample preparation protocols. The experimental results of the assays which employed "three-in-one" sample preparation method were in good agreement with those obtained from conventional VitaFast(®) extraction methods, indicating that the proposed "three-in-one" sample preparation method is applicable to the present three VitaFast(®) vitamin test systems, thus offering a promising alternative for the three independent sample preparation methods. The proposed new sample preparation method will significantly improve the efficiency of infant formulae inspection. Copyright © 2013 Elsevier Ltd. All rights reserved.
2011-01-01
The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments. PMID:22136293
Salvo-Chirnside, Eliane; Kane, Steven; Kerr, Lorraine E
2011-12-02
The increasing popularity of systems-based approaches to plant research has resulted in a demand for high throughput (HTP) methods to be developed. RNA extraction from multiple samples in an experiment is a significant bottleneck in performing systems-level genomic studies. Therefore we have established a high throughput method of RNA extraction from Arabidopsis thaliana to facilitate gene expression studies in this widely used plant model. We present optimised manual and automated protocols for the extraction of total RNA from 9-day-old Arabidopsis seedlings in a 96 well plate format using silica membrane-based methodology. Consistent and reproducible yields of high quality RNA are isolated averaging 8.9 μg total RNA per sample (~20 mg plant tissue). The purified RNA is suitable for subsequent qPCR analysis of the expression of over 500 genes in triplicate from each sample. Using the automated procedure, 192 samples (2 × 96 well plates) can easily be fully processed (samples homogenised, RNA purified and quantified) in less than half a day. Additionally we demonstrate that plant samples can be stored in RNAlater at -20°C (but not 4°C) for 10 months prior to extraction with no significant effect on RNA yield or quality. Additionally, disrupted samples can be stored in the lysis buffer at -20°C for at least 6 months prior to completion of the extraction procedure providing a flexible sampling and storage scheme to facilitate complex time series experiments.
Hennig, Bianca P.; Velten, Lars; Racke, Ines; Tu, Chelsea Szu; Thoms, Matthias; Rybin, Vladimir; Besir, Hüseyin; Remans, Kim; Steinmetz, Lars M.
2017-01-01
Efficient preparation of high-quality sequencing libraries that well represent the biological sample is a key step for using next-generation sequencing in research. Tn5 enables fast, robust, and highly efficient processing of limited input material while scaling to the parallel processing of hundreds of samples. Here, we present a robust Tn5 transposase purification strategy based on an N-terminal His6-Sumo3 tag. We demonstrate that libraries prepared with our in-house Tn5 are of the same quality as those processed with a commercially available kit (Nextera XT), while they dramatically reduce the cost of large-scale experiments. We introduce improved purification strategies for two versions of the Tn5 enzyme. The first version carries the previously reported point mutations E54K and L372P, and stably produces libraries of constant fragment size distribution, even if the Tn5-to-input molecule ratio varies. The second Tn5 construct carries an additional point mutation (R27S) in the DNA-binding domain. This construct allows for adjustment of the fragment size distribution based on enzyme concentration during tagmentation, a feature that opens new opportunities for use of Tn5 in customized experimental designs. We demonstrate the versatility of our Tn5 enzymes in different experimental settings, including a novel single-cell polyadenylation site mapping protocol as well as ultralow input DNA sequencing. PMID:29118030
A hybrid approach to device integration on a genetic analysis platform
NASA Astrophysics Data System (ADS)
Brennan, Des; Jary, Dorothee; Kurg, Ants; Berik, Evgeny; Justice, John; Aherne, Margaret; Macek, Milan; Galvin, Paul
2012-10-01
Point-of-care (POC) systems require significant component integration to implement biochemical protocols associated with molecular diagnostic assays. Hybrid platforms where discrete components are combined in a single platform are a suitable approach to integration, where combining multiple device fabrication steps on a single substrate is not possible due to incompatible or costly fabrication steps. We integrate three devices each with a specific system functionality: (i) a silicon electro-wetting-on-dielectric (EWOD) device to move and mix sample and reagent droplets in an oil phase, (ii) a polymer microfluidic chip containing channels and reservoirs and (iii) an aqueous phase glass microarray for fluorescence microarray hybridization detection. The EWOD device offers the possibility of fully integrating on-chip sample preparation using nanolitre sample and reagent volumes. A key challenge is sample transfer from the oil phase EWOD device to the aqueous phase microarray for hybridization detection. The EWOD device, waveguide performance and functionality are maintained during the integration process. An on-chip biochemical protocol for arrayed primer extension (APEX) was implemented for single nucleotide polymorphism (SNiP) analysis. The prepared sample is aspirated from the EWOD oil phase to the aqueous phase microarray for hybridization. A bench-top instrumentation system was also developed around the integrated platform to drive the EWOD electrodes, implement APEX sample heating and image the microarray after hybridization.
Piroth, Tobias; Pauly, Marie-Christin; Schneider, Christian; Wittmer, Annette; Möllers, Sven; Döbrössy, Máté; Winkler, Christian; Nikkhah, Guido
2014-01-01
Restorative cell therapy concepts in neurodegenerative diseases are aimed at replacing lost neurons. Despite advances in research on pluripotent stem cells, fetal tissue from routine elective abortions is still regarded as the only safe cell source. Progenitor cells isolated from distinct first-trimester fetal CNS regions have already been used in clinical trials and will be used again in a new multicenter trial funded by the European Union (TRANSEURO). Bacterial contamination of human fetal tissue poses a potential risk of causing infections in the brain of the recipient. Thus, effective methods of microbial decontamination and validation of these methods are required prior to approval of a neurorestorative cell therapy trial. We have developed a protocol consisting of subsequent washing steps at different stages of tissue processing. Efficacy of microbial decontamination was assessed on rat embryonic tissue incubated with high concentrations of defined microbe solutions including representative bacterial and fungal species. Experimental microbial contamination was reduced by several log ranks. Subsequently, we have analyzed the spectrum of microbial contamination and the effect of subsequent washing steps on aborted human fetal tissue; 47.7% of the samples taken during human fetal tissue processing were positive for a microbial contamination, but after washing, no sample exhibited bacterial growth. Our data suggest that human fetal tissue for neural repair can carry microbes of various species, highlighting the need for decontamination procedures. The decontamination protocol described in this report has been shown to be effective as no microbes could be detected at the end of the procedure.
Calibrated work function mapping by Kelvin probe force microscopy
NASA Astrophysics Data System (ADS)
Fernández Garrillo, Pablo A.; Grévin, Benjamin; Chevalier, Nicolas; Borowik, Łukasz
2018-04-01
We propose and demonstrate the implementation of an alternative work function tip calibration procedure for Kelvin probe force microscopy under ultrahigh vacuum, using monocrystalline metallic materials with known crystallographic orientation as reference samples, instead of the often used highly oriented pyrolytic graphite calibration sample. The implementation of this protocol allows the acquisition of absolute and reproducible work function values, with an improved uncertainty with respect to unprepared highly oriented pyrolytic graphite-based protocols. The developed protocol allows the local investigation of absolute work function values over nanostructured samples and can be implemented in electronic structures and devices characterization as demonstrated over a nanostructured semiconductor sample presenting Al0.7Ga0.3As and GaAs layers with variable thickness. Additionally, using our protocol we find that the work function of annealed highly oriented pyrolytic graphite is equal to 4.6 ± 0.03 eV.
The purpose of this project was to investigate the effectiveness of the sample preservation protocol outlined in Method 200.8 in recovering lead from water samples. Lead recoveries were studied in various water samples spiked with lead by evaluating lead sorption and desorption f...
21 CFR 660.36 - Samples and protocols.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Samples and protocols. 660.36 Section 660.36 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) BIOLOGICS... Research Sample Custodian (ATTN: HFM-672) (see mailing addresses in § 600.2 of this chapter), within 30...
BIASES IN CASTNET FILTER PACK RESULTS ASSOCIATED WITH SAMPLING PROTOCOL
In the current study, single filter weekly (w) results are compared with weekly results aggregated from day and night (dn) weekly samples. Comparisons of the two sampling protocols for all major constituents (SO42-, NO3-, NH4+, HNO3, and SO2) show median bias (MB) of < 5 nmol m-3...
Communication-Gateway Software For NETEX, DECnet, And TCP/IP
NASA Technical Reports Server (NTRS)
Keith, B.; Ferry, D.; Fendler, E.
1990-01-01
Communications gateway software, GATEWAY, provides process-to-process communication between remote applications programs in different protocol domains. Communicating peer processes may be resident on any paired combination of NETEX, DECnet, or TCP/IP hosts. Provides necessary mapping from one protocol to another and facilitates practical intermachine communications in cost-effective manner by eliminating need to standardize on single protocol or to implement multiple protocols in host computers. Written in Ada.
Roy R. Rosenberger; Carl J. Houtman
2000-01-01
The USPS Image Analysis (IA) protocol recommends the use of hydrophobic dyes to develop contrast between pressure sensitive adhesive (PSA) particles and cellulosic fibers before using a dirt counter to detect all contaminants that have contrast with the handsheet background. Unless the sample contains no contaminants other than those of interest, two measurement steps...
Butler, Ashleigh; Hall, Helen; Copnell, Beverley
2016-06-01
The qualitative systematic review is a rapidly developing area of nursing research. In order to present trustworthy, high-quality recommendations, such reviews should be based on a review protocol to minimize bias and enhance transparency and reproducibility. Although there are a number of resources available to guide researchers in developing a quantitative review protocol, very few resources exist for qualitative reviews. To guide researchers through the process of developing a qualitative systematic review protocol, using an example review question. The key elements required in a systematic review protocol are discussed, with a focus on application to qualitative reviews: Development of a research question; formulation of key search terms and strategies; designing a multistage review process; critical appraisal of qualitative literature; development of data extraction techniques; and data synthesis. The paper highlights important considerations during the protocol development process, and uses a previously developed review question as a working example. This paper will assist novice researchers in developing a qualitative systematic review protocol. By providing a worked example of a protocol, the paper encourages the development of review protocols, enhancing the trustworthiness and value of the completed qualitative systematic review findings. Qualitative systematic reviews should be based on well planned, peer reviewed protocols to enhance the trustworthiness of results and thus their usefulness in clinical practice. Protocols should outline, in detail, the processes which will be used to undertake the review, including key search terms, inclusion and exclusion criteria, and the methods used for critical appraisal, data extraction and data analysis to facilitate transparency of the review process. Additionally, journals should encourage and support the publication of review protocols, and should require reference to a protocol prior to publication of the review results. © 2016 Sigma Theta Tau International.
Gionfriddo, Emanuela; Naccarato, Attilio; Sindona, Giovanni; Tagarelli, Antonio
2014-07-04
In this work, the capabilities of solid phase microextraction were exploited in a fully optimized SPME-GC-QqQ-MS analytical approach for hydrazine assay. A rapid and easy method was obtained by a simple derivatization reaction with propyl chloroformate and pyridine carried out directly in water samples, followed by automated SPME analysis in the same vial without further sample handling. The affinity of the different derivatized compounds obtained towards five commercially available SPME coatings was evaluated, in order to achieve the best extraction efficiency. GC analyses were carried out using a GC-QqQ-MS instrument in selected reaction monitoring (SRM) acquisition mode which has allowed the achievement of high specificity by selecting appropriate precursor-product ion couples improving the capability in analyte identification. The multivariate approach of experimental design was crucial in order to optimize derivatization reaction, SPME process and tandem mass spectrometry parameters. Accuracy of the proposed protocol, tested at 60, 200 and 800 ng L(-1), provided satisfactory values (114.2%, 83.6% and 98.6%, respectively), whereas precision (RSD%) at the same concentration levels were of 10.9%, 7.9% and 7.7% respectively. Limit of detection and quantification of 4.4 and 8.3 ng L(-1) were obtained. The reliable application of the proposed protocol to real drinking water samples confirmed its capability to be used as analytical tool for routine analyses. Copyright © 2014 Elsevier B.V. All rights reserved.
An affordable method to obtain cultured endothelial cells from peripheral blood
Bueno-Betí, Carlos; Novella, Susana; Lázaro-Franco, Macarena; Pérez-Cremades, Daniel; Heras, Magda; Sanchís, Juan; Hermenegildo, Carlos
2013-01-01
The culture of endothelial progenitor cells (EPC) provides an excellent tool to research on EPC biology and vascular regeneration and vasculogenesis. The use of different protocols to obtain EPC cultures makes it difficult to obtain comparable results in different groups. This work offers a systematic comparison of the main variables of most commonly used protocols for EPC isolation, culture and functional evaluation. Peripheral blood samples from healthy individuals were recovered and mononuclear cells were cultured. Different recovery and culture conditions were tested: blood volume, blood anticoagulant, coating matrix and percentage of foetal bovine serum (FBS) in culture media. The success of culture procedure, first colonies of endothelial cells appearance time, correlation with number of circulating EPC (cEPC) and functional comparison with human umbilical vein endothelial cells (HUVEC) were studied. The use of heparin, a minimum blood volume of 30 ml, fibronectin as a coating matrix and endothelial growing media-2 supplemented with 20% FBS increased the success of obtaining EPC cultures up to 80% of the processed samples while reducing EPC colony appearance mean time to a minimum of 13 days. Blood samples exhibiting higher cEPC numbers resulted in reduced EPC colony appearance mean time. Cells isolated by using this combination were endothelial cell-like EPCs morphological and phenotypically. Functionally, cultured EPC showed decreased growing and vasculogenic capacity when compared to HUVEC. Thus, above-mentioned conditions allow the isolation and culture of EPC with smaller blood volumes and shorter times than currently used protocols. PMID:24118735
Machado, Michely Ediani; Tomazoni, Fernanda; Casarin, Maísa; Ardenghi, Thiago M; Zanatta, Fabricio Batistin
2017-10-01
To compare the performance of partial-mouth periodontal examination (PMPE) protocols with different cut-off points to the full-mouth examination (FME) in the assessment of the prevalence and extent of gingival bleeding in adolescents. A cross-sectional study was conducted involving 12-year-old adolescents. Following a systematic two-stage cluster sampling process, 1134 individuals were evaluated. Different PMPE protocols were compared to the FME with six sites per tooth. Sensitivity, specificity, area under the ROC curve (AUC), intraclass correlation coefficient (ICC), relative and absolute biases and the inflation factor were assessed for each PMPE protocol with different cut-off points for the severity of gingival bleeding. The highest AUC values were found for the six-site two-diagonal quadrant (2-4) (0.97), six-site random half-mouth (0.95) and Community Periodontal Index (0.95) protocols. The assessment of three sites [mesiobuccal (MB), buccal (B) and distolingual (DL)] in two diagonal quadrants and the random half-mouth protocol had higher sensitivity and lower specificity than the same protocols with distobuccal (DB) sites. However, the use of DB sites led to better specificity and improved the balance between sensitivity and specificity, except for the two-diagonal quadrant (1-3) protocol. The ≥1 cut-off point led to the most discrepant results from the FME. Six-site two-diagonal quadrant (2-4) and random half-mouth assessments perform better in the evaluation of gingival bleeding in adolescents. However, when a faster protocol is needed, a two-diagonal quadrant assessment using only MB, B and DL sites can be used with no important loss of information. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
A Combined Fabrication and Instrumentation Platform for Sample Preparation.
Guckenberger, David J; Thomas, Peter C; Rothbauer, Jacob; LaVanway, Alex J; Anderson, Meghan; Gilson, Dan; Fawcett, Kevin; Berto, Tristan; Barrett, Kevin; Beebe, David J; Berry, Scott M
2014-06-01
While potentially powerful, access to molecular diagnostics is substantially limited in the developing world. Here we present an approach to reduced cost molecular diagnostic instrumentation that has the potential to empower developing world communities by reducing costs through streamlining the sample preparation process. In addition, this instrument is capable of producing its own consumable devices on demand, reducing reliance on assay suppliers. Furthermore, this instrument is designed with an "open" architecture, allowing users to visually observe the assay process and make modifications as necessary (as opposed to traditional "black box" systems). This open environment enables integration of microfluidic fabrication and viral RNA purification onto an easy-to-use modular system via the use of interchangeable trays. Here we employ this system to develop a protocol to fabricate microfluidic devices and then use these devices to isolate viral RNA from serum for the measurement of human immunodeficiency virus (HIV) viral load. Results obtained from this method show significantly reduced error compared with similar nonautomated sample preparation processes. © 2014 Society for Laboratory Automation and Screening.
Sørbye, Sveinung Wergeland; Pedersen, Mette Kristin; Ekeberg, Bente; Williams, Merete E Johansen; Sauer, Torill; Chen, Ying
2017-01-01
The Norwegian Cervical Cancer Screening Program recommends screening every 3 years for women between 25 and 69 years of age. There is a large difference in the percentage of unsatisfactory samples between laboratories that use different brands of liquid-based cytology. We wished to examine if inadequate ThinPrep samples could be satisfactory by processing them with the SurePath protocol. A total of 187 inadequate ThinPrep specimens from the Department of Clinical Pathology at University Hospital of North Norway were sent to Akershus University Hospital for conversion to SurePath medium. Ninety-one (48.7%) were processed through the automated "gynecologic" application for cervix cytology samples, and 96 (51.3%) were processed with the "nongynecological" automatic program. Out of 187 samples that had been unsatisfactory by ThinPrep, 93 (49.7%) were satisfactory after being converted to SurePath. The rate of satisfactory cytology was 36.6% and 62.5% for samples run through the "gynecology" program and "nongynecology" program, respectively. Of the 93 samples that became satisfactory after conversion from ThinPrep to SurePath, 80 (86.0%) were screened as normal while 13 samples (14.0%) were given an abnormal diagnosis, which included 5 atypical squamous cells of undetermined significance, 5 low-grade squamous intraepithelial lesion, 2 atypical glandular cells not otherwise specified, and 1 atypical squamous cells cannot exclude high-grade squamous intraepithelial lesion. A total of 2.1% (4/187) of the women got a diagnosis of cervical intraepithelial neoplasia 2 or higher at a later follow-up. Converting cytology samples from ThinPrep to SurePath processing can reduce the number of unsatisfactory samples. The samples should be run through the "nongynecology" program to ensure an adequate number of cells.
Hodges, Lisa R; Rose, Laura J; O'Connell, Heather; Arduino, Matthew J
2010-05-01
Twelve Laboratory Response Network (LRN) affiliated laboratories participated in a validation study of a macrofoam swab protocol for the recovery, detection, and quantification of viable B. anthracis (BA) Sterne spores from steel surfaces. CDC personnel inoculated steel coupons (26cm(2)) with 1-4 log(10) BA spores and recovered them by sampling with pre-moistened macrofoam swabs. Phase 1 (P1) of the study evaluated swabs containing BA only, while dust and background organisms were added to swabs in Phase 2 (P2) to mimic environmental conditions. Laboratories processed swabs and enumerated spores by culturing eluted swab suspensions and counting colonies with morphology consistent with BA. Processed swabs were placed in enrichment broth, incubated 24h, and cultured by streaking for isolation. Real-time PCR was performed on selected colonies from P2 samples to confirm the identity of BA. Mean percent recovery (%R) of spores from the surface ranged from 15.8 to 31.0% (P1) and from 27.9 to 55.0% (P2). The highest mean percent recovery was 31.0% (sd 10.9%) for P1 (4 log(10) inoculum) and 55.0% (sd 27.6%) for P2 (1 log(10) inoculum). The overall %R was higher for P2 (44.6%) than P1 (24.1%), but the overall reproducibility (between-lab variability) was lower in P2 than in P1 (25.0 vs 16.5%CV, respectively). The overall precision (within-lab variability) was close to identical for P1 and P2 (44.0 and 44.1, respectively), but varied greatly between inoculum levels. The protocol demonstrated linearity in %R over the three inoculum levels and is able to detect between 26 and 5x10(6)spores/26cm(2). Sensitivity as determined by culture was >98.3% for both phases and all inocula, suggesting that the culture method maintains sensitivity in the presence of contaminants. The enrichment broth method alone was less sensitive for sampled swabs (66.4%) during P2, suggesting that the presence of background organisms inhibited growth or isolation of BA from the broth. The addition of real-time PCR testing to the assay increased specificity from >85.4% to >95.0% in P2. Although the precision was low at the 1 log(10) inoculum level in both phases (59.0 and 50.2%), this swab processing protocol, was sensitive, specific, precise, and reproducible at 2-4 log(10)/26cm(2) spore concentrations. Published by Elsevier B.V.
The New York Brain Bank of Columbia University: practical highlights of 35 years of experience.
Ramirez, Etty Paola Cortes; Keller, Christian Ernst; Vonsattel, Jean Paul
2018-01-01
The New York Brain Bank processes brains and organs of clinically well-characterized patients with age-related neurodegenerative diseases, and for comparison, from individuals without neurologic or psychiatric impairments. The donors, either patients or individuals, were evaluated at healthcare facilities of the Columbia University of New York. Each source brain yields four categories of samples: fresh frozen blocks and crushed parenchyma, and formalin-fixed wet blocks and histology sections. A source brain is thoroughly evaluated to determine qualitatively and quantitatively any changes it might harbor using conventional neuropathologic techniques. The clinical and pathologic diagnoses are integrated to determine the distributive diagnosis assigned to the samples obtained from a source brain. The gradual standardization of the protocol was developed in 1981 in response to the evolving requirements of basic investigations on neurodegeneration. The methods assimilate long-standing experience from multiple centers. The resulting and current protocol includes a constant central core applied to all brains with conditional flexibility around it. The New York Brain Bank is an integral part of the department of pathology, where the expertise, teaching duties, and hardware are shared. Since details of the protocols are available online, this chapter focuses on practical issues in professionalizing brain banking. Copyright © 2018 Elsevier B.V. All rights reserved.
Dormeyer, Wilma; van Hoof, Dennis; Mummery, Christine L; Krijgsveld, Jeroen; Heck, Albert J R
2008-10-01
The identification of (plasma) membrane proteins in cells can provide valuable insights into the regulation of their biological processes. Pluripotent cells such as human embryonic stem cells and embryonal carcinoma cells are capable of unlimited self-renewal and share many of the biological mechanisms that regulate proliferation and differentiation. The comparison of their membrane proteomes will help unravel the biological principles of pluripotency, and the identification of biomarker proteins in their plasma membranes is considered a crucial step to fully exploit pluripotent cells for therapeutic purposes. For these tasks, membrane proteomics is the method of choice, but as indicated by the scarce identification of membrane and plasma membrane proteins in global proteomic surveys it is not an easy task. In this minireview, we first describe the general challenges of membrane proteomics. We then review current sample preparation steps and discuss protocols that we found particularly beneficial for the identification of large numbers of (plasma) membrane proteins in human tumour- and embryo-derived stem cells. Our optimized assembled protocol led to the identification of a large number of membrane proteins. However, as the composition of cells and membranes is highly variable we still recommend adapting the sample preparation protocol for each individual system.
Anastario, Michael P; Rodriguez, Hector P; Gallagher, Patricia M; Cleary, Paul D; Shaller, Dale; Rogers, William H; Bogen, Karen; Safran, Dana Gelb
2010-01-01
Objective To assess the effect of survey distribution protocol (mail versus handout) on data quality and measurement of patient care experiences. Data Sources/Study Setting Multisite randomized trial of survey distribution protocols. Analytic sample included 2,477 patients of 15 clinicians at three practice sites in New York State. Data Collection/Extraction Methods Mail and handout distribution modes were alternated weekly at each site for 6 weeks. Principal Findings Handout protocols yielded an incomplete distribution rate (74 percent) and lower overall response rates (40 percent versus 58 percent) compared with mail. Handout distribution rates decreased over time and resulted in more favorable survey scores compared with mailed surveys. There were significant mode–physician interaction effects, indicating that data cannot simply be pooled and adjusted for mode. Conclusions In-office survey distribution has the potential to bias measurement and comparison of physicians and sites on patient care experiences. Incomplete distribution rates observed in-office, together with between-office differences in distribution rates and declining rates over time suggest staff may be burdened by the process and selective in their choice of patients. Further testing with a larger physician and site sample is important to definitively establish the potential role for in-office distribution in obtaining reliable, valid assessment of patient care experiences. PMID:20579126
Haas, Brian J; Papanicolaou, Alexie; Yassour, Moran; Grabherr, Manfred; Blood, Philip D; Bowden, Joshua; Couger, Matthew Brian; Eccles, David; Li, Bo; Lieber, Matthias; MacManes, Matthew D; Ott, Michael; Orvis, Joshua; Pochet, Nathalie; Strozzi, Francesco; Weeks, Nathan; Westerman, Rick; William, Thomas; Dewey, Colin N; Henschel, Robert; LeDuc, Richard D; Friedman, Nir; Regev, Aviv
2013-08-01
De novo assembly of RNA-seq data enables researchers to study transcriptomes without the need for a genome sequence; this approach can be usefully applied, for instance, in research on 'non-model organisms' of ecological and evolutionary importance, cancer samples or the microbiome. In this protocol we describe the use of the Trinity platform for de novo transcriptome assembly from RNA-seq data in non-model organisms. We also present Trinity-supported companion utilities for downstream applications, including RSEM for transcript abundance estimation, R/Bioconductor packages for identifying differentially expressed transcripts across samples and approaches to identify protein-coding genes. In the procedure, we provide a workflow for genome-independent transcriptome analysis leveraging the Trinity platform. The software, documentation and demonstrations are freely available from http://trinityrnaseq.sourceforge.net. The run time of this protocol is highly dependent on the size and complexity of data to be analyzed. The example data set analyzed in the procedure detailed herein can be processed in less than 5 h.
Mitra, Abhishek; Skrzypczak, Magdalena; Ginalski, Krzysztof; Rowicka, Maga
2015-01-01
Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding). Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants). Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol) that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer’s, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively, we discuss how analysis can be repeated from saved sequencing images using the Long Template Protocol to increase accuracy. PMID:25860802
2011-01-01
Background The OPERA trial is large cluster randomised trial testing a physical activity intervention to address depression amongst people living in nursing and residential homes for older people. A process evaluation was commissioned alongside the trial and we report the protocol for this process evaluation. Challenges included the cognitive and physical ability of the participants, the need to respect the privacy of all home residents, including study non-participants, and the physical structure of the homes. Evaluation activity had to be organised around the structured timetable of homes, leaving limited opportunities for data collection. The aims of this process evaluation are to provide findings that will assist in the interpretation of the clinical trial results, and to inform potential implementation of the physical activity intervention on a wider scale. Methods/design Quantitative data on recruitment of homes and individuals is being collected. For homes in the intervention arm, data on dose and fidelity of the intervention delivered; including individual rates of participation in exercise classes are collected. In the control homes, uptake and delivery of depression awareness training is monitored. These data will be combined with qualitative data from an in-depth study of a purposive sample of eight homes (six intervention and two control). Discussion Although process evaluations are increasingly funded alongside trials, it is still rare to see the findings published, and even rarer to see the protocol for such an evaluation published. Process evaluations have the potential to assist in interpreting and understanding trial results as well as informing future roll-outs of interventions. If such evaluations are funded they should also be reported and reviewed in a similar way to the trial outcome evaluation. Trial Registration ISRCTN No: ISRCTN43769277 PMID:21288341
Using PATIMDB to Create Bacterial Transposon Insertion Mutant Libraries
Urbach, Jonathan M.; Wei, Tao; Liberati, Nicole; Grenfell-Lee, Daniel; Villanueva, Jacinto; Wu, Gang; Ausubel, Frederick M.
2015-01-01
PATIMDB is a software package for facilitating the generation of transposon mutant insertion libraries. The software has two main functions: process tracking and automated sequence analysis. The process tracking function specifically includes recording the status and fates of multiwell plates and samples in various stages of library construction. Automated sequence analysis refers specifically to the pipeline of sequence analysis starting with ABI files from a sequencing facility and ending with insertion location identifications. The protocols in this unit describe installation and use of PATIMDB software. PMID:19343706
NASA Astrophysics Data System (ADS)
Basak, Jyotirmoy; Maitra, Subhamoy
2018-04-01
In device-independent (DI) paradigm, the trustful assumptions over the devices are removed and CHSH test is performed to check the functionality of the devices toward certifying the security of the protocol. The existing DI protocols consider infinite number of samples from theoretical point of view, though this is not practically implementable. For finite sample analysis of the existing DI protocols, we may also consider strategies for checking device independence other than the CHSH test. In this direction, here we present a comparative analysis between CHSH and three-party Pseudo-telepathy game for the quantum private query protocol in DI paradigm that appeared in Maitra et al. (Phys Rev A 95:042344, 2017) very recently.
Jensen, Erik C.; Stockton, Amanda M.; Chiesl, Thomas N.; Kim, Jungkyu; Bera, Abhisek; Mathies, Richard A.
2013-01-01
A digitally programmable microfluidic Automaton consisting of a 2-dimensional array of pneumatically actuated microvalves is programmed to perform new multiscale mixing and sample processing operations. Large (µL-scale) volume processing operations are enabled by precise metering of multiple reagents within individual nL-scale valves followed by serial repetitive transfer to programmed locations in the array. A novel process exploiting new combining valve concepts is developed for continuous rapid and complete mixing of reagents in less than 800 ms. Mixing, transfer, storage, and rinsing operations are implemented combinatorially to achieve complex assay automation protocols. The practical utility of this technology is demonstrated by performing automated serial dilution for quantitative analysis as well as the first demonstration of on-chip fluorescent derivatization of biomarker targets (carboxylic acids) for microchip capillary electrophoresis on the Mars Organic Analyzer. A language is developed to describe how unit operations are combined to form a microfluidic program. Finally, this technology is used to develop a novel microfluidic 6-sample processor for combinatorial mixing of large sets (>26 unique combinations) of reagents. The digitally programmable microfluidic Automaton is a versatile programmable sample processor for a wide range of process volumes, for multiple samples, and for different types of analyses. PMID:23172232
Fallah, F; Minaei Chenar, H; Amiri, H; Omodipour, S; Shirbande Ghods, F; Kahrizi, D; Sohrabi, M; Ghorbani, T; Kazemi, E
2017-02-28
High quality DNA is essential for molecular research. Secondary metabolites can affect the quantity and quality DNA. In current research two DNA isolation methods including CTAB and Delaporta (protocols 1 & 2 respectively) were applied in three leave samples from Cotinus coggygria, Citrus sinensis and Genus juglans that their leaves are rich of secondary metabolites. We successfully isolated DNA from C. coggygria, C. sinensis and Genus Juglans using the two protocols described above. Good quality DNA was isolated from C. coggygria, C. sinensis and Genus Juglans using protocol 1, while protocol 2 failed to produce usable DNA from these sources. The highest amount of DNA (1.3-1.6) was obtained from them using protocol 1. As we discovered, procedure 1 may work better for plants with secondary metabolites.
Kalim, Shahid; Nazir, Shaista; Khan, Zia Ullah
2013-01-01
Protocols based on newer high sensitivity Troponin T (hsTropT) assays can rule in a suspected Acute Myocardial Infarction (AMI) as early as 3 hours. We conducted this study to audit adherence to our Trust's newly introduced AMI diagnostic protocol based on paired hsTropT testing at 0 and 3 hours. We retrospectively reviewed data of all patients who had hsTropT test done between 1st and 7th May 2012. Patient's demographics, utility of single or paired samples, time interval between paired samples, patient's presenting symptoms and ECG findings were noted and their means, medians, Standard deviations and proportions were calculated. A total of 66 patients had hsTropT test done during this period. Mean age was 63.30 +/- 17.46 years and 38 (57.57%) were males. Twenty-four (36.36%) patients had only single, rather than protocol recommended paired hsTropT samples, taken. Among the 42 (63.63%) patients with paired samples, the mean time interval was found to be 4.41 +/- 5.7 hours. Contrary to the recommendations, 15 (22.73%) had a very long whereas 2 (3.03%) had a very short time interval between two samples. A subgroup analysis of patients with single samples, found only 2 (3.03%) patient with ST-segment elevation, appropriate for single testing. Our study confirmed that in a large number of patients the protocol for paired sampling or a recommended time interval of 3 hours between 2 samples was not being followed.
IRB Process Improvements: A Machine Learning Analysis.
Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A
2017-06-01
Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.
Norton, D M; McCamey, M; Boor, K J; Wiedmann, M
2000-03-01
The cold-smoked fish industry was used as a model for the development of a system for monitoring Listeria spp. in foods and in the food processing environment. A total of 214 samples including raw fish, fish during the cold-smoking process, finished product, and environmental samples were collected from three processing facilities over two visits to each facility. Samples were screened for Listeria spp. using the BAX for Screening/genus Listeria polymerase chain reaction system (PCR) and by culture. Listeria spp., confirmed by the API Listeria test strip or by a PCR assay targeting the L. monocytogenes hlyA gene, were isolated from a total of 89 (41.6%) samples. Of these, 80 samples also tested positive for Listeria spp. using the BAX system. Specifically, 42 (55.3%) environmental samples (n = 76), 11 (25.6%) raw materials samples (n = 43), 20 (35.1%) samples from fish in various stages of processing (n = 57), and 7 (18.4%) finished product samples (n = 38) tested positive for Listeria spp. using the BAX system. Five (4.0%) of the 125 culture-negative samples yielded BAX system-positive results. Listeria isolates from each of nine culture-positive/BAX system-negative samples yielded a positive reaction when tested in pure culture by the BAX system, suggesting that our false-negative results were likely due to the presence of low Listeria numbers in the initial enrichment as opposed to nonreacting isolates. The employment of alternative enrichment protocols, such as the two-step enrichment recommended by the manufacturer, may increase the sensitivity of the assay.
Paperless protocoling of CT and MRI requests at an outpatient imaging center.
Bassignani, Matthew J; Dierolf, David A; Roberts, David L; Lee, Steven
2010-04-01
We created our imaging center (IC) to move outpatient imaging from our busy inpatient imaging suite off-site to a location that is more inviting to ambulatory patients. Nevertheless, patients scanned at our IC still represent the depth and breadth of illness complexity seen with our tertiary care population. Thus, we protocol exams on an individualized basis to ensure that the referring clinician's question is fully answered by the exam performed. Previously, paper based protocoling was a laborious process for all those involved where the IC business office would fax the requests to various reading rooms for protocoling by the subspecialist radiologists who are 3 miles away at the main hospital. Once protocoled, reading room coordinators would fax back the protocoled request to the IC technical area in preparation for the next day's scheduled exams. At any breakdown in this process (e.g., lost paperwork), patient exams were delayed and clinicians and patients became upset. To improve this process, we developed a paper free process whereby protocoling is accomplished through scanning of exam requests into our PACS. Using the common worklist functionality found in most PACS, we created "protocoling worklists" that contain these scanned documents. Radiologists protocol these studies in the PACS worklist (with the added benefit of having all imaging and report data available), and subsequently, the technologists can see and act on the protocols they find in PACS. This process has significantly decreased interruptions in our busy reading rooms and decreased rework of IC staff.
Data-driven CT protocol review and management—experience from a large academic hospital.
Zhang, Da; Savage, Cristy A; Li, Xinhua; Liu, Bob
2015-03-01
Protocol review plays a critical role in CT quality assurance, but large numbers of protocols and inconsistent protocol names on scanners and in exam records make thorough protocol review formidable. In this investigation, we report on a data-driven cataloging process that can be used to assist in the reviewing and management of CT protocols. We collected lists of scanner protocols, as well as 18 months of recent exam records, for 10 clinical scanners. We developed computer algorithms to automatically deconstruct the protocol names on the scanner and in the exam records into core names and descriptive components. Based on the core names, we were able to group the scanner protocols into a much smaller set of "core protocols," and to easily link exam records with the scanner protocols. We calculated the percentage of usage for each core protocol, from which the most heavily used protocols were identified. From the percentage-of-usage data, we found that, on average, 18, 33, and 49 core protocols per scanner covered 80%, 90%, and 95%, respectively, of all exams. These numbers are one order of magnitude smaller than the typical numbers of protocols that are loaded on a scanner (200-300, as reported in the literature). Duplicated, outdated, and rarely used protocols on the scanners were easily pinpointed in the cataloging process. The data-driven cataloging process can facilitate the task of protocol review. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Leonard, Susan R.; Mammel, Mark K.; Lacher, David W.
2015-01-01
Culture-independent diagnostics reduce the reliance on traditional (and slower) culture-based methodologies. Here we capitalize on advances in next-generation sequencing (NGS) to apply this approach to food pathogen detection utilizing NGS as an analytical tool. In this study, spiking spinach with Shiga toxin-producing Escherichia coli (STEC) following an established FDA culture-based protocol was used in conjunction with shotgun metagenomic sequencing to determine the limits of detection, sensitivity, and specificity levels and to obtain information on the microbiology of the protocol. We show that an expected level of contamination (∼10 CFU/100 g) could be adequately detected (including key virulence determinants and strain-level specificity) within 8 h of enrichment at a sequencing depth of 10,000,000 reads. We also rationalize the relative benefit of static versus shaking culture conditions and the addition of selected antimicrobial agents, thereby validating the long-standing culture-based parameters behind such protocols. Moreover, the shotgun metagenomic approach was informative regarding the dynamics of microbial communities during the enrichment process, including initial surveys of the microbial loads associated with bagged spinach; the microbes found included key genera such as Pseudomonas, Pantoea, and Exiguobacterium. Collectively, our metagenomic study highlights and considers various parameters required for transitioning to such sequencing-based diagnostics for food safety and the potential to develop better enrichment processes in a high-throughput manner not previously possible. Future studies will investigate new species-specific DNA signature target regimens, rational design of medium components in concert with judicious use of additives, such as antibiotics, and alterations in the sample processing protocol to enhance detection. PMID:26386062
NASA Astrophysics Data System (ADS)
Puget, P.
The reliable and fast detection of chemical or biological molecules, or the measurement of their concentrations in a sample, are key problems in many fields such as environmental analysis, medical diagnosis, or the food industry. There are traditionally two approaches to this problem. The first aims to carry out a measurement in situ in the sample using chemical and biological sensors. The constraints imposed by detection limits, specificity, and in some cases stability are entirely imputed to the sensor. The second approach uses so-called total analysis systems to process the sample according to a protocol made up of different steps, such as extractions, purifications, concentrations, and a final detection stage. The latter is made in better conditions than with the first approach, which may justify the greater complexity of the process. It is this approach that is implemented in most methods for identifying pathogens, whether they be in biological samples (especially for in vitro diagnosis) or samples taken from the environment. The instrumentation traditionally used to carry out these protocols comprises a set of bulky benchtop apparatus, which needs to be plugged into the mains in order to function. However, there are many specific applications (to be discussed in this chapter) for which analysis instruments with the following characteristics are needed: Possibility of use outside the laboratory, i.e., instruments as small as possible, consuming little energy, and largely insensitive to external conditions of temperature, humidity, vibrations, and so on. Possibility of use by non-specialised agents, or even unmanned operation. Possibility of handling a large number of samples in a limited time, typically for high-throughput screening applications. Possibility of handling small samples. At the same time, a high level of performance is required, in particular in terms of (1) the detection limit, which must be as low as possible, (2) specificity, i.e., the ability to detect a particular molecule in a complex mixture, and (3) speed.
Characterization of addressability by simultaneous randomized benchmarking.
Gambetta, Jay M; Córcoles, A D; Merkel, S T; Johnson, B R; Smolin, John A; Chow, Jerry M; Ryan, Colm A; Rigetti, Chad; Poletto, S; Ohki, Thomas A; Ketchen, Mark B; Steffen, M
2012-12-14
The control and handling of errors arising from cross talk and unwanted interactions in multiqubit systems is an important issue in quantum information processing architectures. We introduce a benchmarking protocol that provides information about the amount of addressability present in the system and implement it on coupled superconducting qubits. The protocol consists of randomized benchmarking experiments run both individually and simultaneously on pairs of qubits. A relevant figure of merit for the addressability is then related to the differences in the measured average gate fidelities in the two experiments. We present results from two similar samples with differing cross talk and unwanted qubit-qubit interactions. The results agree with predictions based on simple models of the classical cross talk and Stark shifts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elkin, Christopher; Kapur, Hitesh; Smith, Troy
2001-09-15
We have developed an automated purification method for terminator sequencing products based on a magnetic bead technology. This 384-well protocol generates labeled DNA fragments that are essentially free of contaminates for less than $0.005 per reaction. In comparison to laborious ethanol precipitation protocols, this method increases the phred20 read length by forty bases with various DNA templates such as PCR fragments, Plasmids, Cosmids and RCA products. Our method eliminates centrifugation and is compatible with both the MegaBACE 1000 and ABIPrism 3700 capillary instruments. As of September 2001, this method has produced over 1.6 million samples with 93 percent averaging 620more » phred20 bases as part of Joint Genome Institutes Production Process.« less
Marchi, S; Bonora, M; Patergnani, S; Giorgi, C; Pinton, P
2017-01-01
It is widely acknowledged that mitochondria are highly active structures that rapidly respond to cellular and environmental perturbations by changing their shape, number, and distribution. Mitochondrial remodeling is a key component of diverse biological processes, ranging from cell cycle progression to autophagy. In this chapter, we describe different methodologies for the morphological study of the mitochondrial network. Instructions are given for the preparation of samples for fluorescent microscopy, based on genetically encoded strategies or the employment of synthetic fluorescent dyes. We also propose detailed protocols to analyze mitochondrial morphometric parameters from both three-dimensional and bidimensional datasets. Finally, we describe a protocol for the visualization and quantification of mitochondrial structures through electron microscopy. © 2017 Elsevier Inc. All rights reserved.
Stewart, J B; Hardin, S B; Weinrich, S; McGeorge, S; Lopez, J; Pesut, D
1992-01-01
Literature reports that cognitive understanding and social support can mitigate stress in both adults and adolescents. As a subcomponent of the Carolina Adolescent Health Project (CAHP), this research evaluated the efficacy of a Cognitive Social Support (CSS) group protocol designed to mitigate the disaster stress of adolescents who had been exposed seriously to Hurricane Hugo. A purposive sample of 259 students participated in and evaluated the CSS. This article reports the specific structure, content, process, rationale, and cost of the CSS. Evaluations indicated that 82% of the students evaluated the small-group component of the CSS as "very good" or "excellent," while 70% rated the large-group component as "very good" or "excellent."
Jeddi, Fakhri; Yapo-Kouadio, Gisèle Cha; Normand, Anne-Cécile; Cassagne, Carole; Marty, Pierre; Piarroux, Renaud
2017-02-01
In cases of fungal infection of the bloodstream, rapid species identification is crucial to provide adapted therapy and thereby ameliorate patient outcome. Currently, the commercial Sepsityper kit and the sodium-dodecyl sulfate (SDS) method coupled with MALDI-TOF mass spectrometry are the most commonly reported lysis protocols for direct identification of fungi from positive blood culture vials. However, the performance of these two protocols has never been compared on clinical samples. Accordingly, we performed a two-step survey on two distinct panels of clinical positive blood culture vials to identify the most efficient protocol, establish an appropriate log score (LS) cut-off, and validate the best method. We first compared the performance of the Sepsityper and the SDS protocols on 71 clinical samples. For 69 monomicrobial samples, mass spectrometry LS values were significantly higher with the SDS protocol than with the Sepsityper method (P < .0001), especially when the best score of four deposited spots was considered. Next, we established the LS cut-off for accurate identification at 1.7, based on specimen DNA sequence data. Using this LS cut-off, 66 (95.6%) and 46 (66.6%) isolates were correctly identified at the species level with the SDS and the Sepsityper protocols, respectively. In the second arm of the survey, we validated the SDS protocol on an additional panel of 94 clinical samples. Ninety-two (98.9%) of 93 monomicrobial samples were correctly identified at the species level (median LS = 2.061). Overall, our data suggest that the SDS method yields more accurate species identification of yeasts, than the Sepsityper protocol. © The Author 2016. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Souza, C A; Oliveira, T C; Crovella, S; Santos, S M; Rabêlo, K C N; Soriano, E P; Carvalho, M V D; Junior, A F Caldas; Porto, G G; Campello, R I C; Antunes, A A; Queiroz, R A; Souza, S M
2017-04-28
The use of Y chromosome haplotypes, important for the detection of sexual crimes in forensics, has gained prominence with the use of databases that incorporate these genetic profiles in their system. Here, we optimized and validated an amplification protocol for Y chromosome profile retrieval in reference samples using lesser materials than those in commercial kits. FTA ® cards (Flinders Technology Associates) were used to support the oral cells of male individuals, which were amplified directly using the SwabSolution reagent (Promega). First, we optimized and validated the process to define the volume and cycling conditions. Three reference samples and nineteen 1.2 mm-diameter perforated discs were used per sample. Amplification of one or two discs (samples) with the PowerPlex ® Y23 kit (Promega) was performed using 25, 26, and 27 thermal cycles. Twenty percent, 32%, and 100% reagent volumes, one disc, and 26 cycles were used for the control per sample. Thereafter, all samples (N = 270) were amplified using 27 cycles, one disc, and 32% reagents (optimized conditions). Data was analyzed using a study of equilibrium values between fluorophore colors. In the samples analyzed with 20% volume, an imbalance was observed in peak heights, both inside and in-between each dye. In samples amplified with 32% reagents, the values obtained for the intra-color and inter-color standard balance calculations for verification of the quality of the analyzed peaks were similar to those of samples amplified with 100% of the recommended volume. The quality of the profiles obtained with 32% reagents was suitable for insertion into databases.
Optimization of Native and Formaldehyde iPOND Techniques for Use in Suspension Cells.
Wiest, Nathaniel E; Tomkinson, Alan E
2017-01-01
The isolation of proteins on nascent DNA (iPOND) technique developed by the Cortez laboratory allows a previously unparalleled ability to examine proteins associated with replicating and newly synthesized DNA in mammalian cells. Both the original, formaldehyde-based iPOND technique and a more recent derivative, accelerated native iPOND (aniPOND), have mostly been performed in adherent cell lines. Here, we describe modifications to both protocols for use with suspension cell lines. These include cell culture, pulse, and chase conditions that optimize sample recovery in both protocols using suspension cells and several key improvements to the published aniPOND technique that reduce sample loss, increase signal to noise, and maximize sample recovery. Additionally, we directly and quantitatively compare the iPOND and aniPOND protocols to test the strengths and limitations of both. Finally, we present a detailed protocol to perform the optimized aniPOND protocol in suspension cell lines. © 2017 Elsevier Inc. All rights reserved.
Endoscope disinfection and its pitfalls--requirement for retrograde surveillance cultures.
Buss, A J; Been, M H; Borgers, R P; Stokroos, I; Melchers, W J; Peters, F T; Limburg, A J; Degener, J E
2008-04-01
Several endoscopy-related outbreaks of infection have been reported in recent years. For early recognition of inadequate disinfection of endoscopes we designed a microbiological surveillance system to evaluate the efficacy of the cleaning and disinfection procedure, and to trace disinfection problems to individual endoscopes or washer-disinfectors. Our surveillance protocol included anterograde and retrograde sampling, a decision algorithm, genetic fingerprinting, and scanning electron microscopy. Over a period of 29 months we found an increasing number of patient-ready endoscopes testing positive for Candida species other than albicans, especially C. parapsilosis. These yeasts were also isolated from the washer-disinfectors. The number of positive tests for Candida species varied from 1 out of 21 to 14 out of 27 samples from nine frequently used endoscopes. The number of colony-forming units per milliliter ranged from 1 - 10 to 3000 for endoscopes and 0.002 to 0.06 for the washer disinfectors. DNA fingerprinting was not able to discriminate different strains within C. parapsilosis. Our protocol was able to detect a structural problem in the endoscope disinfection process. Retrograde sampling was crucial for this purpose, because it has much higher sensitivity than anterograde sampling. Endoscopes with damaged working channels are probably the source of the contamination problem with Candida species.
Schmettow, Martin; Schnittker, Raphaela; Schraagen, Jan Maarten
2017-05-01
This paper proposes and demonstrates an extended protocol for usability validation testing of medical devices. A review of currently used methods for the usability evaluation of medical devices revealed two main shortcomings. Firstly, the lack of methods to closely trace the interaction sequences and derive performance measures. Secondly, a prevailing focus on cross-sectional validation studies, ignoring the issues of learnability and training. The U.S. Federal Drug and Food Administration's recent proposal for a validation testing protocol for medical devices is then extended to address these shortcomings: (1) a novel process measure 'normative path deviations' is introduced that is useful for both quantitative and qualitative usability studies and (2) a longitudinal, completely within-subject study design is presented that assesses learnability, training effects and allows analysis of diversity of users. A reference regression model is introduced to analyze data from this and similar studies, drawing upon generalized linear mixed-effects models and a Bayesian estimation approach. The extended protocol is implemented and demonstrated in a study comparing a novel syringe infusion pump prototype to an existing design with a sample of 25 healthcare professionals. Strong performance differences between designs were observed with a variety of usability measures, as well as varying training-on-the-job effects. We discuss our findings with regard to validation testing guidelines, reflect on the extensions and discuss the perspectives they add to the validation process. Copyright © 2017 Elsevier Inc. All rights reserved.
Two Decades into the LCR: What We Do and Still Don’t Know to Solve Lead Problems
Site selection and sampling protocol biases in LCR samplingunderestimate peak lead and copper concentrations whilemissing erratic lead release episodes resulting from distributionsystem chemical and physical disturbances. Possible sitetargeting and sampling protocol changes could...
Vascular Blood Collection protocol samples into MELFI
2011-10-18
iss029e028495 (10/18/2011) --- Japan Aerospace Exploration Agency astronaut Satoshi Furukawa,Expedition 29 flight engineer,prepares to put samples from the CSA (Canadian Space Agency) Vascular Blood Collection protocol into the MELFI-1 (Minus Eighty Laboratory Freezer for ISS 1) unit.
Two Decades into the LCR: What We Do and Still Don’t Know to Solve Lead Problems - abstract
Site selection and sampling protocol biases in LCR samplingunderestimate peak lead and copper concentrations whilemissing erratic lead release episodes resulting from distributionsystem chemical and physical disturbances. Possible sitetargeting and sampling protocol changes could...
NHEXAS PHASE I ARIZONA STUDY--LIST OF STANDARD OPERATING PROCEDURES
This document lists available protocols and SOPs for the NHEXAS Phase I Arizona study. It identifies protocols and SOPs for the following study components: (1) Sample collection and field operations, (2) Sample analysis, (3) General laboratory procedures, (4) Quality Assurance, (...
Protocol Processing for 100 Gbit/s and Beyond - A Soft Real-Time Approach in Hardware and Software
NASA Astrophysics Data System (ADS)
Büchner, Steffen; Lopacinski, Lukasz; Kraemer, Rolf; Nolte, Jörg
2017-09-01
100 Gbit/s wireless communication protocol processing stresses all parts of a communication system until the outermost. The efficient use of upcoming 100 Gbit/s and beyond transmission technology requires the rethinking of the way protocols are processed by the communication endpoints. This paper summarizes the achievements of the project End2End100. We will present a comprehensive soft real-time stream processing approach that allows the protocol designer to develop, analyze, and plan scalable protocols for ultra high data rates of 100 Gbit/s and beyond. Furthermore, we will present an ultra-low power, adaptable, and massively parallelized FEC (Forward Error Correction) scheme that detects and corrects bit errors at line rate with an energy consumption between 1 pJ/bit and 13 pJ/bit. The evaluation results discussed in this publication show that our comprehensive approach allows end-to-end communication with a very low protocol processing overhead.
Developing family planning nurse practitioner protocols.
Hawkins, J W; Roberto, D
1984-01-01
This article focuses on the process of development of protocols for family planning nurse practitioners. A rationale for the use of protocols, a definition of the types and examples, and the pros and cons of practice with protocols are presented. A how-to description for the development process follows, including methods and a suggested tool for critique and evaluation. The aim of the article is to assist nurse practitioners in developing protocols for their practice.
NASA Astrophysics Data System (ADS)
Liu, Yongfang; Zhao, Yu; Chen, Guanrong
2016-11-01
This paper studies the distributed consensus and containment problems for a group of harmonic oscillators with a directed communication topology. First, for consensus without a leader, a class of distributed consensus protocols is designed by using motion planning and Pontryagin's principle. The proposed protocol only requires relative information measurements at the sampling instants, without requiring information exchange over the sampled interval. By using stability theory and the properties of stochastic matrices, it is proved that the distributed consensus problem can be solved in the motion planning framework. Second, for the case with multiple leaders, a class of distributed containment protocols is developed for followers such that their positions and velocities can ultimately converge to the convex hull formed by those of the leaders. Compared with the existing consensus algorithms, a remarkable advantage of the proposed sampled-data-based protocols is that the sampling periods, communication topologies and control gains are all decoupled and can be separately designed, which relaxes many restrictions in controllers design. Finally, some numerical examples are given to illustrate the effectiveness of the analytical results.
A Novel Process Audit for Standardized Perioperative Handoff Protocols.
Pallekonda, Vinay; Scholl, Adam T; McKelvey, George M; Amhaz, Hassan; Essa, Deanna; Narreddy, Spurthy; Tan, Jens; Templonuevo, Mark; Ramirez, Sasha; Petrovic, Michelle A
2017-11-01
A perioperative handoff protocol provides a standardized delivery of communication during a handoff that occurs from the operating room to the postanestheisa care unit or ICU. The protocol's success is dependent, in part, on its continued proper use over time. A novel process audit was developed to help ensure that a perioperative handoff protocol is used accurately and appropriately over time. The Audit Observation Form is used for the Audit Phase of the process audit, while the Audit Averages Form is used for the Data Analysis Phase. Employing minimal resources and using quantitative methods, the process audit provides the necessary means to evaluate the proper execution of any perioperative handoff protocol. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.
iSANLA: intelligent sensor and actuator network for life science applications.
Schloesser, Mario; Schnitzer, Andreas; Ying, Hong; Silex, Carmen; Schiek, Michael
2008-01-01
In the fields of neurological rehabilitation and neurophysiological research there is a strong need for miniaturized, multi channel, battery driven, wireless networking DAQ systems enabling real-time digital signal processing and feedback experiments. For the scientific investigation on the passive auditory based 3D-orientation of Barn Owls and the scientific research on vegetative locomotor coordination of Parkinson's disease patients during rehabilitation we developed our 'intelligent Sensor and Actuator Network for Life science Application' (iSANLA) system. Implemented on the ultra low power microcontroller MSP430 sample rates up to 96 kHz have been realised for single channel DAQ. The system includes lossless local data storage up to 4 GB. With its outer dimensions of 20mm per rim and less than 15 g of weight including the Lithium-Ion battery our modular designed sensor node is thoroughly capable of up to eight channel recordings with 8 kHz sample rate each and provides sufficient computational power for digital signal processing ready to start our first mobile experiments. For wireless mobility a compact communication protocol based on the IEEE 802.15.4 wireless standard with net data rates up to 141 kbit/s has been implemented. To merge the lossless acquired data of the distributed iNODEs a time synchronization protocol has been developed preserving causality. Hence the necessary time synchronous start of the data acquisition inside a network of multiple sensors with a precision better than the highest sample rate has been realized.
Christopher W. Woodall; Vicente J. Monleon
2008-01-01
The USDA Forest Service's Forest Inventory and Analysis program conducts an inventory of forests of the United States including down woody materials (DWM). In this report we provide the rationale and context for a national inventory of DWM, describe the components sampled, discuss the sampling protocol used and corresponding estimation procedures, and provide...
Geochemical data for Colorado soils-Results from the 2006 state-scale geochemical survey
Smith, David B.; Ellefsen, Karl J.; Kilburn, James E.
2010-01-01
In 2006, soil samples were collected at 960 sites (1 site per 280 square kilometers) throughout the state of Colorado. These samples were collected from a depth of 0-15 centimeters and, following a near-total multi-acid digestion, were analyzed for a suite of more than 40 major and trace elements. The resulting data set provides a baseline for the natural variation in soil geochemistry for Colorado and forms the basis for detecting changes in soil composition that might result from natural processes or anthropogenic activities. This report describes the sampling and analytical protocols used and makes available all the soil geochemical data generated in the study.
Evaluation of four automated protocols for extraction of DNA from FTA cards.
Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura; Frank-Hansen, Rune; Poulsen, Lena; Hansen, Anders J; Morling, Niels
2013-10-01
Extraction of DNA using magnetic bead-based techniques on automated DNA extraction instruments provides a fast, reliable, and reproducible method for DNA extraction from various matrices. Here, we have compared the yield and quality of DNA extracted from FTA cards using four automated extraction protocols on three different instruments. The extraction processes were repeated up to six times with the same pieces of FTA cards. The sample material on the FTA cards was either blood or buccal cells. With the QIAamp DNA Investigator and QIAsymphony DNA Investigator kits, it was possible to extract DNA from the FTA cards in all six rounds of extractions in sufficient amount and quality to obtain complete short tandem repeat (STR) profiles on a QIAcube and a QIAsymphony SP. With the PrepFiler Express kit, almost all the extractable DNA was extracted in the first two rounds of extractions. Furthermore, we demonstrated that it was possible to successfully extract sufficient DNA for STR profiling from previously processed FTA card pieces that had been stored at 4 °C for up to 1 year. This showed that rare or precious FTA card samples may be saved for future analyses even though some DNA was already extracted from the FTA cards.
[Application of DNA extraction kit, 'GM quicker' for detection of genetically modified soybeans].
Sato, Noriko; Sugiura, Yoshitsugu; Tanaka, Toshitsugu
2012-01-01
Several DNA extraction methods have been officially introduced to detect genetically modified soybeans, but the choice of DNA extraction kits depend on the nature of the samples, such as grains or processed foods. To overcome this disadvantage, we examined whether the GM quicker kit is available for both grains and processed foods. We compared GM quicker with four approved DNA extraction kits in respect of DNA purity, copy numbers of lectin gene, and working time. We found that the DNA quality of GM quicker was superior to that of the other kits for grains, and the procedure was faster. However, in the case of processed foods, GM quicker was not superior to the other kits. We therefore investigated an unapproved GM quicker 3 kit, which is available for DNA extraction from processed foods, such as tofu and boiled soybeans. The GM quicker 3 kit provided good DNA quality from both grains and processed foods, so we made a minor modification of the GM quicker-based protocol that was suitable for processed foods, using GM quicker and its reagents. The modified method enhanced the performance of GM quicker with processed foods. We believe that GM quicker with the modified protocol is an excellent tool to obtain high-quality DNA from grains and processed foods for detection of genetically modified soybeans.
2014-10-01
number/communicate to site coordinator N/A Task V.5 (mo 6-47): implement methods to educate/monitor participants on aspects of vit D3 and calcium...potential side effects of vit D3 supplementation CRFs for adverse event reporting have been developed and included in the protocols submitted for IRB...Task VI.3 (mo 6-47): document acceptance of storage sample in the CGRP database and vit D3 study database The process for storage of sample in the
Analysis of translation using polysome profiling
Chassé, Héloïse; Boulben, Sandrine; Costache, Vlad; Cormier, Patrick
2017-01-01
Abstract During the past decade, there has been growing interest in the role of translational regulation of gene expression in many organisms. Polysome profiling has been developed to infer the translational status of a specific mRNA species or to analyze the translatome, i.e. the subset of mRNAs actively translated in a cell. Polysome profiling is especially suitable for emergent model organisms for which genomic data are limited. In this paper, we describe an optimized protocol for the purification of sea urchin polysomes and highlight the critical steps involved in polysome purification. We applied this protocol to obtain experimental results on translational regulation of mRNAs following fertilization. Our protocol should prove useful for integrating the study of the role of translational regulation in gene regulatory networks in any biological model. In addition, we demonstrate how to carry out high-throughput processing of polysome gradient fractions, for the simultaneous screening of multiple biological conditions and large-scale preparation of samples for next-generation sequencing. PMID:28180329
Star, Bastiaan; Nederbragt, Alexander J.; Hansen, Marianne H. S.; Skage, Morten; Gilfillan, Gregor D.; Bradbury, Ian R.; Pampoulie, Christophe; Stenseth, Nils Chr; Jakobsen, Kjetill S.; Jentoft, Sissel
2014-01-01
Degradation-specific processes and variation in laboratory protocols can bias the DNA sequence composition from samples of ancient or historic origin. Here, we identify a novel artifact in sequences from historic samples of Atlantic cod (Gadus morhua), which forms interrupted palindromes consisting of reverse complementary sequence at the 5′ and 3′-ends of sequencing reads. The palindromic sequences themselves have specific properties – the bases at the 5′-end align well to the reference genome, whereas extensive misalignments exists among the bases at the terminal 3′-end. The terminal 3′ bases are artificial extensions likely caused by the occurrence of hairpin loops in single stranded DNA (ssDNA), which can be ligated and amplified in particular library creation protocols. We propose that such hairpin loops allow the inclusion of erroneous nucleotides, specifically at the 3′-end of DNA strands, with the 5′-end of the same strand providing the template. We also find these palindromes in previously published ancient DNA (aDNA) datasets, albeit at varying and substantially lower frequencies. This artifact can negatively affect the yield of endogenous DNA in these types of samples and introduces sequence bias. PMID:24608104
1992-12-21
in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59
Johanson, Helene C; Hyland, Valentine; Wicking, Carol; Sturm, Richard A
2009-04-01
We describe here a method for DNA elution from buccal cells and whole blood both collected onto Whatman FTA technology, using methanol fixation followed by an elution PCR program. Extracted DNA is comparable in quality to published Whatman FTA protocols, as judged by PCR-based genotyping. Elution of DNA from the dried sample is a known rate-limiting step in the published Whatman FTA protocol; this method enables the use of each 3-mm punch of sample for several PCR reactions instead of the standard, one PCR reaction per sample punch. This optimized protocol therefore extends the usefulness and cost effectiveness of each buccal swab sample collected, when used for nucleic acid PCR and genotyping.
Parrilla, Inma; del Olmo, David; Sijses, Laurien; Martinez-Alborcia, María J; Cuello, Cristina; Vazquez, Juan M; Martinez, Emilio A; Roca, Jordi
2012-05-01
The present study aimed to evaluate the ability of spermatozoa from individual boar ejaculates to withstand different semen-processing techniques. Eighteen sperm-rich ejaculate samples from six boars (three per boar) were diluted in Beltsville Thawing Solution and split into three aliquots. The aliquots were (1) further diluted to 3×10(7) sperm/mL and stored as a liquid at 17°C for 72 h, (2) frozen-thawed (FT) at 1×10(9) sperm/mL using standard 0.5-mL straw protocols, or (3) sex-sorted with subsequent liquid storage (at 17°C for 6 h) or FT (2×10(7) sperm/mL using a standard 0.25-mL straw protocol). The sperm quality was evaluated based on total sperm motility (the CASA system), viability (plasma membrane integrity assessed using flow cytometry and the LIVE/DEAD Sperm Viability Kit), lipid peroxidation (assessed via indirect measurement of the generation of malondialdehyde (MDA) using the BIOXYTECH MDA-586 Assay Kit) and DNA fragmentation (sperm chromatin dispersion assessed using the Sperm-Sus-Halomax(®) test). Data were normalized to the values assessed for the fresh (for liquid-stored and FT samples) or the sorted semen samples (for liquid stored and the FT sorted spermatozoa). All of the four sperm-processing techniques affected sperm quality (P<0.01), regardless of the semen donor, with reduced percentages of motile and viable sperm and increased MDA generation and percentages of sperm with fragmented DNA. Significant (P<0.05) inter-boar (effect of boars within each semen-processing technique) and intra-boar (effect of semen-processing techniques within each boar) differences were evident for all of the sperm quality parameters assessed, indicating differences in the ability of spermatozoa from individual boars to withstand the semen-processing techniques. These results are the first evidence that ejaculate spermatozoa from individual boars can respond in a boar-dependent manner to different semen-processing techniques. Copyright © 2012 Elsevier B.V. All rights reserved.
Beikircher, Barbara; Mayr, Stefan
2016-01-01
A prerequisite for reliable hydraulic measurements is an accurate collection of the plant material. Thereby, the native hydraulic state of the sample has to be preserved during harvesting (i.e., cutting the plant or plant parts) and preparation (i.e., excising the target section). This is particularly difficult when harvesting has to be done under transpiring conditions. In this article, we present a harvesting and sampling protocol designed for hydraulic measurements on Malus domestica Borkh. and checked for possible sampling artefacts. To test for artefacts, we analysed the percentage loss of hydraulic conductivity, maximum specific conductivity and water contents of bark and wood of branches, taking into account conduit length, time of day of harvesting, different shoot ages and seasonal effects. Our results prove that use of appropriate protocols can avoid artefactual embolization or refilling even when the xylem is under tension at harvest. The presented protocol was developed for Malus but may also be applied for other angiosperms with similar anatomy and refilling characteristics. PMID:26705311
FISH-in-CHIPS: A Microfluidic Platform for Molecular Typing of Cancer Cells.
Perez-Toralla, Karla; Mottet, Guillaume; Tulukcuoglu-Guneri, Ezgi; Champ, Jérôme; Bidard, François-Clément; Pierga, Jean-Yves; Klijanienko, Jerzy; Draskovic, Irena; Malaquin, Laurent; Viovy, Jean-Louis; Descroix, Stéphanie
2017-01-01
Microfluidics offer powerful tools for the control, manipulation, and analysis of cells, in particular for the assessment of cell malignancy or the study of cell subpopulations. However, implementing complex biological protocols on chip remains a challenge. Sample preparation is often performed off chip using multiple manually performed steps, and protocols usually include different dehydration and drying steps that are not always compatible with a microfluidic format.Here, we report the implementation of a Fluorescence in situ Hybridization (FISH) protocol for the molecular typing of cancer cells in a simple and low-cost device. The geometry of the chip allows integrating the sample preparation steps to efficiently assess the genomic content of individual cells using a minute amount of sample. The FISH protocol can be fully automated, thus enabling its use in routine clinical practice.
Near-optimal protocols in complex nonequilibrium transformations
Gingrich, Todd R.; Rotskoff, Grant M.; Crooks, Gavin E.; ...
2016-08-29
The development of sophisticated experimental means to control nanoscale systems has motivated efforts to design driving protocols that minimize the energy dissipated to the environment. Computational models are a crucial tool in this practical challenge. In this paper, we describe a general method for sampling an ensemble of finite-time, nonequilibrium protocols biased toward a low average dissipation. In addition, we show that this scheme can be carried out very efficiently in several limiting cases. As an application, we sample the ensemble of low-dissipation protocols that invert the magnetization of a 2D Ising model and explore how the diversity of themore » protocols varies in response to constraints on the average dissipation. In this example, we find that there is a large set of protocols with average dissipation close to the optimal value, which we argue is a general phenomenon.« less
Exploring the Implementation of Steganography Protocols on Quantum Audio Signals
NASA Astrophysics Data System (ADS)
Chen, Kehan; Yan, Fei; Iliyasu, Abdullah M.; Zhao, Jianping
2018-02-01
Two quantum audio steganography (QAS) protocols are proposed, each of which manipulates or modifies the least significant qubit (LSQb) of the host quantum audio signal that is encoded as an FRQA (flexible representation of quantum audio) audio content. The first protocol (i.e. the conventional LSQb QAS protocol or simply the cLSQ stego protocol) is built on the exchanges between qubits encoding the quantum audio message and the LSQb of the amplitude information in the host quantum audio samples. In the second protocol, the embedding procedure to realize it implants information from a quantum audio message deep into the constraint-imposed most significant qubit (MSQb) of the host quantum audio samples, we refer to it as the pseudo MSQb QAS protocol or simply the pMSQ stego protocol. The cLSQ stego protocol is designed to guarantee high imperceptibility between the host quantum audio and its stego version, whereas the pMSQ stego protocol ensures that the resulting stego quantum audio signal is better immune to illicit tampering and copyright violations (a.k.a. robustness). Built on the circuit model of quantum computation, the circuit networks to execute the embedding and extraction algorithms of both QAS protocols are determined and simulation-based experiments are conducted to demonstrate their implementation. Outcomes attest that both protocols offer promising trade-offs in terms of imperceptibility and robustness.
Surinova, Silvia; Hüttenhain, Ruth; Chang, Ching-Yun; Espona, Lucia; Vitek, Olga; Aebersold, Ruedi
2013-08-01
Targeted proteomics based on selected reaction monitoring (SRM) mass spectrometry is commonly used for accurate and reproducible quantification of protein analytes in complex biological mixtures. Strictly hypothesis-driven, SRM assays quantify each targeted protein by collecting measurements on its peptide fragment ions, called transitions. To achieve sensitive and accurate quantitative results, experimental design and data analysis must consistently account for the variability of the quantified transitions. This consistency is especially important in large experiments, which increasingly require profiling up to hundreds of proteins over hundreds of samples. Here we describe a robust and automated workflow for the analysis of large quantitative SRM data sets that integrates data processing, statistical protein identification and quantification, and dissemination of the results. The integrated workflow combines three software tools: mProphet for peptide identification via probabilistic scoring; SRMstats for protein significance analysis with linear mixed-effect models; and PASSEL, a public repository for storage, retrieval and query of SRM data. The input requirements for the protocol are files with SRM traces in mzXML format, and a file with a list of transitions in a text tab-separated format. The protocol is especially suited for data with heavy isotope-labeled peptide internal standards. We demonstrate the protocol on a clinical data set in which the abundances of 35 biomarker candidates were profiled in 83 blood plasma samples of subjects with ovarian cancer or benign ovarian tumors. The time frame to realize the protocol is 1-2 weeks, depending on the number of replicates used in the experiment.
High-Throughput Cryopreservation of Plant Cell Cultures for Functional Genomics
Ogawa, Yoichi; Sakurai, Nozomu; Oikawa, Akira; Kai, Kosuke; Morishita, Yoshihiko; Mori, Kumiko; Moriya, Kanami; Fujii, Fumiko; Aoki, Koh; Suzuki, Hideyuki; Ohta, Daisaku; Saito, Kazuki; Shibata, Daisuke
2012-01-01
Suspension-cultured cell lines from plant species are useful for genetic engineering. However, maintenance of these lines is laborious, involves routine subculturing and hampers wider use of transgenic lines, especially when many lines are required for a high-throughput functional genomics application. Cryopreservation of these lines may reduce the need for subculturing. Here, we established a simple protocol for cryopreservation of cell lines from five commonly used plant species, Arabidopsis thaliana, Daucus carota, Lotus japonicus, Nicotiana tabacum and Oryza sativa. The LSP solution (2 M glycerol, 0.4 M sucrose and 86.9 mM proline) protected cells from damage during freezing and was only mildly toxic to cells kept at room temperature for at least 2 h. More than 100 samples were processed for freezing simultaneously. Initially, we determined the conditions for cryopreservation using a programmable freezer; we then developed a modified simple protocol that did not require a programmable freezer. In the simple protocol, a thick expanded polystyrene (EPS) container containing the vials with the cell–LSP solution mixtures was kept at −30°C for 6 h to cool the cells slowly (pre-freezing); samples from the EPS containers were then plunged into liquid nitrogen before long-term storage. Transgenic Arabidopsis cells were subjected to cryopreservation, thawed and then re-grown in culture; transcriptome and metabolome analyses indicated that there was no significant difference in gene expression or metabolism between cryopreserved cells and control cells. The simplicity of the protocol will accelerate the pace of research in functional plant genomics. PMID:22437846
Laurin, Nancy; Frégeau, Chantal
2012-01-01
The goal of this work was to optimize and validate a fast amplification protocol for the multiplex amplification of the STR loci included in AmpFlSTR(®) Profiler Plus(®) to expedite human DNA identification. By modifying the cycling conditions and by combining the use of a DNA polymerase optimized for high speed PCR (SpeedSTAR™ HS) and a more efficient thermal cycler instrument (Bio-RAD C1000™), we were able to reduce the amplification process from 4h to 26 min. No modification to the commercial AmpFlSTR(®) Profiler Plus(®) primer mix was required. When compared to the current Royal Canadian Mounted Police (RCMP) amplification protocol, no differences with regards to specificity, sensitivity, heterozygote peak height ratios and overall profile balance were noted. Moreover, complete concordance was obtained with profiles previously generated with the standard amplification protocol and minor alleles in mixture samples were reliably typed. An increase in n-4 stutter ratios (2.2% on average for all loci) was observed for profiles amplified with the fast protocol compared to the current procedure. Our results document the robustness of this rapid amplification protocol for STR profiling using the AmpFlSTR(®) Profiler Plus(®) primer set and demonstrate that comparable data can be obtained in substantially less time. This new approach could provide an alternative option to current multiplex STR typing amplification protocols in order to increase throughput or expedite time-sensitive cases. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Moulton, Stephen R.; Carter, James L.; Grotheer, Scott A.; Cuffney, Thomas F.; Short, Terry M.
2000-01-01
Qualitative and quantitative methods to process benthic macroinvertebrate (BMI) samples have been developed and tested by the U.S. Geological Survey?s National Water Quality Laboratory Biological Group. The qualitative processing method is based on visually sorting a sample for up to 2 hours. Sorting focuses on attaining organisms that are likely to result in taxonomic identifications to lower taxonomic levels (for example, Genus or Species). Immature and damaged organisms are also sorted when they are likely to result in unique determinations. The sorted sample remnant is scanned briefly by a second person to determine if obvious taxa were missed. The quantitative processing method is based on a fixed-count approach that targets some minimum count, such as 100 or 300 organisms. Organisms are sorted from randomly selected 5.1- by 5.1-centimeter parts of a gridded subsampling frame. The sorted remnant from each sample is resorted by a second individual for at least 10 percent of the original sort time. A large-rare organism search is performed on the unsorted remnant to sort BMI taxa that were not likely represented in the sorted grids. After either qualitatively or quantitatively sorting the sample, BMIs are identified by using one of three different types of taxonomic assessment. The Standard Taxonomic Assessment is comparable to the U.S. Environmental Protection Agency Rapid Bioassessment Protocol III and typically provides Genus- or Species-level taxonomic resolution. The Rapid Taxonomic Assessment is comparable to the U.S. Environmental Protection Agency Rapid Bioassessment Protocol II and provides Familylevel and higher taxonomic resolution. The Custom Taxonomic Assessment provides Species-level resolution whenever possible for groups identified to higher taxonomic levels by using the Standard Taxonomic Assessment. The consistent use of standardized designations and notes facilitates the interpretation of BMI data within and among water-quality studies. Taxonomic identifications are quality assured by verifying all referenced taxa and randomly reviewing 10 percent of the taxonomic identifications performed weekly by Biological Group taxonomists. Taxonomic errors discovered during this review are corrected. BMI data are reviewed for accuracy and completeness prior to release. BMI data are released phylogenetically in spreadsheet format and unprocessed abundances are corrected for laboratory and field subsampling when necessary.
Purifying, Separating, and Concentrating Cells From a Sample Low in Biomass
NASA Technical Reports Server (NTRS)
Benardini, James N.; LaDuc, Myron T.; Diamond, Rochelle
2012-01-01
Frequently there is an inability to process and analyze samples of low biomass due to limiting amounts of relevant biomaterial in the sample. Furthermore, molecular biological protocols geared towards increasing the density of recovered cells and biomolecules of interest, by their very nature, also concentrate unwanted inhibitory humic acids and other particulates that have an adversarial effect on downstream analysis. A novel and robust fluorescence-activated cell-sorting (FACS)-based technology has been developed for purifying (removing cells from sampling matrices), separating (based on size, density, morphology), and concentrating cells (spores, prokaryotic, eukaryotic) from a sample low in biomass. The technology capitalizes on fluorescent cell-sorting technologies to purify and concentrate bacterial cells from a low-biomass, high-volume sample. Over the past decade, cell-sorting detection systems have undergone enhancements and increased sensitivity, making bacterial cell sorting a feasible concept. Although there are many unknown limitations with regard to the applicability of this technology to environmental samples (smaller cells, few cells, mixed populations), dogmatic principles support the theoretical effectiveness of this technique upon thorough testing and proper optimization. Furthermore, the pilot study from which this report is based proved effective and demonstrated this technology capable of sorting and concentrating bacterial endospore and bacterial cells of varying size and morphology. Two commercial off-the-shelf bacterial counting kits were used to optimize a bacterial stain/dye FACS protocol. A LIVE/DEAD BacLight Viability and Counting Kit was used to distinguish between the live and dead cells. A Bacterial Counting Kit comprising SYTO BC (mixture of SYTO dyes) was employed as a broad-spectrum bacterial counting agent. Optimization using epifluorescence microscopy was performed with these two dye/stains. This refined protocol was further validated using varying ratios and mixtures of cells to ensure homogenous staining compared to that of individual cells, and were utilized for flow analyzer and FACS labeling. This technology focuses on the purification and concentration of cells from low-biomass spacecraft assembly facility samples. Currently, purification and concentration of low-biomass samples plague planetary protection downstream analyses. Having a capability to use flow cytometry to concentrate cells out of low-biomass, high-volume spacecraft/ facility sample extracts will be of extreme benefit to the fields of planetary protection and astrobiology. Successful research and development of this novel methodology will significantly increase the knowledge base for designing more effective cleaning protocols, and ultimately lead to a more empirical and true account of the microbial diversity present on spacecraft surfaces. Refined cleaning and an enhanced ability to resolve microbial diversity may decrease the overall cost of spacecraft assembly and/or provide a means to begin to assess challenging planetary protection missions.
Rapid assessment of forest canopy and light regime using smartphone hemispherical photography.
Bianchi, Simone; Cahalan, Christine; Hale, Sophie; Gibbons, James Michael
2017-12-01
Hemispherical photography (HP), implemented with cameras equipped with "fisheye" lenses, is a widely used method for describing forest canopies and light regimes. A promising technological advance is the availability of low-cost fisheye lenses for smartphone cameras. However, smartphone camera sensors cannot record a full hemisphere. We investigate whether smartphone HP is a cheaper and faster but still adequate operational alternative to traditional cameras for describing forest canopies and light regimes. We collected hemispherical pictures with both smartphone and traditional cameras in 223 forest sample points, across different overstory species and canopy densities. The smartphone image acquisition followed a faster and simpler protocol than that for the traditional camera. We automatically thresholded all images. We processed the traditional camera images for Canopy Openness (CO) and Site Factor estimation. For smartphone images, we took two pictures with different orientations per point and used two processing protocols: (i) we estimated and averaged total canopy gap from the two single pictures, and (ii) merging the two pictures together, we formed images closer to full hemispheres and estimated from them CO and Site Factors. We compared the same parameters obtained from different cameras and estimated generalized linear mixed models (GLMMs) between them. Total canopy gap estimated from the first processing protocol for smartphone pictures was on average significantly higher than CO estimated from traditional camera images, although with a consistent bias. Canopy Openness and Site Factors estimated from merged smartphone pictures of the second processing protocol were on average significantly higher than those from traditional cameras images, although with relatively little absolute differences and scatter. Smartphone HP is an acceptable alternative to HP using traditional cameras, providing similar results with a faster and cheaper methodology. Smartphone outputs can be directly used as they are for ecological studies, or converted with specific models for a better comparison to traditional cameras.
Jadidi, Masoud; Båth, Magnus; Nyrén, Sven
2018-04-09
To compare the quality of images obtained with two different protocols with different acquisition time and the influence from image post processing in a chest digital tomosynthesis (DTS) system. 20 patients with suspected lung cancer were imaged with a chest X-ray equipment with tomosynthesis option. Two examination protocols with different acquisition times (6.3 and 12 s) were performed on each patient. Both protocols were presented with two different image post-processing (standard DTS processing and more advanced processing optimised for chest radiography). Thus, 4 series from each patient, altogether 80 series, were presented anonymously and in a random order. Five observers rated the quality of the reconstructed section images according to predefined quality criteria in three different classes. Visual grading characteristics (VGC) was used to analyse the data and the area under the VGC curve (AUC VGC ) was used as figure-of-merit. The 12 s protocol and the standard DTS processing were used as references in the analyses. The protocol with 6.3 s acquisition time had a statistically significant advantage over the vendor-recommended protocol with 12 s acquisition time for the classes of criteria, Demarcation (AUC VGC = 0.56, p = 0.009) and Disturbance (AUC VGC = 0.58, p < 0.001). A similar value of AUC VGC was found also for the class Structure (definition of bone structures in the spine) (0.56) but it could not be statistically separated from 0.5 (p = 0.21). For the image processing, the VGC analysis showed a small but statistically significant advantage for the standard DTS processing over the more advanced processing for the classes of criteria Demarcation (AUC VGC = 0.45, p = 0.017) and Disturbance (AUC VGC = 0.43, p = 0.005). A similar value of AUC VGC was found also for the class Structure (0.46), but it could not be statistically separated from 0.5 (p = 0.31). The study indicates that the protocol with 6.3 s acquisition time yields slightly better image quality than the vender-recommended protocol with acquisition time 12 s for several anatomical structures. Furthermore, the standard gradation processing (the vendor-recommended post-processing for DTS), yields to some extent advantage over the gradation processing/multiobjective frequency processing/flexible noise control processing in terms of image quality for all classes of criteria. Advances in knowledge: The study proves that the image quality may be strongly affected by the selection of DTS protocol and that the vendor-recommended protocol may not always be the optimal choice.
USGS/EPA collection protocol for bacterial pathogens in soil
Griffin, Dale W.; Shaefer, F.L.; Charlena Bowling,; Dino Mattorano,; Tonya Nichols,; Erin Silvestri,
2014-01-01
This Sample Collection Procedure (SCP) describes the activities and considerations for the collection of bacterial pathogens from representative surface soil samples (0-5 cm). This sampling depth can be reached without the use of a drill rig, direct-push technology, or other mechanized equipment. This procedure can be used in most soil types but is limited to sampling at or near the ground surface. This protocol has components for two different types of sampling applications: (1) typical sampling, when there is no suspicion of contamination (e.g., surveillance or background studies); and (2) in response to known or suspected accidental contamination (e.g., the presence of animal carcasses). This protocol does not cover sampling in response to a suspected bioterrorist or intentional release event. Surface material is removed to the required depth (0-5 cm) and clean trowel or 50 ml sample tube is used to collect the sample. Sample containers are sealed, bagged, and shipped to the laboratory for analysis. Associated documentation, including a Field Data Log and Chain-of-Custody are also included in this document.
Keiter, David A.; Cunningham, Fred L.; Rhodes, Olin E.; Irwin, Brian J.; Beasley, James
2016-01-01
Collection of scat samples is common in wildlife research, particularly for genetic capture-mark-recapture applications. Due to high degradation rates of genetic material in scat, large numbers of samples must be collected to generate robust estimates. Optimization of sampling approaches to account for taxa-specific patterns of scat deposition is, therefore, necessary to ensure sufficient sample collection. While scat collection methods have been widely studied in carnivores, research to maximize scat collection and noninvasive sampling efficiency for social ungulates is lacking. Further, environmental factors or scat morphology may influence detection of scat by observers. We contrasted performance of novel radial search protocols with existing adaptive cluster sampling protocols to quantify differences in observed amounts of wild pig (Sus scrofa) scat. We also evaluated the effects of environmental (percentage of vegetative ground cover and occurrence of rain immediately prior to sampling) and scat characteristics (fecal pellet size and number) on the detectability of scat by observers. We found that 15- and 20-m radial search protocols resulted in greater numbers of scats encountered than the previously used adaptive cluster sampling approach across habitat types, and that fecal pellet size, number of fecal pellets, percent vegetative ground cover, and recent rain events were significant predictors of scat detection. Our results suggest that use of a fixed-width radial search protocol may increase the number of scats detected for wild pigs, or other social ungulates, allowing more robust estimation of population metrics using noninvasive genetic sampling methods. Further, as fecal pellet size affected scat detection, juvenile or smaller-sized animals may be less detectable than adult or large animals, which could introduce bias into abundance estimates. Knowledge of relationships between environmental variables and scat detection may allow researchers to optimize sampling protocols to maximize utility of noninvasive sampling for wild pigs and other social ungulates.
Keiter, David A; Cunningham, Fred L; Rhodes, Olin E; Irwin, Brian J; Beasley, James C
2016-01-01
Collection of scat samples is common in wildlife research, particularly for genetic capture-mark-recapture applications. Due to high degradation rates of genetic material in scat, large numbers of samples must be collected to generate robust estimates. Optimization of sampling approaches to account for taxa-specific patterns of scat deposition is, therefore, necessary to ensure sufficient sample collection. While scat collection methods have been widely studied in carnivores, research to maximize scat collection and noninvasive sampling efficiency for social ungulates is lacking. Further, environmental factors or scat morphology may influence detection of scat by observers. We contrasted performance of novel radial search protocols with existing adaptive cluster sampling protocols to quantify differences in observed amounts of wild pig (Sus scrofa) scat. We also evaluated the effects of environmental (percentage of vegetative ground cover and occurrence of rain immediately prior to sampling) and scat characteristics (fecal pellet size and number) on the detectability of scat by observers. We found that 15- and 20-m radial search protocols resulted in greater numbers of scats encountered than the previously used adaptive cluster sampling approach across habitat types, and that fecal pellet size, number of fecal pellets, percent vegetative ground cover, and recent rain events were significant predictors of scat detection. Our results suggest that use of a fixed-width radial search protocol may increase the number of scats detected for wild pigs, or other social ungulates, allowing more robust estimation of population metrics using noninvasive genetic sampling methods. Further, as fecal pellet size affected scat detection, juvenile or smaller-sized animals may be less detectable than adult or large animals, which could introduce bias into abundance estimates. Knowledge of relationships between environmental variables and scat detection may allow researchers to optimize sampling protocols to maximize utility of noninvasive sampling for wild pigs and other social ungulates.
An Organic Decontamination Method for Sampling Devices used in Life-detection Studies
NASA Technical Reports Server (NTRS)
Eigenbrode, Jennifer; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E.F.
2008-01-01
Organic decontamination of sampling and storage devices are crucial steps for life-detection, habitability, and ecological investigations of extremophiles living in the most inhospitable niches of Earth, Mars and elsewhere. However, one of the main stumbling blocks for Mars-analogue life-detection studies in terrestrial remote field-sites is the capability to clean instruments and sampling devices to organic levels consistent with null values. Here we present a new seven-step, multi-reagent cleaning and decontamination protocol that was adapted and tested on a glacial ice-coring device and on a rover-guided scoop used for sediment sampling both deployed multiple times during two field seasons of the Arctic Mars Analog Svalbard Expedition AMASE). The effectiveness of the protocols for both devices was tested by (1)in situ metabolic measurements via APT, (2)in situ lipopolysacchride (LPS) quantifications via low-level endotoxin assays, and(3) laboratory-based molecular detection via gas chromatography-mass spectrometry. Our results show that the combination and step-wise application of disinfectants with oxidative and solvation properties for sterilization are effective at removing cellular remnants and other organic traces to levels necessary for molecular organic- and life-detection studies. The validation of this seven-step protocol - specifically for ice sampling - allows us to proceed with confidence in kmskia4 analogue investigations of icy environments. However, results from a rover scoop test showed that this protocol is also suitable for null-level decontamination of sample acquisition devices. Thus, this protocol may be applicable to a variety of sampling devices and analytical instrumentation used for future astrobiology missions to Enceladus, and Europa, as well as for sample-return missions.
Nohara, Kazunari; Chen, Zheng; Yoo, Seung-Hee
2017-07-06
Chromatin immunoprecipitation (ChIP) is a powerful method to determine protein binding to chromatin DNA. Fiber-rich skeletal muscle, however, has been a challenge for ChIP due to technical difficulty in isolation of high-quality nuclei with minimal contamination of myofibrils. Previous protocols have attempted to purify nuclei before cross-linking, which incurs the risk of altered DNA-protein interaction during the prolonged nuclei preparation process. In the current protocol, we first cross-linked the skeletal muscle tissue collected from mice, and the tissues were minced and sonicated. Since we found that ultracentrifugation was not able to separate nuclei from myofibrils using cross-linked muscle tissue, we devised a sequential filtration procedure to obtain high-quality nuclei devoid of significant myofibril contamination. We subsequently prepared chromatin by using an ultrasonicator, and ChIP assays with anti-BMAL1 antibody revealed robust circadian binding pattern of BMAL1 to target gene promoters. This filtration protocol constitutes an easily applicable method to isolate high-quality nuclei from cross-linked skeletal muscle tissue, allowing consistent sample processing for circadian and other time-sensitive studies. In combination with next-generation sequencing (NGS), our method can be deployed for various mechanistic and genomic studies focusing on skeletal muscle function.
U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--LIST OF STANDARD OPERATING PROCEDURES
This document lists available protocols and SOPs for the U.S.-Mexico Border Program study. It identifies protocols and SOPs for the following study components: (1) Sample collection and field operations, (2) Sample analysis, (3) General laboratory procedures, (4) Quality Assuranc...
21 CFR 660.46 - Samples; protocols; official release.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Samples; protocols; official release. 660.46 Section 660.46 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) BIOLOGICS ADDITIONAL STANDARDS FOR DIAGNOSTIC SUBSTANCES FOR LABORATORY TESTS Hepatitis B Surface...
Preparation of Total RNA from Fission Yeast.
Bähler, Jürg; Wise, Jo Ann
2017-04-03
Treatment with hot phenol breaks open fission yeast cells and begins to strip away bound proteins from RNA. Deproteinization is completed by multiple extractions with chloroform/isoamyl alcohol and separation of the aqueous and organic phases using MaXtract gel, an inert material that acts as a physical barrier between the phases. The final step is concentration of the RNA by ethanol precipitation. The protocol can be used to prepare RNA from several cultures grown in parallel, but it is important not to process too many samples at once because delays can be detrimental to RNA quality. A reasonable number of samples to process at once would be three to four for microarray or RNA sequencing analyses and six for preliminary investigations of mutants implicated in RNA metabolism. © 2017 Cold Spring Harbor Laboratory Press.
Samimi, Goli; Trabert, Britton; Duggan, Máire A; Robinson, Jennifer L; Coa, Kisha I; Waibel, Elizabeth; Garcia, Edna; Minasian, Lori M; Sherman, Mark E
2018-03-01
Many high-grade serous carcinomas initiate in fallopian tubes as serous tubal intraepithelial carcinoma (STIC), a microscopic lesion identified with specimen processing according to the Sectioning and Extensive Examination of the Fimbria protocol (SEE-Fim). Given that the tubal origin of these cancers was recently recognized, we conducted a survey of pathology practices to assess processing protocols that are applied to gynecologic surgical pathology specimens in clinical contexts in which finding STIC might have different implications. We distributed a survey electronically to the American Society for Clinical Pathology list-serve to determine practice patterns and compared results between practice types by chi-square (χ2) tests for categorical variables. Free text comments were qualitatively reviewed. Survey responses were received from 159 laboratories (72 academic, 87 non-academic), which reported diverse specimen volumes and percentage of gynecologic samples. Overall, 74.1% of laboratories reported performing SEE-Fim for risk-reducing surgical specimens (82.5% academic versus 65.7% non-academic, p < 0.05). In specimens from surgery for benign indications in which initial microscopic sections showed an unanticipated suspicious finding, 75.9% of laboratories reported using SEE-Fim to process the remainder of the specimen (94.8% academic versus 76.4% non-academic, p < 0.01), and 84.6% submitted the entire fimbriae. Changes in the theories of pathogenesis of high-grade serous carcinoma have led to implementation of pathology specimen processing protocols that include detailed analysis of the fallopian tubes. These results have implications for interpreting trends in cancer incidence data and considering the feasibility of developing a bank of gynecologic tissues containing STIC or early cancer precursors. Published by Elsevier Inc.
Yan, Zhinong; Vorst, Keith L; Zhang, Lei; Ryser, Elliot T
2007-05-01
A novel one-ply composite tissue (CT) method using the Soleris (formerly BioSys) optical analysis system was compared with the conventional U.S. Department of Agriculture (USDA) environmental sponge enrichment method for recovery of Listeria from food contact surfaces and poultry-processing environments. Stainless steel and high-density polyethylene plates were inoculated to contain a six-strain L. monocytogenes cocktail at 10(4), 10(2), and 10 CFU per plate, whereas samples from naturally contaminated surfaces and floor drains from a poultry-processing facility were collected with CTs and environmental sponges. CT samples were transferred into Soleris system vials, and presumptive-positive samples were further confirmed. Sponge samples were processed for Listeria using the USDA culture method. L. monocytogenes recovery rates from inoculated stainless steel and polyethylene surfaces were then compared for the two methods in terms of sensitivity, specificity, and positive and negative predictive values. No significant differences (P > 0.05) were found between the two methods for recovery of L. monocytogenes from any of the inoculated stainless steel and polyethylene surfaces or environmental samples. Sensitivity, specificity, and overall accuracy of the CT-Soleris for recovery of Listeria from environmental samples were 83, 97, and 95%, respectively. Listeria was detected 2 to 3 days sooner with the CT-Soleris method than with the USDA culture method, thus supporting the increased efficacy of this new protocol for environmental sampling.
Holmberg, Rebecca C; Gindlesperger, Alissa; Stokes, Tinsley; Brady, Dane; Thakore, Nitu; Belgrader, Philip; Cooney, Christopher G; Chandler, Darrell P
2013-06-11
TruTip is a simple nucleic acid extraction technology whereby a porous, monolithic binding matrix is inserted into a pipette tip. The geometry of the monolith can be adapted for specific pipette tips ranging in volume from 1.0 to 5.0 ml. The large porosity of the monolith enables viscous or complex samples to readily pass through it with minimal fluidic backpressure. Bi-directional flow maximizes residence time between the monolith and sample, and enables large sample volumes to be processed within a single TruTip. The fundamental steps, irrespective of sample volume or TruTip geometry, include cell lysis, nucleic acid binding to the inner pores of the TruTip monolith, washing away unbound sample components and lysis buffers, and eluting purified and concentrated nucleic acids into an appropriate buffer. The attributes and adaptability of TruTip are demonstrated in three automated clinical sample processing protocols using an Eppendorf epMotion 5070, Hamilton STAR and STARplus liquid handling robots, including RNA isolation from nasopharyngeal aspirate, genomic DNA isolation from whole blood, and fetal DNA extraction and enrichment from large volumes of maternal plasma (respectively).
Ślączka-Wilk, Magdalena M; Włodarczyk, Elżbieta; Kaleniecka, Aleksandra; Zarzycki, Paweł K
2017-07-01
There is increasing interest in the development of simple analytical systems enabling the fast screening of target components in complex samples. A number of newly invented protocols are based on quasi separation techniques involving microfluidic paper-based analytical devices and/or micro total analysis systems. Under such conditions, the quantification of target components can be performed mainly due to selective detection. The main goal of this paper is to demonstrate that miniaturized planar chromatography has the capability to work as an efficient separation and quantification tool for the analysis of multiple targets within complex environmental samples isolated and concentrated using an optimized SPE method. In particular, we analyzed various samples collected from surface water ecosystems (lakes, rivers, and the Baltic Sea of Middle Pomerania in the northern part of Poland) in different seasons, as well as samples collected during key wastewater technological processes (originating from the "Jamno" wastewater treatment plant in Koszalin, Poland). We documented that the multiple detection of chromatographic spots on RP-18W microplates-under visible light, fluorescence, and fluorescence quenching conditions, and using the visualization reagent phosphomolybdic acid-enables fast and robust sample classification. The presented data reveal that the proposed micro-TLC system is useful, inexpensive, and can be considered as a complementary method for the fast control of treated sewage water discharged by a municipal wastewater treatment plant, particularly for the detection of low-molecular mass micropollutants with polarity ranging from estetrol to progesterone, as well as chlorophyll-related dyes. Due to the low consumption of mobile phases composed of water-alcohol binary mixtures (less than 1 mL/run for the simultaneous separation of up to nine samples), this method can be considered an environmentally friendly and green chemistry analytical tool. The described analytical protocol can be complementary to those involving classical column chromatography (HPLC) or various planar microfluidic devices.
2012-01-01
Background Variability among stallions in terms of semen cryopreservation quality renders it difficult to arrive at a standardized cryopreservation method. Different extenders and processing techniques (such us colloidal centrifugation) are used in order to optimize post-thaw sperm quality. Sperm chromatin integrity analysis is an effective tool for assessing such quality. The aim of the present study was to compare the effect of two single layer colloidal centrifugation protocols (prior to cryopreservation) in combination with three commercial freezing extenders on the post-thaw chromatin integrity of equine sperm samples at different post-thaw incubation (37°C) times (i.e., their DNA fragmentation dynamics). Results Post-thaw DNA fragmentation levels in semen samples subjected to either of the colloidal centrifugation protocols were significantly lower (p<0.05) immediately after thawing and after 4 h of incubation at 37°C compared to samples that underwent standard (control) centrifugation. The use of InraFreeze® extender was associated with significantly less DNA fragmentation than the use of Botu-Crio® extender at 6 h of incubation, and than the use of either Botu-Crio® or Gent® extender at 24 h of incubation (p<0.05). Conclusions These results suggest that single layer colloidal centrifugation performed with extended or raw semen prior to cryopreservation reduces DNA fragmentation during the first four hours after thawing. Further studies are needed to determine the influence of freezing extenders on equine sperm DNA fragmentation dynamics. PMID:23217215
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-03
... Collection; Comment Request; Protocol for Access to Tissue Specimen Samples From the National Marine Mammal Tissue Bank AGENCY: National Oceanic and Atmospheric Administration (NOAA), Commerce. ACTION: Notice... National Marine Mammal Tissue Bank (NMMTB) was established by the National Marine Fisheries Service (NMFS...
Development of reagents for immunoassay of Phytophthora ramorum in nursery water samples
Douglas G. Luster; Timothy Widmer; Michael McMahon; C. André Lévesque
2017-01-01
Current regulations under the August 6, 2014 USDA APHIS Official Regulatory Protocol (Confirmed Nursery Protocol: Version 8.2) for Nurseries Containing Plants Infected with Phytophthora ramorum mandates the sampling of water in affected nurseries to demonstrate they are free of P. ramorum. Currently, detection of
A MORE COST-EFFECTIVE EMAP BENTHIC MACROFAUNAL SAMPLING PROTOCOL
Benthic macrofaunal sampling protocols in the U.S. Environmental Protection Agency's Environmental Monitoring and Assessment Program (EMAP) are to collect 30 to 50 random benthic macrofauna [defined as animals retained on a 0.5 mm (East and Gulf Coasts, USA) or a 1.0 mm mesh siev...
21 CFR 660.6 - Samples; protocols; official release.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Research, determines that the reliability and consistency of the finished product can be assured with a... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Samples; protocols; official release. 660.6 Section 660.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES...
21 CFR 660.6 - Samples; protocols; official release.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Research, determines that the reliability and consistency of the finished product can be assured with a... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Samples; protocols; official release. 660.6 Section 660.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED...
STANDARD MEASUREMENT PROTOCOLS - FLORIDA RADON RESEARCH PROGRAM
The manual, in support of the Florida Radon Research Program, contains standard protocols for key measurements where data quality is vital to the program. t contains two sections. he first section, soil measurements, contains field sampling protocols for soil gas permeability and...
Sperm Cell Population Dynamics in Ram Semen during the Cryopreservation Process
Ramón, Manuel; Pérez-Guzmán, M. Dolores; Jiménez-Rabadán, Pilar; Esteso, Milagros C.; García-Álvarez, Olga; Maroto-Morales, Alejandro; Anel-López, Luis; Soler, Ana J.; Fernández-Santos, M. Rocío; Garde, J. Julián
2013-01-01
Background Sperm cryopreservation has become an indispensable tool in biology. Initially, studies were aimed towards the development of efficient freezing protocols in different species that would allow for an efficient storage of semen samples for long periods of time, ensuring its viability. Nowadays, it is widely known that an important individual component exists in the cryoresistance of semen, and efforts are aimed at identifying those sperm characteristics that may allow us to predict this cryoresistance. This knowledge would lead, ultimately, to the design of optimized freezing protocols for the sperm characteristics of each male. Methodology/Principal Findings We have evaluated the changes that occur in the sperm head dimensions throughout the cryopreservation process. We have found three different patterns of response, each of one related to a different sperm quality at thawing. We have been able to characterize males based on these patterns. For each male, its pattern remained constant among different ejaculates. This latter would imply that males always respond in the same way to freezing, giving even more importance to this sperm feature. Conclusions/Significance Changes in the sperm head during cryopreservation process have resulted useful to identify the ability of semen of males for freezing. We suggest that analyses of these response patterns would represent an important tool to characterize the cryoresistance of males when implemented within breeding programs. We also propose follow-up experiments to examine the outcomes of the use of different freezing protocols depending on the pattern of response of males. PMID:23544054
Scafaro, Andrew P; Negrini, A Clarissa A; O'Leary, Brendan; Rashid, F Azzahra Ahmad; Hayes, Lucy; Fan, Yuzhen; Zhang, You; Chochois, Vincent; Badger, Murray R; Millar, A Harvey; Atkin, Owen K
2017-01-01
Mitochondrial respiration in the dark ( R dark ) is a critical plant physiological process, and hence a reliable, efficient and high-throughput method of measuring variation in rates of R dark is essential for agronomic and ecological studies. However, currently methods used to measure R dark in plant tissues are typically low throughput. We assessed a high-throughput automated fluorophore system of detecting multiple O 2 consumption rates. The fluorophore technique was compared with O 2 -electrodes, infrared gas analysers (IRGA), and membrane inlet mass spectrometry, to determine accuracy and speed of detecting respiratory fluxes. The high-throughput fluorophore system provided stable measurements of R dark in detached leaf and root tissues over many hours. High-throughput potential was evident in that the fluorophore system was 10 to 26-fold faster per sample measurement than other conventional methods. The versatility of the technique was evident in its enabling: (1) rapid screening of R dark in 138 genotypes of wheat; and, (2) quantification of rarely-assessed whole-plant R dark through dissection and simultaneous measurements of above- and below-ground organs. Variation in absolute R dark was observed between techniques, likely due to variation in sample conditions (i.e. liquid vs. gas-phase, open vs. closed systems), indicating that comparisons between studies using different measuring apparatus may not be feasible. However, the high-throughput protocol we present provided similar values of R dark to the most commonly used IRGA instrument currently employed by plant scientists. Together with the greater than tenfold increase in sample processing speed, we conclude that the high-throughput protocol enables reliable, stable and reproducible measurements of R dark on multiple samples simultaneously, irrespective of plant or tissue type.
Román, Belén; González-Verdejo, Clara I; Peña, Francisco; Nadal, Salvador; Gómez, Pedro
2012-01-01
Quality and integrity of RNA are critical for transcription studies in plant molecular biology. In squash fruit and other high water content crops, the grinding of tissue with mortar and pestle in liquid nitrogen fails to produce a homogeneous and fine powered sample desirable to ensure a good penetration of the extraction reagent. To develop an improved pulverisation method to facilitate the homogenisation process of squash fruit tissue prior to RNA extraction without reducing quality and yield of the extracted RNA. Three methods of pulverisation, each followed by the same extraction protocol, were compared. The first approach consisted of the lyophilisation of the sample in order to remove the excess of water before grinding, the second one used a cryogenic mill and the control one a mortar grinding of frozen tissue. The quality of the isolated RNA was tested by carrying out a quantitative real time downstream amplification. In the three situations considered, mean values for A(260) /A(280) indicated minimal interference by proteins and RNA quality indicator (RQI) values were considered appropriate for quantitative real-time polymerase chain reaction (qRT-PCR) amplification. Successful qRT-PCR amplifications were obtained with cDNA isolated with the three protocols. Both apparatus can improve and facilitate the grinding step in the RNA extraction process in zucchini, resulting in isolated RNA of high quality and integrity as revealed by qRT-PCR downstream application. This is apparently the first time that a cryogenic mill has been used to prepare fruit samples for RNA extraction, thereby improving the sampling strategy because the fine powder obtained represents a homogeneous mix of the organ tissue. Copyright © 2012 John Wiley & Sons, Ltd.
Representing the work of medical protocols for organizational simulation.
Fridsma, D. B.
1998-01-01
Developing and implementing patient care protocols within a specific organizational setting requires knowledge of the protocol, the organization, and the way in which the organization does its work. Computer-based simulation tools have been used in many industries to provide managers with prospective insight into problems of work process and organization design mismatch. Many of these simulation tools are designed for well-understood routine work processes in which there are few contingent tasks. In this paper, we describe theoretic that make it possible to simulate medical protocols using an information-processing theory framework. These simulations will allow medical administrators to test different protocol and organizational designs before actually using them within a particular clinical setting. PMID:9929231
Brandão, Marcelo L L; Almeida, Davi O; Bispo, Fernanda C P; Bricio, Silvia M L; Marin, Victor A; Miagostovich, Marize P
2014-05-01
This study aimed to assess the microbiological contamination of lettuces commercialized in Rio de Janeiro, Brazil, in order to investigate detection of norovirus genogroup II (NoV GII), Salmonella spp., total and fecal coliforms, such as Escherichia coli. For NoV detection samples were processed using the adsorption-elution concentration method associated to real-time quantitative polymerase chain reaction (qPCR). A total of 90 samples of lettuce including 30 whole fresh lettuces, 30 minimally processed (MP) lettuces, and 30 raw ready-to-eat (RTE) lettuce salads were randomly collected from different supermarkets (fresh and MP lettuce samples), food services, and self-service restaurants (RTE lettuce salads), all located in Rio de Janeiro, Brazil, from October 2010 to December 2011. NoV GII was not detected and PP7 bacteriophage used as internal control process (ICP) was recovered in 40.0%, 86.7%, and 76.7% of those samples, respectively. Salmonella spp. was not detected although fecal contamination has been observed by fecal coliform concentrations higher than 10(2) most probable number/g. E. coli was detected in 70.0%, 6.7%, and 30.0% of fresh, MP, and RTE samples, respectively. This study highlights the need to improve hygiene procedures at all stages of vegetable production and to show PP7 bacteriophage as an ICP for recovering RNA viruses' methods from MP and RTE lettuce samples, encouraging the evaluation of new protocols that facilitate the establishment of methodologies for NoV detection in a greater number of food microbiology laboratories. The PP7 bacteriophage can be used as an internal control process in methods for recovering RNA viruses from minimally processed and ready-to-eat lettuce samples. © 2014 Institute of Food Technologists®
Microbial Groundwater Sampling Protocol for Fecal-Rich Environments
Harter, Thomas; Watanabe, Naoko; Li, Xunde; Atwill, Edward R; Samuels, William
2014-01-01
Inherently, confined animal farming operations (CAFOs) and other intense fecal-rich environments are potential sources of groundwater contamination by enteric pathogens. The ubiquity of microbial matter poses unique technical challenges in addition to economic constraints when sampling wells in such environments. In this paper, we evaluate a groundwater sampling protocol that relies on extended purging with a portable submersible stainless steel pump and Teflon® tubing as an alternative to equipment sterilization. The protocol allows for collecting a large number of samples quickly, relatively inexpensively, and under field conditions with limited access to capacity for sterilizing equipment. The protocol is tested on CAFO monitoring wells and considers three cross-contamination sources: equipment, wellbore, and ambient air. For the assessment, we use Enterococcus, a ubiquitous fecal indicator bacterium (FIB), in laboratory and field tests with spiked and blank samples, and in an extensive, multi-year field sampling campaign on 17 wells within 2 CAFOs. The assessment shows that extended purging can successfully control for equipment cross-contamination, but also controls for significant contamination of the well-head, within the well casing and within the immediate aquifer vicinity of the well-screen. Importantly, our tests further indicate that Enterococcus is frequently entrained in water samples when exposed to ambient air at a CAFO during sample collection. Wellbore and air contamination pose separate challenges in the design of groundwater monitoring strategies on CAFOs that are not addressed by equipment sterilization, but require adequate QA/QC procedures and can be addressed by the proposed sampling strategy. PMID:24903186
All Plasma Products Are Not Created Equal: Characterizing Differences Between Plasma Products
2015-06-01
2011;6(4):e18812. 24. Chandler WL. Microparticle counts in platelet - rich and platelet -free plasma , effect of centrifugation and sample-processing protocols...used throughout the article for this product. Laboratory Methods Platelet -Poor Plasma Preparation Platelet -poor plasma (PPP) was prepared by centrifuga... platelets , respectively. Flow cytometry was performed as described by Matijevic et al.4 Briefly, 10 KL of each plasma product was incubated with
EDRN Standard Operating Procedures (SOP) — EDRN Public Portal
The NCI’s Early Detection Research Network is developing a number of standard operating procedures for assays, methods, and protocols for collection and processing of biological samples, and other reference materials to assist investigators to conduct experiments in a consistent, reliable manner. These SOPs are established by the investigators of the Early Detection Research Network to maintain constancy throughout the Network. These SOPs represent neither a consensus, nor are the recommendations of NCI.
Sun, Ruoyu; Enrico, Maxime; Heimbürger, Lars-Eric; Scott, Clint; Sonke, Jeroen E
2013-08-01
High-precision mercury (Hg) stable isotopic analysis requires relatively large amounts of Hg (>10 ng). Consequently, the extraction of Hg from natural samples with low Hg concentrations (<1-20 ng/g) by wet chemistry is challenging. Combustion-trapping techniques have been shown to be an appropriate alternative. Here, we detail a modified off-line Hg pre-concentration protocol that is based on combustion and trapping. Hg in solid samples is thermally reduced and volatilized in a pure O2 stream using a temperature-programmed combustion furnace. A second furnace, kept at 1,000 °C, decomposes combustion products into H2O, CO2, SO2, etc. The O2 carrier gas, including combustion products and elemental Hg, is then purged into a 40% (v/v) acid-trapping solution. The method was optimized by assessing the variations of Hg pre-concentration efficiency and Hg isotopic compositions as a function of acid ratio, gas flow rate, and temperature ramp rate for two certified reference materials of bituminous coals. Acid ratios of 2HNO3/1HCl (v/v), 25 mL/min O2 flow rate, and a dynamic temperature ramp rate (15 °C/min for 25-150 and 600-900 °C; 2.5 °C/min for 150-600 °C) were found to give optimal results. Hg step-release experiments indicated that significant Hg isotopic fractionation occurred during sample combustion. However, no systematic dependence of Hg isotopic compositions on Hg recovery (81-102%) was observed. The tested 340 samples including coal, coal-associated rocks, fly ash, bottom ash, peat, and black shale sediments with Hg concentrations varying from <5 ng/g to 10 μg/g showed that most Hg recoveries were within the acceptable range of 80-120%. This protocol has the advantages of a short sample processing time (∼3.5 h) and limited transfer of residual sample matrix into the Hg trapping solution. This in turn limits matrix interferences on the Hg reduction efficiency of the cold vapor generator used for Hg isotopic analysis.
Ahn, Jae-Jun; Sanyal, Bhaskar; Akram, Kashif; Kwon, Joong-Ho
2014-11-19
Different spices such as turmeric, oregano, and cinnamon were γ-irradiated at 1 and 10 kGy. The electron paramagnetic resonance (EPR) spectra of the nonirradiated samples were characterized by a single central signal (g = 2.006), the intensity of which was significantly enhanced upon irradiation. The EPR spectra of the irradiated spice samples were characterized by an additional triplet signal at g = 2.006 with a hyperfine coupling constant of 3 mT, associated with the cellulose radical. EPR analysis on various sample pretreatments in the irradiated spice samples demonstrated that the spectral features of the cellulose radical varied on the basis of the pretreatment protocol. Alcoholic extraction pretreatment produced considerable improvements of the EPR signals of the irradiated spice samples relative to the conventional oven and freeze-drying techniques. The alcoholic extraction process is therefore proposed as the most suitable sample pretreatment for unambiguous detection of irradiated spices by EPR spectroscopy.
A novel method of genomic DNA extraction for Cactaceae1
Fehlberg, Shannon D.; Allen, Jessica M.; Church, Kathleen
2013-01-01
• Premise of the study: Genetic studies of Cactaceae can at times be impeded by difficult sampling logistics and/or high mucilage content in tissues. Simplifying sampling and DNA isolation through the use of cactus spines has not previously been investigated. • Methods and Results: Several protocols for extracting DNA from spines were tested and modified to maximize yield, amplification, and sequencing. Sampling of and extraction from spines resulted in a simplified protocol overall and complete avoidance of mucilage as compared to typical tissue extractions. Sequences from one nuclear and three plastid regions were obtained across eight genera and 20 species of cacti using DNA extracted from spines. • Conclusions: Genomic DNA useful for amplification and sequencing can be obtained from cactus spines. The protocols described here are valuable for any cactus species, but are particularly useful for investigators interested in sampling living collections, extensive field sampling, and/or conservation genetic studies. PMID:25202521
A General Purpose Connections type CTI Server Based on SIP Protocol and Its Implementation
NASA Astrophysics Data System (ADS)
Watanabe, Toru; Koizumi, Hisao
In this paper, we propose a general purpose connections type CTI (Computer Telephony Integration) server that provides various CTI services such as voice logging where the CTI server communicates with IP-PBX using the SIP (Session Initiation Protocol), and accumulates voice packets of external line telephone call flowing between an IP telephone for extension and a VoIP gateway connected to outside line networks. The CTI server realizes CTI services such as voice logging, telephone conference, or IVR (interactive voice response) with accumulating and processing voice packets sampled. Furthermore, the CTI server incorporates a web server function which can provide various CTI services such as a Web telephone directory via a Web browser to PCs, cellular telephones or smart-phones in mobile environments.
Study of microtip-based extraction and purification of DNA from human samples for portable devices
NASA Astrophysics Data System (ADS)
Fotouhi, Gareth
DNA sample preparation is essential for genetic analysis. However, rapid and easy-to-use methods are a major challenge to obtaining genetic information. Furthermore, DNA sample preparation technology must follow the growing need for point-of-care (POC) diagnostics. The current use of centrifuges, large robots, and laboratory-intensive protocols has to be minimized to meet the global challenge of limited access healthcare by bringing the lab to patients through POC devices. To address these challenges, a novel extraction method of genomic DNA from human samples is presented by using heat-cured polyethyleneimine-coated microtips generating a high electric field. The microtip extraction method is based on recent work using an electric field and capillary action integrated into an automated device. The main challenges to the method are: (1) to obtain a stable microtip surface for the controlled capture and release of DNA and (2) to improve the recovery of DNA from samples with a high concentration of inhibitors, such as human samples. The present study addresses these challenges by investigating the heat curing of polyethyleneimine (PEI) coated on the surface of the microtip. Heat-cured PEI-coated microtips are shown to control the capture and release of DNA. Protocols are developed for the extraction and purification of DNA from human samples. Heat-cured PEI-coated microtip methods of DNA sample preparation are used to extract genomic DNA from human samples. It is discovered through experiment that heat curing of a PEI layer on a gold-coated surface below 150°C could inhibit the signal of polymerase chain reaction (PCR). Below 150°C, the PEI layer is not completely cured and dissolved off the gold-coated surface. Dissolved PEI binds with DNA to inhibit PCR. Heat curing of a PEI layer above 150°C on a gold-coated surface prevents inhibition to PCR and gel electrophoresis. In comparison to gold-coated microtips, the 225°C-cured PEI-coated microtips improve the recovery of DNA to 45% efficiency. Furthermore, the 225°C-cured PEI-coated microtips recover more DNA than gold-coated microtips when the surface is washed. Heat-cured (225°C) PEI-coated microtips are used for the recovery of human genomic DNA from whole blood. A washing protocol is developed to remove inhibiting particles bound to the PEI-coated microtip surface after DNA extraction. From 1.25 muL of whole blood, an average of 1.83 ng of human genomic DNA is captured, purified, and released using a 225°C-cured PEI-coated microtip in less than 30 minutes. The extracted DNA is profiled by short tandem repeat analysis (STR). For forensic and medical applications, genomic DNA is extracted from dried samples using heat-cured PEI-coated microtips that are integrated into an automated device. DNA extraction from dried samples is critical for forensics. The use of dried samples in the medical field is increasing because dried samples are convenient for storage, biosafety, and contamination. The main challenge is the time required to properly extract DNA in a purified form. Typically, a 1 hour incubation period is required to complete this process. Overnight incubation is sometimes necessary. To address this challenge, a pre-extraction washing step is investigated to remove inhibiting particles from dried blood spots (DBS) before DNA is released from dried form into solution for microtip extraction. The developed protocol is expanded to extract DNA from a variety of dried samples including nasal swabs, buccal swabs, and other forensic samples. In comparison to a commercial kit, the microtip-based extraction reduced the processing time from 1.5 hours to 30 minutes or less with an equivalent concentration of extracted DNA from dried blood spots. The developed assay will benefit genetic studies on newborn screening, forensic investigation, and POC diagnostics.
Kopek, Benjamin G.; Paez-Segala, Maria G.; Shtengel, Gleb; Sochacki, Kem A.; Sun, Mei G.; Wang, Yalin; Xu, C. Shan; van Engelenburg, Schuyler B.; Taraska, Justin W.; Looger, Loren L.; Hess, Harald F.
2017-01-01
Our groups have recently developed related approaches for sample preparation for super-resolution imaging within endogenous cellular environments using correlative light and electron microscopy (CLEM). Four distinct techniques for preparing and acquiring super-resolution CLEM datasets on aldehyde-fixed specimens are provided, including Tokuyasu cryosectioning, whole-cell mount, cell unroofing and platinum replication, and resin embedding and sectioning. Choice of the best protocol for a given application depends on a number of criteria that are discussed in detail. Tokuyasu cryosectioning is relatively rapid but is limited to small, delicate specimens. Whole-cell mount has the simplest sample preparation but is restricted to surface structures. Cell unroofing and platinum replica creates high-contrast, 3-dimensional images of the cytoplasmic surface of the plasma membrane, but is more challenging than whole-cell mount. Resin embedding permits serial sectioning of large samples, but is limited to osmium-resistant probes, and is technically difficult. Expected results from these protocols include super-resolution localization (~10–50 nm) of fluorescent targets within the context of electron microscopy ultrastructure, which can help address cell biological questions. These protocols can be completed in 2–7 days, are compatible with a number of super-resolution imaging protocols, and are broadly applicable across biology. PMID:28384138
A rapid and efficient DNA extraction protocol from fresh and frozen human blood samples.
Guha, Pokhraj; Das, Avishek; Dutta, Somit; Chaudhuri, Tapas Kumar
2018-01-01
Different methods available for extraction of human genomic DNA suffer from one or more drawbacks including low yield, compromised quality, cost, time consumption, use of toxic organic solvents, and many more. Herein, we aimed to develop a method to extract DNA from 500 μL of fresh or frozen human blood. Five hundred microliters of fresh and frozen human blood samples were used for standardization of the extraction procedure. Absorbance at 260 and 280 nm, respectively, (A 260 /A 280 ) were estimated to check the quality and quantity of the extracted DNA sample. Qualitative assessment of the extracted DNA was checked by Polymerase Chain reaction and double digestion of the DNA sample. Our protocol resulted in average yield of 22±2.97 μg and 20.5±3.97 μg from 500 μL of fresh and frozen blood, respectively, which were comparable to many reference protocols and kits. Besides yielding bulk amount of DNA, our protocol is rapid, economical, and avoids toxic organic solvents such as Phenol. Due to unaffected quality, the DNA is suitable for downstream applications. The protocol may also be useful for pursuing basic molecular researches in laboratories having limited funds. © 2017 Wiley Periodicals, Inc.
Use of a Filter Cartridge for Filtration of Water Samples and Extraction of Environmental DNA.
Miya, Masaki; Minamoto, Toshifumi; Yamanaka, Hiroki; Oka, Shin-Ichiro; Sato, Keiichi; Yamamoto, Satoshi; Sado, Tetsuya; Doi, Hideyuki
2016-11-25
Recent studies demonstrated the use of environmental DNA (eDNA) from fishes to be appropriate as a non-invasive monitoring tool. Most of these studies employed disk fiber filters to collect eDNA from water samples, although a number of microbial studies in aquatic environments have employed filter cartridges, because the cartridge has the advantage of accommodating large water volumes and of overall ease of use. Here we provide a protocol for filtration of water samples using the filter cartridge and extraction of eDNA from the filter without having to cut open the housing. The main portions of this protocol consists of 1) filtration of water samples (water volumes ≤4 L or >4 L); (2) extraction of DNA on the filter using a roller shaker placed in a preheated incubator; and (3) purification of DNA using a commercial kit. With the use of this and previously-used protocols, we perform metabarcoding analysis of eDNA taken from a huge aquarium tank (7,500 m 3 ) with known species composition, and show the number of detected species per library from the two protocols as the representative results. This protocol has been developed for metabarcoding eDNA from fishes, but is also applicable to eDNA from other organisms.
Lo Torto, Federico; Relucenti, Michela; Familiari, Giuseppe; Vaia, Nicola; Casella, Donato; Matassa, Roberto; Miglietta, Selenia; Marinozzi, Franco; Bini, Fabiano; Fratoddi, Ilaria; Sciubba, Fabio; Cassese, Raffaele; Tombolini, Vincenzo; Ribuffo, Diego
2018-05-17
The pathogenic mechanism underlying capsular contracture is still unknown. It is certainly a multifactorial process, resulting from human body reaction, biofilm activation, bacteremic seeding, or silicone exposure. The scope of the present article is to investigate the effect of hypofractionated radiotherapy protocol (2.66 Gy × 16 sessions) both on silicone and polyurethane breast implants. Silicone implants and polyurethane underwent irradiation according to a hypofractionated radiotherapy protocol for the treatment of breast cancer. After irradiation implant shells underwent mechanical, chemical, and microstructural evaluation by means of tensile testing, infrared spectra in attenuated total reflectance mode, nuclear magnetic resonance, and field emission scanning electron microscopy. At superficial analysis, irradiated silicone samples show several visible secondary and tertiary blebs. Polyurethane implants showed an open cell structure, which closely resembles a sponge. Morphological observation of struts from treated polyurethane sample shows a more compact structure, with significantly shorter and thicker struts compared with untreated sample. The infrared spectra in attenuated total reflectance mode spectra of irradiated and control samples were compared either for silicon and polyurethane samples. In the case of silicone-based membranes, treated and control specimens showed similar bands, with little differences in the treated one. Nuclear magnetic resonance spectra on the fraction soluble in CDCl3 support these observations. Tensile tests on silicone samples showed a softer behavior of the treated ones. Tensile tests on Polyurethane samples showed no significant differences. Polyurethane implants seem to be more resistant to radiotherapy damage, whereas silicone prosthesis showed more structural, mechanical, and chemical modifications.
Nagy, Bálint; Bán, Zoltán; Papp, Zoltán
2005-10-01
The quality and the quantity of isolated DNA have an effect on PCR amplifications. The authors studied three DNA isolation protocols (resin binding method using fresh and frozen amniotic fluid samples, and silica adsorption method using fresh samples) on the quantity and on the quality of the isolated DNA. Amniotic fluid samples were obtained from 20 pregnant women. The isolated DNA concentrations were determined by real-time fluorimeter using SYBRGreen I method. Each sample was studied for the presence of 8 STR markers. The authors compared the number of the detected alleles, electrophoretograms and peak areas. There was a significant difference between the concentration of the obtained DNA and in the peak areas between the three isolation protocols. The numbers of detected alleles were different, we observed the most allele drop outs in the resin type DNA isolation protocol from the fresh sample (detected allele numbers 182), followed by resin binding protocol from the frozen samples (detected allele number 243) and by the silica adsorption method (detected allele number 264). The authors demonstrated that the DNA isolation method has an effect on the quantity and quality of the isolated DNA, and on further PCR amplifications.
Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.
Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María
2017-01-01
This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.
Evaluation of Patient Handoff Methods on an Inpatient Teaching Service
Craig, Steven R.; Smith, Hayden L.; Downen, A. Matthew; Yost, W. John
2012-01-01
Background The patient handoff process can be a highly variable and unstructured period at risk for communication errors. The morning sign-in process used by resident physicians at teaching hospitals typically involves less rigorous handoff protocols than the resident evening sign-out process. Little research has been conducted on best practices for handoffs during morning sign-in exchanges between resident physicians. Research must evaluate optimal protocols for the resident morning sign-in process. Methods Three morning handoff protocols consisting of written, electronic, and face-to-face methods were implemented over 3 study phases during an academic year. Study participants included all interns covering the internal medicine inpatient teaching service at a tertiary hospital. Study measures entailed intern survey-based interviews analyzed for failures in handoff protocols with or without missed pertinent information. Descriptive and comparative analyses examined study phase differences. Results A scheduled face-to-face handoff process had the fewest protocol deviations and demonstrated best communication of essential patient care information between cross-covering teams compared to written and electronic sign-in protocols. Conclusion Intern patient handoffs were more reliable when the sign-in protocol included scheduled face-to-face meetings. This method provided the best communication of patient care information and allowed for open exchanges of information. PMID:23267259
Crystallization of Macromolecules
Friedmann, David; Messick, Troy; Marmorstein, Ronen
2014-01-01
X-ray crystallography has evolved into a very powerful tool to determine the three-dimensional structure of macromolecules and macromolecular complexes. The major bottleneck in structure determination by X-ray crystallography is the preparation of suitable crystalline samples. This unit outlines steps for the crystallization of a macromolecule, starting with a purified, homogeneous sample. The first protocols describe preparation of the macromolecular sample (i.e., proteins, nucleic acids, and macromolecular complexes). The preparation and assessment of crystallization trials is then described, along with a protocol for confirming whether the crystals obtained are composed of macromolecule as opposed to a crystallization reagent . Next, the optimization of crystallization conditions is presented. Finally, protocols that facilitate the growth of larger crystals through seeding are described. PMID:22045560
Palacio-Bielsa, Ana; Cubero, Jaime; Cambra, Miguel A; Collados, Raquel; Berruete, Isabel M; López, María M
2011-01-01
Xanthomonas arboricola pv. pruni, the causal agent of bacterial spot disease of stone fruit, is considered a quarantine organism by the European Union and the European and Mediterranean Plant Protection Organization (EPPO). The bacterium can undergo an epiphytic phase and/or be latent and can be transmitted by plant material, but currently, only visual inspections are used to certify plants as being X. arboricola pv. pruni free. A novel and highly sensitive real-time TaqMan PCR detection protocol was designed based on a sequence of a gene for a putative protein related to an ABC transporter ATP-binding system in X. arboricola pv. pruni. Pathogen detection can be completed within a few hours with a sensitivity of 10(2) CFU ml(-1), thus surpassing the sensitivity of the existing conventional PCR. Specificity was assessed for X. arboricola pv. pruni strains from different origins as well as for closely related Xanthomonas species, non-Xanthomonas species, saprophytic bacteria, and healthy Prunus samples. The efficiency of the developed protocol was evaluated with field samples of 14 Prunus species and rootstocks. For symptomatic leaf samples, the protocol was very efficient even when washed tissues of the leaves were directly amplified without any previous DNA extraction. For samples of 117 asymptomatic leaves and 285 buds, the protocol was more efficient after a simple DNA extraction, and X. arboricola pv. pruni was detected in 9.4% and 9.1% of the 402 samples analyzed, respectively, demonstrating its frequent epiphytic or endophytic phase. This newly developed real-time PCR protocol can be used as a quantitative assay, offers a reliable and sensitive test for X. arboricola pv. pruni, and is suitable as a screening test for symptomatic as well as asymptomatic plant material.
Consensus for second-order multi-agent systems with position sampled data
NASA Astrophysics Data System (ADS)
Wang, Rusheng; Gao, Lixin; Chen, Wenhai; Dai, Dameng
2016-10-01
In this paper, the consensus problem with position sampled data for second-order multi-agent systems is investigated. The interaction topology among the agents is depicted by a directed graph. The full-order and reduced-order observers with position sampled data are proposed, by which two kinds of sampled data-based consensus protocols are constructed. With the provided sampled protocols, the consensus convergence analysis of a continuous-time multi-agent system is equivalently transformed into that of a discrete-time system. Then, by using matrix theory and a sampled control analysis method, some sufficient and necessary consensus conditions based on the coupling parameters, spectrum of the Laplacian matrix and sampling period are obtained. While the sampling period tends to zero, our established necessary and sufficient conditions are degenerated to the continuous-time protocol case, which are consistent with the existing result for the continuous-time case. Finally, the effectiveness of our established results is illustrated by a simple simulation example. Project supported by the Natural Science Foundation of Zhejiang Province, China (Grant No. LY13F030005) and the National Natural Science Foundation of China (Grant No. 61501331).
Robust DNA Isolation and High-throughput Sequencing Library Construction for Herbarium Specimens.
Saeidi, Saman; McKain, Michael R; Kellogg, Elizabeth A
2018-03-08
Herbaria are an invaluable source of plant material that can be used in a variety of biological studies. The use of herbarium specimens is associated with a number of challenges including sample preservation quality, degraded DNA, and destructive sampling of rare specimens. In order to more effectively use herbarium material in large sequencing projects, a dependable and scalable method of DNA isolation and library preparation is needed. This paper demonstrates a robust, beginning-to-end protocol for DNA isolation and high-throughput library construction from herbarium specimens that does not require modification for individual samples. This protocol is tailored for low quality dried plant material and takes advantage of existing methods by optimizing tissue grinding, modifying library size selection, and introducing an optional reamplification step for low yield libraries. Reamplification of low yield DNA libraries can rescue samples derived from irreplaceable and potentially valuable herbarium specimens, negating the need for additional destructive sampling and without introducing discernible sequencing bias for common phylogenetic applications. The protocol has been tested on hundreds of grass species, but is expected to be adaptable for use in other plant lineages after verification. This protocol can be limited by extremely degraded DNA, where fragments do not exist in the desired size range, and by secondary metabolites present in some plant material that inhibit clean DNA isolation. Overall, this protocol introduces a fast and comprehensive method that allows for DNA isolation and library preparation of 24 samples in less than 13 h, with only 8 h of active hands-on time with minimal modifications.
NASA Technical Reports Server (NTRS)
Pang, Jackson; Pingree, Paula J.; Torgerson, J. Leigh
2006-01-01
We present the Telecommunications protocol processing subsystem using Reconfigurable Interoperable Gate Arrays (TRIGA), a novel approach that unifies fault tolerance, error correction coding and interplanetary communication protocol off-loading to implement CCSDS File Delivery Protocol and Datalink layers. The new reconfigurable architecture offers more than one order of magnitude throughput increase while reducing footprint requirements in memory, command and data handling processor utilization, communication system interconnects and power consumption.
A Family of ACO Routing Protocols for Mobile Ad Hoc Networks.
Rupérez Cañas, Delfín; Sandoval Orozco, Ana Lucila; García Villalba, Luis Javier; Kim, Tai-Hoon
2017-05-22
In this work, an ACO routing protocol for mobile ad hoc networks based on AntHocNet is specified. As its predecessor, this new protocol, called AntOR, is hybrid in the sense that it contains elements from both reactive and proactive routing. Specifically, it combines a reactive route setup process with a proactive route maintenance and improvement process. Key aspects of the AntOR protocol are the disjoint-link and disjoint-node routes, separation between the regular pheromone and the virtual pheromone in the diffusion process and the exploration of routes, taking into consideration the number of hops in the best routes. In this work, a family of ACO routing protocols based on AntOR is also specified. These protocols are based on protocol successive refinements. In this work, we also present a parallelized version of AntOR that we call PAntOR. Using programming multiprocessor architectures based on the shared memory protocol, PAntOR allows running tasks in parallel using threads. This parallelization is applicable in the route setup phase, route local repair process and link failure notification. In addition, a variant of PAntOR that consists of having more than one interface, which we call PAntOR-MI (PAntOR-Multiple Interface), is specified. This approach parallelizes the sending of broadcast messages by interface through threads.
Jong, Stephanie T; Brown, Helen Elizabeth; Croxson, Caroline H D; Wilkinson, Paul; Corder, Kirsten L; van Sluijs, Esther M F
2018-05-21
Process evaluations are critical for interpreting and understanding outcome trial results. By understanding how interventions function across different settings, process evaluations have the capacity to inform future dissemination of interventions. The complexity of Get others Active (GoActive), a 12-week, school-based physical activity intervention implemented in eight schools, highlights the need to investigate how implementation is achieved across a variety of school settings. This paper describes the mixed methods GoActive process evaluation protocol that is embedded within the outcome evaluation. In this detailed process evaluation protocol, we describe the flexible and pragmatic methods that will be used for capturing the process evaluation data. A mixed methods design will be used for the process evaluation, including quantitative data collected in both the control and intervention arms of the GoActive trial, and qualitative data collected in the intervention arm. Data collection methods will include purposively sampled, semi-structured interviews and focus group interviews, direct observation, and participant questionnaires (completed by students, teachers, older adolescent mentors, and local authority-funded facilitators). Data will be analysed thematically within and across datasets. Overall synthesis of findings will address the process of GoActive implementation, and through which this process affects outcomes, with careful attention to the context of the school environment. This process evaluation will explore the experience of participating in GoActive from the perspectives of key groups, providing a greater understanding of the acceptability and process of implementation of the intervention across the eight intervention schools. This will allow for appraisal of the intervention's conceptual base, inform potential dissemination, and help optimise post-trial sustainability. The process evaluation will also assist in contextualising the trial effectiveness results with respect to how the intervention may or may not have worked and, if it was found to be effective, what might be required for it to be sustained in the 'real world'. Furthermore, it will offer suggestions for the development and implementation of future initiatives to promote physical activity within schools. ISRCTN, ISRCTN31583496 . Registered on 18 February 2014.
Development of an HPV Educational Protocol for Adolescents
Wetzel, Caitlin; Tissot, Abbigail; Kollar, Linda M.; Hillard, Paula A.; Stone, Rachel; Kahn, Jessica A.
2007-01-01
Study Objectives To develop an educational protocol about HPV and Pap tests for adolescents, to evaluate the protocol for understandability and clarity, and to evaluate the protocol for its effectiveness in increasing knowledge about HPV. Design In phase 1, investigators and adolescents developed the protocol. In phase 2, adolescents evaluated the protocol qualitatively, investigators evaluated its effectiveness in increasing HPV knowledge in a sample of adolescents, and the protocol was revised. In phase 3, investigators evaluated the effectiveness of the revised protocol in an additional adolescent sample. Setting Urban, hospital-based teen health center. Participants A total of 252 adolescent girls and boys in the three study phases. Main Outcome Measures Pre- and post-protocol knowledge about HPV, measured using a 10- or 11-item scale. Results Scores on the HPV knowledge scale increased significantly (p<.0001) among adolescents who participated in phases 2 and 3 after they received the protocol. Initial differences in scores based on race, insurance type and condom use were not noted post-protocol. Conclusion The protocol significantly increased knowledge scores about HPV in this population, regardless of sociodemographic characteristics and risk behaviors. Effective, developmentally appropriate educational protocols about HPV and Pap tests are particularly important in clinical settings as cervical cancer screening guidelines evolve, HPV DNA testing is integrated into screening protocols, and HPV vaccines become available. In-depth, one-on-one education about HPV may also prevent adverse psychosocial responses and promote healthy sexual and Pap screening behaviors in adolescents with abnormal HPV or Pap test results. Synopsis The investigators developed an educational protocol about HPV and Pap tests and evaluated its effectiveness in increasing knowledge about HPV among adolescents. PMID:17868894
Rezk, Amgad R; Ramesan, Shwathy; Yeo, Leslie Y
2018-01-30
The microarray titre plate remains a fundamental workhorse in genomic, proteomic and cellomic analyses that underpin the drug discovery process. Nevertheless, liquid handling technologies for sample dispensing, processing and transfer have not progressed significantly beyond conventional robotic micropipetting techniques, which are not only at their fundamental sample size limit, but are also prone to mechanical failure and contamination. This is because alternative technologies to date suffer from a number of constraints, mainly their limitation to carry out only a single liquid operation such as dispensing or mixing at a given time, and their inability to address individual wells, particularly at high throughput. Here, we demonstrate the possibility for true sequential or simultaneous single- and multi-well addressability in a 96-well plate using a reconfigurable modular platform from which MHz-order hybrid surface and bulk acoustic waves can be coupled to drive a variety of microfluidic modes including mixing, sample preconcentration and droplet jetting/ejection in individual or multiple wells on demand, thus constituting a highly versatile yet simple setup capable of improving the functionality of existing laboratory protocols and processes.
Tran, Duc T; Banerjee, Sambuddha; Alayash, Abdu I; Crumbliss, Alvin L; Fitzgerald, Michael C
2012-02-07
Described here is a mass spectrometry-based protocol to study the thermodynamic stability of proteins and protein-ligand complexes using the chemical denaturant dependence of the slow H/D exchange reaction of the imidazole C(2) proton in histidine side chains. The protocol is developed using several model protein systems including: ribonuclease (Rnase) A, myoglobin, bovine carbonic anhydrase (BCA) II, hemoglobin (Hb), and the hemoglobin-haptoglobin (Hb-Hp) protein complex. Folding free energies consistent with those previously determined by other more conventional techniques were obtained for the two-state folding proteins, Rnase A and myoglobin. The protocol successfully detected a previously observed partially unfolded intermediate stabilized in the BCA II folding/unfolding reaction, and it could be used to generate a K(d) value of 0.24 nM for the Hb-Hp complex. The compatibility of the protocol with conventional mass spectrometry-based proteomic sample preparation and analysis methods was also demonstrated in an experiment in which the protocol was used to detect the binding of zinc to superoxide dismutase in the yeast cell lysate sample. The yeast cell sample analyses also helped define the scope of the technique, which requires the presence of globally protected histidine residues in a protein's three-dimensional structure for successful application. © 2011 American Chemical Society
NASA Astrophysics Data System (ADS)
Migliozzi, D.; Nguyen, H. T.; Gijs, M. A. M.
2018-02-01
Immunohistochemistry (IHC) is one of the main techniques currently used in the clinics for biomarker characterization. It consists in colorimetric labeling with specific antibodies followed by microscopy analysis. The results are then used for diagnosis and therapeutic targeting. Well-known drawbacks of such protocols are their limited accuracy and precision, which prevent the clinicians from having quantitative and robust IHC results. With our work, we combined rapid microfluidic immunofluorescent staining with efficient image-based cell segmentation and signal quantification to increase the robustness of both experimental and analytical protocols. The experimental protocol is very simple and based on fast-fluidic-exchange in a microfluidic chamber created on top of the formalin-fixed-paraffin-embedded (FFPE) slide by clamping it a silicon chip with a polydimethyl siloxane (PDMS) sealing ring. The image-processing protocol is based on enhancement and subsequent thresholding of the local contrast of the obtained fluorescence image. As a case study, given that the human epidermal growth factor receptor 2 (HER2) protein is often used as a biomarker for breast cancer, we applied our method to HER2+ and HER2- cell lines. We report very fast (5 minutes) immunofluorescence staining of both HER2 and cytokeratin (a marker used to define the tumor region) on FFPE slides. The image-processing program can segment cells correctly and give a cell-based quantitative immunofluorescent signal. With this method, we found a reproducible well-defined separation for the HER2-to-cytokeratin ratio for positive and negative control samples.
Protocols for second-generation business satellites systems
NASA Astrophysics Data System (ADS)
Evans, B. G.; Coakley, F. P.; El Amin, M. H. M.
The paper discusses the nature and mix of traffic in business satellite systems and describes the limitations on the protocol imposed by the differing impairments of speech, video, and data. A simple TDMA system protocol is presented which meets the requirements of mixed-service operation. The efficiency of the protocol together with implications for allocation, scheduling and synchronisation are discussed. Future-generation satellites will probably use on-board processing. Some initial work on protocols that make use of on-board processing and the implications for satellite and earth-station equipment are presented.
Development of bull trout sampling protocols
R. F. Thurow; J. T. Peterson; J. W. Guzevich
2001-01-01
This report describes results of research conducted in Washington in 2000 through Interagency Agreement #134100H002 between the U.S. Fish and Wildlife Service (USFWS) and the U.S. Forest Service Rocky Mountain Research Station (RMRS). The purpose of this agreement is to develop a bull trout (Salvelinus confluentus) sampling protocol by integrating...
Soil Geochemical Data for the Wyoming Landscape Conservation Initiative Study Area
Smith, David B.; Ellefsen, Karl J.
2010-01-01
In 2008, soil samples were collected at 139 sites throughout the Wyoming Landscape Conservation Initiative study area in southwest Wyoming. These samples, representing a density of 1 site per 440 square kilometers, were collected from a depth of 0-5 cm and analyzed for a suite of more than 40 major and trace elements following a near-total multi-acid extraction. In addition, soil pH, electrical conductivity, total nitrogen, total and organic carbon, and sodium adsorption ratio were determined. The resulting data set provides a baseline for detecting changes in soil composition that might result from natural processes or anthropogenic activities. This report describes the sampling and analytical protocols used, and makes available all the soil geochemical data generated in the study.
Chapter A6. Section 6.6. Alkalinity and Acid Neutralizing Capacity
Rounds, Stewart A.; Wilde, Franceska D.
2002-01-01
Alkalinity (determined on a filtered sample) and Acid Neutralizing Capacity (ANC) (determined on a whole-water sample) are measures of the ability of a water sample to neutralize strong acid. Alkalinity and ANC provide information on the suitability of water for uses such as irrigation, determining the efficiency of wastewater processes, determining the presence of contamination by anthropogenic wastes, and maintaining ecosystem health. In addition, alkalinity is used to gain insights on the chemical evolution of an aqueous system. This section of the National Field Manual (NFM) describes the USGS field protocols for alkalinity/ANC determination using either the inflection-point or Gran function plot methods, including calculation of carbonate species, and provides guidance on equipment selection.
Automated monitoring of medical protocols: a secure and distributed architecture.
Alsinet, T; Ansótegui, C; Béjar, R; Fernández, C; Manyà, F
2003-03-01
The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.
Fabrication of Ultra-thin Color Films with Highly Absorbing Media Using Oblique Angle Deposition.
Yoo, Young Jin; Lee, Gil Ju; Jang, Kyung-In; Song, Young Min
2017-08-29
Ultra-thin film structures have been studied extensively for use as optical coatings, but performance and fabrication challenges remain. We present an advanced method for fabricating ultra-thin color films with improved characteristics. The proposed process addresses several fabrication issues, including large area processing. Specifically, the protocol describes a process for fabricating ultra-thin color films using an electron beam evaporator for oblique angle deposition of germanium (Ge) and gold (Au) on silicon (Si) substrates. Film porosity produced by the oblique angle deposition induces color changes in the ultra-thin film. The degree of color change depends on factors such as deposition angle and film thickness. Fabricated samples of the ultra-thin color films showed improved color tunability and color purity. In addition, the measured reflectance of the fabricated samples was converted into chromatic values and analyzed in terms of color. Our ultra-thin film fabricating method is expected to be used for various ultra-thin film applications such as flexible color electrodes, thin film solar cells, and optical filters. Also, the process developed here for analyzing the color of the fabricated samples is broadly useful for studying various color structures.
Assessment of levels of bacterial contamination of large wild game meat in Europe.
Membré, Jeanne-Marie; Laroche, Michel; Magras, Catherine
2011-08-01
The variations in prevalence and levels of pathogens and fecal contamination indicators in large wild game meat were studied to assess their potential impact on consumers. This analysis was based on hazard analysis, data generation and statistical analysis. A total of 2919 meat samples from three species (red deer, roe deer, wild boar) were collected at French game meat traders' facilities using two sampling protocols. Information was gathered on the types of meat cuts (forequarter or haunch; first sampling protocol) or type of retail-ready meat (stewing meat or roasting meat; second protocol), and also on the meat storage conditions (frozen or chilled), country of origin (eight countries) and shooting season (autumn, winter, spring). The samples were analyzed in both protocols for detection and enumeration of Escherichia coli, coagulase+staphylococci and Clostridium perfringens. In addition, detection and enumeration of thermotolerant coliforms and Listeria monocytogenes were performed for samples collected in the first and second protocols, respectively. The levels of bacterial contamination of the raw meat were determined by performing statistical analysis involving probabilistic techniques and Bayesian inference. C. perfringens was found in the highest numbers for the three indicators of microbial quality, hygiene and good handling, and L. monocytogenes in the lowest. Differences in contamination levels between game species and between meats distributed as chilled or frozen products were not significant. These results might be included in quantitative exposure assessments. Copyright © 2011 Elsevier Ltd. All rights reserved.
Hubbard, Laura E; Kolpin, Dana W; Fields, Chad L; Hladik, Michelle L; Iwanowicz, Luke R
2017-10-01
The highly pathogenic avian influenza (H5N2) outbreak in the Midwestern United States (US) in 2015 was historic due to the number of birds and poultry operations impacted and the corresponding economic loss to the poultry industry and was the largest animal health emergency in US history. The U.S. Geological Survey (USGS), with the assistance of several state and federal agencies, aided the response to the outbreak by developing a study to determine the extent of virus transport in the environment. The study goals were to: develop the appropriate sampling methods and protocols for measuring avian influenza virus (AIV) in groundwater, provide the first baseline data on AIV and outbreak- and poultry-related contaminant occurrence and movement into groundwater, and document climatological factors that may have affected both survival and transport of AIV to groundwater during the months of the 2015 outbreak. While site selection was expedient, there were often delays in sample response times due to both relationship building between agencies, groups, and producers and logistical time constraints. This study's design and sampling process highlights the unpredictable nature of disease outbreaks and the corresponding difficulty in environmental sampling of such events. The lessons learned, including field protocols and approaches, can be used to improve future research on AIV in the environment. Published by Elsevier Inc.
Hubbard, Laura E.; Kolpin, Dana W.; Fields, Chad L.; Hladik, Michelle L.; Iwanowicz, Luke R.
2017-01-01
The highly pathogenic avian influenza (H5N2) outbreak in the Midwestern United States (US) in 2015 was historic due to the number of birds and poultry operations impacted and the corresponding economic loss to the poultry industry and was the largest animal health emergency in US history. The U.S. Geological Survey (USGS), with the assistance of several state and federal agencies, aided the response to the outbreak by developing a study to determine the extent of virus transport in the environment. The study goals were to: develop the appropriate sampling methods and protocols for measuring avian influenza virus (AIV) in groundwater, provide the first baseline data on AIV and outbreak- and poultry-related contaminant occurrence and movement into groundwater, and document climatological factors that may have affected both survival and transport of AIV to groundwater during the months of the 2015 outbreak. While site selection was expedient, there were often delays in sample response times due to both relationship building between agencies, groups, and producers and logistical time constraints. This study's design and sampling process highlights the unpredictable nature of disease outbreaks and the corresponding difficulty in environmental sampling of such events. The lessons learned, including field protocols and approaches, can be used to improve future research on AIV in the environment.
NASA Astrophysics Data System (ADS)
Vaishampayan, Parag; Osman, Shariff; Andersen, Gary; Venkateswaran, Kasthuri
2010-06-01
The bacterial diversity and comparative community structure of a clean room used for assembling the Phoenix spacecraft was characterized throughout the spacecraft assembly process by using 16S rRNA gene cloning/sequencing and DNA microarray (PhyloChip) technologies. Samples were collected from several locations of the clean room at three time points: before Phoenix's arrival (PHX-B), during hardware assembly (PHX-D), and after the spacecraft was removed for launch (PHX-A). Bacterial diversity comprised of all major bacterial phyla of PHX-B was found to be statistically different from PHX-D and PHX-A samples. Due to stringent cleaning and decontamination protocols during assembly, PHX-D bacterial diversity was dramatically reduced when compared to PHX-B and PHX-A samples. Comparative community analysis based on PhyloChip results revealed similar overall trends as were seen in clone libraries, but the high-density phylogenetic microarray detected larger diversity in all sampling events. The decrease in community complexity in PHX-D compared to PHX-B, and the subsequent recurrence of these organisms in PHX-A, speaks to the effectiveness of NASA cleaning protocols. However, the persistence of a subset of bacterial signatures throughout all spacecraft assembly phases underscores the need for continued refinement of sterilization technologies and the implementation of safeguards that monitor and inventory microbial contaminants.
Vaishampayan, Parag; Osman, Shariff; Andersen, Gary; Venkateswaran, Kasthuri
2010-06-01
The bacterial diversity and comparative community structure of a clean room used for assembling the Phoenix spacecraft was characterized throughout the spacecraft assembly process by using 16S rRNA gene cloning/sequencing and DNA microarray (PhyloChip) technologies. Samples were collected from several locations of the clean room at three time points: before Phoenix's arrival (PHX-B), during hardware assembly (PHX-D), and after the spacecraft was removed for launch (PHX-A). Bacterial diversity comprised of all major bacterial phyla of PHX-B was found to be statistically different from PHX-D and PHX-A samples. Due to stringent cleaning and decontamination protocols during assembly, PHX-D bacterial diversity was dramatically reduced when compared to PHX-B and PHX-A samples. Comparative community analysis based on PhyloChip results revealed similar overall trends as were seen in clone libraries, but the high-density phylogenetic microarray detected larger diversity in all sampling events. The decrease in community complexity in PHX-D compared to PHX-B, and the subsequent recurrence of these organisms in PHX-A, speaks to the effectiveness of NASA cleaning protocols. However, the persistence of a subset of bacterial signatures throughout all spacecraft assembly phases underscores the need for continued refinement of sterilization technologies and the implementation of safeguards that monitor and inventory microbial contaminants.
A Method for Identification and Analysis of Non-Overlapping Myeloid Immunophenotypes in Humans
Gustafson, Michael P.; Lin, Yi; Maas, Mary L.; Van Keulen, Virginia P.; Johnston, Patrick B.; Peikert, Tobias; Gastineau, Dennis A.; Dietz, Allan B.
2015-01-01
The development of flow cytometric biomarkers in human studies and clinical trials has been slowed by inconsistent sample processing, use of cell surface markers, and reporting of immunophenotypes. Additionally, the function(s) of distinct cell types as biomarkers cannot be accurately defined without the proper identification of homogeneous populations. As such, we developed a method for the identification and analysis of human leukocyte populations by the use of eight 10-color flow cytometric protocols in combination with novel software analyses. This method utilizes un-manipulated biological sample preparation that allows for the direct quantitation of leukocytes and non-overlapping immunophenotypes. We specifically designed myeloid protocols that enable us to define distinct phenotypes that include mature monocytes, granulocytes, circulating dendritic cells, immature myeloid cells, and myeloid derived suppressor cells (MDSCs). We also identified CD123 as an additional distinguishing marker for the phenotypic characterization of immature LIN-CD33+HLA-DR- MDSCs. Our approach permits the comprehensive analysis of all peripheral blood leukocytes and yields data that is highly amenable for standardization across inter-laboratory comparisons for human studies. PMID:25799053
Study of a scanning HIFU therapy protocol, Part II: Experiment and results
NASA Astrophysics Data System (ADS)
Andrew, Marilee A.; Kaczkowski, Peter; Cunitz, Bryan W.; Brayman, Andrew A.; Kargl, Steven G.
2003-04-01
Instrumentation and protocols for creating scanned HIFU lesions in freshly excised bovine liver were developed in order to study the in vitro HIFU dose response and validate models. Computer-control of the HIFU transducer and 3-axis positioning system provided precise spatial placement of the thermal lesions. Scan speeds were selected in the range of 1 to 8 mm/s, and the applied electrical power was varied from 20 to 60 W. These parameters were chosen to hold the thermal dose constant. A total of six valid scans of 15 mm length were created in each sample; a 3.5 MHz single-element, spherically focused transducer was used. Treated samples were frozen, then sliced in 1.27 mm increments. Digital photographs of slices were downloaded to computer for image processing and analysis. Lesion characteristics, including the depth within the tissue, axial length, and radial width, were computed. Results were compared with those generated from modified KZK and BHTE models, and include a comparison of the statistical variation in the across-scan lesion radial width. [Work supported by USAMRMC.
Richards-Kortum, Rebecca
2015-01-01
It was recently demonstrated that recombinase polymerase amplification (RPA), an isothermal amplification platform for pathogen detection, may be used to quantify DNA sample concentration using a standard curve. In this manuscript, a detailed protocol for developing and implementing a real-time quantitative recombinase polymerase amplification assay (qRPA assay) is provided. Using HIV-1 DNA quantification as an example, the assembly of real-time RPA reactions, the design of an internal positive control (IPC) sequence, and co-amplification of the IPC and target of interest are all described. Instructions and data processing scripts for the construction of a standard curve using data from multiple experiments are provided, which may be used to predict the concentration of unknown samples or assess the performance of the assay. Finally, an alternative method for collecting real-time fluorescence data with a microscope and a stage heater as a step towards developing a point-of-care qRPA assay is described. The protocol and scripts provided may be used for the development of a qRPA assay for any DNA target of interest. PMID:25867513
Crannell, Zachary A; Rohrman, Brittany; Richards-Kortum, Rebecca
2015-03-30
It was recently demonstrated that recombinase polymerase amplification (RPA), an isothermal amplification platform for pathogen detection, may be used to quantify DNA sample concentration using a standard curve. In this manuscript, a detailed protocol for developing and implementing a real-time quantitative recombinase polymerase amplification assay (qRPA assay) is provided. Using HIV-1 DNA quantification as an example, the assembly of real-time RPA reactions, the design of an internal positive control (IPC) sequence, and co-amplification of the IPC and target of interest are all described. Instructions and data processing scripts for the construction of a standard curve using data from multiple experiments are provided, which may be used to predict the concentration of unknown samples or assess the performance of the assay. Finally, an alternative method for collecting real-time fluorescence data with a microscope and a stage heater as a step towards developing a point-of-care qRPA assay is described. The protocol and scripts provided may be used for the development of a qRPA assay for any DNA target of interest.
Sørbye, Sveinung Wergeland; Pedersen, Mette Kristin; Ekeberg, Bente; Williams, Merete E. Johansen; Sauer, Torill; Chen, Ying
2017-01-01
Background: The Norwegian Cervical Cancer Screening Program recommends screening every 3 years for women between 25 and 69 years of age. There is a large difference in the percentage of unsatisfactory samples between laboratories that use different brands of liquid-based cytology. We wished to examine if inadequate ThinPrep samples could be satisfactory by processing them with the SurePath protocol. Materials and Methods: A total of 187 inadequate ThinPrep specimens from the Department of Clinical Pathology at University Hospital of North Norway were sent to Akershus University Hospital for conversion to SurePath medium. Ninety-one (48.7%) were processed through the automated “gynecologic” application for cervix cytology samples, and 96 (51.3%) were processed with the “nongynecological” automatic program. Results: Out of 187 samples that had been unsatisfactory by ThinPrep, 93 (49.7%) were satisfactory after being converted to SurePath. The rate of satisfactory cytology was 36.6% and 62.5% for samples run through the “gynecology” program and “nongynecology” program, respectively. Of the 93 samples that became satisfactory after conversion from ThinPrep to SurePath, 80 (86.0%) were screened as normal while 13 samples (14.0%) were given an abnormal diagnosis, which included 5 atypical squamous cells of undetermined significance, 5 low-grade squamous intraepithelial lesion, 2 atypical glandular cells not otherwise specified, and 1 atypical squamous cells cannot exclude high-grade squamous intraepithelial lesion. A total of 2.1% (4/187) of the women got a diagnosis of cervical intraepithelial neoplasia 2 or higher at a later follow-up. Conclusions: Converting cytology samples from ThinPrep to SurePath processing can reduce the number of unsatisfactory samples. The samples should be run through the “nongynecology” program to ensure an adequate number of cells. PMID:28900466
PNNI Performance Validation Test Report
NASA Technical Reports Server (NTRS)
Dimond, Robert P.
1999-01-01
Two Private Network-Network Interface (PNNI) neighboring peers were monitored with a protocol analyzer to understand and document how PNNI works with regards to initialization and recovery processes. With the processes documented, pertinent events were found and measured to determine the protocols behavior in several environments, which consisted of congestion and/or delay. Subsequent testing of the protocol in these environments was conducted to determine the protocol's suitability for use in satellite-terrestrial network architectures.
A Novel Re-keying Function Protocol (NRFP) For Wireless Sensor Network Security
Abdullah, Maan Younis; Hua, Gui Wei; Alsharabi, Naif
2008-01-01
This paper describes a novel re-keying function protocol (NRFP) for wireless sensor network security. A re-keying process management system for sensor networks is designed to support in-network processing. The design of the protocol is motivated by decentralization key management for wireless sensor networks (WSNs), covering key deployment, key refreshment, and key establishment. NRFP supports the establishment of novel administrative functions for sensor nodes that derive/re-derive a session key for each communication session. The protocol proposes direct connection, in-direct connection and hybrid connection. NRFP also includes an efficient protocol for local broadcast authentication based on the use of one-way key chains. A salient feature of the authentication protocol is that it supports source authentication without precluding innetwork processing. Security and performance analysis shows that it is very efficient in computation, communication and storage and, that NRFP is also effective in defending against many sophisticated attacks. PMID:27873963
A Novel Re-keying Function Protocol (NRFP) For Wireless Sensor Network Security.
Abdullah, Maan Younis; Hua, Gui Wei; Alsharabi, Naif
2008-12-04
This paper describes a novel re-keying function protocol (NRFP) for wireless sensor network security. A re-keying process management system for sensor networks is designed to support in-network processing. The design of the protocol is motivated by decentralization key management for wireless sensor networks (WSNs), covering key deployment, key refreshment, and key establishment. NRFP supports the establishment of novel administrative functions for sensor nodes that derive/re-derive a session key for each communication session. The protocol proposes direct connection, in-direct connection and hybrid connection. NRFP also includes an efficient protocol for local broadcast authentication based on the use of one-way key chains. A salient feature of the authentication protocol is that it supports source authentication without precluding in-network processing. Security and performance analysis shows that it is very efficient in computation, communication and storage and, that NRFP is also effective in defending against many sophisticated attacks.
Emwas, Abdul-Hamid; Luchinat, Claudio; Turano, Paola; Tenori, Leonardo; Roy, Raja; Salek, Reza M; Ryan, Danielle; Merzaban, Jasmeen S; Kaddurah-Daouk, Rima; Zeri, Ana Carolina; Nagana Gowda, G A; Raftery, Daniel; Wang, Yulan; Brennan, Lorraine; Wishart, David S
The metabolic composition of human biofluids can provide important diagnostic and prognostic information. Among the biofluids most commonly analyzed in metabolomic studies, urine appears to be particularly useful. It is abundant, readily available, easily stored and can be collected by simple, noninvasive techniques. Moreover, given its chemical complexity, urine is particularly rich in potential disease biomarkers. This makes it an ideal biofluid for detecting or monitoring disease processes. Among the metabolomic tools available for urine analysis, NMR spectroscopy has proven to be particularly well-suited, because the technique is highly reproducible and requires minimal sample handling. As it permits the identification and quantification of a wide range of compounds, independent of their chemical properties, NMR spectroscopy has been frequently used to detect or discover disease fingerprints and biomarkers in urine. Although protocols for NMR data acquisition and processing have been standardized, no consensus on protocols for urine sample selection, collection, storage and preparation in NMR-based metabolomic studies have been developed. This lack of consensus may be leading to spurious biomarkers being reported and may account for a general lack of reproducibility between laboratories. Here, we review a large number of published studies on NMR-based urine metabolic profiling with the aim of identifying key variables that may affect the results of metabolomics studies. From this survey, we identify a number of issues that require either standardization or careful accounting in experimental design and provide some recommendations for urine collection, sample preparation and data acquisition.
Purifying Nucleic Acids from Samples of Extremely Low Biomass
NASA Technical Reports Server (NTRS)
La Duc, Myron; Osman, Shariff; Venkateswaran, Kasthuri
2008-01-01
A new method is able to circumvent the bias to which one commercial DNA extraction method falls prey with regard to the lysing of certain types of microbial cells, resulting in a truncated spectrum of microbial diversity. By prefacing the protocol with glass-bead-beating agitation (mechanically lysing a much more encompassing array of cell types and spores), the resulting microbial diversity detection is greatly enhanced. In preliminary studies, a commercially available automated DNA extraction method is effective at delivering total DNA yield, but only the non-hardy members of the bacterial bisque were represented in clone libraries, suggesting that this method was ineffective at lysing the hardier cell types. To circumvent such a bias in cells, yet another extraction method was devised. In this technique, samples are first subjected to a stringent bead-beating step, and then are processed via standard protocols. Prior to being loaded into extraction vials, samples are placed in micro-centrifuge bead tubes containing 50 micro-L of commercially produced lysis solution. After inverting several times, tubes are agitated at maximum speed for two minutes. Following agitation, tubes are centrifuged at 10,000 x g for one minute. At this time, the aqueous volumes are removed from the bead tubes and are loaded into extraction vials to be further processed via extraction regime. The new method couples two independent methodologies in such as way as to yield the highest concentration of PCR-amplifiable DNA with consistent and reproducible results and with the most accurate and encompassing report of species richness.
Campoy, Irene; Lanau, Lucia; Altadill, Tatiana; Sequeiros, Tamara; Cabrera, Silvia; Cubo-Abert, Montserrat; Pérez-Benavente, Assumpción; Garcia, Angel; Borrós, Salvador; Santamaria, Anna; Ponce, Jordi; Matias-Guiu, Xavier; Reventós, Jaume; Gil-Moreno, Antonio; Rigau, Marina; Colas, Eva
2016-06-18
Uterine aspirates are used in the diagnostic process of endometrial disorders, yet further applications could emerge if its complex milieu was simplified. Exosome-like vesicles isolated from uterine aspirates could become an attractive source of biomarkers, but there is a need to standardize isolation protocols. The objective of the study was to determine whether exosome-like vesicles exist in the fluid fraction of uterine aspirates and to compare protocols for their isolation, characterization, and analysis. We collected uterine aspirates from 39 pre-menopausal women suffering from benign gynecological diseases. The fluid fraction of 27 of those aspirates were pooled and split into equal volumes to evaluate three differential centrifugation-based procedures: (1) a standard protocol, (2) a filtration protocol, and (3) a sucrose cushion protocol. Characterization of isolated vesicles was assessed by electron microscopy, nanoparticle tracking analysis and immunoblot. Specifically for RNA material, we evaluate the effect of sonication and RNase A treatment at different steps of the protocol. We finally confirmed the efficiency of the selected methods in non-pooled samples. All protocols were useful to isolate exosome-like vesicles. However, the Standard procedure was the best performing protocol to isolate exosome-like vesicles from uterine aspirates: nanoparticle tracking analysis revealed a higher concentration of vesicles with a mode of 135 ± 5 nm, and immunoblot showed a higher expression of exosome-related markers (CD9, CD63, and CD81) thus verifying an enrichment in this type of vesicles. RNA contained in exosome-like vesicles was successfully extracted with no sonication treatment and exogenous nucleic acids digestion with RNaseA, allowing the analysis of the specific inner cargo by Real-Time qPCR. We confirmed the existence of exosome-like vesicles in the fluid fraction of uterine aspirates. They were successfully isolated by differential centrifugation giving sufficient proteomic and transcriptomic material for further analyses. The Standard protocol was the best performing procedure since the other two tested protocols did not ameliorate neither yield nor purity of exosome-like vesicles. This study contributes to establishing the basis for future comparative studies to foster the field of biomarker research in gynecology.
Campbell, Stephen M; Kontopantelis, Evangelos; Hannon, Kerin; Burke, Martyn; Barber, Annette; Lester, Helen E
2011-08-10
Quality measures should be subjected to a testing protocol before being used in practice using key attributes such as acceptability, feasibility and reliability, as well as identifying issues derived from actual implementation and unintended consequences. We describe the methodologies and results of an indicator testing protocol (ITP) using data from proposed quality indicators for the United Kingdom Quality and Outcomes Framework (QOF). The indicator testing protocol involved a multi-step and methodological process: 1) The RAND/UCLA Appropriateness Method, to test clarity and necessity, 2) data extraction from patients' medical records, to test technical feasibility and reliability, 3) diaries, to test workload, 4) cost-effectiveness modelling, and 5) semi-structured interviews, to test acceptability, implementation issues and unintended consequences. Testing was conducted in a sample of representative family practices in England. These methods were combined into an overall recommendation for each tested indicator. Using an indicator testing protocol as part of piloting was seen as a valuable way of testing potential indicators in 'real world' settings. Pilot 1 (October 2009-March 2010) involved thirteen indicators across six clinical domains and twelve indicators passed the indicator testing protocol. However, the indicator testing protocol identified a number of implementation issues and unintended consequences that can be rectified or removed prior to national roll out. A palliative care indicator is used as an exemplar of the value of piloting using a multiple attribute indicator testing protocol - while technically feasible and reliable, it was unacceptable to practice staff and raised concerns about potentially causing actual patient harm. This indicator testing protocol is one example of a protocol that may be useful in assessing potential quality indicators when adapted to specific country health care settings and may be of use to policy-makers and researchers worldwide to test the likely effect of implementing indicators prior to roll out. It builds on and codifies existing literature and other testing protocols to create a field testing methodology that can be used to produce country specific quality indicators for pay-for-performance or quality improvement schemes.
Rothrock, Michael J.; Hiett, Kelli L.; Gamble, John; Caudill, Andrew C.; Cicconi-Hogan, Kellie M.; Caporaso, J. Gregory
2014-01-01
The efficacy of DNA extraction protocols can be highly dependent upon both the type of sample being investigated and the types of downstream analyses performed. Considering that the use of new bacterial community analysis techniques (e.g., microbiomics, metagenomics) is becoming more prevalent in the agricultural and environmental sciences and many environmental samples within these disciplines can be physiochemically and microbiologically unique (e.g., fecal and litter/bedding samples from the poultry production spectrum), appropriate and effective DNA extraction methods need to be carefully chosen. Therefore, a novel semi-automated hybrid DNA extraction method was developed specifically for use with environmental poultry production samples. This method is a combination of the two major types of DNA extraction: mechanical and enzymatic. A two-step intense mechanical homogenization step (using bead-beating specifically formulated for environmental samples) was added to the beginning of the “gold standard” enzymatic DNA extraction method for fecal samples to enhance the removal of bacteria and DNA from the sample matrix and improve the recovery of Gram-positive bacterial community members. Once the enzymatic extraction portion of the hybrid method was initiated, the remaining purification process was automated using a robotic workstation to increase sample throughput and decrease sample processing error. In comparison to the strict mechanical and enzymatic DNA extraction methods, this novel hybrid method provided the best overall combined performance when considering quantitative (using 16S rRNA qPCR) and qualitative (using microbiomics) estimates of the total bacterial communities when processing poultry feces and litter samples. PMID:25548939
A Mobile Satellite Experiment (MSAT-X) network definition
NASA Technical Reports Server (NTRS)
Wang, Charles C.; Yan, Tsun-Yee
1990-01-01
The network architecture development of the Mobile Satellite Experiment (MSAT-X) project for the past few years is described. The results and findings of the network research activities carried out under the MSAT-X project are summarized. A framework is presented upon which the Mobile Satellite Systems (MSSs) operator can design a commercial network. A sample network configuration and its capability are also included under the projected scenario. The Communication Interconnection aspect of the MSAT-X network is discussed. In the MSAT-X network structure two basic protocols are presented: the channel access protocol, and the link connection protocol. The error-control techniques used in the MSAT-X project and the packet structure are also discussed. A description of two testbeds developed for experimentally simulating the channel access protocol and link control protocol, respectively, is presented. A sample network configuration and some future network activities of the MSAT-X project are also presented.
Speech and language disorders in children from public schools in Belo Horizonte
Rabelo, Alessandra Terra Vasconcelos; Campos, Fernanda Rodrigues; Friche, Clarice Passos; da Silva, Bárbara Suelen Vasconcelos; Friche, Amélia Augusta de Lima; Alves, Claudia Regina Lindgren; Goulart, Lúcia Maria Horta de Figueiredo
2015-01-01
Objective: To investigate the prevalence of oral language, orofacial motor skill and auditory processing disorders in children aged 4-10 years and verify their association with age and gender. Methods: Cross-sectional study with stratified, random sample consisting of 539 students. The evaluation consisted of three protocols: orofacial motor skill protocol, adapted from the Myofunctional Evaluation Guidelines; the Child Language Test ABFW - Phonology; and a simplified auditory processing evaluation. Descriptive and associative statistical analyses were performed using Epi Info software, release 6.04. Chi-square test was applied to compare proportion of events and analysis of variance was used to compare mean values. Significance was set at p≤0.05. Results: Of the studied subjects, 50.1% had at least one of the assessed disorders; of those, 33.6% had oral language disorder, 17.1% had orofacial motor skill impairment, and 27.3% had auditory processing disorder. There were significant associations between auditory processing skills’ impairment, oral language impairment and age, suggesting a decrease in the number of disorders with increasing age. Similarly, the variable "one or more speech, language and hearing disorders" was also associated with age. Conclusions: The prevalence of speech, language and hearing disorders in children was high, indicating the need for research and public health efforts to cope with this problem. PMID:26300524
Mars Sample Quarantine Protocol Workshop
NASA Technical Reports Server (NTRS)
DeVincenzi, Donald L. (Editor); Bagby, John (Editor); Race, Margaret (Editor); Rummel, John (Editor)
1999-01-01
The Mars Sample Quarantine Protocol (QP) Workshop was convened to deal with three specific aspects of the initial handling of a returned Mars sample: 1) biocontainment, to prevent uncontrolled release of sample material into the terrestrial environment; 2) life detection, to examine the sample for evidence of live organisms; and 3) biohazard testing, to determine if the sample poses any threat to terrestrial life forms and the Earth's biosphere. During the first part of the Workshop, several tutorials were presented on topics related to the workshop in order to give all participants a common basis in the technical areas necessary to achieve the objectives of the Workshop.
Adaptive Peer Sampling with Newscast
NASA Astrophysics Data System (ADS)
Tölgyesi, Norbert; Jelasity, Márk
The peer sampling service is a middleware service that provides random samples from a large decentralized network to support gossip-based applications such as multicast, data aggregation and overlay topology management. Lightweight gossip-based implementations of the peer sampling service have been shown to provide good quality random sampling while also being extremely robust to many failure scenarios, including node churn and catastrophic failure. We identify two problems with these approaches. The first problem is related to message drop failures: if a node experiences a higher-than-average message drop rate then the probability of sampling this node in the network will decrease. The second problem is that the application layer at different nodes might request random samples at very different rates which can result in very poor random sampling especially at nodes with high request rates. We propose solutions for both problems. We focus on Newscast, a robust implementation of the peer sampling service. Our solution is based on simple extensions of the protocol and an adaptive self-control mechanism for its parameters, namely—without involving failure detectors—nodes passively monitor local protocol events using them as feedback for a local control loop for self-tuning the protocol parameters. The proposed solution is evaluated by simulation experiments.
2006-08-01
and the regulation of the timing of initial seedling growth. The evolution of flowering plants extended the potential for regu- lating growth and...improved the efficiency of gamete transfer via pollination (Willis and Figure 1. A one-gram plant sample of R. maritima seeds Report Documentation...uniformity of plant growth and development is contrary to the goals of ecological restoration where the objective is the successful establishment of
A process-based hierarchical framework for monitoring glaciated alpine headwaters
Weekes, Anne A.; Torgersen, Christian E.; Montgomery, David R.; Woodward, Andrea; Bolton, Susan M.
2012-01-01
Recent studies have demonstrated the geomorphic complexity and wide range of hydrologic regimes found in alpine headwater channels that provide complex habitats for aquatic taxa. These geohydrologic elements are fundamental to better understand patterns in species assemblages and indicator taxa and are necessary to aquatic monitoring protocols that aim to track changes in physical conditions. Complex physical variables shape many biological and ecological traits, including life history strategies, but these mechanisms can only be understood if critical physical variables are adequately represented within the sampling framework. To better align sampling design protocols with current geohydrologic knowledge, we present a conceptual framework that incorporates regional-scale conditions, basin-scale longitudinal profiles, valley-scale glacial macroform structure, valley segment-scale (i.e., colluvial, alluvial, and bedrock), and reach-scale channel types. At the valley segment- and reach-scales, these hierarchical levels are associated with differences in streamflow and sediment regime, water source contribution and water temperature. Examples of linked physical-ecological hypotheses placed in a landscape context and a case study using the proposed framework are presented to demonstrate the usefulness of this approach for monitoring complex temporal and spatial patterns and processes in glaciated basins. This approach is meant to aid in comparisons between mountain regions on a global scale and to improve management of potentially endangered alpine species affected by climate change and other stressors.
Refinement of NMR structures using implicit solvent and advanced sampling techniques.
Chen, Jianhan; Im, Wonpil; Brooks, Charles L
2004-12-15
NMR biomolecular structure calculations exploit simulated annealing methods for conformational sampling and require a relatively high level of redundancy in the experimental restraints to determine quality three-dimensional structures. Recent advances in generalized Born (GB) implicit solvent models should make it possible to combine information from both experimental measurements and accurate empirical force fields to improve the quality of NMR-derived structures. In this paper, we study the influence of implicit solvent on the refinement of protein NMR structures and identify an optimal protocol of utilizing these improved force fields. To do so, we carry out structure refinement experiments for model proteins with published NMR structures using full NMR restraints and subsets of them. We also investigate the application of advanced sampling techniques to NMR structure refinement. Similar to the observations of Xia et al. (J.Biomol. NMR 2002, 22, 317-331), we find that the impact of implicit solvent is rather small when there is a sufficient number of experimental restraints (such as in the final stage of NMR structure determination), whether implicit solvent is used throughout the calculation or only in the final refinement step. The application of advanced sampling techniques also seems to have minimal impact in this case. However, when the experimental data are limited, we demonstrate that refinement with implicit solvent can substantially improve the quality of the structures. In particular, when combined with an advanced sampling technique, the replica exchange (REX) method, near-native structures can be rapidly moved toward the native basin. The REX method provides both enhanced sampling and automatic selection of the most native-like (lowest energy) structures. An optimal protocol based on our studies first generates an ensemble of initial structures that maximally satisfy the available experimental data with conventional NMR software using a simplified force field and then refines these structures with implicit solvent using the REX method. We systematically examine the reliability and efficacy of this protocol using four proteins of various sizes ranging from the 56-residue B1 domain of Streptococcal protein G to the 370-residue Maltose-binding protein. Significant improvement in the structures was observed in all cases when refinement was based on low-redundancy restraint data. The proposed protocol is anticipated to be particularly useful in early stages of NMR structure determination where a reliable estimate of the native fold from limited data can significantly expedite the overall process. This refinement procedure is also expected to be useful when redundant experimental data are not readily available, such as for large multidomain biomolecules and in solid-state NMR structure determination.
Ormerod, Marcus; Newton, Rita
2018-01-01
Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people. PMID:29682348
Russell, Rachel; Ormerod, Marcus; Newton, Rita
2018-01-01
Modifying the home environments of older people as they age in place is a well-established health and social care intervention. Using design and construction methods to redress any imbalance caused by the ageing process or disability within the home environment, occupational therapists are seen as the experts in this field of practice. However, the process used by occupational therapists when modifying home environments has been criticised for being disorganised and not founded on theoretical principles and concepts underpinning the profession. To address this issue, research was conducted to develop a design and construction process protocol specifically for home modifications. A three-stage approach was taken for the analysis of qualitative data generated from an online survey, completed by 135 occupational therapists in the UK. Using both the existing occupational therapy intervention process model and the design and construction process protocol as the theoretical frameworks, a 4-phase, 9-subphase design and construction process protocol for home modifications was developed. Overall, the study is innovative in developing the first process protocol for home modifications, potentially providing occupational therapists with a systematic and effective approach to the design and delivery of home modification services for older and disabled people.
[Change of care model in natural childbirth: Implementation in La Ribera delivery room].
Camacho-Morell, F; Romero-Martín, M J
To assess knowledge, wish for inclusion and implementation of normal childbirth care protocols at La Ribera University Hospital, the reason why they are not applied, and to assess the attendance at antepartum training activities. Cross-sectional descriptive study. They were carried out 186 surveys by convenience sampling to pregnant women attending fetal well-being control at hospital between 2014 and 2015. They were collected data about knowledge, wish for inclusion, compliance of protocols and reasons for non-compliance, and attendance at antepartum training activities. Percentages and confidence intervals were calculated. Chi-square test was used to compare categorical variables. They were collected percentages of knowledge (77%, CI95%: 75,5-78,5) and wish for inclusion (84,6%, CI 95% : 82,5-86,7). Protocol compliance ranged from 6% (nitrous oxide administration) to 91% (skin-to-skin contact). The main reasons for non-compliance were due to circumstances of childbirth process (56,3%, CI 95% : 51,1-61,5). Attendance at maternal education classes was 62%, mainly primiparous women (p=0,0001) with medium or high education level (p=0,001). Pregnant women have a high knowledge and wish for inclusion of normal childbirth care protocols. Attendance at antepartum training activities could by improved and the main reason for non-attendance is lack of information. Compliance is good enough in most protocols; when they are not applied is due to childbirth circumstances. Remaining tasks include the introduction of additional protocols and to involve pregnant women in decision-making. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.
An efficacious oral health care protocol for immunocompromised patients.
Solomon, C S; Shaikh, A B; Arendorf, T M
1995-01-01
A twice-weekly oral and perioral examination was provided to 120 patients receiving antineoplastic therapy. Sixty patients were monitored while following the traditional hospital oral care protocol (chlorhexidine, hydrogen peroxide, sodium bicarbonate, thymol glycol, benzocaine mouthrinse, and nystatin). The mouth care protocol was then changed (experimental protocol = chlorhexidine, benzocaine lozenges, amphotericin B lozenges), and patients were monitored until the sample size matched that of the hospital mouth care regime. There was a statistically significant reduction in oral complications upon introduction and maintenance of the experimental protocol.
PROTOCOL FOR EXAMINATION OF THE INNER CAN CLOSURE WELD REGION FOR 3013 DE CONTAINERS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mickalonis, J.
2014-09-16
The protocol for the examination of the inner can closure weld region (ICCWR) for 3013 DE containers is presented within this report. The protocol includes sectioning of the inner can lid section, documenting the surface condition, measuring corrosion parameters, and storing of samples. This protocol may change as the investigation develops since findings may necessitate additional steps be taken. Details of the previous analyses, which formed the basis for this protocol, are also presented.
On fixed-area plot sampling for downed coarse woody debris
Jeffrey H. Gove; Paul C. Van Deusen
2011-01-01
The use of fixed-area plots for sampling down coarse woody debris is reviewed. A set of clearly defined protocols for two previously described methods is established and a new method, which we call the 'sausage' method, is developed. All methods (protocols) are shown to be unbiased for volume estimation, but not necessarily for estimation of population...
The purpose of this protocol is to provide guidelines for the analysis of hair samples for total mercury by cold vapor atomic fluorescence (CVAFS) spectrometry. This protocol describes the methodology and all other analytical aspects involved in the analysis. Keywords: hair; s...
21 CFR 610.2 - Requests for samples and protocols; official release.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Biologics Evaluation and Research, a manufacturer shall not distribute a lot of a product until the lot is... Evaluation and Research, a manufacturer shall not distribute a lot of a biological product until the lot is... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Requests for samples and protocols; official...
21 CFR 610.2 - Requests for samples and protocols; official release.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Biologics Evaluation and Research, a manufacturer shall not distribute a lot of a product until the lot is... Evaluation and Research, a manufacturer shall not distribute a lot of a biological product until the lot is... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Requests for samples and protocols; official...
Justin D. Waskiewicz; Laura S. Kenefic; Nicole S. Rogers; Joshua J. Puhlick; John C. Brissette; Richard J. Dionne
2015-01-01
The U.S. Forest Service, Northern Research Station has been conducting research on the silviculture of northern conifers on the Penobscot Experimental Forest (PEF) in Maine since 1950. Formal study plans provide guidance and specifications for the experimental treatments, but documentation is also needed to ensure consistency in data collection and sampling protocols....
Twenty-first century brain banking. Processing brains for research: the Columbia University methods
del Amaya, Maria Pilar; Keller, Christian E.
2007-01-01
Carefully categorized postmortem human brains are crucial for research. The lack of generally accepted methods for processing human postmortem brains for research persists. Thus, brain banking is essential; however, it cannot be achieved at the cost of the teaching mission of the academic institution by routing brains away from residency programs, particularly when the autopsy rate is steadily decreasing. A consensus must be reached whereby a brain can be utilizable for diagnosis, research, and teaching. The best diagnostic categorization possible must be secured and the yield of samples for basic investigation maximized. This report focuses on integrated, novel methods currently applied at the New York Brain Bank, Columbia University, New York, which are designed to reach accurate neuropathological diagnosis, optimize the yield of samples, and process fresh-frozen samples suitable for a wide range of modern investigations. The brains donated for research are processed as soon as possible after death. The prosector must have a good command of the neuroanatomy, neuropathology, and the protocol. One half of each brain is immersed in formalin for performing the thorough neuropathologic evaluation, which is combined with the teaching task. The contralateral half is extensively dissected at the fresh state. The anatomical origin of each sample is recorded using the map of Brodmann for the cortical samples. The samples are frozen at −160°C, barcode labeled, and ready for immediate disbursement once categorized diagnostically. A rigorous organization of freezer space, coupled to an electronic tracking system with its attached software, fosters efficient access for retrieval within minutes of any specific frozen samples in storage. This report describes how this achievement is feasible with emphasis on the actual processing of brains donated for research. PMID:17985145
Staples, Emily; Ingram, Richard James Michael; Atherton, John Christopher; Robinson, Karen
2013-01-01
Sensitive measurement of multiple cytokine profiles from small mucosal tissue biopsies, for example human gastric biopsies obtained through an endoscope, is technically challenging. Multiplex methods such as Luminex assays offer an attractive solution but standard protocols are not available for tissue samples. We assessed the utility of three commercial Luminex kits (VersaMAP, Bio-Plex and MILLIPLEX) to measure interleukin-17A (IL-17) and interferon-gamma (IFNγ) concentrations in human gastric biopsies and we optimised preparation of mucosal samples for this application. First, we assessed the technical performance, limits of sensitivity and linear dynamic ranges for each kit. Next we spiked human gastric biopsies with recombinant IL-17 and IFNγ at a range of concentrations (1.5 to 1000 pg/mL) and assessed kit accuracy for spiked cytokine recovery and intra-assay precision. We also evaluated the impact of different tissue processing methods and extraction buffers on our results. Finally we assessed recovery of endogenous cytokines in unspiked samples. In terms of sensitivity, all of the kits performed well within the manufacturers' recommended standard curve ranges but the MILLIPLEX kit provided most consistent sensitivity for low cytokine concentrations. In the spiking experiments, the MILLIPLEX kit performed most consistently over the widest range of concentrations. For tissue processing, manual disruption provided significantly improved cytokine recovery over automated methods. Our selected kit and optimised protocol were further validated by measurement of relative cytokine levels in inflamed and uninflamed gastric mucosa using Luminex and real-time polymerase chain reaction. In summary, with proper optimisation Luminex kits (and for IL-17 and IFNγ the MILLIPLEX kit in particular) can be used for the sensitive detection of cytokines in mucosal biopsies. Our results should help other researchers seeking to quantify multiple low concentration cytokines in small tissue samples. PMID:23644159
Laforest, Brandon J; Winegardner, Amanda K; Zaheer, Omar A; Jeffery, Nicholas W; Boyle, Elizabeth E; Adamowicz, Sarah J
2013-04-04
Biodiversity surveys have long depended on traditional methods of taxonomy to inform sampling protocols and to determine when a representative sample of a given species pool of interest has been obtained. Questions remain as to how to design appropriate sampling efforts to accurately estimate total biodiversity. Here we consider the biodiversity of freshwater ostracods (crustacean class Ostracoda) from the region of Churchill, Manitoba, Canada. Through an analysis of observed species richness and complementarity, accumulation curves, and richness estimators, we conduct an a posteriori analysis of five bioblitz-style collection strategies that differed in terms of total duration, number of sites, protocol flexibility to heterogeneous habitats, sorting of specimens for analysis, and primary purpose of collection. We used DNA barcoding to group specimens into molecular operational taxonomic units for comparison. Forty-eight provisional species were identified through genetic divergences, up from the 30 species previously known and documented in literature from the Churchill region. We found differential sampling efficiency among the five strategies, with liberal sorting of specimens for molecular analysis, protocol flexibility (and particularly a focus on covering diverse microhabitats), and a taxon-specific focus to collection having strong influences on garnering more accurate species richness estimates. Our findings have implications for the successful design of future biodiversity surveys and citizen-science collection projects, which are becoming increasingly popular and have been shown to produce reliable results for a variety of taxa despite relying on largely untrained collectors. We propose that efficiency of biodiversity surveys can be increased by non-experts deliberately selecting diverse microhabitats; by conducting two rounds of molecular analysis, with the numbers of samples processed during round two informed by the singleton prevalence during round one; and by having sub-teams (even if all non-experts) focus on select taxa. Our study also provides new insights into subarctic diversity of freshwater Ostracoda and contributes to the broader "Barcoding Biotas" campaign at Churchill. Finally, we comment on the associated implications and future research directions for community ecology analyses and biodiversity surveys through DNA barcoding, which we show here to be an efficient technique enabling rapid biodiversity quantification in understudied taxa.
NEON Data Products: Supporting the Validation of GCOS Essential Climate Variables
NASA Astrophysics Data System (ADS)
Petroy, S. B.; Fox, A. M.; Metzger, S.; Thorpe, A.; Meier, C. L.
2014-12-01
The National Ecological Observatory Network (NEON) is a continental-scale ecological observation platform designed to collect and disseminate data that contributes to understanding and forecasting the impacts of climate change, land use change, and invasive species on ecology. NEON will collect in-situ and airborne data over 60 sites across the US, including Alaska, Hawaii, and Puerto Rico. The NEON Biomass, Productivity, and Biogeochemistry protocols currently direct the collection of samples from distributed, gradient, and tower plots at each site, with sampling occurring either multiple times during the growing season, annually, or on three- or five-year centers (e.g. for coarse woody debris). These data are processed into a series of field-derived data products (e.g. Biogeochemistry, LAI, above ground Biomass, etc.), and when combined with the NEON airborne hyperspectral and LiDAR imagery, are used support validation efforts of algorithms for deriving vegetation characteristics from the airborne data. Sites are further characterized using airborne data combined with in-situ tower measurements, to create additional data products of interest to the GCOS community, such as Albedo and fPAR. Presented here are a summary of tower/field/airborne sampling and observation protocols and examples of provisional datasets collected at NEON sites that may be used to support the ongoing validation of GCOS Essential Climate Variables.
RNA-seq mixology: designing realistic control experiments to compare protocols and analysis methods
Holik, Aliaksei Z.; Law, Charity W.; Liu, Ruijie; Wang, Zeya; Wang, Wenyi; Ahn, Jaeil; Asselin-Labat, Marie-Liesse; Smyth, Gordon K.
2017-01-01
Abstract Carefully designed control experiments provide a gold standard for benchmarking different genomics research tools. A shortcoming of many gene expression control studies is that replication involves profiling the same reference RNA sample multiple times. This leads to low, pure technical noise that is atypical of regular studies. To achieve a more realistic noise structure, we generated a RNA-sequencing mixture experiment using two cell lines of the same cancer type. Variability was added by extracting RNA from independent cell cultures and degrading particular samples. The systematic gene expression changes induced by this design allowed benchmarking of different library preparation kits (standard poly-A versus total RNA with Ribozero depletion) and analysis pipelines. Data generated using the total RNA kit had more signal for introns and various RNA classes (ncRNA, snRNA, snoRNA) and less variability after degradation. For differential expression analysis, voom with quality weights marginally outperformed other popular methods, while for differential splicing, DEXSeq was simultaneously the most sensitive and the most inconsistent method. For sample deconvolution analysis, DeMix outperformed IsoPure convincingly. Our RNA-sequencing data set provides a valuable resource for benchmarking different protocols and data pre-processing workflows. The extra noise mimics routine lab experiments more closely, ensuring any conclusions are widely applicable. PMID:27899618
Freitas, R; Nero, L A; Carvalho, A F
2009-07-01
Enumeration of mesophilic aerobes (MA) is the main quality and hygiene parameter for raw and pasteurized milk. High levels of these microorganisms indicate poor conditions in production, storage, and processing of milk, and also the presence of pathogens. Fifteen raw and 15 pasteurized milk samples were submitted for MA enumeration by a conventional plating method (using plate count agar) and Petrifilm Aerobic Count plates (3M, St. Paul, MN), followed by incubation according to 3 official protocols: IDF/ISO (incubation at 30 degrees C for 72 h), American Public Health Association (32 degrees C for 48 h), and Brazilian Ministry of Agriculture (36 degrees C for 48 h). The results were compared by linear regression and ANOVA. Considering the results from conventional methodology, good correlation indices and absence of significant differences between mean counts were observed, independent of type of milk sample (raw or pasteurized) and incubation conditions (IDF/ISO, American Public Health Association, or Ministry of Agriculture). Considering the results from Petrifilm Aerobic Count plates, good correlation indices and absence of significant differences were only observed for raw milk samples. The microbiota of pasteurized milk interfered negatively with the performance of Petrifilm Aerobic Count plates, probably because of the presence of microorganisms that poorly reduce the dye indicator of this system.
Xiao, Yongli; Sheng, Zong-Mei; Taubenberger, Jeffery K.
2015-01-01
The vast majority of surgical biopsy and post-mortem tissue samples are formalin-fixed and paraffin-embedded (FFPE), but this process leads to RNA degradation that limits gene expression analysis. As an example, the viral RNA genome of the 1918 pandemic influenza A virus was previously determined in a 9-year effort by overlapping RT-PCR from post-mortem samples. Using the protocols described here, the full genome of the 1918 virus at high coverage was determined in one high-throughput sequencing run of a cDNA library derived from total RNA of a 1918 FFPE sample after duplex-specific nuclease treatments. This basic methodological approach should assist in the analysis of FFPE tissue samples isolated over the past century from a variety of infectious diseases. PMID:26344216
Calvano, Cosima Damiana; van der Werf, Inez Dorothé; Palmisano, Francesco; Sabbatini, Luigia
2015-01-01
Direct on-target plate processing of small (ca. 100 μg) fragments of paint samples for MALDI-MS identification of lipid- and protein-based binders is described. Fragments were fixed on a conventional stainless steel target plate by colloidal graphite followed by in situ fast tryptic digestion and matrix addition. The new protocol was first developed on paint replicas composed of chicken egg, collagen, and cow milk mixed with inorganic pigments and then successfully applied on historical paint samples taken from a fifteenth century Italian panel painting. The present work contributes a step forward in the simplification of binder identification in very small paint samples since no conventional solvent extraction is required, speeding up the whole sample preparation to 10 min and reducing lipid/protein loss.
Evaluation of whole genome amplified DNA to decrease material expenditure and increase quality.
Bækvad-Hansen, Marie; Bybjerg-Grauholm, Jonas; Poulsen, Jesper B; Hansen, Christine S; Hougaard, David M; Hollegaard, Mads V
2017-06-01
The overall aim of this study is to evaluate whole genome amplification of DNA extracted from dried blood spot samples. We wish to explore ways of optimizing the amplification process, while decreasing the amount of input material and inherently the cost. Our primary focus of optimization is on the amount of input material, the amplification reaction volume, the number of replicates and amplification time and temperature. Increasing the quality of the amplified DNA and the subsequent results of array genotyping is a secondary aim of this project. This study is based on DNA extracted from dried blood spot samples. The extracted DNA was subsequently whole genome amplified using the REPLIg kit and genotyped on the PsychArray BeadChip (assessing > 570,000 SNPs genome wide). We used Genome Studio to evaluate the quality of the genotype data by call rates and log R ratios. The whole genome amplification process is robust and does not vary between replicates. Altering amplification time, temperature or number of replicates did not affect our results. We found that spot size i.e. amount of input material could be reduced without compromising the quality of the array genotyping data. We also showed that whole genome amplification reaction volumes can be reduced by a factor of 4, without compromising the DNA quality. Whole genome amplified DNA samples from dried blood spots is well suited for array genotyping and produces robust and reliable genotype data. However, the amplification process introduces additional noise to the data, making detection of structural variants such as copy number variants difficult. With this study, we explore ways of optimizing the amplification protocol in order to reduce noise and increase data quality. We found, that the amplification process was very robust, and that changes in amplification time or temperature did not alter the genotyping calls or quality of the array data. Adding additional replicates of each sample also lead to insignificant changes in the array data. Thus, the amount of noise introduced by the amplification process was consistent regardless of changes made to the amplification protocol. We also explored ways of decreasing material expenditure by reducing the spot size or the amplification reaction volume. The reduction did not affect the quality of the genotyping data.
A Family of ACO Routing Protocols for Mobile Ad Hoc Networks
Rupérez Cañas, Delfín; Sandoval Orozco, Ana Lucila; García Villalba, Luis Javier; Kim, Tai-hoon
2017-01-01
In this work, an ACO routing protocol for mobile ad hoc networks based on AntHocNet is specified. As its predecessor, this new protocol, called AntOR, is hybrid in the sense that it contains elements from both reactive and proactive routing. Specifically, it combines a reactive route setup process with a proactive route maintenance and improvement process. Key aspects of the AntOR protocol are the disjoint-link and disjoint-node routes, separation between the regular pheromone and the virtual pheromone in the diffusion process and the exploration of routes, taking into consideration the number of hops in the best routes. In this work, a family of ACO routing protocols based on AntOR is also specified. These protocols are based on protocol successive refinements. In this work, we also present a parallelized version of AntOR that we call PAntOR. Using programming multiprocessor architectures based on the shared memory protocol, PAntOR allows running tasks in parallel using threads. This parallelization is applicable in the route setup phase, route local repair process and link failure notification. In addition, a variant of PAntOR that consists of having more than one interface, which we call PAntOR-MI (PAntOR-Multiple Interface), is specified. This approach parallelizes the sending of broadcast messages by interface through threads. PMID:28531159
A high-throughput semi-automated preparation for filtered synaptoneurosomes.
Murphy, Kathryn M; Balsor, Justin; Beshara, Simon; Siu, Caitlin; Pinto, Joshua G A
2014-09-30
Synaptoneurosomes have become an important tool for studying synaptic proteins. The filtered synaptoneurosomes preparation originally developed by Hollingsworth et al. (1985) is widely used and is an easy method to prepare synaptoneurosomes. The hand processing steps in that preparation, however, are labor intensive and have become a bottleneck for current proteomic studies using synaptoneurosomes. For this reason, we developed new steps for tissue homogenization and filtration that transform the preparation of synaptoneurosomes to a high-throughput, semi-automated process. We implemented a standardized protocol with easy to follow steps for homogenizing multiple samples simultaneously using a FastPrep tissue homogenizer (MP Biomedicals, LLC) and then filtering all of the samples in centrifugal filter units (EMD Millipore, Corp). The new steps dramatically reduce the time to prepare synaptoneurosomes from hours to minutes, increase sample recovery, and nearly double enrichment for synaptic proteins. These steps are also compatible with biosafety requirements for working with pathogen infected brain tissue. The new high-throughput semi-automated steps to prepare synaptoneurosomes are timely technical advances for studies of low abundance synaptic proteins in valuable tissue samples. Copyright © 2014 Elsevier B.V. All rights reserved.
Vismarra, Alice; Barilli, Elena; Miceli, Maura; Mangia, Carlo; Bacci, Cristina; Brindani, Franco; Kramer, Laura
2017-01-24
Toxoplasmosis is a zoonotic disease caused by the protozoan Toxoplasma gondii. Ingestion of raw milk has been suggested as a risk for transmission to humans. Here the authors evaluated pre-treatment protocols for DNA extraction on T. gondii tachyzoite-spiked sheep milk with the aim of identifying the method that resulted in the most rapid and reliable polymerase chain reaction (PCR) positivity. This protocol was then used to analyse milk samples from sheep of three different farms in Southern Italy, including real time PCR for DNA quantification and PCR-restriction fragment length polymorphism for genotyping. The pre-treatment protocol using ethylenediaminetetraacetic acid and Tris-HCl to remove casein gave the best results in the least amount of time compared to the others on spiked milk samples. One sample of 21 collected from sheep farms was positive on one-step PCR, real time PCR and resulted in a Type I genotype at one locus (SAG3). Milk usually contains a low number of tachyzoites and this could be a limiting factor for molecular identification. Our preliminary data has evaluated a rapid, cost-effective and sensitive protocol to treat milk before DNA extraction. The results of the present study also confirm the possibility of T. gondii transmission through consumption of raw milk and its unpasteurised derivatives.
A simplified field protocol for genetic sampling of birds using buccal swabs
Vilstrup, Julia T.; Mullins, Thomas D.; Miller, Mark P.; McDearman, Will; Walters, Jeffrey R.; Haig, Susan M.
2018-01-01
DNA sampling is an essential prerequisite for conducting population genetic studies. For many years, blood sampling has been the preferred method for obtaining DNA in birds because of their nucleated red blood cells. Nonetheless, use of buccal swabs has been gaining favor because they are less invasive yet still yield adequate amounts of DNA for amplifying mitochondrial and nuclear markers; however, buccal swab protocols often include steps (e.g., extended air-drying and storage under frozen conditions) not easily adapted to field settings. Furthermore, commercial extraction kits and swabs for buccal sampling can be expensive for large population studies. We therefore developed an efficient, cost-effective, and field-friendly protocol for sampling wild birds after comparing DNA yield among 3 inexpensive buccal swab types (2 with foam tips and 1 with a cotton tip). Extraction and amplification success was high (100% and 97.2% respectively) using inexpensive generic swabs. We found foam-tipped swabs provided higher DNA yields than cotton-tipped swabs. We further determined that omitting a drying step and storing swabs in Longmire buffer increased efficiency in the field while still yielding sufficient amounts of DNA for detailed population genetic studies using mitochondrial and nuclear markers. This new field protocol allows time- and cost-effective DNA sampling of juveniles or small-bodied birds for which drawing blood may cause excessive stress to birds and technicians alike.
NASA Astrophysics Data System (ADS)
Cheng, Yuan; Duan, Feng-kui; He, Ke-bin; Du, Zhen-yu; Zheng, Mei; Ma, Yong-liang
2012-12-01
Three temperature protocols with different peak inert mode temperature (Tpeak-inert) were compared based on source and ambient samples (both untreated and extracted using a mixture of hexane, methylene chloride, and acetone) collected in Beijing, China. The ratio of EC580 (elemental carbon measured by the protocol with a Tpeak-inert of 580 °C; similar hereinafter) to EC850 could be as high as 4.8 for biomass smoke samples whereas the ratio was about 1.0 for diesel and gasoline exhaust samples. The EC580 to EC850 ratio averaged 1.95 ± 0.89 and 1.13 ± 0.20 for the untreated and extracted ambient samples, whereas the EC580 to EC650 ratio of ambient samples was 1.22 ± 0.10 and 1.20 ± 0.12 before and after extraction. It was suggested that there are two competing mechanisms for the effects of Tpeak-inert on the EC results such that when Tpeak-inert is increased, one mechanism tends to decrease EC by increasing the amount of charring whereas the other tends to increase EC through promoting more charring to evolve before native EC. Results from this study showed that EC does not always decrease when increasing the peak inert mode temperature. Moreover, reducing the charring amount could improve the protocols agreement on EC measurements, whereas temperature protocol would not influence the EC results if no charring is formed. This study also demonstrated the benefits of allowing for the OC and EC split occurring in the inert mode when a high Tpeak-inert is used (e.g., 850 °C).
Han, Yongming; Chen, Antony; Cao, Junji; Fung, Kochy; Ho, Fai; Yan, Beizhan; Zhan, Changlin; Liu, Suixin; Wei, Chong; An, Zhisheng
2013-01-01
Quantifying elemental carbon (EC) content in geological samples is challenging due to interferences of crustal, salt, and organic material. Thermal/optical analysis, combined with acid pretreatment, represents a feasible approach. However, the consistency of various thermal/optical analysis protocols for this type of samples has never been examined. In this study, urban street dust and soil samples from Baoji, China were pretreated with acids and analyzed with four thermal/optical protocols to investigate how analytical conditions and optical correction affect EC measurement. The EC values measured with reflectance correction (ECR) were found always higher and less sensitive to temperature program than the EC values measured with transmittance correction (ECT). A high-temperature method with extended heating times (STN120) showed the highest ECT/ECR ratio (0.86) while a low-temperature protocol (IMPROVE-550), with heating time adjusted for sample loading, showed the lowest (0.53). STN ECT was higher than IMPROVE ECT, in contrast to results from aerosol samples. A higher peak inert-mode temperature and extended heating times can elevate ECT/ECR ratios for pretreated geological samples by promoting pyrolyzed organic carbon (PyOC) removal over EC under trace levels of oxygen. Considering that PyOC within filter increases ECR while decreases ECT from the actual EC levels, simultaneous ECR and ECT measurements would constrain the range of EC loading and provide information on method performance. Further testing with standard reference materials of common environmental matrices supports the findings. Char and soot fractions of EC can be further separated using the IMPROVE protocol. The char/soot ratio was lower in street dusts (2.2 on average) than in soils (5.2 on average), most likely reflecting motor vehicle emissions. The soot concentrations agreed with EC from CTO-375, a pure thermal method.
Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.
2009-01-01
In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The <2-mm fraction of each sample was analyzed for Al, Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of 19 organochlorine pesticides by gas chromatography. Only three of these samples had detectable pesticide concentrations. A separate sample of A-horizon soil was collected for microbial characterization by phospholipid fatty acid analysis (PLFA), soil enzyme assays, and determination of selected human and agricultural pathogens. Collection, preservation and analysis of samples for both organic compounds and microbial characterization add a great degree of complication to the sampling and preservation protocols and a significant increase to the cost for a continental-scale survey. Both these issues must be considered carefully prior to adopting these parameters as part of the soil geochemical survey of North America.
Onda, Yuichi; Kato, Hiroaki; Hoshi, Masaharu; Takahashi, Yoshio; Nguyen, Minh-Long
2015-01-01
The Fukushima Dai-ichi Nuclear Power Plant (FDNPP) accident resulted in extensive radioactive contamination of the environment via deposited radionuclides such as radiocesium and (131)I. Evaluating the extent and level of environmental contamination is critical to protecting citizens in affected areas and to planning decontamination efforts. However, a standardized soil sampling protocol is needed in such emergencies to facilitate the collection of large, tractable samples for measuring gamma-emitting radionuclides. In this study, we developed an emergency soil sampling protocol based on preliminary sampling from the FDNPP accident-affected area. We also present the results of a preliminary experiment aimed to evaluate the influence of various procedures (e.g., mixing, number of samples) on measured radioactivity. Results show that sample mixing strongly affects measured radioactivity in soil samples. Furthermore, for homogenization, shaking the plastic sample container at least 150 times or disaggregating soil by hand-rolling in a disposable plastic bag is required. Finally, we determined that five soil samples within a 3 m × 3-m area are the minimum number required for reducing measurement uncertainty in the emergency soil sampling protocol proposed here. Copyright © 2014 Elsevier Ltd. All rights reserved.
Konstan, Joseph; Iantaffi, Alex; Wilkerson, J. Michael; Galos, Dylan; Simon Rosser, B. R.
2017-01-01
Researchers use protocols to screen for suspicious survey submissions in online studies. We evaluated how well a de-duplication and cross-validation process detected invalid entries. Data were from the Sexually Explicit Media Study, an Internet-based HIV prevention survey of men who have sex with men. Using our protocol, 146 (11.6 %) of 1254 entries were identified as invalid. Most indicated changes to the screening questionnaire to gain entry (n = 109, 74.7 %), matched other submissions’ payment profiles (n = 56, 41.8 %), or featured an IP address that was recorded previously (n = 43, 29.5 %). We found few demographic or behavioral differences between valid and invalid samples, however. Invalid submissions had lower odds of reporting HIV testing in the past year (OR 0.63), and higher odds of requesting no payment compared to check payments (OR 2.75). Thus, rates of HIV testing would have been underestimated if invalid submissions had not been removed, and payment may not be the only incentive for invalid participation. PMID:25805443
A Constrained and Versioned Data Model for TEAM Data
NASA Astrophysics Data System (ADS)
Andelman, S.; Baru, C.; Chandra, S.; Fegraus, E.; Lin, K.
2009-04-01
The objective of the Tropical Ecology Assessment and Monitoring Network (www.teamnetwork.org) is "To generate real time data for monitoring long-term trends in tropical biodiversity through a global network of TEAM sites (i.e. field stations in tropical forests), providing an early warning system on the status of biodiversity to effectively guide conservation action". To achieve this, the TEAM Network operates by collecting data via standardized protocols at TEAM Sites. The standardized TEAM protocols include the Climate, Vegetation and Terrestrial Vertebrate Protocols. Some sites also implement additional protocols. There are currently 7 TEAM Sites with plans to grow the network to 15 by June 30, 2009 and 50 TEAM Sites by the end of 2010. At each TEAM Site, data is gathered as defined by the protocols and according to a predefined sampling schedule. The TEAM data is organized and stored in a database based on the TEAM spatio-temporal data model. This data model is at the core of the TEAM Information System - it consumes and executes spatio-temporal queries, and analytical functions that are performed on TEAM data, and defines the object data types, relationships and operations that maintain database integrity. The TEAM data model contains object types including types for observation objects (e.g. bird, butterfly and trees), sampling unit, person, role, protocol, site and the relationship of these object types. Each observation data record is a set of attribute values of an observation object and is always associated with a sampling unit, an observation timestamp or time interval, a versioned protocol and data collectors. The operations on the TEAM data model can be classified as read operations, insert operations and update operations. Following are some typical operations: The operation get(site, protocol, [sampling unit block, sampling unit,] start time, end time) returns all data records using the specified protocol and collected at the specified site, block, sampling unit and time range. The operation insertSamplingUnit(sampling unit, site, protocol) saves a new sampling unit into the data model and links it with the site and protocol. The operation updateSampligUnit(sampling_unit_id, attribute, value) changes the attribute (e.g. latitude or longitude) of the sampling unit to the specified value. The operation insertData(observation record, site, protocol, sampling unit, timestamps, data collectors) saves a new observation record into the database and associates it with specified objects. The operation updateData(protocol, data_id, attribute, value) modifies the attribute of an existing observation record to the specified value. All the insert or update operations require: 1) authorization to ensure the user has necessary privileges to perform the operation; 2) timestamp validation to ensure the observation timestamps are in the designated time range specified in the sampling schedule; 3) data validation to check that the data records use correct taxonomy terms and data values. No authorization is performed for get operations, but under some specific condition, a username may be required for the purpose of authentication. Along with the validations above, the TEAM data model also supports human based data validation on observed data through the Data Review subsystem to ensure data quality. The data review is implemented by adding two attributes review_tag and review_comment to each observation data record. The attribute review_tag is used by a reviewer to specify the quality of data, and the attribute review_comment is for reviewers to give more information when a problem is identified. The review_tag attribute can be populated by either the system conducting QA/QC tests or by pre-specified scientific experts. The following is the review operation, which is actually a special case of the operation updateData: The operation updateReview(protocol, data_id, judgment, comment) sets the attribute review_tag and review_comment to the specified values. By systematically tracking every step, The TEAM data model can roll back to any previous state. This is achieved by introducing a historical data container for each editable object type. When the operation updateData is applied to an object to modify its attribute, the object will be tagged with the current timestamp and the name of the user who conducts the operation, the tagged object will then be moved into the historical data container, and finally a new object will be created with the new value for the specified attribute. The diagram illustrates the architecture of the TEAM data management system. A data collector can use the Data Ingestion subsystem to load new data records into the TEAM data model. The system establishes a first level of review (i.e. meets minimum data standards via QA/QC tests). Further review is done via experts and they can verify and provide their comments on data records through the Data Review subsystem. The data editor can then address data records based on the reviewer's comments. Users can use the Data Query and Download application to find data by sites, protocols and time ranges. The Data Query and Download system packages selected data with the data license and important metadata information into a single package and delivers it to the user.
Flow cytometry for enrichment and titration in massively parallel DNA sequencing
Sandberg, Julia; Ståhl, Patrik L.; Ahmadian, Afshin; Bjursell, Magnus K.; Lundeberg, Joakim
2009-01-01
Massively parallel DNA sequencing is revolutionizing genomics research throughout the life sciences. However, the reagent costs and labor requirements in current sequencing protocols are still substantial, although improvements are continuously being made. Here, we demonstrate an effective alternative to existing sample titration protocols for the Roche/454 system using Fluorescence Activated Cell Sorting (FACS) technology to determine the optimal DNA-to-bead ratio prior to large-scale sequencing. Our method, which eliminates the need for the costly pilot sequencing of samples during titration is capable of rapidly providing accurate DNA-to-bead ratios that are not biased by the quantification and sedimentation steps included in current protocols. Moreover, we demonstrate that FACS sorting can be readily used to highly enrich fractions of beads carrying template DNA, with near total elimination of empty beads and no downstream sacrifice of DNA sequencing quality. Automated enrichment by FACS is a simple approach to obtain pure samples for bead-based sequencing systems, and offers an efficient, low-cost alternative to current enrichment protocols. PMID:19304748
Jablonski, Rita A; Winstead, Vicki; Azuero, Andres; Ptacek, Travis; Jones-Townsend, Corteza; Byrd, Elizabeth; Geisinger, Maria L; Morrow, Casey
2017-09-01
Individuals with dysphagia who reside in nursing homes often receive inadequate mouth care and experience poor oral health. From a policy perspective, the combination of absent evidence-based mouth care protocols coupled with insufficient dental coverage create a pool of individuals at great risk for preventable infectious illnesses that contribute to high health care costs. The purpose of the current study was to determine (a) the safety of a mouth care protocol tailored for individuals with dysphagia residing in nursing homes without access to suction equipment, and (b) the feasibility of collecting oral and fecal samples for microbiota analyses. The mouth care protocol resulted in improved oral hygiene without aspiration, and oral and fecal samples were safely collected from participants. Policies supporting ongoing testing of evidence-based mouth care protocols for individuals with dysphagia are important to improve quality, demonstrate efficacy, and save health care costs. [Journal of Gerontological Nursing, 43(9), 9-15.]. Copyright 2017, SLACK Incorporated.
METHOD FOR MICRORNA ISOLATION FROM CLINICAL SERUM SAMPLES
Li, Yu; Kowdley, Kris V.
2012-01-01
MicroRNAs are a group of intracellular non-coding RNA molecules that have been implicated in a variety of human diseases. Due to their high stability in blood, microRNAs released into circulation could be potentially utilized as non-invasive biomarkers for diagnosis or prognosis. Current microRNA isolation protocols are specifically designed for solid tissues and are impractical for biomarker development utilizing small-volume serum samples on a large scale. Thus, a protocol for microRNA isolation from serum is needed to accommodate these conditions in biomarker development. To establish such a protocol, we developed a simplified approach to normalize sample input by using single synthetic spike-in microRNA. We evaluated three commonly used commercial microRNA isolation kits for the best performance by comparing RNA quality and yield. The manufacturer’s protocol was further modified to improve the microRNA yield from 200 μL of human serum. MicroRNAs isolated from a large set of clinical serum samples were tested on the miRCURY LNA real-time PCR panel and confirmed to be suitable for high-throughput microRNA profiling. In conclusion, we have established a proven method for microRNA isolation from clinical serum samples suitable for microRNA biomarker development. PMID:22982505
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-18
...] Solicitation of Information and Recommendations for Revising OIG's Provider Self-Disclosure Protocol AGENCY... Register notice informs the public that OIG: (1) Intends to update the Provider Self-Disclosure Protocol... Provider Self-Disclosure Protocol (the Protocol) to establish a process for health care providers to...
NASA Astrophysics Data System (ADS)
Lane, Rebecca E.; Korbie, Darren; Anderson, Will; Vaidyanathan, Ramanathan; Trau, Matt
2015-01-01
Exosomes are vesicles which have garnered interest due to their diagnostic and therapeutic potential. Isolation of pure yields of exosomes from complex biological fluids whilst preserving their physical characteristics is critical for downstream applications. In this study, we use 100 nm-liposomes from 1,2-dioleoyl-sn-glycero-3-phosphocholine (DOPC) and cholesterol as a model system as a model system to assess the effect of exosome isolation protocols on vesicle recovery and size distribution using a single-particle analysis method. We demonstrate that liposome size distribution and ζ-potential are comparable to extracted exosomes, making them an ideal model for comparison studies. Four different purification protocols were evaluated, with liposomes robustly isolated by three of them. Recovered yields varied and liposome size distribution was unaltered during processing, suggesting that these protocols do not induce particle aggregation. This leads us to conclude that the size distribution profile and characteristics of vesicles are stably maintained during processing and purification, suggesting that reports detailing how exosomes derived from tumour cells differ in size to those from normal cells are reporting a real phenomenon. However, we hypothesize that larger particles present in most purified exosome samples represent co-purified contaminating non-exosome debris. These isolation techniques are therefore likely nonspecific and may co-isolate non-exosome material of similar physical properties.
Sato, Takahiro; Orai, Yoshihisa; Suzuki, Yuya; Ito, Hiroyuki; Isshiki, Toshiyuki; Fukui, Munetoshi; Nakamura, Kuniyasu; Schamp, C T
2017-10-01
To improve the reliability of silicon carbide (SiC) electronic power devices, the characteristics of various kinds of crystal defects should be precisely understood. Of particular importance is understanding the correlation between the surface morphology and the near surface dislocations. In order to analyze the dislocations near the surface of 4H-SiC wafers, a dislocation analysis protocol has been developed. This protocol consists of the following process: (1) inspection of surface defects using low energy scanning electron microscopy (LESEM), (2) identification of small and shallow etch pits using KOH low temperature etching, (3) classification of etch pits using LESEM, (4) specimen preparation of several hundred nanometer thick sample using the in-situ focused ion beam micro-sampling® technique, (5) crystallographic analysis using the selected diffraction mode of the scanning transmission electron microscope (STEM), and (6) determination of the Burgers vector using multi-directional STEM (MD-STEM). The results show a correlation between the triangular terrace shaped surface defects and an hexagonal etch pit arising from threading dislocations, linear shaped surface defects and elliptical shaped etch pits arising from basal plane dislocations. Through the observation of the sample from two orthogonal directions via the MD-STEM technique, a basal plane dislocation is found to dissociate into an extended dislocation bound by two partial dislocations. A protocol developed and presented in this paper enables one to correlate near surface defects of a 4H-SiC wafer with the root cause dislocations giving rise to those surface defects. © The Author 2017. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Attrition of limestone by impact loading in fluidized beds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fabrizio Scala; Fabio Montagnaro; Piero Salatino
2007-09-15
The present study addresses limestone attrition and fragmentation associated with impact loading, a process which may occur extensively in various regions of fluidized bed (FB) combustors/gasifiers, primarily the jetting region of the bottom bed, the exit region of the riser, and the cyclone. An experimental protocol for the characterization of the propensity of limestone to undergo attrition/fragmentation by impact loading is reported. The application of the protocol is demonstrated with reference to an Italian limestone whose primary fragmentation and attrition by surface wear have already been characterized in previous studies. The experimental procedure is based on the characterization of themore » amount and particle size distribution of the debris generated upon the impact of samples of sorbent particles against a target. Experiments were carried out at a range of particle impact velocities between 10 and 45 m/s, consistent with jet velocities corresponding to typical pressure drops across FB gas distributors. The protocol has been applied to either raw or preprocessed limestone samples. In particular, the effect of calcination, sulfation, and calcination/recarbonation cycles on the impact damage suffered by sorbent particles has been assessed. The measurement of particle voidage and pore size distribution by mercury intrusion was also accomplished to correlate fragmentation with the structural properties of the sorbent samples. Fragmentation by impact loading of the limestone is significant. Lime displays the largest propensity to undergo impact damage, followed by the sorbent sulfated to exhaustion, the recarbonated sorbent, and the raw limestone. Fragmentation of the raw limestone and of the sulfated lime follows a pattern typical of the failure of brittle materials. The fragmentation behavior of lime and recarbonated lime better conforms to a disintegration failure mode, with an extensive generation of very fine fragments. 27 refs., 9 figs. 1 tab.« less
Aloisio, Michelangelo; Bortot, Barbara; Gandin, Ilaria; Severini, Giovanni Maria; Athanasakis, Emmanouil
2017-02-01
Chimerism status evaluation of post-allogeneic hematopoietic stem cell transplantation samples is essential to predict post-transplant relapse. The most commonly used technique capable of detecting small increments of chimerism is quantitative real-time PCR. Although this method is already used in several laboratories, previously described protocols often lack sensitivity and the amount of the DNA required for each chimerism analysis is too high. In the present study, we compared a novel semi-nested allele-specific real-time PCR (sNAS-qPCR) protocol with our in-house standard allele-specific real-time PCR (gAS-qPCR) protocol. We selected two genetic markers and analyzed technical parameters (slope, y-intercept, R2, and standard deviation) useful to determine the performances of the two protocols. The sNAS-qPCR protocol showed better sensitivity and precision. Moreover, the sNAS-qPCR protocol requires, as input, only 10 ng of DNA, which is at least 10-fold less than the gAS-qPCR protocols described in the literature. Finally, the proposed sNAS-qPCR protocol could prove very useful for performing chimerism analysis with a small amount of DNA, as in the case of blood cell subsets.
Kizilbash, Quratulain; Jost, Kenneth; Armitige, Lisa; Griffith, David E; Dunbar, Denise; Seaworth, Barbara
2017-01-01
Abstract Background Non tuberculous mycobacteria (NTM) are widely distributed in soil and water. NTM/Mycobacterium tuberculosis complex (MTBC) mixes may yield positive AFB smears falsely attributed to tuberculosis (TB) and false-resistance profiles for TB due to contaminated diagnostic samples. This as well as isolation of NTM may pose diagnostic and management problems. Texas Center for Infectious Disease (TCID) is a hospital for patients with confirmed TB. After a cluster of isolates of Mycobacterium gordonae was identified, a quality assurance review found inadequate protocols which included eating and drinking prior to collection. Changes made to the sputum collection protocol included reeducation of respiratory therapists and a sterile saline rinse intervention prior to sputum collection. Methods All sputa collected for AFB culture from diagnosed TB patients at TCID from January 1st, 2014 to December 31st, 2014 prior to the intervention and from August 1st, 2016 to January 31st, 2017, the 6 months following the quality assurance intervention were included. Sputum samples were processed at the Texas Department of State and Health Services (DSHS) Laboratory. Results A total of 1,853 sputum samples were processed; 1,288 from 2014 and 565 following the intervention. NTM decreased from 56 (4.3%) to 7 (1.2%) after the quality assurance intervention was instituted for a NTM decrease of 75.0%. M. gordonae decreased by 78.6%. No patients had evidence of NTM disease. Conclusion A breach in sputum collection protocols at TCID accounted for the increase in NTM isolation in 2014, half of which were M. gordonae. The reeducation of respiratory therapy staff and initiation of sterile saline rinse prior to sputum collection resulted in a significant reduction in the overall NTM rate. M. gordonae was isolated only three times following the intervention. At TCID, a location where tap water and bottled water contains NTM, drinking these prior to sputum collection possibly contributed to the cluster for NTM, especially M. gordonae. We recommend rinsing the mouth with sterile saline or water prior to sputum collection to decrease isolation of rarely pathogenic NTM. Disclosures All authors: No reported disclosures.
Renaudin, Isabelle; Poliakoff, Françoise
2017-01-01
A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of “Flavescence dorée” (FD) phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes’ theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007), Pelletier (2009) and under patent oligonucleotides) achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper and their combination can be applied to many other studies concerning plant pathogens and other disciplines that use qualitative detection methods. PMID:28384335
Chabirand, Aude; Loiseau, Marianne; Renaudin, Isabelle; Poliakoff, Françoise
2017-01-01
A working group established in the framework of the EUPHRESCO European collaborative project aimed to compare and validate diagnostic protocols for the detection of "Flavescence dorée" (FD) phytoplasma in grapevines. Seven molecular protocols were compared in an interlaboratory test performance study where each laboratory had to analyze the same panel of samples consisting of DNA extracts prepared by the organizing laboratory. The tested molecular methods consisted of universal and group-specific real-time and end-point nested PCR tests. Different statistical approaches were applied to this collaborative study. Firstly, there was the standard statistical approach consisting in analyzing samples which are known to be positive and samples which are known to be negative and reporting the proportion of false-positive and false-negative results to respectively calculate diagnostic specificity and sensitivity. This approach was supplemented by the calculation of repeatability and reproducibility for qualitative methods based on the notions of accordance and concordance. Other new approaches were also implemented, based, on the one hand, on the probability of detection model, and, on the other hand, on Bayes' theorem. These various statistical approaches are complementary and give consistent results. Their combination, and in particular, the introduction of new statistical approaches give overall information on the performance and limitations of the different methods, and are particularly useful for selecting the most appropriate detection scheme with regards to the prevalence of the pathogen. Three real-time PCR protocols (methods M4, M5 and M6 respectively developed by Hren (2007), Pelletier (2009) and under patent oligonucleotides) achieved the highest levels of performance for FD phytoplasma detection. This paper also addresses the issue of indeterminate results and the identification of outlier results. The statistical tools presented in this paper and their combination can be applied to many other studies concerning plant pathogens and other disciplines that use qualitative detection methods.
DeRose, Yoko S.; Gligorich, Keith M.; Wang, Guoying; Georgelas, Ann; Bowman, Paulette; Courdy, Samir J.; Welm, Alana L.; Welm, Bryan E.
2013-01-01
Research models that replicate the diverse genetic and molecular landscape of breast cancer are critical for developing the next generation therapeutic entities that can target specific cancer subtypes. Patient-derived tumorgrafts, generated by transplanting primary human tumor samples into immune-compromised mice, are a valuable method to model the clinical diversity of breast cancer in mice, and are a potential resource in personalized medicine. Primary tumorgrafts also enable in vivo testing of therapeutics and make possible the use of patient cancer tissue for in vitro screens. Described in this unit are a variety of protocols including tissue collection, biospecimen tracking, tissue processing, transplantation, and 3-dimensional culturing of xenografted tissue, that enable use of bona fide uncultured human tissue in designing and validating cancer therapies. PMID:23456611
A quarantine protocol for analysis of returned extraterrestrial samples
NASA Technical Reports Server (NTRS)
Bagby, J. R.; Sweet, H. C.; Devincenzi, D. L.
1983-01-01
A protocol is presented for the analysis at an earth-orbiting quarantine facility of return samples of extraterrestrial material that might contain (nonterrestrial) life forms. The protocol consists of a series of tests designed to determine whether the sample, conceptualized as a 1-kg sample of Martian soil, is free from nonterrestrial biologically active agents and so may safely be sent to a terrestrial containment facility, or it exhibits biological activity requiring further (second-order) testing outside the biosphere. The first-order testing procedure seeks to detect the presence of any replicating organisms or toxic substances through a series of experiments including gas sampling, analysis of radioactivity, stereomicroscopic inspection, chemical analysis, microscopic examination, the search for metabolic products under growth conditions, microbiologicl assays, and the challenge of cultured cells with any agents found or with the extraterrestrial material as is. Detailed plans for the second-order testing would be developed in response to the actual data received from primary testing.
A Field-Based Cleaning Protocol for Sampling Devices Used in Life-Detection Studies
NASA Astrophysics Data System (ADS)
Eigenbrode, Jennifer; Benning, Liane G.; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E. F.
2009-06-01
Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.
A field-based cleaning protocol for sampling devices used in life-detection studies.
Eigenbrode, Jennifer; Benning, Liane G; Maule, Jake; Wainwright, Norm; Steele, Andrew; Amundsen, Hans E F
2009-06-01
Analytical approaches to extant and extinct life detection involve molecular detection often at trace levels. Thus, removal of biological materials and other organic molecules from the surfaces of devices used for sampling is essential for ascertaining meaningful results. Organic decontamination to levels consistent with null values on life-detection instruments is particularly challenging at remote field locations where Mars analog field investigations are carried out. Here, we present a seven-step, multi-reagent decontamination method that can be applied to sampling devices while in the field. In situ lipopolysaccharide detection via low-level endotoxin assays and molecular detection via gas chromatography-mass spectrometry were used to test the effectiveness of the decontamination protocol for sampling of glacial ice with a coring device and for sampling of sediments with a rover scoop during deployment at Arctic Mars-analog sites in Svalbard, Norway. Our results indicate that the protocols and detection technique sufficiently remove and detect low levels of molecular constituents necessary for life-detection tests.
Use of Electronic Hand-held Devices for Collection of Savannah River Site Environmental Data - 13329
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marberry, Hugh; Moore, Winston
2013-07-01
Savannah River Nuclear Solutions has begun using Xplore Tablet PC's to collect data in the field for soil samples, groundwater samples, air samples and round sheets at the Savannah River Site (SRS). EPA guidelines for groundwater sampling are incorporated into the application to ensure the sample technician follows the proper protocol. The sample technician is guided through the process for sampling and round sheet data collection by a series of menus and input boxes. Field measurements and well stabilization information are entered into the tablet for uploading into Environmental Restoration Data Management System (ERDMS). The process helps to eliminate inputmore » errors and provides data integrity. A soil sample technician has the ability to collect information about location of sample, field parameter, describe the soil sample, print bottle labels, and print chain of custody for the sample that they have collected. An air sample technician has the ability to provide flow, pressure, hours of operation, print bottle labels and chain of custody for samples they collect. Round sheets are collected using the information provided in the various procedures. The data are collected and uploaded into ERDMS. The equipment used is weather proof and hardened for the field use. Global Positioning System (GPS) capabilities are integrated into the applications to provide the location where samples were collected and to help sample technicians locate wells that are not visited often. (authors)« less
Frazier, Melanie; Miller, A. Whitman; Lee, Henry; Reusser, Deborah A.
2013-01-01
Discharge from the ballast tanks of ships is one of the primary vectors of nonindigenous species in marine environments. To mitigate this environmental and economic threat, international, national, and state entities are establishing regulations to limit the concentration of living organisms that may be discharged from the ballast tanks of ships. The proposed discharge standards have ranged from zero detectable organisms to 3. If standard sampling methods are used, verifying whether ballast discharge complies with these stringent standards will be challenging due to the inherent stochasticity of sampling. Furthermore, at low concentrations, very large volumes of water must be sampled to find enough organisms to accurately estimate concentration. Despite these challenges, adequate sampling protocols comprise a critical aspect of establishing standards because they help define the actual risk level associated with a standard. A standard that appears very stringent may be effectively lax if it is paired with an inadequate sampling protocol. We describe some of the statistical issues associated with sampling at low concentrations to help regulators understand the uncertainties of sampling as well as to inform the development of sampling protocols that ensure discharge standards are adequately implemented.
Johnsen, Hege Mari; Slettebø, Åshild; Fossum, Mariann
2016-05-01
The home healthcare context can be unpredictable and complex, and requires registered nurses with a high level of clinical reasoning skills and professional autonomy. Thus, additional knowledge about registered nurses' clinical reasoning performance during patient home care is required. The aim of this study is to describe the cognitive processes and thinking strategies used by recently graduated registered nurses while caring for patients in home healthcare clinical practice. An exploratory qualitative think-aloud design with protocol analysis was used. Home healthcare visits to patients with stroke, diabetes, and chronic obstructive pulmonary disease in seven healthcare districts in southern Norway. A purposeful sample of eight registered nurses with one year of experience. Each nurse was interviewed using the concurrent think-aloud technique in three different patient home healthcare clinical practice visits. A total of 24 home healthcare visits occurred. Follow-up interviews were conducted with each participant. The think-aloud sessions were transcribed and analysed using three-step protocol analysis. Recently graduated registered nurses focused on both general nursing concepts and concepts specific to the domains required and tasks provided in home healthcare services as well as for different patient groups. Additionally, participants used several assertion types, cognitive processes, and thinking strategies. Our results showed that recently graduated registered nurses used both simple and complex cognitive processes involving both inductive and deductive reasoning. However, their reasoning was more reactive than proactive. The results may contribute to nursing practice in terms of developing effective nursing education programmes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Haynes, Abby; Brennan, Sue; Carter, Stacy; O'Connor, Denise; Schneider, Carmen Huckel; Turner, Tari; Gallego, Gisselle
2014-09-27
Process evaluation is vital for understanding how interventions function in different settings, including if and why they have different effects or do not work at all. This is particularly important in trials of complex interventions in 'real world' organisational settings where causality is difficult to determine. Complexity presents challenges for process evaluation, and process evaluations that tackle complexity are rarely reported. This paper presents the detailed protocol for a process evaluation embedded in a randomised trial of a complex intervention known as SPIRIT (Supporting Policy In health with Research: an Intervention Trial). SPIRIT aims to build capacity for using research in health policy and program agencies. We describe the flexible and pragmatic methods used for capturing, managing and analysing data across three domains: (a) the intervention as it was implemented; (b) how people participated in and responded to the intervention; and (c) the contextual characteristics that mediated this relationship and may influence outcomes. Qualitative and quantitative data collection methods include purposively sampled semi-structured interviews at two time points, direct observation and coding of intervention activities, and participant feedback forms. We provide examples of the data collection and data management tools developed. This protocol provides a worked example of how to embed process evaluation in the design and evaluation of a complex intervention trial. It tackles complexity in the intervention and its implementation settings. To our knowledge, it is the only detailed example of the methods for a process evaluation of an intervention conducted as part of a randomised trial in policy organisations. We identify strengths and weaknesses, and discuss how the methods are functioning during early implementation. Using 'insider' consultation to develop methods is enabling us to optimise data collection while minimising discomfort and burden for participants. Embedding the process evaluation within the trial design is facilitating access to data, but may impair participants' willingness to talk openly in interviews. While it is challenging to evaluate the process of conducting a randomised trial of a complex intervention, our experience so far suggests that it is feasible and can add considerably to the knowledge generated.
Microstructural controls on the macroscopic behavior of geo-architected rock samples
NASA Astrophysics Data System (ADS)
Mitchell, C. A.; Pyrak-Nolte, L. J.
2017-12-01
Reservoir caprocks, are known to span a range of mechanical behavior from elastic granitic units to visco-elastic shale units. Whether a rock will behave elastically, visco-elastically or plastically depends on both the compositional and textural or microsctructural components of the rock, and how these components are spatially distributed. In this study, geo-architected caprock fabrication was performed to develop synthetic rock to study the role of rock rheology on fracture deformations, fluid flow and geochemical alterations. Samples were geo-architected with Portland Type II cement, Ottawa sand, and different clays (kaolinite, illite, and Montmorillonite). The relative percentages of these mineral components are manipulated to generate different rock types. With set protocols, the mineralogical content, texture, and certain structural aspects of the rock were controlled. These protocols ensure that identical samples with the same morphological and mechanical characteristics are constructed, thus overcoming issues that may arise in the presence of heterogeneity and high anisotropy from natural rock samples. Several types of homogeneous geo-architected rock samples were created, and in some cases the methods were varied to manipulate the physical parameters of the rocks. Characterization of rocks that the samples exhibit good repeatability. Rocks with the same mineralogical content generally yielded similar compressional and shear wave velocities, UCS and densities. Geo-architected rocks with 10% clay in the matrix had lower moisture content and effective porosities than rocks with no clay. The process by which clay is added to the matrix can strongly affect the resulting compressive strength and physical properties of the geo-architected sample. Acknowledgment: This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Geosciences Research Program under Award Number (DE-FG02-09ER16022).
Rapid Microbial Sample Preparation from Blood Using a Novel Concentration Device
Boardman, Anna K.; Campbell, Jennifer; Wirz, Holger; Sharon, Andre; Sauer-Budge, Alexis F.
2015-01-01
Appropriate care for bacteremic patients is dictated by the amount of time needed for an accurate diagnosis. However, the concentration of microbes in the blood is extremely low in these patients (1–100 CFU/mL), traditionally requiring growth (blood culture) or amplification (e.g., PCR) for detection. Current culture-based methods can take a minimum of two days, while faster methods like PCR require a sample free of inhibitors (i.e., blood components). Though commercial kits exist for the removal of blood from these samples, they typically capture only DNA, thereby necessitating the use of blood culture for antimicrobial testing. Here, we report a novel, scaled-up sample preparation protocol carried out in a new microbial concentration device. The process can efficiently lyse 10 mL of bacteremic blood while maintaining the microorganisms’ viability, giving a 30‑μL final output volume. A suite of six microorganisms (Staphylococcus aureus, Streptococcus pneumoniae, Escherichia coli, Haemophilus influenzae, Pseudomonas aeruginosa, and Candida albicans) at a range of clinically relevant concentrations was tested. All of the microorganisms had recoveries greater than 55% at the highest tested concentration of 100 CFU/mL, with three of them having over 70% recovery. At the lowest tested concentration of 3 CFU/mL, two microorganisms had recoveries of ca. 40–50% while the other four gave recoveries greater than 70%. Using a Taqman assay for methicillin-sensitive S. aureus (MSSA)to prove the feasibility of downstream analysis, we show that our microbial pellets are clean enough for PCR amplification. PCR testing of 56 spiked-positive and negative samples gave a specificity of 0.97 and a sensitivity of 0.96, showing that our sample preparation protocol holds great promise for the rapid diagnosis of bacteremia directly from a primary sample. PMID:25675242
Lippi, Giuseppe; Montagnana, Martina; Giavarina, Davide
2006-01-01
Owing to remarkable advances in automation, laboratory technology and informatics, the pre-analytical phase has become the major source of variability in laboratory testing. The present survey investigated the development of several pre-analytical processes within a representative cohort of Italian clinical laboratories. A seven-point questionnaire was designed to investigate the following issues: 1a) the mean outpatient waiting time before check-in and 1b) the mean time from check-in to sample collection; 2) the mean time from sample collection to analysis; 3) the type of specimen collected for clinical chemistry testing; 4) the degree of pre-analytical automation; 5a) the number of samples shipped to other laboratories and 5b) the availability of standardised protocols for transportation; 6) the conditions for specimen storage; and 7) the availability and type of guidelines for management of unsuitable specimens. The questionnaire was administered to 150 laboratory specialists attending the SIMEL (Italian Society of Laboratory Medicine) National Meeting in June 2006. 107 questionnaires (71.3%) were returned. Data analysis revealed a high degree of variability among laboratories for the time required for check-in, outpatient sampling, sample transportation to the referral laboratory and analysis upon the arrival. Only 31% of laboratories have automated some pre-analytical steps. Of the 87% of laboratories that ship specimens to other facilities without sample preparation, 19% have no standardised protocol for transportation. For conventional clinical chemistry testing, 74% of the laboratories use serum evacuated tubes (59% with and 15% without serum separator), whereas the remaining 26% use lithium-heparin evacuated tubes (11% with and 15% without plasma separator). The storage period and conditions for rerun/retest vary widely. Only 63% of laboratories have a codified procedure for the management of unsuitable specimens, which are recognised by visual inspection (69%) or automatic detection (29%). Only 56% of the laboratories have standardised procedures for the management of unsuitable specimens, which vary widely on a local basis. The survey highlights broad heterogeneity in several pre-analytical processes among Italian laboratories. The lack of reliable guidelines encompassing evidence-based practice is a major problem for the standardisation of this crucial part of the testing process and represents a major challenge for laboratory medicine in the 2000s.
Directed Diffusion Modelling for Tesso Nilo National Parks Case Study
NASA Astrophysics Data System (ADS)
Yasri, Indra; Safrianti, Ery
2018-01-01
— Directed Diffusion (DD has ability to achieve energy efficiency in Wireless Sensor Network (WSN). This paper proposes Directed Diffusion (DD) modelling for Tesso Nilo National Parks (TNNP) case study. There are 4 stages of scenarios involved in this modelling. It’s started by appointing of sampling area through GPS coordinate. The sampling area is determined by optimization processes from 500m x 500m up to 1000m x 1000m with 100m increment in between. The next stage is sensor node placement. Sensor node is distributed in sampling area with three different quantities i.e. 20 nodes, 30 nodes and 40 nodes. One of those quantities is choose as an optimized sensor node placement. The third stage is to implement all scenarios in stages 1 and stages 2 on DD modelling. In the last stage, the evaluation process to achieve most energy efficient in the combination of optimized sampling area and optimized sensor node placement on Direct Diffusion (DD) routing protocol. The result shows combination between sampling area 500m x 500m and 20 nodes able to achieve energy efficient to support a forest preventive fire system at Tesso Nilo National Parks.
Knirsch, Charles; Alemayehu, Demissie; Botgros, Radu; Comic-Savic, Sabrina; Friedland, David; Holland, Thomas L; Merchant, Kunal; Noel, Gary J; Pelfrene, Eric; Reith, Christina; Santiago, Jonas; Tiernan, Rosemary; Tenearts, Pamela; Goldsack, Jennifer C; Fowler, Vance G
2016-08-15
The etiology of hospital-acquired or ventilator-associated bacterial pneumonia (HABP/VABP) is often multidrug-resistant infections. The evaluation of new antibacterial drugs for efficacy in this population is important, as many antibacterial drugs have demonstrated limitations when studied in this population. HABP/VABP trials are expensive and challenging to conduct due to protocol complexity and low patient enrollment, among other factors. The Clinical Trials Transformation Initiative (CTTI) seeks to advance antibacterial drug development by streamlining HABP/VABP clinical trials to improve efficiency and feasibility while maintaining ethical rigor, patient safety, information value, and scientific validity. In 2013, CTTI engaged a multidisciplinary group of experts to discuss challenges impeding the conduct of HABP/VABP trials. Separate workstreams identified challenges associated with HABP/VABP protocol complexity. The Project Team developed potential solutions to streamline HABP/VABP trials using a Quality by Design approach. CTTI recommendations focus on 4 key areas to improve HABP/VABP trials: informed consent processes/practices, protocol design, choice of an institutional review board (IRB), and trial outcomes. Informed consent processes should include legally authorized representatives. Protocol design decisions should focus on eligibility criteria, prestudy antibacterial therapy considerations, use of new diagnostics, and sample size. CTTI recommends that sponsors use a central IRB and discuss trial endpoints with regulators, including defining a clinical failure and evaluating the impact of concomitant antibacterial drugs. Streamlining HABP/VABP trials by addressing key protocol elements can improve trial startup and patient recruitment/retention, reduce trial complexity and costs, and ensure patient safety while advancing antibacterial drug development. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.
E-novo: an automated workflow for efficient structure-based lead optimization.
Pearce, Bradley C; Langley, David R; Kang, Jia; Huang, Hongwei; Kulkarni, Amit
2009-07-01
An automated E-Novo protocol designed as a structure-based lead optimization tool was prepared through Pipeline Pilot with existing CHARMm components in Discovery Studio. A scaffold core having 3D binding coordinates of interest is generated from a ligand-bound protein structural model. Ligands of interest are generated from the scaffold using an R-group fragmentation/enumeration tool within E-Novo, with their cores aligned. The ligand side chains are conformationally sampled and are subjected to core-constrained protein docking, using a modified CHARMm-based CDOCKER method to generate top poses along with CDOCKER energies. In the final stage of E-Novo, a physics-based binding energy scoring function ranks the top ligand CDOCKER poses using a more accurate Molecular Mechanics-Generalized Born with Surface Area method. Correlation of the calculated ligand binding energies with experimental binding affinities were used to validate protocol performance. Inhibitors of Src tyrosine kinase, CDK2 kinase, beta-secretase, factor Xa, HIV protease, and thrombin were used to test the protocol using published ligand crystal structure data within reasonably defined binding sites. In-house Respiratory Syncytial Virus inhibitor data were used as a more challenging test set using a hand-built binding model. Least squares fits for all data sets suggested reasonable validation of the protocol within the context of observed ligand binding poses. The E-Novo protocol provides a convenient all-in-one structure-based design process for rapid assessment and scoring of lead optimization libraries.
A proposed group management scheme for XTP multicast
NASA Technical Reports Server (NTRS)
Dempsey, Bert J.; Weaver, Alfred C.
1990-01-01
The purpose of a group management scheme is to enable its associated transfer layer protocol to be responsive to user determined reliability requirements for multicasting. Group management (GM) must assist the client process in coordinating multicast group membership, allow the user to express the subset of the multicast group that a particular multicast distribution must reach in order to be successful (reliable), and provide the transfer layer protocol with the group membership information necessary to guarantee delivery to this subset. GM provides services and mechanisms that respond to the need of the client process or process level management protocols to coordinate, modify, and determine attributes of the multicast group, especially membership. XTP GM provides a link between process groups and their multicast groups by maintaining a group membership database that identifies members in a name space understood by the underlying transfer layer protocol. Other attributes of the multicast group useful to both the client process and the data transfer protocol may be stored in the database. Examples include the relative dispersion, most recent update, and default delivery parameters of a group.
Adamowicz, Michael S.; Stasulli, Dominique M.; Sobestanovich, Emily M.; Bille, Todd W.
2014-01-01
Samples for forensic DNA analysis are often collected from a wide variety of objects using cotton or nylon tipped swabs. Testing has shown that significant quantities of DNA are retained on the swab, however, and subsequently lost. When processing evidentiary samples, the recovery of the maximum amount of available DNA is critical, potentially dictating whether a usable profile can be derived from a piece of evidence or not. The QIAamp DNA Investigator extraction kit was used with its recommended protocol for swabs (one hour incubation at 56°C) as a baseline. Results indicate that over 50% of the recoverable DNA may be retained on the cotton swab tip, or otherwise lost, for both blood and buccal cell samples when using this protocol. The protocol’s incubation time and temperature were altered, as was incubating while shaking or stationary to test for increases in recovery efficiency. An additional step was then tested that included periodic re-suspension of the swab tip in the extraction buffer during incubation. Aliquots of liquid blood or a buccal cell suspension were deposited and dried on cotton swabs and compared with swab-less controls. The concentration of DNA in each extract was quantified and STR analysis was performed to assess the quality of the extracted DNA. Stationary incubations and those performed at 65°C did not result in significant gains in DNA yield. Samples incubated for 24 hours yielded less DNA. Increased yields were observed with three and 18 hour incubation periods. Increases in DNA yields were also observed using a swab re-suspension method for both cell types. The swab re-suspension method yielded an average two-fold increase in recovered DNA yield with buccal cells and an average three-fold increase with blood cells. These findings demonstrate that more of the DNA collected on swabs can be recovered with specific protocol alterations. PMID:25549111
Snyder-Mackler, Noah; Majoros, William H.; Yuan, Michael L.; Shaver, Amanda O.; Gordon, Jacob B.; Kopp, Gisela H.; Schlebusch, Stephen A.; Wall, Jeffrey D.; Alberts, Susan C.; Mukherjee, Sayan; Zhou, Xiang; Tung, Jenny
2016-01-01
Research on the genetics of natural populations was revolutionized in the 1990s by methods for genotyping noninvasively collected samples. However, these methods have remained largely unchanged for the past 20 years and lag far behind the genomics era. To close this gap, here we report an optimized laboratory protocol for genome-wide capture of endogenous DNA from noninvasively collected samples, coupled with a novel computational approach to reconstruct pedigree links from the resulting low-coverage data. We validated both methods using fecal samples from 62 wild baboons, including 48 from an independently constructed extended pedigree. We enriched fecal-derived DNA samples up to 40-fold for endogenous baboon DNA and reconstructed near-perfect pedigree relationships even with extremely low-coverage sequencing. We anticipate that these methods will be broadly applicable to the many research systems for which only noninvasive samples are available. The lab protocol and software (“WHODAD”) are freely available at www.tung-lab.org/protocols-and-software.html and www.xzlab.org/software.html, respectively. PMID:27098910
An objective protocol for comparing the noise performance of silver halide film and digital sensor
NASA Astrophysics Data System (ADS)
Cao, Frédéric; Guichard, Frédéric; Hornung, Hervé; Tessière, Régis
2012-01-01
Digital sensors have obviously invaded the photography mass market. However, some photographers with very high expectancy still use silver halide film. Are they only nostalgic reluctant to technology or is there more than meets the eye? The answer is not so easy if we remark that, at the end of the golden age, films were actually scanned before development. Nowadays film users have adopted digital technology and scan their film to take advantage from digital processing afterwards. Therefore, it is legitimate to evaluate silver halide film "with a digital eye", with the assumption that processing can be applied as for a digital camera. The article will describe in details the operations we need to consider the film as a RAW digital sensor. In particular, we have to account for the film characteristic curve, the autocorrelation of the noise (related to film grain) and the sampling of the digital sensor (related to Bayer filter array). We also describe the protocol that was set, from shooting to scanning. We then present and interpret the results of sensor response, signal to noise ratio and dynamic range.
Chandarana, Keval; Drew, Megan E; Emmanuel, Julian; Karra, Efthimia; Gelegen, Cigdem; Chan, Philip; Cron, Nicholas J; Batterham, Rachel L
2009-06-01
Gut hormones represent attractive therapeutic targets for the treatment of obesity and type 2 diabetes. However, controversy surrounds the effects that adiposity, dietary manipulations, and bariatric surgery have on their circulating concentrations. We sought to determine whether these discrepancies are due to methodologic differences. Ten normal-weight males participated in a 4-way crossover study investigating whether fasting appetite scores, plasma acyl-ghrelin, active glucagon-like peptide-1 (GLP-1), and peptide YY3-36 (PYY3-36) levels are altered by study-induced stress, prior food consumption, and sample processing. Study visit order affected anxiety, plasma cortisol, and temporal profiles of appetite and plasma PYY3-36, with increased anxiety and cortisol concentrations on the first study day. Plasma cortisol area under the curve (AUC) correlated positively with plasma PYY3-36 AUC. Despite a 14-hour fast, baseline hunger, PYY3-36 concentrations, temporal appetite profiles, PYY3-36 AUC, and active GLP-1 were affected by the previous evening's meal. Sample processing studies revealed that sample acidification and esterase inhibition are required when measuring acyl-ghrelin and dipeptidyl-peptidase IV inhibitor addition for active GLP-1. However, plasma PYY3-36 concentrations were unaffected by addition of dipeptidyl-peptidase IV. Accurate assessment of appetite, feeding behavior, and gut hormone concentrations requires standardization of prior food consumption and subject acclimatization to the study protocol. Moreover, because of the labile nature of acyl-ghrelin and active GLP-1, specialized sample processing needs to be undertaken.
Palmer, L; Farrar, A R; Valle, M; Ghahary, N; Panella, M; DeGraw, D
2000-05-01
Identification and evaluation of child sexual abuse is an integral task for clinicians. To aid these processes, it is necessary to have reliable and valid psychological measures. This is an investigation of the clinical validity and use of the House-Tree-Person (HTP) projective drawing, a widely used diagnostic tool, in the assessment of child sexual abuse. HTP drawings were collected archivally from a sample of sexually abused children (n = 47) and a nonabused comparison sample (n = 82). The two samples were grossly matched for gender, ethnicity, age, and socioeconomic status. The protocols were scored using a quantitative scoring system. The data were analyzed using a discriminant function analysis. Group membership could not be predicted based on a total HTP score.
Forensic DNA typing from teeth using demineralized root tips.
Corrêa, Heitor Simões Dutra; Pedro, Fabio Luis Miranda; Volpato, Luiz Evaristo Ricci; Pereira, Thiago Machado; Siebert Filho, Gilberto; Borges, Álvaro Henrique
2017-11-01
Teeth are widely used samples in forensic human genetic identification due to their persistence and practical sampling and processing. Their processing, however, has changed very little in the last 20 years, usually including powdering or pulverization of the tooth. The objective of this study was to present demineralized root tips as DNA sources while, at the same time, not involving powdering the samples or expensive equipment for teeth processing. One to five teeth from each of 20 unidentified human bodies recovered from midwest Brazil were analyzed. Whole teeth were demineralized in EDTA solution with daily solution change. After a maximum of approximately seven days, the final millimeters of the root tip was excised. This portion of the sample was used for DNA extraction through a conventional organic protocol. DNA quantification and STR amplification were performed using commercial kits followed by capillary electrophoresis on 3130 or 3500 genetic analyzers. For 60% of the unidentified bodies (12 of 20), a full genetic profile was obtained from the extraction of the first root tip. By the end of the analyses, full genetic profiles were obtained for 85% of the individuals studied, of which 80% were positively identified. This alternative low-tech approach for postmortem teeth processing is capable of extracting DNA in sufficient quantity and quality for forensic casework, showing that root tips are viable nuclear DNA sources even after demineralization. Copyright © 2017 Elsevier B.V. All rights reserved.
A Taxonomy of Attacks on the DNP3 Protocol
NASA Astrophysics Data System (ADS)
East, Samuel; Butts, Jonathan; Papa, Mauricio; Shenoi, Sujeet
Distributed Network Protocol (DNP3) is the predominant SCADA protocol in the energy sector - more than 75% of North American electric utilities currently use DNP3 for industrial control applications. This paper presents a taxonomy of attacks on the protocol. The attacks are classified based on targets (control center, outstation devices and network/communication paths) and threat categories (interception, interruption, modification and fabrication). To facilitate risk analysis and mitigation strategies, the attacks are associated with the specific DNP3 protocol layers they exploit. Also, the operational impact of the attacks is categorized in terms of three key SCADA objectives: process confi- dentiality, process awareness and process control. The attack taxonomy clarifies the nature and scope of the threats to DNP3 systems, and can provide insights into the relative costs and benefits of implementing mitigation strategies.
Reliable communication in the presence of failures
NASA Technical Reports Server (NTRS)
Birman, Kenneth P.; Joseph, Thomas A.
1987-01-01
The design and correctness of a communication facility for a distributed computer system are reported on. The facility provides support for fault-tolerant process groups in the form of a family of reliable multicast protocols that can be used in both local- and wide-area networks. These protocols attain high levels of concurrency, while respecting application-specific delivery ordering constraints, and have varying cost and performance that depend on the degree of ordering desired. In particular, a protocol that enforces causal delivery orderings is introduced and shown to be a valuable alternative to conventional asynchronous communication protocols. The facility also ensures that the processes belonging to a fault-tolerant process group will observe consistant orderings of events affecting the group as a whole, including process failures, recoveries, migration, and dynamic changes to group properties like member rankings. A review of several uses for the protocols is the ISIS system, which supports fault-tolerant resilient objects and bulletin boards, illustrates the significant simplification of higher level algorithms made possible by our approach.
Preventing disease transmission by deceased tissue donors by testing blood for viral nucleic acid.
Strong, D Michael; Nelson, Karen; Pierce, Marge; Stramer, Susan L
2005-01-01
Nucleic acid testing (NAT) has reduced the risk of transmitting infectious disease through blood transfusion. Currently NAT for HIV-1 and HCV are FDA licensed and performed by nearly all blood collection facilities, but HBV NAT is performed under an investigational study protocol. Residual risk estimates indicate that NAT could potentially reduce disease transmission through transplanted tissue. However, tissue donor samples obtained post-mortem have the potential to produce an invalid NAT result due to inhibition of amplification reactions by hemolysis and other factors. The studies reported here summarize the development of protocols to allow NAT of deceased donor samples with reduced rates of invalid results. Using these protocols, inventories from two tissue centers were tested with greater than 99% of samples producing a valid test result.
Geochemical and mineralogical data for soils of the conterminous United States
Smith, David B.; Cannon, William F.; Woodruff, Laurel G.; Solano, Federico; Kilburn, James E.; Fey, David L.
2013-01-01
In 2007, the U.S. Geological Survey initiated a low-density (1 site per 1,600 square kilometers, 4,857 sites) geochemical and mineralogical survey of soils of the conterminous United States as part of the North American Soil Geochemical Landscapes Project. Sampling and analytical protocols were developed at a workshop in 2003, and pilot studies were conducted from 2004 to 2007 to test and refine these recommended protocols. The final sampling protocol for the national-scale survey included, at each site, a sample from a depth of 0 to 5 centimeters, a composite of the soil A horizon, and a deeper sample from the soil C horizon or, if the top of the C horizon was at a depth greater than 1 meter, from a depth of approximately 80–100 centimeters. The <2-millimeter fraction of each sample was analyzed for a suite of 45 major and trace elements by methods that yield the total or near-total elemental content. The major mineralogical components in the samples from the soil A and C horizons were determined by a quantitative X-ray diffraction method using Rietveld refinement. Sampling in the conterminous United States was completed in 2010, with chemical and mineralogical analyses completed in May 2013. The resulting dataset provides an estimate of the abundance and spatial distribution of chemical elements and minerals in soils of the conterminous United States and represents a baseline for soil geochemistry and mineralogy against which future changes may be recognized and quantified. This report (1) describes the sampling, sample preparation, and analytical methods used; (2) gives details of the quality control protocols used to monitor the quality of chemical and mineralogical analyses over approximately six years; and (3) makes available the soil geochemical and mineralogical data in downloadable tables.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kafka, Kyle R. P.; Hoffman, Brittany N.; Papernov, Semyon
The laser-induced damage threshold of fused-silica samples processed via magnetorheological finishing is investigated for polishing compounds depending on the type of abrasive material and the post-polishing surface roughness. The effectiveness of laser conditioning is examined using a ramped pre-exposure with the same 351-nm, 3-ns Gaussian pulses. Lastly, we examine chemical etching of the surface and correlate the resulting damage threshold to the etching protocol. A combination of etching and laser conditioning is found to improve the damage threshold by a factor of ~3, while maintaining <1-nm surface roughness.
Pre-Mission Input Requirements to Enable Successful Sample Collection by A Remote Field/EVA Team
NASA Technical Reports Server (NTRS)
Cohen, B. A.; Lim, D. S. S.; Young, K. E.; Brunner, A.; Elphic, R. E.; Horne, A.; Kerrigan, M. C.; Osinski, G. R.; Skok, J. R.; Squyres, S. W.;
2016-01-01
The FINESSE (Field Investigations to Enable Solar System Science and Exploration) team, part of the Solar System Exploration Virtual Institute (SSERVI), is a field-based research program aimed at generating strategic knowledge in preparation for human and robotic exploration of the Moon, near-Earth asteroids, Phobos and Deimos, and beyond. In contract to other technology-driven NASA analog studies, The FINESSE WCIS activity is science-focused and, moreover, is sampling-focused with the explicit intent to return the best samples for geochronology studies in the laboratory. We used the FINESSE field excursion to the West Clearwater Lake Impact structure (WCIS) as an opportunity to test factors related to sampling decisions. We examined the in situ sample characterization and real-time decision-making process of the astronauts, with a guiding hypothesis that pre-mission training that included detailed background information on the analytical fate of a sample would better enable future astronauts to select samples that would best meet science requirements. We conducted three tests of this hypothesis over several days in the field. Our investigation was designed to document processes, tools and procedures for crew sampling of planetary targets. This was not meant to be a blind, controlled test of crew efficacy, but rather an effort to explicitly recognize the relevant variables that enter into sampling protocol and to be able to develop recommendations for crew and backroom training in future endeavors.
Anti-malarial drug quality in Lagos and Accra - a comparison of various quality assessments
2010-01-01
Background Two major cities in West Africa, Accra, the capital of Ghana, and Lagos, the largest city of Nigeria, have significant problems with substandard pharmaceuticals. Both have actively combated the problem in recent years, particularly by screening products on the market using the Global Pharma Health Fund e.V. Minilab® protocol. Random sampling of medicines from the two cities at least twice over the past 30 months allows a tentative assessment of whether improvements in drug quality have occurred. Since intelligence provided by investigators indicates that some counterfeit producers may be adapting products to pass Minilab tests, the results are compared with those from a Raman spectrometer and discrepancies are discussed. Methods Between mid-2007 and early-2010, samples of anti-malarial drugs were bought covertly from pharmacies in Lagos on three different occasions (October 2007, December 2008, February 2010), and from pharmacies in Accra on two different occasions (October 2007, February 2010). All samples were tested using the Minilab® protocol, which includes disintegration and active ingredient assays as well as visual inspection, and most samples were also tested by Raman spectrometry. Results In Lagos, the failure rate in the 2010 sampling fell to 29% of the 2007 finding using the Minilab® protocol, 53% using Raman spectrometry, and 46% using visual inspection. In Accra, the failure rate in the 2010 sampling fell to 54% of the 2007 finding using the Minilab® protocol, 72% using Raman spectrometry, and 90% using visual inspection. Conclusions The evidence presented shows that drug quality is probably improving in both cities, especially Lagos, since major reductions of failure rates over time occur with all means of assessment. Many more samples failed when examined by Raman spectrometry than by Minilab® protocol. The discrepancy is most likely caused by the two techniques measuring different aspects of the medication and hence the discrepancy may be the natural variation in these techniques. But other explanations are possible and are discussed. PMID:20537190
HEALTH-SCREENING PROTOCOLS FOR VINACEOUS AMAZONS (AMAZONA VINACEA) IN A REINTRODUCTION PROJECT.
Saidenberg, André B S; Zuniga, Eveline; Melville, Priscilla A; Salaberry, Sandra; Benites, Nilson R
2015-12-01
Reintroduction is a growing field in the conservation of endangered species. The vinaceous Amazon parrot (Amazona vinacea) is extinct in several areas, and a project to release confiscated individuals to their former range is currently underway. The objective of this study was to evaluate and improve the selection and treatment of individual release candidates by detecting possible pathogen carriers using samples taken before and during release. As part of prerelease health protocols, samples were obtained from 29 parrots on three different occasions while in captivity and once after their release. Samples were screened for paramyxovirus type 1, avian influenza, poxvirus, coronavirus, psittacine herpesvirus 1, Chlamydia psittaci , enteropathogenic Escherichia coli (EPEC), Salmonella spp., and endoparasites. The majority of samples returned negative results, with the exception of two individuals that tested positive for C. psittaci in the first sampling and for Ascaridia spp. in the second pooled sampling. Treatments for C. psittaci and endoparasites were administered prior to release, and negative results were obtained in subsequent exams. The number of positive results for E. coli (non-EPEC) decreased during the rehabilitation period. Adequate quarantine procedures and health examinations greatly minimize disease risks. The protocols employed in this study resulted in acceptable health status in accordance with current environmental legislation in Brazil. Additionally, protocols allowed informed decisions to release candidates, minimized risks, and favored the selection of healthy individuals, thereby contributing to the recovery of this species. It is important to determine appropriate minimum health-screening protocols when advanced diagnostics may not be available or high costs make the tests prohibitive in countries where confiscations occur. We hypothesize that a minimum panel of tests of pooled samples can serve as an alternative approach that minimizes costs and overall workload and supports projects intended to restore and promote flagship species and hamper their illegal trade.
Human immunodeficiency virus bDNA assay for pediatric cases.
Avila, M M; Liberatore, D; Martínez Peralta, L; Biglione, M; Libonatti, O; Coll Cárdenas, P; Hodara, V L
2000-01-01
Techniques to quantify plasma HIV-1 RNA viral load (VL) are commercially available, and they are adequate for monitoring adults infected by HIV and treated with antiretroviral drugs. Little experience on HIV VL has been reported in pediatric cases. In Argentina, the evaluation of several assays for VL in pediatrics are now being considered. To evaluate the pediatric protocol for bDNA assay in HIV-infected children, 25 samples from HIV-infected children (according to CDC criteria for pediatric AIDS) were analyzed by using Quantiplex HIV RNA 2.0 Assay (Chiron Corporation) following the manufacturer's recommendations in a protocol that uses 50 microliters of patient's plasma (sensitivity: 10,000 copies/ml). When HIV-RNA was not detected, samples were run with the 1 ml standard bDNA protocol (sensitivity: 500 HIV-RNA c/ml). Nine samples belonged to infants under 12 months of age (group A) and 16 were over 12 months (group B). All infants under one year of age had high HIV-RNA copies in plasma. VL ranged from 30,800 to 2,560,000 RNA copies/ml (median = 362,000 c/ml) for group A and < 10,000 to 554,600 c/ml (median = < 10,000) for group B. Only 25% of children in group B had detectable HIV-RNA. By using the standard test of quantification, none of the patients had non detectable HIV-RNA, ranging between 950 and 226,200 c/ml for group B (median = 23,300 RNA c/ml). The suggested pediatric protocol could be useful in children under 12 months of age, but 1 ml standard protocol must be used for older children. Samples with undetectable results from children under one year of age should be repeated using the standard protocol.
Keshishian, Hasmik; Burgess, Michael W; Specht, Harrison; Wallace, Luke; Clauser, Karl R; Gillette, Michael A; Carr, Steven A
2017-08-01
Proteomic characterization of blood plasma is of central importance to clinical proteomics and particularly to biomarker discovery studies. The vast dynamic range and high complexity of the plasma proteome have, however, proven to be serious challenges and have often led to unacceptable tradeoffs between depth of coverage and sample throughput. We present an optimized sample-processing pipeline for analysis of the human plasma proteome that provides greatly increased depth of detection, improved quantitative precision and much higher sample analysis throughput as compared with prior methods. The process includes abundant protein depletion, isobaric labeling at the peptide level for multiplexed relative quantification and ultra-high-performance liquid chromatography coupled to accurate-mass, high-resolution tandem mass spectrometry analysis of peptides fractionated off-line by basic pH reversed-phase (bRP) chromatography. The overall reproducibility of the process, including immunoaffinity depletion, is high, with a process replicate coefficient of variation (CV) of <12%. Using isobaric tags for relative and absolute quantitation (iTRAQ) 4-plex, >4,500 proteins are detected and quantified per patient sample on average, with two or more peptides per protein and starting from as little as 200 μl of plasma. The approach can be multiplexed up to 10-plex using tandem mass tags (TMT) reagents, further increasing throughput, albeit with some decrease in the number of proteins quantified. In addition, we provide a rapid protocol for analysis of nonfractionated depleted plasma samples analyzed in 10-plex. This provides ∼600 quantified proteins for each of the ten samples in ∼5 h of instrument time.
Determination of the magnetocaloric entropy change by field sweep using a heat flux setup
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monteiro, J. C. B., E-mail: jolmiui@gmail.com; Reis, R. D. dos; Mansanares, A. M.
2014-08-18
We report on a simple setup using a heat flux sensor adapted to a Quantum Design Physical Property Measurement System to determine the magnetocaloric entropy change (ΔS). The major differences for the existing setups are the simplicity of this assembly and the ease to obtain the isothermal entropy change either by a field sweep or a temperature sweep process. We discuss the use of these two processes applied to Gd and Gd{sub 5}Ge{sub 2}Si{sub 2} samples. The results are compared to the temperature sweep measurements and they show the advantages of this setup and of the field sweep procedure. Wemore » found a significant reduction of ΔS and on the refrigerating cooling power (RCP) at low field changes in a field sweep process when the sample is not driven to the same initial state for each temperature. We show that the field sweep process without any measuring protocol is the only correct way to experimentally determine ΔS and RCP for a practical regenerative refrigerator.« less
HSRP and HSRP Partner Analytical Methods and Protocols
HSRP has worked with various partners to develop and test analytical methods and protocols for use by laboratories charged with analyzing environmental and/or buildling material samples following contamination incident.
Universal Linear Optics: An implementation of Boson Sampling on a Fully Reconfigurable Circuit
NASA Astrophysics Data System (ADS)
Harrold, Christopher; Carolan, Jacques; Sparrow, Chris; Russell, Nicholas J.; Silverstone, Joshua W.; Marshall, Graham D.; Thompson, Mark G.; Matthews, Jonathan C. F.; O'Brien, Jeremy L.; Laing, Anthony; Martín-López, Enrique; Shadbolt, Peter J.; Matsuda, Nobuyuki; Oguma, Manabu; Itoh, Mikitaka; Hashimoto, Toshikazu
Linear optics has paved the way for fundamental tests in quantum mechanics and has gone on to enable a broad range of quantum information processing applications for quantum technologies. We demonstrate an integrated photonics processor that is universal for linear optics. The device is a silica-on-silicon planar waveguide circuit (PLC) comprising a cascade of 15 Mach Zehnder interferometers, with 30 directional couplers and 30 tunable thermo-optic phase shifters which are electrically interfaced for the arbitrary setting of a phase. We input ensembles of up to six photons, and monitor the output with a 12-single-photon detector system. The calibrated device is capable of implementing any linear optical protocol. This enables the implementation of new quantum information processing tasks in seconds, which would have previously taken months to realise. We demonstrate 100 instances of the boson sampling problem with verification tests, and six-dimensional complex Hadamards. Also Imperial College London.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Letant, S E; Kane, S R; Murphy, G A
2008-05-30
This note presents a comparison of Most-Probable-Number Rapid Viability (MPN-RV) PCR and traditional culture methods for the quantification of Bacillus anthracis Sterne spores in macrofoam swabs generated by the Centers for Disease Control and Prevention (CDC) for a multi-center validation study aimed at testing environmental swab processing methods for recovery, detection, and quantification of viable B. anthracis spores from surfaces. Results show that spore numbers provided by the MPN RV-PCR method were in statistical agreement with the CDC conventional culture method for all three levels of spores tested (10{sup 4}, 10{sup 2}, and 10 spores) even in the presence ofmore » dirt. In addition to detecting low levels of spores in environmental conditions, the MPN RV-PCR method is specific, and compatible with automated high-throughput sample processing and analysis protocols.« less
Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy
2017-08-15
The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Lever, Mark A.; Torti, Andrea; Eickenbusch, Philip; Michaud, Alexander B.; Šantl-Temkiv, Tina; Jørgensen, Bo Barker
2015-01-01
A method for the extraction of nucleic acids from a wide range of environmental samples was developed. This method consists of several modules, which can be individually modified to maximize yields in extractions of DNA and RNA or separations of DNA pools. Modules were designed based on elaborate tests, in which permutations of all nucleic acid extraction steps were compared. The final modular protocol is suitable for extractions from igneous rock, air, water, and sediments. Sediments range from high-biomass, organic rich coastal samples to samples from the most oligotrophic region of the world's oceans and the deepest borehole ever studied by scientific ocean drilling. Extraction yields of DNA and RNA are higher than with widely used commercial kits, indicating an advantage to optimizing extraction procedures to match specific sample characteristics. The ability to separate soluble extracellular DNA pools without cell lysis from intracellular and particle-complexed DNA pools may enable new insights into the cycling and preservation of DNA in environmental samples in the future. A general protocol is outlined, along with recommendations for optimizing this general protocol for specific sample types and research goals. PMID:26042110
Schneider, Dominik; Wemheuer, Franziska; Pfeiffer, Birgit; Wemheuer, Bernd
2017-01-01
Microbial communities play an important role in marine ecosystem processes. Although the number of studies targeting marker genes such as the 16S rRNA gene has been increased in the last few years, the vast majority of marine diversity is rather unexplored. Moreover, most studies focused on the entire bacterial community and thus disregarded active microbial community players. Here, we describe a detailed protocol for the simultaneous extraction of DNA and RNA from marine water samples and for the generation of cDNA from the isolated RNA which can be used as a universal template in various marker gene studies.