Science.gov

Sample records for assurance sampling method

  1. Chesapeake Bay coordinated split sample program annual report, 1990-1991: Analytical methods and quality assurance workgroup of the Chesapeake Bay program monitoring subcommittee

    SciTech Connect

    Not Available

    1991-01-01

    The Chesapeake Bay Program is a federal-state partnership with a goal of restoring the Chesapeake Bay. Its ambient water quality monitoring programs, started in 1984, sample over 150 monitoring stations once or twice a month a month. Due to the size of the Bay watershed (64,000 square miles) and the cooperative nature of the CBP, these monitoring programs involve 10 different analytical laboratories. The Chesapeake Bay Coordinated Split Sample Program (CSSP), initialed in 1988, assesses the comparability of the water quality results from these laboratories. The report summarizes CSSP results for 1990 and 1991, its second and third full years of operation. The CSSP has two main objectives: identifying parameters with low inter-organization agreement, and estimating measurement system variability. The identification of parmeters with low agreement is used as part of the overall Quality Assurance program. Laboratory and program personnel use the information to investigate possible causes of the differences, and take action to increase agreement if possible. Later CSSP results will document any improvements in inter-organization agreement. The variability estimates are most useful to data analysts and modelers who need confidence estimates for monitoring data.

  2. Measurement assurance program for LSC analyses of tritium samples

    SciTech Connect

    Levi, G.D. Jr.; Clark, J.P.

    1997-05-01

    Liquid Scintillation Counting (LSC) for Tritium is done on 600 to 800 samples daily as part of a contamination control program at the Savannah River Site`s Tritium Facilities. The tritium results from the LSCs are used: to release items as radiologically clean; to establish radiological control measures for workers; and to characterize waste. The following is a list of the sample matrices that are analyzed for tritium: filter paper smears, aqueous, oil, oily rags, ethylene glycol, ethyl alcohol, freon and mercury. Routine and special causes of variation in standards, counting equipment, environment, operators, counting times, samples, activity levels, etc. produce uncertainty in the LSC measurements. A comprehensive analytical process measurement assurance program such as JTIPMAP{trademark} has been implemented. The process measurement assurance program is being used to quantify and control many of the sources of variation and provide accurate estimates of the overall measurement uncertainty associated with the LSC measurements. The paper will describe LSC operations, process improvements, quality control and quality assurance programs along with future improvements associated with the implementation of the process measurement assurance program.

  3. Sampling quality assurance guidance in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document introduces quality assurance guidance pertaining to the design and implementation of sampling procedures and processes for collecting environmental data for DOE`s Office of EM (Environmental Restoration and Waste Management).

  4. Extending cluster lot quality assurance sampling designs for surveillance programs.

    PubMed

    Hund, Lauren; Pagano, Marcello

    2014-07-20

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. PMID:24633656

  5. Assure

    Integrated Risk Information System (IRIS)

    Assure ; CASRN 76578 - 14 - 8 Human health assessment information on a chemical substance is included in the IRIS database only after a comprehensive review of toxicity data , as outlined in the IRIS assessment development process . Sections I ( Health Hazard Assessments for Noncarcinogenic Effects

  6. Triangle area water supply monitoring project, October 1988 through September 2001, North Carolina -- description of the water-quality network, sampling and analysis methods, and quality-assurance practices

    USGS Publications Warehouse

    Oblinger, Carolyn J.

    2004-01-01

    The Triangle Area Water Supply Monitoring Project was initiated in October 1988 to provide long-term water-quality data for six area water-supply reservoirs and their tributaries. In addition, the project provides data that can be used to determine the effectiveness of large-scale changes in water-resource management practices, document differences in water quality among water-supply types (large multiuse reservoir, small reservoir, run-of-river), and tributary-loading and in-lake data for water-quality modeling of Falls and Jordan Lakes. By September 2001, the project had progressed in four phases and included as many as 34 sites (in 1991). Most sites were sampled and analyzed by the U.S. Geological Survey. Some sites were already a part of the North Carolina Division of Water Quality statewide ambient water-quality monitoring network and were sampled by the Division of Water Quality. The network has provided data on streamflow, physical properties, and concentrations of nutrients, major ions, metals, trace elements, chlorophyll, total organic carbon, suspended sediment, and selected synthetic organic compounds. Project quality-assurance activities include written procedures for sample collection, record management and archive, collection of field quality-control samples (blank samples and replicate samples), and monitoring the quality of field supplies. In addition to project quality-assurance activities, the quality of laboratory analyses was assessed through laboratory quality-assurance practices and an independent laboratory quality-control assessment provided by the U.S. Geological Survey Branch of Quality Systems through the Blind Inorganic Sample Project and the Organic Blind Sample Project.

  7. Development of quality assurance methods for epoxy graphite prepreg

    NASA Technical Reports Server (NTRS)

    Chen, J. S.; Hunter, A. B.

    1982-01-01

    Quality assurance methods for graphite epoxy/prepregs were developed. Liquid chromatography, differential scanning calorimetry, and gel permeation chromatography were investigated. These methods were applied to a second prepreg system. The resin matrix formulation was correlated with mechanical properties. Dynamic mechanical analysis and fracture toughness methods were investigated. The chromatography and calorimetry techniques were all successfully developed as quality assurance methods for graphite epoxy prepregs. The liquid chromatography method was the most sensitive to changes in resin formulation. The were also successfully applied to the second prepreg system.

  8. Using lot quality-assurance sampling and area sampling to identify priority areas for trachoma control: Viet Nam.

    PubMed Central

    Myatt, Mark; Mai, Nguyen Phuong; Quynh, Nguyen Quang; Nga, Nguyen Huy; Tai, Ha Huy; Long, Nguyen Hung; Minh, Tran Hung; Limburg, Hans

    2005-01-01

    OBJECTIVE: To report on the use of lot quality-assurance sampling (LQAS) surveys undertaken within an area-sampling framework to identify priority areas for intervention with trachoma control activities in Viet Nam. METHODS: The LQAS survey method for the rapid assessment of the prevalence of active trachoma was adapted for use in Viet Nam with the aim of classifying individual communes by the prevalence of active trachoma among children in primary school. School-based sampling was used; school sites to be sampled were selected using an area-sampling approach. A total of 719 communes in 41 districts in 18 provinces were surveyed. FINDINGS: Survey staff found the LQAS survey method both simple and rapid to use after initial problems with area-sampling methods were identified and remedied. The method yielded a finer spatial resolution of prevalence than had been previously achieved in Viet Nam using semiquantitative rapid assessment surveys and multistage cluster-sampled surveys. CONCLUSION: When used with area-sampling techniques, the LQAS survey method has the potential to form the basis of survey instruments that can be used to efficiently target resources for interventions against active trachoma. With additional work, such methods could provide a generally applicable tool for effective programme planning and for the certification of the elimination of trachoma as a blinding disease. PMID:16283052

  9. Using lot quality assurance sampling to improve immunization coverage in Bangladesh.

    PubMed Central

    Tawfik, Y.; Hoque, S.; Siddiqi, M.

    2001-01-01

    OBJECTIVE: To determine areas of low vaccination coverage in five cities in Bangladesh (Chittagong, Dhaka, Khulna, Rajshahi, and Syedpur). METHODS: Six studies using lot quality assurance sampling were conducted between 1995 and 1997 by Basic Support for Institutionalizing Child Survival and the Bangladesh National Expanded Programme on Immunization. FINDINGS: BCG vaccination coverage was acceptable in all lots studied; however, the proportion of lots rejected because coverage of measles vaccination was low ranged from 0% of lots in Syedpur to 12% in Chittagong and 20% in Dhaka's zones 7 and 8. The proportion of lots rejected because an inadequate number of children in the sample had been fully vaccinated varied from 11% in Syedpur to 30% in Dhaka. Additionally, analysis of aggregated, weighted immunization coverage showed that there was a high BCG vaccination coverage (the first administered vaccine) and a low measles vaccination coverage (the last administered vaccine) indicating a high drop-out rate, ranging from 14% in Syedpur to 36% in Dhaka's zone 8. CONCLUSION: In Bangladesh, where resources are limited, results from surveys using lot quality assurance sampling enabled managers of the National Expanded Programme on Immunization to identify areas with poor vaccination coverage. Those areas were targeted to receive focused interventions to improve coverage. Since this sampling method requires only a small sample size and was easy for staff to use, it is feasible for routine monitoring of vaccination coverage. PMID:11436470

  10. [Quality assurance in geriatric rehabilitation--approaches and methods].

    PubMed

    Deckenbach, B; Borchelt, M; Steinhagen-Thiessen, E

    1997-08-01

    It did not take the provisions of the 5th Book of the Social Code for quality assurance issues to gain significance in the field of geriatric rehabilitation as well. While in the surgical specialties, experience in particular with external quality assurance have already been gathered over several years now, suitable concepts and methods for the new Geriatric Rehabilitation specialty are still in the initial stages of development. Proven methods from the industrial and service sectors, such as auditing, monitoring and quality circles, can in principle be drawn on for devising geriatric rehabilitation quality assurance schemes; these in particular need to take into account the multiple factors influencing the course and outcome of rehabilitation entailed by multimorbidity and multi-drug use; the eminent role of the social environment; therapeutic interventions by a multidisciplinary team; as well as the multi-dimensional nature of rehabilitation outcomes. Moreover, the specific conditions of geriatric rehabilitation require development not only of quality standards unique to this domain but also of quality assurance procedures specific to geriatrics. Along with a number of other methods, standardized geriatric assessment will play a crucial role in this respect. PMID:9411627

  11. Sampling system and method

    DOEpatents

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  12. Chapter 5: Quality assurance/quality control in stormwater sampling

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Sampling the quality of stormwater presents unique challenges because stormwater flow is relatively short-lived with drastic variability. Furthermore, storm events often occur with little advance warning, outside conventional work hours, and under adverse weather conditions. Therefore, most stormwat...

  13. Quality assurance of analytical methods developed for analysis of environmentally significant species

    SciTech Connect

    Smith, R.E.

    1992-05-01

    A quality assurance program for trace analyses of environmentally significant species has begun. In the first stage, methods to analyze environmental samples for a variety of components have been developed and documented. Techniques include visual inspection, gravimetric analysis, ion chromatography (IC), inductively coupled plasma (ICP) emission spectrometry, ICP-mass spectrometry (ICP/MS), atomic absorption (AA) (flame, furnace, and mercury cold vapor techniques), gas chromatography (GC), potentiometry, and visible spectrophotometry. Industrial sites are analyzed for contamination by methylene dianiline (MDA). Precious metal waste sludges are analyzed for cyanide, halogens, mercury, and precious metals. Paint samples are analyzed for volatile organic compounds by GC and gravimetric analysis. Polychlorinated biphenyls (PCBs) also are determined in oil samples. In the second stage of quality assurance, methods are validated by accuracy and precision studies and by determination of detection limits and ranges. Improved methods provide additional information about key substances targeted by EPA. Furthermore, quality assurance data on IC and GC analyses are presented. IC methods simultaneously determine five anions in one run and four cations in another. Results on EPA-sponsored round robin tests indicate that IC can accurately determine the concentrations of anions and cations. Spiked samples analyzed by both GC and IC methods gave recoveries very close to 100%.

  14. 42 CFR 440.260 - Methods and standards to assure quality of services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Methods and standards to assure quality of services... and Limits Applicable to All Services § 440.260 Methods and standards to assure quality of services. The plan must include a description of methods and standards used to assure that services are of...

  15. Evaluation of a clinically intuitive quality assurance method

    PubMed Central

    Norris, H; Thomas, A; Oldham, M

    2013-01-01

    There is a pressing need for clinically intuitive quality assurance methods that report metrics of relevance to the likely impact on tumor control of normal tissue injury. This paper presents a preliminary investigation into the accuracy of a novel “transform method” which enables a clinically relevant analysis through dose-volume-histograms (DVHs) and dose overlays on the patient’s CT data. The transform method was tested by inducing a series of known mechanical and delivery errors onto simulated 3D dosimetry measurements of six different head-and-neck IMRT treatment plans. Accuracy was then examined through the comparison of the transformed patient dose distributions and the known actual patient dose distributions through dose-volume histograms and normalized dose difference analysis. Through these metrics, the transform method was found to be highly accurate in predicting measured patient dose distributions for these types of errors. PMID:24454519

  16. Evaluation of a Standardized Method of Quality Assurance in Mental Health Records: A Pilot Study

    ERIC Educational Resources Information Center

    Bradshaw, Kelsey M.; Donohue, Bradley; Fayeghi, Jasmine; Lee, Tiffany; Wilks, Chelsey R.; Ross, Brendon

    2016-01-01

    The widespread adoption of research-supported treatments by mental health providers has facilitated empirical development of quality assurance (QA) methods. Research in this area has focused on QA systems aimed at assuring the integrity of research-supported treatment implementation, while examination of QA systems to assure appropriate…

  17. Improved Sampling Method Reduces Isokinetic Sampling Errors.

    ERIC Educational Resources Information Center

    Karels, Gale G.

    The particulate sampling system currently in use by the Bay Area Air Pollution Control District, San Francisco, California is described in this presentation for the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971. The method represents a practical, inexpensive tool that can…

  18. QUALITY ASSURANCE PROGRAM FOR WET DEPOSITION SAMPLING AND CHEMICAL ANALYSES FOR THE NATIONAL TRENDS NETWORK.

    USGS Publications Warehouse

    Schroder, LeRoy J.; Malo, Bernard A.; ,

    1985-01-01

    The purpose of the National Trends Network is to delineate the major inorganic constituents in the wet deposition in the United States. The approach chosen to monitor the Nation's wet deposition is to install approximately 150 automatic sampling devices with at least one collector in each state. Samples are collected at one week intervals, removed from collectors, and transported to an analytical laboratory for chemical analysis. The quality assurance program has divided wet deposition monitoring into 5 parts: (1) Sampling site selection, (2) sampling device, (3) sample container, (4) sample handling, and (5) laboratory analysis. Each of these five components is being examined using existing designs or new designs. Each existing or proposed sampling site is visited and a criteria audit is performed.

  19. Sampling system and method

    DOEpatents

    Decker, David L; Lyles, Brad F; Purcell, Richard G; Hershey, Ronald Lee

    2014-05-20

    An apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. The method includes deploying the tubing bundle and wireline together, The tubing bundle is periodically secured to the wireline using a clamp.

  20. Data quality assessment in the routine health information system: an application of the Lot Quality Assurance Sampling in Benin.

    PubMed

    Glèlè Ahanhanzo, Yolaine; Ouendo, Edgard-Marius; Kpozèhouen, Alphonse; Levêque, Alain; Makoutodé, Michel; Dramaix-Wilmet, Michèle

    2015-09-01

    Health information systems in developing countries are often faulted for the poor quality of the data generated and for the insufficient means implemented to improve system performance. This study examined data quality in the Routine Health Information System in Benin in 2012 and carried out a cross-sectional evaluation of the quality of the data using the Lot Quality Assurance Sampling method. The results confirm the insufficient quality of the data based on three criteria: completeness, reliability and accuracy. However, differences can be seen as the shortcomings are less significant for financial data and for immunization data. The method is simple, fast and can be proposed for current use at operational level as a data quality control tool during the production stage.

  1. Quality assurance guidance for field sampling and measurement assessment plates in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document is one of several guidance documents developed by the US Department of Energy (DOE) Office of Environmental Restoration and Waste Management (EM). These documents support the EM Analytical Services Program (ASP) and are based on applicable regulatory requirements and DOE Orders. They address requirements in DOE Orders by providing guidance that pertains specifically to environmental restoration and waste management sampling and analysis activities. DOE 5700.6C Quality Assurance (QA) defines policy and requirements to establish QA programs ensuring that risks and environmental impacts are minimized and that safety, reliability, and performance are maximized. This is accomplished through the application of effective management systems commensurate with the risks imposed by the facility and the project. Every organization supporting EM`s environmental sampling and analysis activities must develop and document a QA program. Management of each organization is responsible for appropriate QA program implementation, assessment, and improvement. The collection of credible and cost-effective environmental data is critical to the long-term success of remedial and waste management actions performed at DOE facilities. Only well established and management supported assessment programs within each EM-support organization will enable DOE to demonstrate data quality. The purpose of this series of documents is to offer specific guidance for establishing an effective assessment program for EM`s environmental sampling and analysis (ESA) activities.

  2. Sediment laboratory quality-assurance project: studies of methods and materials

    USGS Publications Warehouse

    Gordon, J.D.; Newland, C.A.; Gray, J.R.

    2001-01-01

    In August 1996 the U.S. Geological Survey initiated the Sediment Laboratory Quality-Assurance project. The Sediment Laboratory Quality Assurance project is part of the National Sediment Laboratory Quality-Assurance program. This paper addresses the fmdings of the sand/fme separation analysis completed for the single-blind reference sediment-sample project and differences in reported results between two different analytical procedures. From the results it is evident that an incomplete separation of fme- and sand-size material commonly occurs resulting in the classification of some of the fme-size material as sand-size material. Electron microscopy analysis supported the hypothesis that the negative bias for fme-size material and the positive bias for sand-size material is largely due to aggregation of some of the fine-size material into sand-size particles and adherence of fine-size material to the sand-size grains. Electron microscopy analysis showed that preserved river water, which was low in dissolved solids, specific conductance, and neutral pH, showed less aggregation and adhesion than preserved river water that was higher in dissolved solids and specific conductance with a basic pH. Bacteria were also found growing in the matrix, which may enhance fme-size material aggregation through their adhesive properties. Differences between sediment-analysis methods were also investigated as pan of this study. Suspended-sediment concentration results obtained from one participating laboratory that used a total-suspended solids (TSS) method had greater variability and larger negative biases than results obtained when this laboratory used a suspended-sediment concentration method. When TSS methods were used to analyze the reference samples, the median suspended sediment concentration percent difference was -18.04 percent. When the laboratory used a suspended-sediment concentration method, the median suspended-sediment concentration percent difference was -2

  3. Sampling methods for phlebotomine sandflies.

    PubMed

    Alexander, B

    2000-06-01

    A review is presented of methods for sampling phlebotomine sandflies (Diptera: Psychodidae). Among approximately 500 species of Phlebotominae so far described, mostly in the New World genus Lutzomyia and the Old World genus Phlebotomus, about 10% are known vectors of Leishmania parasites or other pathogens. Despite being small and fragile, sandflies have a wide geographical range with species occupying a considerable diversity of ecotopes and habitats, from deserts to humid forests, so that suitable methods for collecting them are influenced by environmental conditions where they are sought. Because immature phlebotomines occupy obscure terrestrial habitats, it is difficult to find their breeding sites. Therefore, most trapping methods and sampling procedures focus on sandfly adults, whether resting or active. The diurnal resting sites of adult sandflies include tree holes, buttress roots, rock crevices, houses, animal shelters and burrows, from which they may be aspirated directly or trapped after being disturbed. Sandflies can be collected during their periods of activity by interception traps, or by using attractants such as bait animals, CO2 or light. The method of trapping used should: (a) be suited to the habitat and area to be surveyed, (b) take into account the segment of the sandfly population to be sampled (species, sex and reproduction condition) and (c) yield specimens of appropriate condition for the study objectives (e.g. identification of species present, population genetics or vector implication). Methods for preservation and transportation of sandflies to the laboratory also depend on the objectives of a particular study and are described accordingly. PMID:10872855

  4. Elaborating transition interface sampling methods

    SciTech Connect

    Erp, Titus S. van . E-mail: bolhuis@science.uva.nl

    2005-05-01

    We review two recently developed efficient methods for calculating rate constants of processes dominated by rare events in high-dimensional complex systems. The first is transition interface sampling (TIS), based on the measurement of effective fluxes through hypersurfaces in phase space. TIS improves efficiency with respect to standard transition path sampling (TPS) rate constant techniques, because it allows a variable path length and is less sensitive to recrossings. The second method is the partial path version of TIS. Developed for diffusive processes, it exploits the loss of long time correlation. We discuss the relation between the new techniques and the standard reactive flux methods in detail. Path sampling algorithms can suffer from ergodicity problems, and we introduce several new techniques to alleviate these problems, notably path swapping, stochastic configurational bias Monte Carlo shooting moves and order-parameter free path sampling. In addition, we give algorithms to calculate other interesting properties from path ensembles besides rate constants, such as activation energies and reaction mechanisms.

  5. Data Mining Methods Applied to Flight Operations Quality Assurance Data: A Comparison to Standard Statistical Methods

    NASA Technical Reports Server (NTRS)

    Stolzer, Alan J.; Halford, Carl

    2007-01-01

    In a previous study, multiple regression techniques were applied to Flight Operations Quality Assurance-derived data to develop parsimonious model(s) for fuel consumption on the Boeing 757 airplane. The present study examined several data mining algorithms, including neural networks, on the fuel consumption problem and compared them to the multiple regression results obtained earlier. Using regression methods, parsimonious models were obtained that explained approximately 85% of the variation in fuel flow. In general data mining methods were more effective in predicting fuel consumption. Classification and Regression Tree methods reported correlation coefficients of .91 to .92, and General Linear Models and Multilayer Perceptron neural networks reported correlation coefficients of about .99. These data mining models show great promise for use in further examining large FOQA databases for operational and safety improvements.

  6. Experience with Formal Methods techniques at the Jet Propulsion Laboratory from a quality assurance perspective

    NASA Technical Reports Server (NTRS)

    Kelly, John C.; Covington, Rick

    1993-01-01

    Recent experience with Formal Methods (FM) in the Software Quality Assurance Section at the Jet Propulsion Lab is presented. An integrated Formal Method process is presented to show how related existing requirements analysis and FM techniques complement one another. Example application of FM techniques such as formal specifications and specification animators are presented. The authors suggest that the quality assurance organization is a natural home for the Formal Methods specialist, whose expertise can then be used to best advantage across a range of projects.

  7. Quality Assurance Program Plan for the Waste Sampling and Characterization Facility

    SciTech Connect

    Grabbe, R.R.

    1995-03-02

    The objective of this Quality Assurance Plan is to provide quality assurance (QA) guidance, implementation of regulatory QA requirements, and quality control (QC) specifications for analytical service. This document follows the Department of Energy (DOE)-issued Hanford Analytical Services Quality Assurance Plan (HASQAP) and additional federal [10 US Code of Federal Regulations (CFR) 830.120] QA requirements that HASQAP does not cover. This document describes how the laboratory implements QA requirements to meet the federal or state requirements, provides what are the default QC specifications, and/or identifies the procedural information that governs how the laboratory operates. In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. This document also covers QA elements that are required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAPPs), (QAMS-004), and Interim Guidelines and Specifications for Preparing Quality Assurance Product Plans (QAMS-005) from the Environmental Protection Agency (EPA). A QA Index is provided in the Appendix A.

  8. Quality assurance flood source and method of making

    SciTech Connect

    Fisher, Darrell R; Alexander, David L; Satz, Stanley

    2002-12-03

    Disclosed is a is an improved flood source, and method of making the same, which emits an evenly distributed flow of energy from a gamma emitting radionuclide dispersed throughout the volume of the flood source. The flood source is formed by filling a bottom pan with a mix of epoxy resin with cobalt-57, preferably at 10 to 20 millicuries and then adding a hardener. The pan is secured to a flat, level surface to prevent the pan from warping and to act as a heat sink for removal of heat from the pan during the curing of the resin-hardener mixture.

  9. Fluid sampling apparatus and method

    DOEpatents

    Yeamans, David R.

    1998-01-01

    Incorporation of a bellows in a sampling syringe eliminates ingress of contaminants, permits replication of amounts and compression of multiple sample injections, and enables remote sampling for off-site analysis.

  10. Fluid sampling apparatus and method

    DOEpatents

    Yeamans, D.R.

    1998-02-03

    Incorporation of a bellows in a sampling syringe eliminates ingress of contaminants, permits replication of amounts and compression of multiple sample injections, and enables remote sampling for off-site analysis. 3 figs.

  11. Quality assurance manual plutonium liquid scintillation methods and procedures

    SciTech Connect

    Romero, L.

    1997-01-01

    Nose swipe analysis is a very important tool for Radiation Protection personnel. Nose swipe analysis is a very fast and accurate method for (1) determining if a worker has been exposed to airborne plutonium contamination and (2) Identifying the area where there has been a possible plutonium release. Liquid scintillation analysis techniques have been effectively applied to accurately determine the plutonium alpha activity on nose swipe media. Whatman-40 paper and Q-Tips are the only two media which have been evaluated and can be used for nose swipe analysis. Presently, only Q-Tips are used by Group HSE-1 Radiation Protection Personnel. However, both swipe media will be discussed in this report.

  12. Assessing Local Risk of Rifampicin-Resistant Tuberculosis in KwaZulu-Natal, South Africa Using Lot Quality Assurance Sampling

    PubMed Central

    Heidebrecht, Christine L.; Podewils, Laura J.; Pym, Alexander; Mthiyane, Thuli; Cohen, Ted

    2016-01-01

    Background KwaZulu-Natal (KZN) has the highest burden of notified multidrug-resistant tuberculosis (MDR TB) and extensively drug-resistant (XDR) TB cases in South Africa. A better understanding of spatial heterogeneity in the risk of drug-resistance may help to prioritize local responses. Methods Between July 2012 and June 2013, we conducted a two-way Lot Quality Assurance Sampling (LQAS) study to classify the burden of rifampicin (RIF)-resistant TB among incident TB cases notified within the catchment areas of seven laboratories in two northern and one southern district of KZN. Decision rules for classification of areas as having either a high- or low-risk of RIF resistant TB (based on proportion of RIF resistance among all TB cases) were based on consultation with local policy makers. Results We classified five areas as high-risk and two as low-risk. High-risk areas were identified in both Southern and Northern districts, with the greatest proportion of RIF resistance observed in the northernmost area, the Manguzi community situated on the Mozambique border. Conclusion Our study revealed heterogeneity in the risk of RIF resistant disease among incident TB cases in KZN. This study demonstrates the potential for LQAS to detect geographic heterogeneity in areas where access to drug susceptibility testing is limited. PMID:27050561

  13. Quality assurance

    SciTech Connect

    Gillespie, B.M.; Gleckler, B.P.

    1995-06-01

    This section of the 1994 Hanford Site Environmental Report summarizes the quality assurance and quality control practices of Hanford Site environmental monitoring and surveillance programs. Samples are analyzed according to documented standard analytical procedures. This section discusses specific measures taken to ensure quality in project management, sample collection, and analytical results.

  14. Duplex sampling apparatus and method

    DOEpatents

    Brown, Paul E.; Lloyd, Robert

    1992-01-01

    An improved apparatus is provided for sampling a gaseous mixture and for measuring mixture components. The apparatus includes two sampling containers connected in series serving as a duplex sampling apparatus. The apparatus is adapted to independently determine the amounts of condensable and noncondensable gases in admixture from a single sample. More specifically, a first container includes a first port capable of selectively connecting to and disconnecting from a sample source and a second port capable of selectively connecting to and disconnecting from a second container. A second container also includes a first port capable of selectively connecting to and disconnecting from the second port of the first container and a second port capable of either selectively connecting to and disconnecting from a differential pressure source. By cooling a mixture sample in the first container, the condensable vapors form a liquid, leaving noncondensable gases either as free gases or dissolved in the liquid. The condensed liquid is heated to drive out dissolved noncondensable gases, and all the noncondensable gases are transferred to the second container. Then the first and second containers are separated from one another in order to separately determine the amount of noncondensable gases and the amount of condensable gases in the sample.

  15. A method for critical software event execution reliability in high assurance systems

    SciTech Connect

    Kidd, M.E.C.

    1997-03-01

    This paper presents a method for Critical Software Event Execution Reliability (Critical SEER). The Critical SEER method is intended for high assurance software that operates in an environment where transient upsets could occur, causing a disturbance of the critical software event execution order, which could cause safety or security hazards. The method has a finite automata based module that watches (hence SEER) and tracks the critical events and ensures they occur in the proper order or else a fail safe state is forced. This method is applied during the analysis, design and implementation phases of software engineering.

  16. Apparatus and method for handheld sampling

    DOEpatents

    Staab, Torsten A.

    2005-09-20

    The present invention includes an apparatus, and corresponding method, for taking a sample. The apparatus is built around a frame designed to be held in at least one hand. A sample media is used to secure the sample. A sample media adapter for securing the sample media is operated by a trigger mechanism connectively attached within the frame to the sample media adapter.

  17. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, C.V.

    1991-02-05

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allows an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds. 11 figures.

  18. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, Cyril V.

    1991-01-01

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allow an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds.

  19. Paediatric Rehabilitation Treatment Standards: A Method for Quality Assurance in Germany

    PubMed Central

    Ahnert, Jutta; Löffler, Stefan; Müller, Jochen; Lukasczik, Matthias; Brüggemann, Silke; Vogel, Heiner

    2014-01-01

    Over the last few years, the German Pension Insurance has implemented a new method of quality assurance for inpatient rehabilitation of children and adolescents diagnosed with bronchial asthma, obesity, or atopic dermatitis: the so-called rehabilitation treatment standards (RTS). They aim at promoting a comprehensive and evidence-based care in rehabilitation. Furthermore, they are intended to make the therapeutic processes in medical rehabilitation as well as potential deficits more transparent. The development of RTS was composed of five phases during which current scientific evidence, expert knowledge, and patient expectations were included. Their core element is the specification of evidence-based treatment modules that describe a good rehabilitation standard for children diagnosed with bronchial asthma, obesity, or atopic dermatitis. Opportunities and limitations of the RTS as a tool for quality assurance are discussed. Significance for public health The German pension insurance’s rehabilitation treatment standards (RTS) for inpatient rehabilitation of children and adolescents aim at contributing to a comprehensive and evidence-based care in paediatric rehabilitation. As a core element, they comprise evidence-based treatment modules that describe a good rehabilitation standard for children diagnosed with bronchial asthma, obesity, or atopic dermatitis. Although the RTS have been developed for the specific context of the German health care system, they may be referred to as a more general starting point regarding the development of health care and quality assurance standards in child/adolescent medical rehabilitative care. PMID:25343137

  20. A Novel Method for Quality Assurance of the Cyberknife Iris Variable Aperture Collimator

    PubMed Central

    Kremer, Nikolaus; Fürweger, Christoph

    2016-01-01

    Objective: To characterize a novel method for field-size quality assurance of a variable approximately circular aperture collimator by means of dose-area product measurements and to validate its practical use over two years of clinical application. Methods:  To assess methodical limitations, we analyze measurement errors due to change in linac output, beam tuning, uncertainty in MU delivery, daily factors, inherent uncertainty of the large-area parallel-plate ionisation chamber, and misalignment of the large-area parallel-plate ionisation chamber relative to the primary beam axis. To establish a baseline for quality assurance, the dose-area product is measured with the large-area parallel-plate ionisation chamber for all 12 clinical iris apertures in relation to the 60 mm fixed reference aperture. To evaluate the long-term stability of the Iris collimation system, deviation from baseline data is assessed monthly and compared to a priori derived tolerance levels. Results: Only chamber misalignment, variation in output, and uncertainty in MU delivery contribute to a combined error that is estimated at 0.2 % of the nominal field size. This is equivalent to a resolution of 0.005 mm for the 5 mm, and 0.012 mm for the 60 mm field. The method offers ease of use, small measurement time commitment, and is independent of most error sources. Over the observed period, the Iris accuray is within the tolerance levels. Conclusions:  The method is an advantageous alternative to film quality assurance with a high reliability, short measurement time, and superior accuracy in field-size determination. PMID:27382526

  1. Duplex sampling apparatus and method

    SciTech Connect

    Brown, P.E.; Lloyd, R.

    1992-07-07

    This patent describes a method of measuring the condensable vapor content and the noncondensable gaseous content of a mixture of condensable vapors and noncondensable gases. It comprises collecting a quantity of a mixture of condensable vapors and noncondensable gases in a first container, cooling the first container whereby the condensable vapors in the mixture are condensed, transferring the noncondensable gases from the first container to a downstream second container, determining the quantity of condensable vapors retained in the first container, and measuring the pressure of noncondensable gases in the second container, measuring the temperature of noncondensable gases in the second container, thereby determining the quantity of noncondensable gases in the second container from the temperature, pressure and volume, using the ideal gas law: P{sub v} = nRT, whereby the ratio of condensable vapors to noncondensable gases in the mixture is determined.

  2. Quality assurance and quality control for thermal/optical analysis of aerosol samples for organic and elemental carbon.

    PubMed

    Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K

    2011-12-01

    Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (<4 °C) to minimize loss of semi-volatile OC. QA activities include participation in laboratory accreditation programs, external system audits, and interlaboratory comparisons. For thermal/optical carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (<100 ppmv) in the helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components. PMID:21626190

  3. Direct method for second-order sensitivity analysis of modal assurance criterion

    NASA Astrophysics Data System (ADS)

    Lei, Sheng; Mao, Kuanmin; Li, Li; Xiao, Weiwei; Li, Bin

    2016-08-01

    A Lagrange direct method is proposed to calculate the second-order sensitivity of modal assurance criterion (MAC) values of undamped systems. The eigenvalue problem and normalizations of eigenvectors, which augmented by using some Lagrange multipliers, are used as the constraints of the Lagrange functional. Once the Lagrange multipliers are determined, the sensitivities of MAC values can be evaluated directly. The Lagrange direct method is accurate, efficient and easy to implement. A simply supported beam is utilized to check the accuracy of the proposed method. A frame is adopted to validate the predicting capacity of the first- and second-order sensitivities of MAC values. It is shown that the computational costs of the proposed method can be remarkably reduced in comparison with those of the indirect method without loss of accuracy.

  4. Subrandom methods for multidimensional nonuniform sampling.

    PubMed

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics.

  5. Subrandom methods for multidimensional nonuniform sampling

    NASA Astrophysics Data System (ADS)

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics.

  6. Method and apparatus for data sampling

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium.

  7. Method and apparatus for data sampling

    DOEpatents

    Odell, D.M.C.

    1994-04-19

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples is described. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium. 6 figures.

  8. Mixed Methods Sampling: A Typology with Examples

    ERIC Educational Resources Information Center

    Teddlie, Charles; Yu, Fen

    2007-01-01

    This article presents a discussion of mixed methods (MM) sampling techniques. MM sampling involves combining well-established qualitative and quantitative techniques in creative ways to answer research questions posed by MM research designs. Several issues germane to MM sampling are presented including the differences between probability and…

  9. Analytical laboratory quality assurance guidance in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document introduces QA guidance pertaining to design and implementation of laboratory procedures and processes for collecting DOE Environmental Restoration and Waste Management (EM) ESAA (environmental sampling and analysis activities) data. It addresses several goals: identifying key laboratory issues and program elements to EM HQ and field office managers; providing non-prescriptive guidance; and introducing environmental data collection program elements for EM-263 assessment documents and programs. The guidance describes the implementation of laboratory QA elements within a functional QA program (development of the QA program and data quality objectives are not covered here).

  10. Private sector delivery of health services in developing countries: a mixed-methods study on quality assurance in social franchises

    PubMed Central

    2013-01-01

    Background Across the developing world health care services are most often delivered in the private sector and social franchising has emerged, over the past decade, as an increasingly popular method of private sector health care delivery. Social franchising aims to strengthen business practices through economies of scale: branding clinics and purchasing drugs in bulk at wholesale prices. While quality is one of the established goals of social franchising, there is no published documentation of how quality levels might be set in the context of franchised private providers, nor what quality assurance measures can or should exist within social franchises. The aim of this study was to better understand the quality assurance systems currently utilized in social franchises, and to determine if there are shared standards for practice or quality outcomes that exist across programs. Methods The study included three data sources and levels of investigation: 1) Self-reported program data; 2) Scoping telephone interviews; and 3) In-depth field interviews and clinic visits. Results Social Franchises conceive of quality assurance not as an independent activity, but rather as a goal that is incorporated into all areas of franchise operations, including recruitment, training, monitoring of provider performance, monitoring of client experience and the provision of feedback. Conclusions These findings are the first evidence to support the 2002 conceptual model of social franchising which proposed that the assurance of quality was one of the three core goals of all social franchises. However, while quality is important to franchise programs, quality assurance systems overall are not reflective of the evidence to-date on quality measurement or quality improvement best practices. Future research in this area is needed to better understand the details of quality assurance systems as applied in social franchise programs, the process by which quality assurance becomes a part of the

  11. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    SciTech Connect

    Suermann, J.F.

    1996-04-01

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits.

  12. Assuring Quality in Education Evaluation.

    ERIC Educational Resources Information Center

    Trochim, William M. K.; Visco, Ronald J.

    1986-01-01

    A number of quality assurance educational evaluation methods are illustrated. Evaluation data obtained from the Providence, Rhode Island, school district are used. The methods are: (1) from auditing, internal control; (2) from accounting, double bookkeeping; and (3) from industrial quality control, acceptance sampling and cumulative percentage…

  13. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts.

  14. A method of setting limits for the purpose of quality assurance

    NASA Astrophysics Data System (ADS)

    Sanghangthum, Taweap; Suriyapee, Sivalee; Kim, Gwe-Ya; Pawlicki, Todd

    2013-10-01

    The result from any assurance measurement needs to be checked against some limits for acceptability. There are two types of limits; those that define clinical acceptability (action limits) and those that are meant to serve as a warning that the measurement is close to the action limits (tolerance limits). Currently, there is no standard procedure to set these limits. In this work, we propose an operational procedure to set tolerance limits and action limits. The approach to establish the limits is based on techniques of quality engineering using control charts and a process capability index. The method is different for tolerance limits and action limits with action limits being categorized into those that are specified and unspecified. The procedure is to first ensure process control using the I-MR control charts. Then, the tolerance limits are set equal to the control chart limits on the I chart. Action limits are determined using the Cpm process capability index with the requirements that the process must be in-control. The limits from the proposed procedure are compared to an existing or conventional method. Four examples are investigated: two of volumetric modulated arc therapy (VMAT) point dose quality assurance (QA) and two of routine linear accelerator output QA. The tolerance limits range from about 6% larger to 9% smaller than conventional action limits for VMAT QA cases. For the linac output QA, tolerance limits are about 60% smaller than conventional action limits. The operational procedure describe in this work is based on established quality management tools and will provide a systematic guide to set up tolerance and action limits for different equipment and processes.

  15. Dynamic Method for Identifying Collected Sample Mass

    NASA Technical Reports Server (NTRS)

    Carson, John

    2008-01-01

    G-Sample is designed for sample collection missions to identify the presence and quantity of sample material gathered by spacecraft equipped with end effectors. The software method uses a maximum-likelihood estimator to identify the collected sample's mass based on onboard force-sensor measurements, thruster firings, and a dynamics model of the spacecraft. This makes sample mass identification a computation rather than a process requiring additional hardware. Simulation examples of G-Sample are provided for spacecraft model configurations with a sample collection device mounted on the end of an extended boom. In the absence of thrust knowledge errors, the results indicate that G-Sample can identify the amount of collected sample mass to within 10 grams (with 95-percent confidence) by using a force sensor with a noise and quantization floor of 50 micrometers. These results hold even in the presence of realistic parametric uncertainty in actual spacecraft inertia, center-of-mass offset, and first flexibility modes. Thrust profile knowledge is shown to be a dominant sensitivity for G-Sample, entering in a nearly one-to-one relationship with the final mass estimation error. This means thrust profiles should be well characterized with onboard accelerometers prior to sample collection. An overall sample-mass estimation error budget has been developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  16. Photographic sampling: a photographic sampling method for mites on plants.

    PubMed

    Sircom, J

    2000-01-01

    A photographic sampling method for mites on plants was evaluated using Tetranychus urticae and Phytoseiulus persimilis on pepper plants. It was found to be 92% accurate for T. urticae eggs and 98% accurate for P. persimilis eggs at densities up to 45 eggs per cm2 for T. urticae, and up to 3 eggs per cm2 for P. persimilis. The motiles of the two species were not confused, nor were they confused with exuviae or other matter.

  17. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    PubMed

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella. PMID:26268975

  18. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    PubMed

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella.

  19. Systems and methods for sample analysis

    SciTech Connect

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-10-20

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  20. Systems and methods for sample analysis

    DOEpatents

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-01-13

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  1. Method and apparatus for sampling atmospheric mercury

    DOEpatents

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.

    1976-01-20

    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  2. Monitoring maternal, newborn, and child health interventions using lot quality assurance sampling in Sokoto State of northern Nigeria

    PubMed Central

    Abegunde, Dele; Orobaton, Nosa; Shoretire, Kamil; Ibrahim, Mohammed; Mohammed, Zainab; Abdulazeez, Jumare; Gwamzhi, Ringpon; Ganiyu, Akeem

    2015-01-01

    Background Maternal mortality ratio and infant mortality rate are as high as 1,576 per 100,000 live births and 78 per 1,000 live births, respectively, in Nigeria's northwestern region, where Sokoto State is located. Using applicable monitoring indicators for tracking progress in the UN/WHO framework on continuum of maternal, newborn, and child health care, this study evaluated the progress of Sokoto toward achieving the Millennium Development Goals (MDGs) 4 and 5 by December 2015. The changes in outcomes in 2012–2013 associated with maternal and child health interventions were assessed. Design We used baseline and follow-up lot quality assurance sampling (LQAS) data obtained in 2012 and 2013, respectively. In each of the surveys, data were obtained from 437 households sampled from 19 LQAS locations in each of the 23 local government areas (LGAs). The composite state-level coverage estimates of the respective indicators were aggregated from estimated LGA coverage estimates. Results None of the nine indicators associated with the continuum of maternal, neonatal, and child care satisfied the recommended 90% coverage target for achieving MDGs 4 and 5. Similarly, the average state coverage estimates were lower than national coverage estimates. Marginal improvements in coverage were obtained in the demand for family planning satisfied, antenatal care visits, postnatal care for mothers, and exclusive breast-feeding. Antibiotic treatment for acute pneumonia increased significantly by 12.8 percentage points. The majority of the LGAs were classifiable as low-performing, high-priority areas for intensified program intervention. Conclusions Despite the limited time left in the countdown to December 2015, Sokoto State, Nigeria, is not on track to achieving the MDG 90% coverage of indicators tied to the continuum of maternal and child care, to reduce maternal and childhood mortality by a third by 2015. Targeted health system investments at the primary care level remain a

  3. A Method of Separation Assurance for Instrument Flight Procedures at Non-Radar Airports

    NASA Technical Reports Server (NTRS)

    Conway, Sheila R.; Consiglio, Maria

    2002-01-01

    A method to provide automated air traffic separation assurance services during approach to or departure from a non-radar, non-towered airport environment is described. The method is constrained by provision of these services without radical changes or ambitious investments in current ground-based technologies. The proposed procedures are designed to grant access to a large number of airfields that currently have no or very limited access under Instrument Flight Rules (IFR), thus increasing mobility with minimal infrastructure investment. This paper primarily addresses a low-cost option for airport and instrument approach infrastructure, but is designed to be an architecture from which a more efficient, albeit more complex, system may be developed. A functional description of the capabilities in the current NAS infrastructure is provided. Automated terminal operations and procedures are introduced. Rules of engagement and the operations are defined. Results of preliminary simulation testing are presented. Finally, application of the method to more terminal-like operations, and major research areas, including necessary piloted studies, are discussed.

  4. Associations with HIV testing in Uganda: an analysis of the Lot Quality Assurance Sampling database 2003-2012.

    PubMed

    Jeffery, Caroline; Beckworth, Colin; Hadden, Wilbur C; Ouma, Joseph; Lwanga, Stephen K; Valadez, Joseph J

    2016-01-01

    Beginning in 2003, Uganda used Lot Quality Assurance Sampling (LQAS) to assist district managers collect and use data to improve their human immunodeficiency virus (HIV)/AIDS program. Uganda's LQAS-database (2003-2012) covers up to 73 of 112 districts. Our multidistrict analysis of the LQAS data-set at 2003-2004 and 2012 examined gender variation among adults who ever tested for HIV over time, and attributes associated with testing. Conditional logistic regression matched men and women by community with seven model effect variables. HIV testing prevalence rose from 14% (men) and 12% (women) in 2003-2004 to 62% (men) and 80% (women) in 2012. In 2003-2004, knowing the benefits of testing (Odds Ratio [OR] = 6.09, 95% CI = 3.01-12.35), knowing where to get tested (OR = 2.83, 95% CI = 1.44-5.56), and secondary education (OR = 3.04, 95% CI = 1.19-7.77) were significantly associated with HIV testing. By 2012, knowing the benefits of testing (OR = 3.63, 95% CI = 2.25-5.83), where to get tested (OR = 5.15, 95% CI = 3.26-8.14), primary education (OR = 2.01, 95% CI = 1.39-2.91), being female (OR = 3.03, 95% CI = 2.53-3.62), and being married (OR = 1.81, 95% CI = 1.17-2.8) were significantly associated with HIV testing. HIV testing prevalence in Uganda has increased dramatically, more for women than men. Our results concurred with other authors that education, knowledge of HIV, and marriage (women only) are associated with testing for HIV and suggest that couples testing is more prevalent than other authors.

  5. Are Patent Medicine Vendors Effective Agents in Malaria Control? Using Lot Quality Assurance Sampling to Assess Quality of Practice in Jigawa, Nigeria

    PubMed Central

    Berendes, Sima; Adeyemi, Olusegun; Oladele, Edward Adekola; Oresanya, Olusola Bukola; Okoh, Festus; Valadez, Joseph J.

    2012-01-01

    Background Patent medicine vendors (PMV) provide antimalarial treatment and care throughout Sub-Saharan Africa, and can play an important role in the fight against malaria. Their close-to-client infrastructure could enable lifesaving artemisinin-based combination therapy (ACT) to reach patients in time. However, systematic assessments of drug sellers’ performance quality are crucial if their role is to be managed within the health system. Lot quality assurance sampling (LQAS) could be an efficient method to monitor and evaluate PMV practice, but has so far never been used for this purpose. Methods In support of the Nigeria Malaria Booster Program we assessed PMV practices in three Senatorial Districts (SDs) of Jigawa, Nigeria. A two-stage LQAS assessed whether at least 80% of PMV stores in SDs used national treatment guidelines. Acceptable sampling errors were set in consultation with government officials (alpha and beta <0.10). The hypergeometric formula determined sample sizes and cut-off values for SDs. A structured assessment tool identified high and low performing SDs for quality of care indicators. Findings Drug vendors performed poorly in all SDs of Jigawa for all indicators. For example, all SDs failed for stocking and selling first-line antimalarials. PMV sold no longer recommended antimalarials, such as Chloroquine, Sulfadoxine-Pyrimethamine and oral Artesunate monotherapy. Most PMV were ignorant of and lacked training about new treatment guidelines that had endorsed ACTs as first-line treatment for uncomplicated malaria. Conclusion There is urgent need to regularly monitor and improve the availability and quality of malaria treatment provided by medicine sellers in Nigeria; the irrational use of antimalarials in the ACT era revealed in this study bears a high risk of economic loss, death and development of drug resistance. LQAS has been shown to be a suitable method for monitoring malaria-related indicators among PMV, and should be applied in Nigeria

  6. An EPID based method for efficient and precise asymmetric jaw alignment quality assurance

    SciTech Connect

    Clews, Luke; Greer, Peter B.

    2009-12-15

    Purpose: The aim of this work was to investigate the use of amorphous silicon electronic portal imaging devices (EPIDs) for regular quality assurance of linear accelerator asymmetric jaw junctioning. Methods: The method uses the beam central axis position on the EPID measured to subpixel accuracy found from two EPID images with 180 degree sign opposing collimator angles. Individual zero jaw position (''half-beam blocked'') images are then acquired and the jaw position precisely determined for each using penumbra interpolation. The accuracy of determining jaw position with the EPID method was measured by translating a block (simulating a jaw) by known distances, using a translation stage, and then measuring each translation distance with the EPID. To establish the utility of EPID based junction dose measurements, radiographic film measurements of junction dose maxima/minima as a function of jaw gap/overlap were made and compared to EPID measurements. Using the method, the long-term stability of zero jaw positioning was assessed for four linear accelerators over a 1-1.5 yr time period. The stability at nonzero gantry angles was assessed over a shorter time period. Results: The accuracy of determining jaw translations with the method was within 0.14 mm found using the translation stage [standard deviation (SD) of 0.037 mm]. The junction doses measured with the EPID were different from film due to the nonwater equivalent EPID scattering properties and hence different penumbra profile. The doses were approximately linear with gap or overlap, and a correction factor was derived to convert EPID measured junction dose to film measured equivalent. Over a 1 yr period, the zero jaw positions at gantry zero position were highly reproducible with an average SD of 0.07 mm for the 16 collimator jaws examined. However, the average jaw positions ranged from -0.7 to 0.9 mm relative to central axis for the different jaws. The zero jaw position was also reproducible at gantry 90

  7. Bayesian individualization via sampling-based methods.

    PubMed

    Wakefield, J

    1996-02-01

    We consider the situation where we wish to adjust the dosage regimen of a patient based on (in general) sparse concentration measurements taken on-line. A Bayesian decision theory approach is taken which requires the specification of an appropriate prior distribution and loss function. A simple method for obtaining samples from the posterior distribution of the pharmacokinetic parameters of the patient is described. In general, these samples are used to obtain a Monte Carlo estimate of the expected loss which is then minimized with respect to the dosage regimen. Some special cases which yield analytic solutions are described. When the prior distribution is based on a population analysis then a method of accounting for the uncertainty in the population parameters is described. Two simulation studies showing how the methods work in practice are presented. PMID:8827585

  8. Alternate calibration method of radiochromic EBT3 film for quality assurance verification of clinical radiotherapy treatments

    NASA Astrophysics Data System (ADS)

    Park, Soah; Kang, Sei-Kwon; Cheong, Kwang-Ho; Hwang, Taejin; Yoon, Jai-Woong; Koo, Taeryool; Han, Tae Jin; Kim, Haeyoung; Lee, Me Yeon; Bae, Hoonsik; Kim, Kyoung Ju

    2016-07-01

    EBT3 film is utilized as a dosimetry quality assurance tool for the verification of clinical radiotherapy treatments. In this work, we suggest a percentage-depth-dose (PDD) calibration method that can calibrate several EBT3 film pieces together at different dose levels because photon beams provide different dose levels at different depths along the axis of the beam. We investigated the feasibility of the film PDD calibration method based on PDD data and compared the results those from the traditional film calibration method. Photon beams at 6 MV were delivered to EBT3 film pieces for both calibration methods. For the PDD-based calibration, the film pieces were placed on solid phantoms at the depth of maximum dose (dmax) and at depths of 3, 5, 8, 12, 17, and 22 cm, and a photon beam was delivered twice, at 100 cGy and 400 cGy, to extend the calibration dose range under the same conditions. Fourteen film pieces, to maintain their consistency, were irradiated at doses ranging from approximately 30 to 400 cGy for both film calibrations. The film pieces were located at the center position on the scan bed of an Epson 1680 flatbed scanner in the parallel direction. Intensity-modulated radiation therapy (IMRT) plans were created, and their dose distributions were delivered to the film. The dose distributions for the traditional method and those for the PDD-based calibration method were evaluated using a Gamma analysis. The PDD dose values using a CC13 ion chamber and those obtained by using a FC65-G Farmer chamber and measured at the depth of interest produced very similar results. With the objective test criterion of a 1% dosage agreement at 1 mm, the passing rates for the four cases of the three IMRT plans were essentially identical. The traditional and the PDD-based calibrations provided similar plan verification results. We also describe another alternative for calibrating EBT3 films, i.e., a PDD-based calibration method that provides an easy and time-saving approach

  9. Actinide recovery method -- Large soil samples

    SciTech Connect

    Maxwell , S.L. III

    2000-04-25

    There is a need to measure actinides in environmental samples with lower and lower detection limits, requiring larger sample sizes. This analysis is adversely affected by sample-matrix interferences, which make analyzing soil samples above five-grams very difficult. A new Actinide-Recovery Method has been developed by the Savannah River Site Central Laboratory to preconcentrate actinides from large-soil samples. Diphonix Resin (Eichrom Industries), a 1994 R and D 100 winner, is used to preconcentrate the actinides from large soil samples, which are bound powerfully to the resin's diphosphonic acid groups. A rapid microwave-digestion technique is used to remove the actinides from the Diphonix Resin, which effectively eliminates interfering matrix components from the soil matrix. The microwave-digestion technique is more effective and less tedious than catalyzed hydrogen peroxide digestions of the resin or digestion of diphosphonic stripping agents such as HEDPA. After resin digestion, the actinides are recovered in a small volume of nitric acid which can be loaded onto small extraction chromatography columns, such as TEVA Resin, U-TEVA Resin or TRU Resin (Eichrom Industries). Small, selective extraction columns do not generate large volumes of liquid waste and provide consistent tracer recoveries after soil matrix elimination.

  10. A comparison of two ozone sampling methods

    SciTech Connect

    Downey, E.B.; Buchan, R.M.; Blehm, K.D.; Gunter, B.J.

    1983-05-01

    A study was conducted to compare the alkaline potassium iodide (AKI) impinger method versus a direct-reading chemiluminescent monitor for determining ozone concentrations. Comparisons were made in both a controlled laboratory situation and in the field during MIG welding. Laboratoy results indicated that the accuracy of the AKI procedure is affected by sample size. In the field, AKI impinger samples seemed to give very low estimations of the true ozone concentration. The direct-reading chemiluminescent monitor performed excellently in both the laboratory and field, and exhibited its merit as an industrial hygiene field instrument.

  11. Chemicals of emerging concern in water and bottom sediment in Great Lakes areas of concern, 2010 to 2011-Collection methods, analyses methods, quality assurance, and data

    USGS Publications Warehouse

    Lee, Kathy E.; Langer, Susan K.; Menheer, Michael A.; Foreman, William T.; Furlong, Edward T.; Smith, Steven G.

    2012-01-01

    The U.S. Geological Survey (USGS) cooperated with the U.S. Environmental Protection Agency and the U.S. Fish and Wildlife Service on a study to identify the occurrence of chemicals of emerging concern (CECs) in water and bottom-sediment samples collected during 2010–11 at sites in seven areas of concern (AOCs) throughout the Great Lakes. Study sites include tributaries to the Great Lakes in AOCs located near Duluth, Minn.; Green Bay, Wis.; Roches­ter, N.Y.; Detroit, Mich.; Toledo, Ohio; Milwaukee, Wis.; and Ashtabula, Ohio. This report documents the collection meth­ods, analyses methods, quality-assurance data and analyses, and provides the data for this study. Water and bottom-sediment samples were analyzed at the USGS National Water Quality Laboratory in Denver, Colo., for a broad suite of CECs. During this study, 135 environmental and 23 field dupli­cate samples of surface water and wastewater effluent, 10 field blank water samples, and 11 field spike water samples were collected and analyzed. Sixty-one of the 69 wastewater indicator chemicals (laboratory method 4433) analyzed were detected at concentrations ranging from 0.002 to 11.2 micrograms per liter. Twenty-eight of the 48 pharmaceuticals (research method 8244) analyzed were detected at concentrations ranging from 0.0029 to 22.0 micro­grams per liter. Ten of the 20 steroid hormones and sterols analyzed (research method 4434) were detected at concentrations ranging from 0.16 to 10,000 nanograms per liter. During this study, 75 environmental, 13 field duplicate samples, and 9 field spike samples of bottom sediment were collected and analyzed for a wide variety of CECs. Forty-seven of the 57 wastewater indicator chemicals (laboratory method 5433) analyzed were detected at concentrations ranging from 0.921 to 25,800 nanograms per gram. Seventeen of the 20 steroid hormones and sterols (research method 6434) analyzed were detected at concentrations ranging from 0.006 to 8,921 nanograms per gram. Twelve of

  12. Actinide Recovery Method for Large Soil Samples

    SciTech Connect

    Maxwell, S.L. III; Nichols, S.

    1998-11-01

    A new Actinide Recovery Method has been developed by the Savannah River Site Central Laboratory to preconcentrate actinides in very large soil samples. Diphonix Resin(r) is used eliminate soil matrix interferences and preconcentrate actinides after soil leaching or soil fusion. A rapid microwave digestion technique is used to remove the actinides from the Diphonix Resin(r). After the resin digestion, the actinides are recovered in a small volume of nitric acid which can be easily loaded onto small extraction-chromatography columns, such as TEVA Resin(r), U-TEVA Resin(r) or TRU Resin(r) (Eichrom Industries). This method enables the application of small, selective extraction-columns to recover actinides from very large soil samples with high selectivity, consistent tracer recoveries and minimal liquid waste.

  13. Methods for Sampling of Airborne Viruses

    PubMed Central

    Verreault, Daniel; Moineau, Sylvain; Duchaine, Caroline

    2008-01-01

    Summary: To better understand the underlying mechanisms of aerovirology, accurate sampling of airborne viruses is fundamental. The sampling instruments commonly used in aerobiology have also been used to recover viruses suspended in the air. We reviewed over 100 papers to evaluate the methods currently used for viral aerosol sampling. Differentiating infections caused by direct contact from those caused by airborne dissemination can be a very demanding task given the wide variety of sources of viral aerosols. While epidemiological data can help to determine the source of the contamination, direct data obtained from air samples can provide very useful information for risk assessment purposes. Many types of samplers have been used over the years, including liquid impingers, solid impactors, filters, electrostatic precipitators, and many others. The efficiencies of these samplers depend on a variety of environmental and methodological factors that can affect the integrity of the virus structure. The aerodynamic size distribution of the aerosol also has a direct effect on sampler efficiency. Viral aerosols can be studied under controlled laboratory conditions, using biological or nonbiological tracers and surrogate viruses, which are also discussed in this review. Lastly, general recommendations are made regarding future studies on the sampling of airborne viruses. PMID:18772283

  14. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi, Shanavaz; Langlois, Richard G.; Venkateswaran, Kodumudi S.

    2006-08-01

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM, on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA, on the 5' end.

  15. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi,Shanavaz; Langlois, Richard G.; Venkateswaran, Kodumudi S.

    2011-07-05

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM.TM. on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA.TM., on the 5' end.

  16. A Quality Assurance Method that Utilizes 3D Dosimetry and Facilitates Clinical Interpretation

    SciTech Connect

    Oldham, Mark; Thomas, Andrew; O'Daniel, Jennifer; Juang, Titania; Ibbott, Geoffrey; Adamovics, John; Kirkpatrick, John P.

    2012-10-01

    Purpose: To demonstrate a new three-dimensional (3D) quality assurance (QA) method that provides comprehensive dosimetry verification and facilitates evaluation of the clinical significance of QA data acquired in a phantom. Also to apply the method to investigate the dosimetric efficacy of base-of-skull (BOS) intensity-modulated radiotherapy (IMRT) treatment. Methods and Materials: Two types of IMRT QA verification plans were created for 6 patients who received BOS IMRT. The first plan enabled conventional 2D planar IMRT QA using the Varian portal dosimetry system. The second plan enabled 3D verification using an anthropomorphic head phantom. In the latter, the 3D dose distribution was measured using the DLOS/Presage dosimetry system (DLOS = Duke Large-field-of-view Optical-CT System, Presage Heuris Pharma, Skillman, NJ), which yielded isotropic 2-mm data throughout the treated volume. In a novel step, measured 3D dose distributions were transformed back to the patient's CT to enable calculation of dose-volume histograms (DVH) and dose overlays. Measured and planned patient DVHs were compared to investigate clinical significance. Results: Close agreement between measured and calculated dose distributions was observed for all 6 cases. For gamma criteria of 3%, 2 mm, the mean passing rate for portal dosimetry was 96.8% (range, 92.0%-98.9%), compared to 94.9% (range, 90.1%-98.9%) for 3D. There was no clear correlation between 2D and 3D passing rates. Planned and measured dose distributions were evaluated on the patient's anatomy, using DVH and dose overlays. Minor deviations were detected, and the clinical significance of these are presented and discussed. Conclusions: Two advantages accrue to the methods presented here. First, treatment accuracy is evaluated throughout the whole treated volume, yielding comprehensive verification. Second, the clinical significance of any deviations can be assessed through the generation of DVH curves and dose overlays on the patient

  17. Guidelines for the processing and quality assurance of benthic invertebrate samples collected as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.

    1993-01-01

    Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to

  18. STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...

  19. Field evaluation of a VOST sampling method

    SciTech Connect

    Jackson, M.D.; Johnson, L.D.; Fuerst, R.G.; McGaughey, J.F.; Bursey, J.T.; Merrill, R.G.

    1994-12-31

    The VOST (SW-846 Method 0030) specifies the use of Tenax{reg_sign} and a particular petroleum-based charcoal (SKC Lot 104, or its equivalent), that is no longer commercially available. In field evaluation studies of VOST methodology, a replacement petroleum-based charcoal has been used: candidate replacement sorbents for charcoal were studied, and Anasorb{reg_sign} 747, a carbon-based sorbent, was selected for field testing. The sampling train was modified to use only Anasorb{reg_sign} in the back tube and Tenax{reg_sign} in the two front tubes to avoid analytical difficulties associated with the analysis of the sequential bed back tube used in the standard VOST train. The standard (SW-846 Method 0030) and the modified VOST methods were evaluated at a chemical manufacturing facility using a quadruple probe system with quadruple trains. In this field test, known concentrations of the halogenated volatile organic compounds, that are listed in the Clean Air Act Amendments of 1990, Title 3, were introduced into the VOST train and the modified VOST train, using the same certified gas cylinder as a source of test compounds. Statistical tests of the comparability of methods were performed on a compound-by-compound basis. For most compounds, the VOST and modified VOST methods were found to be statistically equivalent.

  20. A Review of Quality Assurance Methods to Assist Professional Record Keeping: Implications for Providers of Interpersonal Violence Treatment

    PubMed Central

    Bradshaw, Kelsey M.; Donohue, Brad; Wilks, Chelsey

    2014-01-01

    Errors have been found to frequently occur in the management of case records within mental health service systems. In cases involving interpersonal violence, such errors have been found to negatively impact service implementation and lead to significant trauma and fatalities. In an effort to ensure adherence to specified standards of care, quality assurance programs (QA) have been developed to monitor and enhance service implementation. These programs have generally been successful in facilitating record management. However, these systems are rarely disseminated, and not well integrated. Therefore, within the context of interpersonal violence, we provide an extensive review of evidence supported record keeping practices, and methods to assist in assuring these practices are implemented with adherence. PMID:24976786

  1. System and Method for Isolation of Samples

    NASA Technical Reports Server (NTRS)

    Zhang, Ye (Inventor); Wu, Honglu (Inventor)

    2014-01-01

    Systems and methods for isolating samples are provided. The system comprises a first membrane and a second membrane disposed within an enclosure. First and second reservoirs can also be disposed within the enclosure and adapted to contain one or more reagents therein. A first valve can be disposed within the enclosure and in fluid communication with the first reservoir, the second reservoir, or both. The first valve can also be in fluid communication with the first or second membranes or both. The first valve can be adapted to selectively regulate the flow of the reagents from the first reservoir, through at least one of the first and second membranes, and into the second reservoir.

  2. Methods, quality assurance, and data for assessing atmospheric deposition of pesticides in the Central Valley of California

    USGS Publications Warehouse

    Zamora, Celia; Majewski, Michael S.; Foreman, William T.

    2013-01-01

    The U.S. Geological Survey monitored atmospheric deposition of pesticides in the Central Valley of California during two studies in 2001 and 2002–04. The 2001 study sampled wet deposition (rain) and storm-drain runoff in the Modesto, California, area during the orchard dormant-spray season to examine the contribution of pesticide concentrations to storm runoff from rainfall. In the 2002–04 study, the number and extent of collection sites in the Central Valley were increased to determine the areal distribution of organophosphate insecticides and other pesticides, and also five more sample types were collected. These were dry deposition, bulk deposition, and three sample types collected from a soil box: aqueous phase in runoff, suspended sediment in runoff, and surficial-soil samples. This report provides concentration data and describes methods and quality assurance of sample collection and laboratory analysis for pesticide compounds in all samples collected from 16 sites. Each sample was analyzed for 41 currently used pesticides and 23 pesticide degradates, including oxygen analogs (oxons) of 9 organophosphate insecticides. Analytical results are presented by sample type and study period. The median concentrations of both chloryprifos and diazinon sampled at four urban (0.067 micrograms per liter [μg/L] and 0.515 μg/L, respectively) and four agricultural sites (0.079 μg/L and 0.583 μg/L, respectively) during a January 2001 storm event in and around Modesto, Calif., were nearly identical, indicating that the overall atmospheric burden in the region appeared to be fairly similar during the sampling event. Comparisons of median concentrations in the rainfall to those in the McHenry storm-drain runoff showed that, for some compounds, rainfall contributed a substantial percentage of the concentration in the runoff; for other compounds, the concentrations in rainfall were much greater than in the runoff. For example, diazinon concentrations in rainfall were about

  3. [Establishment and assessment of QA/QC method for sampling and analysis of atmosphere background CO2].

    PubMed

    Liu, Li-xin; Zhou, Ling-xi; Xia, Ling-jun; Wang, Hong-yang; Fang, Shuang-xi

    2014-12-01

    To strengthen scientific management and sharing of greenhouse gas data obtained from atmospheric background stations in China, it is important to ensure the standardization of quality assurance and quality control method for background CO2 sampling and analysis. Based on the greenhouse gas sampling and observation experience of CMA, using portable sampling observation and WS-CRDS analysis technique as an example, the quality assurance measures for atmospheric CO,sampling and observation in the Waliguan station (Qinghai), the glass bottle quality assurance measures and the systematic quality control method during sample analysis, the correction method during data processing, as well as the data grading quality markers and data fitting interpolation method were systematically introduced. Finally, using this research method, the CO2 sampling and observation data at the atmospheric background stations in 3 typical regions were processed and the concentration variation characteristics were analyzed, indicating that this research method could well catch the influences of the regional and local environmental factors on the observation results, and reflect the characteristics of natural and human activities in an objective and accurate way.

  4. A robust adaptive sampling method for faster acquisition of MR images.

    PubMed

    Vellagoundar, Jaganathan; Machireddy, Ramasubba Reddy

    2015-06-01

    A robust adaptive k-space sampling method is proposed for faster acquisition and reconstruction of MR images. In this method, undersampling patterns are generated based on magnitude profile of a fully acquired 2-D k-space data. Images are reconstructed using compressive sampling reconstruction algorithm. Simulation experiments are done to assess the performance of the proposed method under various signal-to-noise ratio (SNR) levels. The performance of the method is better than non-adaptive variable density sampling method when k-space SNR is greater than 10dB. The method is implemented on a fully acquired multi-slice raw k-space data and a quality assurance phantom data. Data reduction of up to 60% is achieved in the multi-slice imaging data and 75% is achieved in the phantom imaging data. The results show that reconstruction accuracy is improved over non-adaptive or conventional variable density sampling method. The proposed sampling method is signal dependent and the estimation of sampling locations is robust to noise. As a result, it eliminates the necessity of mathematical model and parameter tuning to compute k-space sampling patterns as required in non-adaptive sampling methods.

  5. Quality Assurance.

    ERIC Educational Resources Information Center

    Massachusetts Career Development Inst., Springfield.

    This booklet is one of six texts from a workplace literacy curriculum designed to assist learners in facing the increased demands of the workplace. The booklet contains five sections that cover the following topics: (1) importance of reliability; (2) meaning of quality assurance; (3) historical development of quality assurance; (4) statistical…

  6. Selected quality assurance data for water samples collected by the US Geological Survey, Idaho National Engineering Laboratory, Idaho, 1980 to 1988

    USGS Publications Warehouse

    Wegner, S.J.

    1989-01-01

    Multiple water samples from 115 wells and 3 surface water sites were collected between 1980 and 1988 for the ongoing quality assurance program at the Idaho National Engineering Laboratory. The reported results from the six laboratories involved were analyzed for agreement using descriptive statistics. The constituents and properties included: tritium, plutonium-238, plutonium-239, -240 (undivided), strontium-90, americium-241, cesium-137, total dissolved chromium, selected dissolved trace metals, sodium, chloride, nitrate, selected purgeable organic compounds, and specific conductance. Agreement could not be calculated for purgeable organic compounds, trace metals, some nitrates and blank sample analyses because analytical uncertainties were not consistently reported. However, differences between results for most of these data were calculated. The blank samples were not analyzed for differences. The laboratory results analyzed using descriptive statistics showed a median agreement between all useable data pairs of 95%. (USGS)

  7. Medicaid Program; Methods for Assuring Access to Covered Medicaid Services. Final rule with comment period.

    PubMed

    2015-11-01

    This final rule with comment period provides for a transparent data-driven process for states to document whether Medicaid payments are sufficient to enlist providers to assure beneficiary access to covered care and services consistent with section 1902(a)(30)(A) of the Social Security Act (the Act) and to address issues raised by that process. The final rule with comment period also recognizes electronic publication as an optional means of providing public notice of proposed changes in rates or ratesetting methodologies that the state intends to include in a Medicaid state plan amendment (SPA). We are providing an opportunity for comment on whether future adjustments would be warranted to the provisions setting forth requirements for ongoing state reviews of beneficiary access. PMID:26524772

  8. Conflict Prevention and Separation Assurance Method in the Small Aircraft Transportation System

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria C.; Carreno, Victor A.; Williams, Daniel M.; Munoz, Cesar

    2005-01-01

    A multilayer approach to the prevention of conflicts due to the loss of aircraft-to-aircraft separation which relies on procedures and on-board automation was implemented as part of the SATS HVO Concept of Operations. The multilayer system gives pilots support and guidance during the execution of normal operations and advance warning for procedure deviations or off-nominal operations. This paper describes the major concept elements of this multilayer approach to separation assurance and conflict prevention and provides the rationale for its design. All the algorithms and functionality described in this paper have been implemented in an aircraft simulation in the NASA Langley Research Center s Air Traffic Operation Lab and on the NASA Cirrus SR22 research aircraft.

  9. Medicaid Program; Methods for Assuring Access to Covered Medicaid Services. Final rule with comment period.

    PubMed

    2015-11-01

    This final rule with comment period provides for a transparent data-driven process for states to document whether Medicaid payments are sufficient to enlist providers to assure beneficiary access to covered care and services consistent with section 1902(a)(30)(A) of the Social Security Act (the Act) and to address issues raised by that process. The final rule with comment period also recognizes electronic publication as an optional means of providing public notice of proposed changes in rates or ratesetting methodologies that the state intends to include in a Medicaid state plan amendment (SPA). We are providing an opportunity for comment on whether future adjustments would be warranted to the provisions setting forth requirements for ongoing state reviews of beneficiary access.

  10. Guidance for characterizing explosives contaminated soils: Sampling and selecting on-site analytical methods

    SciTech Connect

    Crockett, A.B.; Craig, H.D.; Jenkins, T.F.; Sisk, W.E.

    1996-09-01

    A large number of defense-related sites are contaminated with elevated levels of secondary explosives. Levels of contamination range from barely detectable to levels above 10% that need special handling due to the detonation potential. Characterization of explosives-contaminated sites is particularly difficult due to the very heterogeneous distribution of contamination in the environment and within samples. To improve site characterization, several options exist including collecting more samples, providing on-site analytical data to help direct the investigation, compositing samples, improving homogenization of samples, and extracting larger samples. On-site analytical methods are essential to more economical and improved characterization. On-site methods might suffer in terms of precision and accuracy, but this is more than offset by the increased number of samples that can be run. While verification using a standard analytical procedure should be part of any quality assurance program, reducing the number of samples analyzed by the more expensive methods can result in significantly reduced costs. Often 70 to 90% of the soil samples analyzed during an explosives site investigation do not contain detectable levels of contamination. Two basic types of on-site analytical methods are in wide use for explosives in soil, calorimetric and immunoassay. Calorimetric methods generally detect broad classes of compounds such as nitroaromatics or nitramines, while immunoassay methods are more compound specific. Since TNT or RDX is usually present in explosive-contaminated soils, the use of procedures designed to detect only these or similar compounds can be very effective.

  11. Quality assurance of metabolomics.

    PubMed

    Bouhifd, Mounir; Beger, Richard; Flynn, Thomas; Guo, Lining; Harris, Georgina; Hogberg, Helena; Kaddurah-Daouk, Rima; Kamp, Hennicke; Kleensang, Andre; Maertens, Alexandra; Odwin-DaCosta, Shelly; Pamies, David; Robertson, Donald; Smirnova, Lena; Sun, Jinchun; Zhao, Liang; Hartung, Thomas

    2015-01-01

    Metabolomics promises a holistic phenotypic characterization of biological responses to toxicants. This technology is based on advanced chemical analytical tools with reasonable throughput, including mass-spectroscopy and NMR. Quality assurance, however - from experimental design, sample preparation, metabolite identification, to bioinformatics data-mining - is urgently needed to assure both quality of metabolomics data and reproducibility of biological models. In contrast to microarray-based transcriptomics, where consensus on quality assurance and reporting standards has been fostered over the last two decades, quality assurance of metabolomics is only now emerging. Regulatory use in safety sciences, and even proper scientific use of these technologies, demand quality assurance. In an effort to promote this discussion, an expert workshop discussed the quality assurance needs of metabolomics. The goals for this workshop were 1) to consider the challenges associated with metabolomics as an emerging science, with an emphasis on its application in toxicology and 2) to identify the key issues to be addressed in order to establish and implement quality assurance procedures in metabolomics-based toxicology. Consensus has still to be achieved regarding best practices to make sure sound, useful, and relevant information is derived from these new tools.

  12. Clustered lot quality assurance sampling: a tool to monitor immunization coverage rapidly during a national yellow fever and polio vaccination campaign in Cameroon, May 2009.

    PubMed

    Pezzoli, L; Tchio, R; Dzossa, A D; Ndjomo, S; Takeu, A; Anya, B; Ticha, J; Ronveaux, O; Lewis, R F

    2012-01-01

    We used the clustered lot quality assurance sampling (clustered-LQAS) technique to identify districts with low immunization coverage and guide mop-up actions during the last 4 days of a combined oral polio vaccine (OPV) and yellow fever (YF) vaccination campaign conducted in Cameroon in May 2009. We monitored 17 pre-selected districts at risk for low coverage. We designed LQAS plans to reject districts with YF vaccination coverage <90% and with OPV coverage <95%. In each lot the sample size was 50 (five clusters of 10) with decision values of 3 for assessing OPV and 7 for YF coverage. We 'rejected' 10 districts for low YF coverage and 14 for low OPV coverage. Hence we recommended a 2-day extension of the campaign. Clustered-LQAS proved to be useful in guiding the campaign vaccination strategy before the completion of the operations.

  13. System and method for extracting a sample from a surface

    DOEpatents

    Van Berkel, Gary; Covey, Thomas

    2015-06-23

    A system and method is disclosed for extracting a sample from a sample surface. A sample is provided and a sample surface receives the sample which is deposited on the sample surface. A hydrophobic material is applied to the sample surface, and one or more devices are configured to dispense a liquid on the sample, the liquid dissolving the sample to form a dissolved sample material, and the one or more devices are configured to extract the dissolved sample material from the sample surface.

  14. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or... 7 Agriculture 3 2011-01-01 2011-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods...

  15. Transuranic Waste Characterization Quality Assurance Program Plan

    SciTech Connect

    1995-04-30

    This quality assurance plan identifies the data necessary, and techniques designed to attain the required quality, to meet the specific data quality objectives associated with the DOE Waste Isolation Pilot Plant (WIPP). This report specifies sampling, waste testing, and analytical methods for transuranic wastes.

  16. Quality assurance in the pre-analytical phase of human urine samples by (1)H NMR spectroscopy.

    PubMed

    Budde, Kathrin; Gök, Ömer-Necmi; Pietzner, Maik; Meisinger, Christine; Leitzmann, Michael; Nauck, Matthias; Köttgen, Anna; Friedrich, Nele

    2016-01-01

    Metabolomic approaches investigate changes in metabolite profiles, which may reflect changes in metabolic pathways and provide information correlated with a specific biological process or pathophysiology. High-resolution (1)H NMR spectroscopy is used to identify metabolites in biofluids and tissue samples qualitatively and quantitatively. This pre-analytical study evaluated the effects of storage time and temperature on (1)H NMR spectra from human urine in two settings. Firstly, to evaluate short time effects probably due to acute delay in sample handling and secondly, the effect of prolonged storage up to one month to find markers of sample miss-handling. A number of statistical procedures were used to assess the differences between samples stored under different conditions, including Projection to Latent Structure Discriminant Analysis (PLS-DA), non-parametric testing as well as mixed effect linear regression analysis. The results indicate that human urine samples can be stored at 10 °C for 24 h or at -80 °C for 1 month, as no relevant changes in (1)H NMR fingerprints were observed during these time periods and temperature conditions. However, some metabolites most likely of microbial origin showed alterations during prolonged storage but without facilitating classification. In conclusion, the presented protocol for urine sample handling and semi-automatic metabolite quantification is suitable for large-scale epidemiological studies. PMID:26264917

  17. Potassium bromide method of infrared sampling

    USGS Publications Warehouse

    Milkey, R.G.

    1958-01-01

    In the preparation of potassium bromide pressed windows for use in the infrared analysis of solids, severe grinding of the potassium bromide powder may produce strong absorption bands that could interfere seriously with the spectra of the sample. These absorption bands appear to be due to some crystal alteration of the potassium bromide as a result of the grinding process. They were less apt to occur when the coarser powder, which had received a relatively gentle grinding, was used. Window blanks prepared from the coarser powders showed smaller adsorbed water peaks and generally higher over-all transmittance readings than windows pressed from the very fine powders.

  18. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises

    PubMed Central

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere

    2011-01-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004

  19. A method for sampling waste corn

    USGS Publications Warehouse

    Frederick, R.B.; Klaas, E.E.; Baldassarre, G.A.; Reinecke, K.J.

    1984-01-01

    Corn had become one of the most important wildlife food in the United States. It is eaten by a wide variety of animals, including white-tailed deer (Odocoileus virginianus ), raccoon (Procyon lotor ), ring-necked pheasant (Phasianus colchicus , wild turkey (Meleagris gallopavo ), and many species of aquatic birds. Damage to unharvested crops had been documented, but many birds and mammals eat waste grain after harvest and do not conflict with agriculture. A good method for measuring waste-corn availability can be essential to studies concerning food density and food and feeding habits of field-feeding wildlife. Previous methods were developed primarily for approximating losses due to harvest machinery. In this paper, a method is described for estimating the amount of waste corn potentially available to wildlife. Detection of temporal changes in food availability and differences caused by agricultural operations (e.g., recently harvested stubble fields vs. plowed fields) are discussed.

  20. [Quality assurance in food microbiology laboratories].

    PubMed

    Cwiek-Ludwicka, K; Windyga, B; Karłowski, K

    1996-01-01

    In the paper the quality assurance system in food microbiology laboratories to ensure the reliability of the analytical data are discussed. To introduce quality assurance system in the laboratory all activities such as sampling, method selection, laboratory environment, equipment, reagents and media, staff, reference materials, internal quality control and external quality control (proficiency testing) that effect on the results must be documented and controlled. The kind of food sample, condition and time of storage before analysis and proper selection of methodology have significant influence on the result of the microbiological analysis. Equipment used to carry out the test must work properly. Implementation of of the internal and external quality control to the routine work of the food microbiology laboratory means that the production of the results is under control and that the data are reliable. If the quality assurance system is properly implemented and well documented it makes the base for the laboratory to get the accreditation.

  1. Evaluation of Sampling Methods and Development of Sample Plans for Estimating Predator Densities in Cotton

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The cost-reliability of five sampling methods (visual search, drop cloth, beat bucket, shake bucket and sweep net) was determined for predatory arthropods on cotton plants. The beat bucket sample method was the most cost-reliable while the visual sample method was the least cost-reliable. The beat ...

  2. Modified electrokinetic sample injection method in chromatography and electrophoresis analysis

    DOEpatents

    Davidson, J. Courtney; Balch, Joseph W.

    2001-01-01

    A sample injection method for horizontal configured multiple chromatography or electrophoresis units, each containing a number of separation/analysis channels, that enables efficient introduction of analyte samples. This method for loading when taken in conjunction with horizontal microchannels allows much reduced sample volumes and a means of sample stacking to greatly reduce the concentration of the sample. This reduction in the amount of sample can lead to great cost savings in sample preparation, particularly in massively parallel applications such as DNA sequencing. The essence of this method is in preparation of the input of the separation channel, the physical sample introduction, and subsequent removal of excess material. By this method, sample volumes of 100 nanoliter to 2 microliters have been used successfully, compared to the typical 5 microliters of sample required by the prior separation/analysis method.

  3. Quality assurance of solar spectral UV-measurements: methods and use of the SHICrivm software tool

    NASA Astrophysics Data System (ADS)

    Williams, J. E.; den Outer, P. N.; Slaper, H.

    2003-04-01

    Ground-based UV-irradiance measurements are crucial for determining the long-term changes and trends in biologically and/or photo-chemically relevant solar UV-radiation reaching the Earth's surface. Such changes in UV-radiation levels have probably occurred and/or are expected due to ozone depletion and climate change. In order to analyse UV-irradiation levels in relation to atmospheric parameters and to facilitate an assessment of the European UV-climate a European database (EUVDatabase) has been set up within the EDUCE-project (EC-contract EVK2-CT-1999-00028). High quality UV-data-sets from across the continent are assessable from the EUVDatabase (http://uv.fmi.fi/uvdb/). An accurate analysis of the UV-climate and long term changes therein requires quality assurance of the spectral data. The SHICrivm software tool (http://www.rivm.nl/shicrivm) is developed to analyse several quality aspects of measured UV-spectra. The SHICrivm tool is applied to over one million spectra from the EUVDatabase and detects for each measured spectrum: the accuracy of the wavelength calibration from 290 up to 500 nm, the lowest detectable irradiance level, the occurrence of non-natural spikes in spectra, deviations in spectral shape, and identifies possible irradiance scale errors in the UV-range. In addition the SHIC-package can be used to correct wavelength scale errors and non-natural spectral spikes. A deconvolution and convolution algorithm is included to improve the comparibility of spectra obtained with different instruments, and to allow a fully comparable analysis of biologically weighted UV-dose for instruments with various spectral characteristics. Within the context of the EDUCE-project data from over 20 UV-monitoring stations are retrieved from the database and a quality assessment is performed using the SHIC-tool. The quality parameters are presented by means of a simple scheme of coloured quality flags. Spectra that meet the WMO-criteria for spectral measurements are

  4. U.S. Geological Survey nutrient preservation experiment; nutrient concentration data for surface-, ground-, and municipal-supply water samples and quality-assurance samples

    USGS Publications Warehouse

    Patton, Charles J.; Truitt, Earl P.

    1995-01-01

    This report is a compilation of analytical results from a study conducted at the U.S. Geological Survey, National Water Quality Laboratory (NWQL) in 1992 to assess the effectiveness of three field treatment protocols to stabilize nutrient concentra- tions in water samples stored for about 1 month at 4C. Field treatments tested were chilling, adjusting sample pH to less than 2 with sulfuric acid and chilling, and adding 52 milligrams of mercury (II) chloride per liter of sample and chilling. Field treatments of samples collected for determination of ammonium, nitrate plus nitrite, nitrite, dissolved Kjeldahl nitrogen, orthophosphate, and dissolved phosphorus included 0.45-micrometer membrane filtration. Only total Kjeldahl nitrogen and total phosphorus were determined in unfiltered samples. Data reported here pertain to water samples collected in April and May 1992 from 15 sites within the continental United States. Also included in this report are analytical results for nutrient concentrations in synthetic reference samples that were analyzed concurrently with real samples.

  5. Quality assurance.

    PubMed

    Hannan, T

    1991-01-01

    Modern health care is changing--we have more chronic disease, increasing demands for documentation from government and legal bodies, and greater emphasis on disease screening and prevention. Quality assurance is important in developing health care services to meet these changing needs because it provides standards by which we can measure the activities involved in the delivery of health care, and quality assurance programmes are more likely to ensure that the predefined standards of health care are being met. This paper provides: (i) an acceptable definition of quality assurance (QA); (ii) an explanation of why we need it; (iii) evidence that the medical decision-making process is failing under the modern technological advances; and (iv) guidelines for meeting future health care standards by using the modern technological tools of computers and computer software to support the beleaguered clinical decision-making process.

  6. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official... 7 Agriculture 3 2011-01-01 2011-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable...

  7. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA,...

  8. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA,...

  9. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural...

  10. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural...

  11. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Representative Sampling Methods I...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the... Crushed or powdered material—ASTM Standard D346-75 Soil or rock-like material—ASTM Standard D420-69...

  12. Time efficient methods for scanning a fluorescent membrane with a fluorescent microscopic imager for the quality assurance of food

    NASA Astrophysics Data System (ADS)

    Lerm, Steffen; Holder, Silvio; Schellhorn, Mathias; Brückner, Peter; Linß, Gerhard

    2013-05-01

    An important part of the quality assurance of meat is the estimation of germs in the meat exudes. The kind and the number of the germs in the meat affect the medical risk for the consumer of the meat. State-of-the-art analyses of meat are incubator test procedures. The main disadvantages of such incubator tests are the time consumption, the necessary equipment and the need of special skilled employees. These facts cause in high inspection cost. For this reason a new method for the quality assurance is necessary which combines low detection limits and less time consumption. One approach for such a new method is fluorescence microscopic imaging. The germs in the meat exude are caught in special membranes by body-antibody reactions. The germ typical signature could be enhanced with fluorescent chemical markers instead of reproduction of the germs. Each fluorescent marker connects with a free germ or run off the membrane. An image processing system is used to detect the number of fluorescent particles. Each fluorescent spot should be a marker which is connected with a germ. Caused by the small object sizes of germs, the image processing system needs a high optical magnification of the camera. However, this leads to a small field of view and a small depth of focus. For this reasons the whole area of the membrane has to be scanned in three dimensions. To minimize the time consumption, the optimal path has to be found. This optimization problem is influenced by features of the hardware and is presented in this paper. The traversing range in each direction, the step width, the velocity, the shape of the inspection volume and the field of view have influence on the optimal path to scan the membrane.

  13. Microfluidic DNA sample preparation method and device

    DOEpatents

    Krulevitch, Peter A.; Miles, Robin R.; Wang, Xiao-Bo; Mariella, Raymond P.; Gascoyne, Peter R. C.; Balch, Joseph W.

    2002-01-01

    Manipulation of DNA molecules in solution has become an essential aspect of genetic analyses used for biomedical assays, the identification of hazardous bacterial agents, and in decoding the human genome. Currently, most of the steps involved in preparing a DNA sample for analysis are performed manually and are time, labor, and equipment intensive. These steps include extraction of the DNA from spores or cells, separation of the DNA from other particles and molecules in the solution (e.g. dust, smoke, cell/spore debris, and proteins), and separation of the DNA itself into strands of specific lengths. Dielectrophoresis (DEP), a phenomenon whereby polarizable particles move in response to a gradient in electric field, can be used to manipulate and separate DNA in an automated fashion, considerably reducing the time and expense involved in DNA analyses, as well as allowing for the miniaturization of DNA analysis instruments. These applications include direct transport of DNA, trapping of DNA to allow for its separation from other particles or molecules in the solution, and the separation of DNA into strands of varying lengths.

  14. Quality assurance and quality control in light stable isotope laboratories: a case study of Rio Grande, Texas, water samples.

    PubMed

    Coplen, Tyler B; Qi, Haiping

    2009-06-01

    New isotope laboratories can achieve the goal of reporting the same isotopic composition within analytical uncertainty for the same material analysed decades apart by (1) writing their own acceptance testing procedures and putting them into their mass spectrometric or laser-based isotope-ratio equipment procurement contract, (2) requiring a manufacturer to demonstrate acceptable performance using all sample ports provided with the instrumentation, (3) for each medium to be analysed, prepare two local reference materials substantially different in isotopic composition to encompass the range in isotopic composition expected in the laboratory and calibrated them with isotopic reference materials available from the International Atomic Energy Agency (IAEA) or the US National Institute of Standards and Technology (NIST), (4) using the optimum storage containers (for water samples, sealing in glass ampoules that are sterilised after sealing is satisfactory), (5) interspersing among sample unknowns local laboratory isotopic reference materials daily (internationally distributed isotopic reference materials can be ordered at three-year intervals, and can be used for elemental analyser analyses and other analyses that consume less than 1 mg of material) - this process applies to H, C, N, O, and S isotope ratios, (6) calculating isotopic compositions of unknowns by normalising isotopic data to that of local reference materials, which have been calibrated to internationally distributed isotopic reference materials, (7) reporting results on scales normalised to internationally distributed isotopic reference materials (where they are available) and providing to sample submitters the isotopic compositions of internationally distributed isotopic reference materials of the same substance had they been analysed with unknowns, (8) providing an audit trail in the laboratory for analytical results - this trail commonly will be in electronic format and might include a laboratory

  15. Quality assurance and quality control in light stable isotope laboratories: a case study of Rio Grande, Texas, water samples.

    PubMed

    Coplen, Tyler B; Qi, Haiping

    2009-06-01

    New isotope laboratories can achieve the goal of reporting the same isotopic composition within analytical uncertainty for the same material analysed decades apart by (1) writing their own acceptance testing procedures and putting them into their mass spectrometric or laser-based isotope-ratio equipment procurement contract, (2) requiring a manufacturer to demonstrate acceptable performance using all sample ports provided with the instrumentation, (3) for each medium to be analysed, prepare two local reference materials substantially different in isotopic composition to encompass the range in isotopic composition expected in the laboratory and calibrated them with isotopic reference materials available from the International Atomic Energy Agency (IAEA) or the US National Institute of Standards and Technology (NIST), (4) using the optimum storage containers (for water samples, sealing in glass ampoules that are sterilised after sealing is satisfactory), (5) interspersing among sample unknowns local laboratory isotopic reference materials daily (internationally distributed isotopic reference materials can be ordered at three-year intervals, and can be used for elemental analyser analyses and other analyses that consume less than 1 mg of material) - this process applies to H, C, N, O, and S isotope ratios, (6) calculating isotopic compositions of unknowns by normalising isotopic data to that of local reference materials, which have been calibrated to internationally distributed isotopic reference materials, (7) reporting results on scales normalised to internationally distributed isotopic reference materials (where they are available) and providing to sample submitters the isotopic compositions of internationally distributed isotopic reference materials of the same substance had they been analysed with unknowns, (8) providing an audit trail in the laboratory for analytical results - this trail commonly will be in electronic format and might include a laboratory

  16. Quality assurance and quality control in light stable isotope laboratories: A case study of Rio Grande, Texas, water samples

    USGS Publications Warehouse

    Coplen, T.B.; Qi, H.

    2009-01-01

    New isotope laboratories can achieve the goal of reporting the same isotopic composition within analytical uncertainty for the same material analysed decades apart by (1) writing their own acceptance testing procedures and putting them into their mass spectrometric or laser-based isotope-ratio equipment procurement contract, (2) requiring a manufacturer to demonstrate acceptable performance using all sample ports provided with the instrumentation, (3) for each medium to be analysed, prepare two local reference materials substantially different in isotopic composition to encompass the range in isotopic composition expected in the laboratory and calibrated them with isotopic reference materials available from the International Atomic Energy Agency (IAEA) or the US National Institute of Standards and Technology (NIST), (4) using the optimum storage containers (for water samples, sealing in glass ampoules that are sterilised after sealing is satisfactory), (5) interspersing among sample unknowns local laboratory isotopic reference materials daily (internationally distributed isotopic reference materials can be ordered at three-year intervals, and can be used for elemental analyser analyses and other analyses that consume less than 1 mg of material) - this process applies to H, C, N, O, and S isotope ratios, (6) calculating isotopic compositions of unknowns by normalising isotopic data to that of local reference materials, which have been calibrated to internationally distributed isotopic reference materials, (7) reporting results on scales normalised to internationally distributed isotopic reference materials (where they are available) and providing to sample submitters the isotopic compositions of internationally distributed isotopic reference materials of the same substance had they been analysed with unknowns, (8) providing an audit trail in the laboratory for analytical results - this trail commonly will be in electronic format and might include a laboratory

  17. Evaluation of Environmental Sample Analysis Methods and Results Reporting in the National Children's Study Vanguard Study.

    PubMed

    Heikkinen, Maire S A; Khalaf, Abdisalam; Beard, Barbara; Viet, Susan M; Dellarco, Michael

    2016-05-01

    During the initial Vanguard phase of the U.S. National Children's Study (NCS), about 2000 tap water, surface wipe, and air samples were collected and analyzed immediately. The shipping conditions, analysis methods, results, and laboratory performance were evaluated to determine the best approaches for use in the NCS Main Study. The main conclusions were (1) to employ established sample analysis methods, when possible, and alternate methodologies only after careful consideration with method validation studies; (2) lot control and prescreening sample collection materials are important quality assurance procedures; (3) packing samples correctly requires careful training and adjustment of shipping conditions to local conditions; (4) trip blanks and spiked samples should be considered for samplers with short expiration times and labile analytes; (5) two study-specific results reports should be required: laboratory electronic data deliverables (EDD) of sample results in a useable electronic format (CSV or SEDD XML/CSV) and a data package with sample results and supporting information in PDF format. These experiences and lessons learned can be applied to any long-term study.

  18. Method Description, Quality Assurance, Environmental Data, and other Information for Analysis of Pharmaceuticals in Wastewater-Treatment-Plant Effluents, Streamwater, and Reservoirs, 2004-2009

    USGS Publications Warehouse

    Phillips, Patrick J.; Smith, Steven G.; Kolpin, Dana W.; Zaugg, Steven D.; Buxton, Herbert T.; Furlong, Edward T.

    2010-01-01

    Abstract Wastewater-treatment-plant (WWTP) effluents are a demonstrated source of pharmaceuticals to the environment. During 2004-09, a study was conducted to identify pharmaceutical compounds in effluents from WWTPs (including two that receive substantial discharges from pharmaceutical formulation facilities), streamwater, and reservoirs. The methods used to determine and quantify concentrations of seven pharmaceuticals are described. In addition, the report includes information on pharmaceuticals formulated or potentially formulated at the two pharmaceutical formulation facilities that provide substantial discharge to two of the WWTPs, and potential limitations to these data are discussed. The analytical methods used to provide data on the seven pharmaceuticals (including opioids, muscle relaxants, and other pharmaceuticals) in filtered water samples also are described. Data are provided on method performance, including spike data, method detection limit results, and an estimation of precision. Quality-assurance data for sample collection and handling are included. Quantitative data are presented for the seven pharmaceuticals in water samples collected at WWTP discharge points, from streams, and at reservoirs. Occurrence data also are provided for 19 pharmaceuticals that were qualitatively identified. Flow data at selected WWTP and streams are presented. Between 2004-09, 35-38 effluent samples were collected from each of three WWTPs in New York and analyzed for seven pharmaceuticals. Two WWTPs (NY2 and NY3) receive substantial inflows (greater than 20 percent of plant flow) from pharmaceutical formulation facilities (PFF) and one (NY1) receives no PFF flow. Samples of effluents from 23 WWTPs across the United States were analyzed once for these pharmaceuticals as part of a national survey. Maximum pharmaceutical effluent concentrations for the national survey and NY1 effluent samples were generally less than 1 ug/L. Four pharmaceuticals (methadone, oxycodone

  19. Understanding and Evaluating Assurance Cases

    NASA Technical Reports Server (NTRS)

    Rushby, John; Xu, Xidong; Rangarajan, Murali; Weaver, Thomas L.

    2015-01-01

    Assurance cases are a method for providing assurance for a system by giving an argument to justify a claim about the system, based on evidence about its design, development, and tested behavior. In comparison with assurance based on guidelines or standards (which essentially specify only the evidence to be produced), the chief novelty in assurance cases is provision of an explicit argument. In principle, this can allow assurance cases to be more finely tuned to the specific circumstances of the system, and more agile than guidelines in adapting to new techniques and applications. The first part of this report (Sections 1-4) provides an introduction to assurance cases. Although this material should be accessible to all those with an interest in these topics, the examples focus on software for airborne systems, traditionally assured using the DO-178C guidelines and its predecessors. A brief survey of some existing assurance cases is provided in Section 5. The second part (Section 6) considers the criteria, methods, and tools that may be used to evaluate whether an assurance case provides sufficient confidence that a particular system or service is fit for its intended use. An assurance case cannot provide unequivocal "proof" for its claim, so much of the discussion focuses on the interpretation of such less-than-definitive arguments, and on methods to counteract confirmation bias and other fallibilities in human reasoning.

  20. Method for sampling sub-micron particles

    DOEpatents

    Gay, Don D.; McMillan, William G.

    1985-01-01

    Apparatus and method steps for collecting sub-micron sized particles include a collection chamber and cryogenic cooling. The cooling is accomplished by coil tubing carrying nitrogen in liquid form, with the liquid nitrogen changing to the gas phase before exiting from the collection chamber in the tubing. Standard filters are used to filter out particles of diameter greater than or equal to 0.3 microns; however the present invention is used to trap particles of less than 0.3 micron in diameter. A blower draws air to said collection chamber through a filter which filters particles with diameters greater than or equal to 0.3 micron. The air is then cryogenically cooled so that moisture and sub-micron sized particles in the air condense into ice on the coil. The coil is then heated so that the ice melts, and the liquid is then drawn off and passed through a Buchner funnel where the liquid is passed through a Nuclepore membrane. A vacuum draws the liquid through the Nuclepore membrane, with the Nuclepore membrane trapping sub-micron sized particles therein. The Nuclepore membrane is then covered on its top and bottom surfaces with sheets of Mylar.RTM. and the assembly is then crushed into a pellet. This effectively traps the sub-micron sized particles for later analysis.

  1. Methods for characterizing, classifying, and identifying unknowns in samples

    DOEpatents

    Grate, Jay W [West Richland, WA; Wise, Barry M [Manson, WA

    2002-01-01

    Disclosed is a method for taking the data generated from an array of responses from a multichannel instrument, and determining the characteristics of a chemical in the sample without the necessity of calibrating or training the instrument with known samples containing the same chemical. The characteristics determined by the method are then used to classify and identify the chemical in the sample. The method can also be used to quantify the concentration of the chemical in the sample.

  2. Methods for characterizing, classifying, and identifying unknowns in samples

    DOEpatents

    Grate, Jay W.; Wise, Barry M.

    2003-08-12

    Disclosed is a method for taking the data generated from an array of responses from a multichannel instrument, and determining the characteristics of a chemical in the sample without the necessity of calibrating or training the instrument with known samples containing the same chemical. The characteristics determined by the method are then used to classify and identify the chemical in the sample. The method can also be used to quantify the concentration of the chemical in the sample.

  3. SU-E-T-438: Commissioning of An In-Vivo Quality Assurance Method Using the Electronic Portal Imaging Device

    SciTech Connect

    Morin, O; Held, M; Pouliot, J

    2014-06-01

    Purpose: Patient specific pre-treatment quality assurance (QA) using arrays of detectors or film have been the standard approach to assure the correct treatment is delivered to the patient. This QA approach is expensive, labor intensive and does not guarantee or document that all remaining fractions were treated properly. The purpose of this abstract is to commission and evaluate the performance of a commercially available in-vivo QA software using the electronic portal imaging device (EPID) to record the daily treatments. Methods: The platform EPIgray V2.0.2 (Dosisoft), which machine model compares ratios of TMR with EPID signal to predict dose was commissioned for an Artiste (Siemens Oncology Care Systems) and a Truebeam (Varian medical systems) linear accelerator following the given instructions. The systems were then tested on three different phantoms (homogeneous stack of solid water, anthropomorphic head and pelvis) and on a library of patient cases. Simple and complex fields were delivered at different exposures and for different gantry angles. The effects of the table attenuation and the EPID sagging were evaluated. Gamma analysis of the measured dose was compared to the predicted dose for complex clinical IMRT cases. Results: Commissioning of the EPIgray system for two photon energies took 8 hours. The difference between the dose planned and the dose measured with EPIgray was better than 3% for all phantom scenarios tested. Preliminary results on patients demonstrate an accuracy of 5% is achievable in high dose regions for both 3DCRT and IMRT. Large discrepancies (>5%) were observed due to metallic structures or air cavities and in low dose areas. Flat panel sagging was visible and accounted for in the EPIgray model. Conclusion: The accuracy achieved by EPIgray is sufficient to document the safe delivery of complex IMRT treatments. Future work will evaluate EPIgray for VMAT and high dose rate deliveries. This work is supported by Dosisoft, Cachan, France.

  4. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  5. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling...

  6. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling...

  7. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling...

  8. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling...

  9. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling....

  10. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling...

  11. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 2 2014-04-01 2014-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling....

  12. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 2 2012-04-01 2012-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling....

  13. Methods for collecting algal samples as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.

  14. Remedial investigation sampling and analysis plan for J-Field, Aberdeen Proving Ground, Maryland: Volume 2, Quality Assurance Project Plan

    SciTech Connect

    Prasad, S.; Martino, L.; Patton, T.

    1995-03-01

    J-Field encompasses about 460 acres at the southern end of the Gunpowder Neck Peninsula in the Edgewood Area of APG (Figure 2.1). Since World War II, the Edgewood Area of APG has been used to develop, manufacture, test, and destroy chemical agents and munitions. These materials were destroyed at J-Field by open burning and open detonation (OB/OD). For the purposes of this project, J-Field has been divided into eight geographic areas or facilities that are designated as areas of concern (AOCs): the Toxic Burning Pits (TBP), the White Phosphorus Burning Pits (WPP), the Riot Control Burning Pit (RCP), the Robins Point Demolition Ground (RPDG), the Robins Point Tower Site (RPTS), the South Beach Demolition Ground (SBDG), the South Beach Trench (SBT), and the Prototype Building (PB). The scope of this project is to conduct a remedial investigation/feasibility study (RI/FS) and ecological risk assessment to evaluate the impacts of past disposal activities at the J-Field site. Sampling for the RI will be carried out in three stages (I, II, and III) as detailed in the FSP. A phased approach will be used for the J-Field ecological risk assessment (ERA).

  15. Quality-assurance audits for the ARB (Air Resources Board)-sponsored Carbonaceous Species Methods Comparison Study at Citrus College, Glendora, California, August 12-21, 1986. Final report

    SciTech Connect

    Countess, R.J.

    1987-09-21

    A series of quality-assurance tasks were performed in support of the Air Resources Board-sponsored Carbonaceous Species Methods Comparison Study, conducted at Citrus College in Glendora, CA, August 1986. The project summarizes the quality assurance efforts for the study, which included: (1) flow rate audits for all samplers deployed in the nine day field study; (2) preparation and supplies of carbonaceous reference materials for an interlaboratory round-robin study; and (3) analysis of the reference materials as well as 20% of the ambient particulate samples collected by each of the study participants for both organic and elemental carbon. The final task was done in order to assess the influence of samplers upon collected particulate carbon.

  16. Quality assurance using outlier detection on an automatic segmentation method for the cerebellar peduncles

    NASA Astrophysics Data System (ADS)

    Li, Ke; Ye, Chuyang; Yang, Zhen; Carass, Aaron; Ying, Sarah H.; Prince, Jerry L.

    2016-03-01

    Cerebellar peduncles (CPs) are white matter tracts connecting the cerebellum to other brain regions. Automatic segmentation methods of the CPs have been proposed for studying their structure and function. Usually the performance of these methods is evaluated by comparing segmentation results with manual delineations (ground truth). However, when a segmentation method is run on new data (for which no ground truth exists) it is highly desirable to efficiently detect and assess algorithm failures so that these cases can be excluded from scientific analysis. In this work, two outlier detection methods aimed to assess the performance of an automatic CP segmentation algorithm are presented. The first one is a univariate non-parametric method using a box-whisker plot. We first categorize automatic segmentation results of a dataset of diffusion tensor imaging (DTI) scans from 48 subjects as either a success or a failure. We then design three groups of features from the image data of nine categorized failures for failure detection. Results show that most of these features can efficiently detect the true failures. The second method—supervised classification—was employed on a larger DTI dataset of 249 manually categorized subjects. Four classifiers—linear discriminant analysis (LDA), logistic regression (LR), support vector machine (SVM), and random forest classification (RFC)—were trained using the designed features and evaluated using a leave-one-out cross validation. Results show that the LR performs worst among the four classifiers and the other three perform comparably, which demonstrates the feasibility of automatically detecting segmentation failures using classification methods.

  17. 76 FR 26341 - Medicaid Program; Methods for Assuring Access to Covered Medicaid Services

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-06

    ..., telemedicine and telehealth, nurse help lines, health information technology and other methods for providing... years of research and consulted extensively with key stakeholders to develop a recommendation on how to... standards and are considering future proposals to address access issues under managed care delivery...

  18. Fast sample preparation for analysis of tablets and capsules: the ball-mill extraction method.

    PubMed

    Kok, S J; Debets, A J

    2001-11-01

    A new ball-mill extraction method for solid dosage forms was developed. It was used for tablets, and compared with a conventional (powdering and sonication) method applied in pharmaceutical analysis of solid dosage forms. The ball-mill sample preparation procedure is both quantitative and fast. No powdering, weighing and sonication steps are needed in the sample preparation. The complete procedure takes 2 min (milling and extraction) and 5 min (centrifugation), respectively, much less than the conventional method in which sample preparation takes approximately 45-90 min. The samples are centrifuged in the mill vial, which saves time and avoids evaporation of solvent. Stainless steel extraction vials with different diameters were fabricated to enable the use of various extraction volumes. The extraction recovery was tested using various types of tablets (small, large and extended release tablets) with active compounds at low and higher concentrations, recoveries were comparable with the conventional method. The relative small investment and simplicity of the method makes it excellently suited for use in various pharmaceutical (development and quality assurance) laboratories. PMID:11516911

  19. Photoacoustic sample vessel and method of elevated pressure operation

    DOEpatents

    Autrey, Tom; Yonker, Clement R.

    2004-05-04

    An improved photoacoustic vessel and method of photoacoustic analysis. The photoacoustic sample vessel comprises an acoustic detector, an acoustic couplant, and an acoustic coupler having a chamber for holding the acoustic couplant and a sample. The acoustic couplant is selected from the group consisting of liquid, solid, and combinations thereof. Passing electromagnetic energy through the sample generates an acoustic signal within the sample, whereby the acoustic signal propagates through the sample to and through the acoustic couplant to the acoustic detector.

  20. Methods for collecting benthic invertebrate samples as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic invertebrate communities are evaluated as part of the ecological survey component of the U.S. Geological Survey's National Water-Quality Assessment Program. These biological data are collected along with physical and chemical data to assess water-quality conditions and to develop an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. The objectives of benthic invertebrate community characterizations are to (1) develop for each site a list of tax a within the associated stream reach and (2) determine the structure of benthic invertebrate communities within selected habitats of that reach. A nationally consistent approach is used to achieve these objectives. This approach provides guidance on site, reach, and habitat selection and methods and equipment for qualitative multihabitat sampling and semi-quantitative single habitat sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data within and among study units.

  1. Sci—Sat AM: Stereo — 05: The Development of Quality Assurance Methods for Trajectory based Cranial SRS Treatments

    SciTech Connect

    Wilson, B; Duzenli, C; Gete, E; Teke, T

    2014-08-15

    The goal of this work was to develop and validate non-planar linac beam trajectories defined by the dynamic motion of the gantry, couch, jaws, collimator and MLCs. This was conducted on the Varian TrueBeam linac by taking advantage of the linac's advanced control features in a non-clinical mode (termed developers mode). In this work, we present quality assurance methods that we have developed to test for the positional and temporal accuracy of the linac's moving components. The first QA method focuses on the coordination of couch and gantry. For this test, we developed a cylindrical phantom which has a film insert. Using this phantom we delivered a plan with dynamic motion of the couch and gantry. We found the mean absolute deviation of the entrance position from its expected value to be 0.5mm, with a standard deviation of 0.5mm. This was within the tolerances set by the machine's mechanical accuracy and the setup accuracy of the phantom. We also present an altered picket fence test which has added dynamic and simultaneous rotations of the couch and the collimator. While the test was shown to be sensitive enough to discern errors 1° and greater, we were unable to identify any errors in the coordination of the linacs collimator and couch. When operating under normal conditions, the Varian TrueBeam linac was able to pass both tests and is within tolerances acceptable for complex trajectory based treatments.

  2. Systems and methods for self-synchronized digital sampling

    NASA Technical Reports Server (NTRS)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  3. A method for selecting training samples based on camera response

    NASA Astrophysics Data System (ADS)

    Zhang, Leihong; Li, Bei; Pan, Zilan; Liang, Dong; Kang, Yi; Zhang, Dawei; Ma, Xiuhua

    2016-09-01

    In the process of spectral reflectance reconstruction, sample selection plays an important role in the accuracy of the constructed model and in reconstruction effects. In this paper, a method for training sample selection based on camera response is proposed. It has been proved that the camera response value has a close correlation with the spectral reflectance. Consequently, in this paper we adopt the technique of drawing a sphere in camera response value space to select the training samples which have a higher correlation with the test samples. In addition, the Wiener estimation method is used to reconstruct the spectral reflectance. Finally, we find that the method of sample selection based on camera response value has the smallest color difference and root mean square error after reconstruction compared to the method using the full set of Munsell color charts, the Mohammadi training sample selection method, and the stratified sampling method. Moreover, the goodness of fit coefficient of this method is also the highest among the four sample selection methods. Taking all the factors mentioned above into consideration, the method of training sample selection based on camera response value enhances the reconstruction accuracy from both the colorimetric and spectral perspectives.

  4. Sampling bee communities using pan traps: alternative methods increase sample size

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...

  5. Passive Samplers for Investigations of Air Quality: Method Description, Implementation, and Comparison to Alternative Sampling Methods

    EPA Science Inventory

    This Paper covers the basics of passive sampler design, compares passive samplers to conventional methods of air sampling, and discusses considerations when implementing a passive sampling program. The Paper also discusses field sampling and sample analysis considerations to ensu...

  6. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    PubMed

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis. PMID:26525264

  7. Developing a novel method to analyse Gafchromic EBT2 films in intensity modulated radiation therapy quality assurance.

    PubMed

    Hu, Yunfei; Wang, Yang; Fogarty, Gerald; Liu, Guilin

    2013-12-01

    Recently individual intensity modulated radiation therapy quality assurances (IMRT QA) have been more and more performed with Gafchromic™ EBT series films processed in red-green-blue (R-G-B) channel due to their extremely high spatial resolution. However, the efficiency of this method is relatively low, as for each box of film, a calibration curve must be established prior to the film being used for measurement. In this study, the authors find a novel method to process the Gafchromic™ EBT series, that is, to use the 16-bit greyscale channel to process the exposed film rather than the conventional 48-bit R-G-B channel, which greatly increases the efficiency and even accuracy of the whole IMRT procedure. The main advantage is that when processed in greyscale channel, the Gafchromic™ EBT2 films exhibits a linear relationship between the net pixel value and the dose delivered. This linear relationship firstly reduces the error in calibration-curve fitting, and secondly saves the need of establishing a calibration curve for each box of films if it is only to be used for relative measurements. Clinical testing for this novel method was carried out in two radiation therapy centres that involved a total of 743 IMRT cases, and 740 cases passed the 3 mm 3 % gamma analysis criteria. The cases were also tested with small ionization chambers (cc-13) and the results were convincing. Consequently the authors recommend the use of this novel method to improve the accuracy and efficiency of individual IMRT QA procedure using Gafchromic EBT2 films.

  8. SU-E-J-126: Respiratory Gating Quality Assurance: A Simple Method to Achieve Millisecond Temporal Resolution

    SciTech Connect

    McCabe, B; Wiersma, R

    2014-06-01

    Purpose: Low temporal latency between a gating on/off signal and a linac beam on/off during respiratory gating is critical for patient safety. Although, a measurement of temporal lag is recommended by AAPM Task Group 142 for commissioning and annual quality assurance, there currently exists no published method. Here we describe a simple, inexpensive, and reliable method to precisely measure gating lag at millisecond resolutions. Methods: A Varian Real-time Position Management™ (RPM) gating simulator with rotating disk was modified with a resistive flex sensor (Spectra Symbol) attached to the gating box platform. A photon diode was placed at machine isocenter. Output signals of the flex sensor and diode were monitored with a multichannel oscilloscope (Tektronix™ DPO3014). Qualitative inspection of the gating window/beam on synchronicity were made by setting the linac to beam on/off at end-expiration, and the oscilloscope's temporal window to 100 ms to visually examine if the on/off timing was within the recommended 100-ms tolerance. Quantitative measurements were made by saving the signal traces and analyzing in MatLab™. The on and off of the beam signal were located and compared to the expected gating window (e.g. 40% to 60%). Four gating cycles were measured and compared. Results: On a Varian TrueBeam™ STx linac with RPM gating software, the average difference in synchronicity at beam on and off for four cycles was 14 ms (3 to 30 ms) and 11 ms (2 to 32 ms), respectively. For a Varian Clinac™ 21EX the average difference at beam on and off was 127 ms (122 to 133 ms) and 46 ms (42 to 49 ms), respectively. The uncertainty in the synchrony difference was estimated at ±6 ms. Conclusion: This new gating QA method is easy to implement and allows for fast qualitative inspection and quantitative measurements for commissioning and TG-142 annual QA measurements.

  9. Methods for sampling and inorganic analysis of coal

    USGS Publications Warehouse

    Golightly, D. W.; Simon, Frederick Otto

    1989-01-01

    Methods used by the U.S. Geological Survey for the sampling, comminution, and inorganic analysis of coal are summarized in this bulletin. Details, capabilities, and limitations of the methods are presented.

  10. DOE methods for evaluating environmental and waste management samples.

    SciTech Connect

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  11. SAMSAN- MODERN NUMERICAL METHODS FOR CLASSICAL SAMPLED SYSTEM ANALYSIS

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any

  12. In-depth analysis of sampling optimization methods

    NASA Astrophysics Data System (ADS)

    Lee, Honggoo; Han, Sangjun; Kim, Myoungsoo; Habets, Boris; Buhl, Stefan; Guhlemann, Steffen; Rößiger, Martin; Bellmann, Enrico; Kim, Seop

    2016-03-01

    High order overlay and alignment models require good coverage of overlay or alignment marks on the wafer. But dense sampling plans are not possible for throughput reasons. Therefore, sampling plan optimization has become a key issue. We analyze the different methods for sampling optimization and discuss the different knobs to fine-tune the methods to constraints of high volume manufacturing. We propose a method to judge sampling plan quality with respect to overlay performance, run-to-run stability and dispositioning criteria using a number of use cases from the most advanced lithography processes.

  13. Method and apparatus for imaging a sample on a device

    DOEpatents

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.

    2001-01-01

    A method and apparatus for imaging a sample are provided. An electromagnetic radiation source generates excitation radiation which is sized by excitation optics to a line. The line is directed at a sample resting on a support and excites a plurality of regions on the sample. Collection optics collect response radiation reflected from the sample I and image the reflected radiation. A detector senses the reflected radiation and is positioned to permit discrimination between radiation reflected from a certain focal plane in the sample and certain other planes within the sample.

  14. Rapid method for sampling metals for materials identification

    NASA Technical Reports Server (NTRS)

    Higgins, L. E.

    1971-01-01

    Nondamaging process similar to electrochemical machining is useful in obtaining metal samples from places inaccessible to conventional sampling methods or where methods would be hazardous or contaminating to specimens. Process applies to industries where metals or metal alloys play a vital role.

  15. Engineering Study of 500 ML Sample Bottle Transportation Methods

    SciTech Connect

    BOGER, R.M.

    1999-08-25

    This engineering study reviews and evaluates all available methods for transportation of 500-mL grab sample bottles, reviews and evaluates transportation requirements and schedules and analyzes and recommends the most cost-effective method for transporting 500-mL grab sample bottles.

  16. GROUND WATER PURGING AND SAMPLING METHODS: HISTORY VS. HYSTERIA

    EPA Science Inventory

    It has been over 10 years since the low-flow ground water purging and sampling method was initially reported in the literature. The method grew from the recognition that well purging was necessary to collect representative samples, bailers could not achieve well purging, and high...

  17. A method and fortran program for quantitative sampling in paleontology

    USGS Publications Warehouse

    Tipper, J.C.

    1976-01-01

    The Unit Sampling Method is a binomial sampling method applicable to the study of fauna preserved in rocks too well cemented to be disaggregated. Preliminary estimates of the probability of detecting each group in a single sampling unit can be converted to estimates of the group's volumetric abundance by means of correction curves obtained by a computer simulation technique. This paper describes the technique and gives the FORTRAN program. ?? 1976.

  18. A Mixed Methods Sampling Methodology for a Multisite Case Study

    ERIC Educational Resources Information Center

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  19. Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom

    USGS Publications Warehouse

    Zimmerman, L.R.; Strahan, A.P.; Thurman, E.M.

    2001-01-01

    A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

  20. Simple quality assurance method of dynamic tumor tracking with the gimbaled linac system using a light field.

    PubMed

    Miura, Hideharu; Ozawa, Shuichi; Hayata, Masahiro; Tsuda, Shintaro; Yamada, Kiyoshi; Nagata, Yasushi

    2016-01-01

    We proposed a simple visual method for evaluating the dynamic tumor tracking (DTT) accuracy of a gimbal mechanism using a light field. A single photon beam was set with a field size of 30 × 30 mm2 at a gantry angle of 90°. The center of a cube phantom was set up at the isocenter of a motion table, and 4D modeling was performed based on the tumor and infrared (IR) marker motion. After 4D modeling, the cube phantom was replaced with a sheet of paper, which was placed perpen-dicularly, and a light field was projected on the sheet of paper. The light field was recorded using a web camera in a treatment room that was as dark as possible. Calculated images from each image obtained using the camera were summed to compose a total summation image. Sinusoidal motion sequences were produced by moving the phantom with a fixed amplitude of 20 mm and different breathing periods of 2, 4, 6, and 8 s. The light field was projected on the sheet of paper under three conditions: with the moving phantom and DTT based on the motion of the phantom, with the moving phantom and non-DTT, and with a stationary phantom for comparison. The values of tracking errors using the light field were 1.12 ± 0.72, 0.31 ± 0.19, 0.27 ± 0.12, and 0.15 ± 0.09 mm for breathing periods of 2, 4, 6, and 8s, respectively. The tracking accuracy showed dependence on the breath-ing period. We proposed a simple quality assurance (QA) process for the tracking accuracy of a gimbal mechanism system using a light field and web camera. Our method can assess the tracking accuracy using a light field without irradiation and clearly visualize distributions like film dosimetry. PMID:27685142

  1. Simple quality assurance method of dynamic tumor tracking with the gimbaled linac system using a light field.

    PubMed

    Miura, Hideharu; Ozawa, Shuichi; Hayata, Masahiro; Tsuda, Shintaro; Yamada, Kiyoshi; Nagata, Yasushi

    2016-09-08

    We proposed a simple visual method for evaluating the dynamic tumor tracking (DTT) accuracy of a gimbal mechanism using a light field. A single photon beam was set with a field size of 30 × 30 mm2 at a gantry angle of 90°. The center of a cube phantom was set up at the isocenter of a motion table, and 4D modeling was performed based on the tumor and infrared (IR) marker motion. After 4D modeling, the cube phantom was replaced with a sheet of paper, which was placed perpen-dicularly, and a light field was projected on the sheet of paper. The light field was recorded using a web camera in a treatment room that was as dark as possible. Calculated images from each image obtained using the camera were summed to compose a total summation image. Sinusoidal motion sequences were produced by moving the phantom with a fixed amplitude of 20 mm and different breathing periods of 2, 4, 6, and 8 s. The light field was projected on the sheet of paper under three conditions: with the moving phantom and DTT based on the motion of the phantom, with the moving phantom and non-DTT, and with a stationary phantom for comparison. The values of tracking errors using the light field were 1.12 ± 0.72, 0.31 ± 0.19, 0.27 ± 0.12, and 0.15 ± 0.09 mm for breathing periods of 2, 4, 6, and 8s, respectively. The tracking accuracy showed dependence on the breath-ing period. We proposed a simple quality assurance (QA) process for the tracking accuracy of a gimbal mechanism system using a light field and web camera. Our method can assess the tracking accuracy using a light field without irradiation and clearly visualize distributions like film dosimetry.

  2. Application of the SAMR method to high magnetostrictive samples

    NASA Astrophysics Data System (ADS)

    Sanchez, P.; Lopez, E.; Trujillo, M. C. Sanchez; Aroca, C.

    1988-12-01

    Magnetostriction measurement by using the small angle magnetization rotation method (SAMR) has been performed in high magnetostrictive amorphous samples. To apply the SAMR method to these samples, a theoritical model about the influence of the internal stresses and magnetization distribution has been proposed. The dependence of the magnetostriction, λ s, with the temperature and applied stress was measured in as-cast and in different annealed samples. In the as-cast samples the existence of a stray field and a dependence of λ s with the applied stress has been observed.

  3. Method for using polarization gating to measure a scattering sample

    DOEpatents

    Baba, Justin S.

    2015-08-04

    Described herein are systems, devices, and methods facilitating optical characterization of scattering samples. A polarized optical beam can be directed to pass through a sample to be tested. The optical beam exiting the sample can then be analyzed to determine its degree of polarization, from which other properties of the sample can be determined. In some cases, an apparatus can include a source of an optical beam, an input polarizer, a sample, an output polarizer, and a photodetector. In some cases, a signal from a photodetector can be processed through attenuation, variable offset, and variable gain.

  4. Interval sampling methods and measurement error: a computer simulation.

    PubMed

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments.

  5. [Recent advances in sample preparation methods of plant hormones].

    PubMed

    Wu, Qian; Wang, Lus; Wu, Dapeng; Duan, Chunfeng; Guan, Yafeng

    2014-04-01

    Plant hormones are a group of naturally occurring trace substances which play a crucial role in controlling the plant development, growth and environment response. With the development of the chromatography and mass spectroscopy technique, chromatographic analytical method has become a widely used way for plant hormone analysis. Among the steps of chromatographic analysis, sample preparation is undoubtedly the most vital one. Thus, a highly selective and efficient sample preparation method is critical for accurate identification and quantification of phytohormones. For the three major kinds of plant hormones including acidic plant hormones & basic plant hormones, brassinosteroids and plant polypeptides, the sample preparation methods are reviewed in sequence especially the recently developed methods. The review includes novel methods, devices, extractive materials and derivative reagents for sample preparation of phytohormones analysis. Especially, some related works of our group are included. At last, the future developments in this field are also prospected.

  6. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  7. Assessment of Flatbed Scanner Method for Quality Assurance Testing of Air Content and Spacing Factor in Concrete

    NASA Astrophysics Data System (ADS)

    Nezami, Sona

    The flatbed scanner method for air void analysis of concrete is investigated through a comparison study with the standard ASTM C457 manual and Rapid Air 457 test methods. Air void parameters including air content and spacing factor are determined by image analysis of a large population of scanned samples through contrast enhancement and threshold determination procedures. It is shown that flatbed scanner method is giving comparable results to manual and Rapid Air 457 methods. Furthermore, a comparison of the air void chord length distributions obtained from the two methods of flatbed scanner and Rapid Air 457 has been implemented in this research. The effect of having different settings in the scanning process of scanner method is also investigated. Moreover, a threshold study has been performed that showed the flatbed scanner method can be employed in combination with manual and Rapid Air 457 methods as a time and cost saving strategy.

  8. Method and sample spinning apparatus for measuring the NMR spectrum of an orientationally disordered sample

    DOEpatents

    Pines, Alexander; Samoson, Ago

    1990-01-01

    An improved NMR apparatus and method are described which substantially improve the resolution of NMR measurements made on powdered or amorphous or otherwise orientationally disordered samples. The apparatus spins the sample about an axis. The angle of the axis is mechanically varied such that the time average of two or more Legendre polynomials are zero.

  9. Methods for collection and analysis of water samples

    USGS Publications Warehouse

    Rainwater, Frank Hays; Thatcher, Leland Lincoln

    1960-01-01

    This manual contains methods used by the U.S. Geological Survey to collect, preserve, and analyze water samples. Throughout, the emphasis is on obtaining analytical results that accurately describe the chemical composition of the water in situ. Among the topics discussed are selection of sampling sites, frequency of sampling, field equipment, preservatives and fixatives, analytical techniques of water analysis, and instruments. Seventy-seven laboratory and field procedures are given for determining fifty-three water properties.

  10. Marshall Island radioassay quality assurance program an overview

    SciTech Connect

    Conrado, C.L.; Hamilton, T.F.; Kehl, S.R.; Robison, W.L.; Stoker, A.C.

    1998-09-01

    The Lawrence Livermore National Laboratory has developed an extensive quality assurance program to provide high quality data and assessments in support of the Marshall Islands Dose Assessment and Radioecology Program. Our quality assurance objectives begin with the premise of providing integrated and cost-effective program support (to meet wide-ranging programmatic needs, scientific peer review, litigation defense, and build public confidence) and continue through from design and implementation of large-scale field programs, sampling and sample preparation, radiometric and chemical analyses, documentation of quality assurance/quality control practices, exposure assessments, and dose/risk assessments until publication. The basic structure of our radioassay quality assurance/quality control program can be divided into four essential elements; (1) sample and data integrity control; (2) instrument validation and calibration; (3) method performance testing, validation, development and documentation; and (4) periodic peer review and on-site assessments. While our quality assurance objectives are tailored towards a single research program and the evaluation of major exposure pathways/critical radionuclides pertinent to the Marshall Islands, we have attempted to develop quality assurance practices that are consistent with proposed criteria designed for laboratory accre

  11. Nominal Weights Mean Equating: A Method for Very Small Samples

    ERIC Educational Resources Information Center

    Babcock, Ben; Albano, Anthony; Raymond, Mark

    2012-01-01

    The authors introduced nominal weights mean equating, a simplified version of Tucker equating, as an alternative for dealing with very small samples. The authors then conducted three simulation studies to compare nominal weights mean equating to six other equating methods under the nonequivalent groups anchor test design with sample sizes of 20,…

  12. Isolation of Legionella from water samples using various culture methods.

    PubMed

    Kusnetsov, J M; Jousimies-Somer, H R; Nevalainen, A I; Martikainen, P J

    1994-02-01

    The efficacy of a non-selective medium and two selective media were compared for the isolation of legionellas from water samples. The effect of acid wash treatment for decontamination of the water samples on the isolation frequency of legionellas was also studied. The 236 samples were taken from cooling, humidifying and drinking water systems; 21% were legionella-positive when inoculated directly on modified Wadowsky-Yee (MWY) medium and 26% were positive when concentrated (x 200) before cultivation on MWY or CCVC media. Inoculation on MWY medium after concentration followed by decontamination by the acid-wash technique gave the highest isolation frequency (31%). The lowest frequency (8%) was found with the non-selective BCYE alpha medium. An isolation frequency of 28% was achieved with the BCYE alpha medium after concentration and acid-wash treatment of the samples. Forty per cent of the samples were positive for legionellas when the results from all the culture methods were combined. Not all the legionella-positive samples were identified by a single culture method. Ninety-three of the 95 positive samples were detected with the two best combinations of three culture methods. The best culture method for detecting legionellas depended on the source of the water sample. Some water quality characteristics, like temperature and organic matter content, affected the isolation frequency of Legionella spp.

  13. A cryopreservation method for Pasteurella multocida from wetland samples

    USGS Publications Warehouse

    Moore, Melody K.; Shadduck, D.J.; Goldberg, D.R.; Samuel, M.D.

    1998-01-01

    A cryopreservation method and improved isolation techniques for detection of Pasteurella multocida from wetland samples were developed. Wetland water samples were collected in the field, diluted in dimethyl sulfoxide (DMSO, final concentration 10%), and frozen at -180 C in a liquid nitrogen vapor shipper. Frozen samples were transported to the laboratory where they were subsequently thawed and processed in Pasteurella multocida selective broth (PMSB) to isolate P. multocida. This method allowed for consistent isolation of 2 to 18 organisms/ml from water seeded with known concentrations of P. multocida. The method compared favorably with the standard mouse inoculation method and allowed for preservation of the samples until they could be processed in the laboratory.

  14. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    PubMed

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well. PMID:26107223

  15. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    PubMed

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.

  16. Methods of human body odor sampling: the effect of freezing.

    PubMed

    Lenochova, Pavlina; Roberts, S Craig; Havlicek, Jan

    2009-02-01

    Body odor sampling is an essential tool in human chemical ecology research. However, methodologies of individual studies vary widely in terms of sampling material, length of sampling, and sample processing. Although these differences might have a critical impact on results obtained, almost no studies test validity of current methods. Here, we focused on the effect of freezing samples between collection and use in experiments involving body odor perception. In 2 experiments, we tested whether axillary odors were perceived differently by raters when presented fresh or having been frozen and whether several freeze-thaw cycles affected sample quality. In the first experiment, samples were frozen for 2 weeks, 1 month, or 4 months. We found no differences in ratings of pleasantness, attractiveness, or masculinity between fresh and frozen samples. Similarly, almost no differences between repeatedly thawed and fresh samples were found. We found some variations in intensity; however, this was unrelated to length of storage. The second experiment tested differences between fresh samples and those frozen for 6 months. Again no differences in subjective ratings were observed. These results suggest that freezing has no significant effect on perceived odor hedonicity and that samples can be reliably used after storage for relatively long periods.

  17. Soil separator and sampler and method of sampling

    DOEpatents

    O'Brien, Barry H [Idaho Falls, ID; Ritter, Paul D [Idaho Falls, ID

    2010-02-16

    A soil sampler includes a fluidized bed for receiving a soil sample. The fluidized bed may be in communication with a vacuum for drawing air through the fluidized bed and suspending particulate matter of the soil sample in the air. In a method of sampling, the air may be drawn across a filter, separating the particulate matter. Optionally, a baffle or a cyclone may be included within the fluidized bed for disentrainment, or dedusting, so only the finest particulate matter, including asbestos, will be trapped on the filter. The filter may be removable, and may be tested to determine the content of asbestos and other hazardous particulate matter in the soil sample.

  18. Method and apparatus for imaging a sample on a device

    DOEpatents

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.

    1996-01-01

    The present invention provides methods and systems for detecting a labeled marker on a sample located on a support. The imaging system comprises a body for immobilizing the support, an excitation radiation source and excitation optics to generate and direct the excitation radiation at the sample. In response, labeled material on the sample emits radiation which has a wavelength that is different from the excitation wavelength, which radiation is collected by collection optics and imaged onto a detector which generates an image of the sample.

  19. System and method for measuring fluorescence of a sample

    SciTech Connect

    Riot, Vincent J

    2015-03-24

    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  20. DOE methods for evaluating environmental and waste management samples

    SciTech Connect

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  1. Convenient mounting method for electrical measurements of thin samples

    NASA Technical Reports Server (NTRS)

    Matus, L. G.; Summers, R. L.

    1986-01-01

    A method for mounting thin samples for electrical measurements is described. The technique is based on a vacuum chuck concept in which the vacuum chuck simultaneously holds the sample and established electrical contact. The mounting plate is composed of a glass-ceramic insulating material and the surfaces of the plate and vacuum chuck are polished. The operation of the vacuum chuck is examined. The contacts on the sample and mounting plate, which are sputter-deposited through metal masks, are analyzed. The mounting method was utilized for van der Pauw measurements.

  2. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  3. Comparison of surface sampling methods for virus recovery from fomites.

    PubMed

    Julian, Timothy R; Tamayo, Francisco J; Leckie, James O; Boehm, Alexandria B

    2011-10-01

    The role of fomites in infectious disease transmission relative to other exposure routes is difficult to discern due, in part, to the lack of information on the level and distribution of virus contamination on surfaces. Comparisons of studies intending to fill this gap are difficult because multiple different sampling methods are employed and authors rarely report their method's lower limit of detection. In the present study, we compare a subset of sampling methods identified from a literature review to demonstrate that sampling method significantly influences study outcomes. We then compare a subset of methods identified from the review to determine the most efficient methods for recovering virus from surfaces in a laboratory trial using MS2 bacteriophage as a model virus. Recoveries of infective MS2 and MS2 RNA are determined using both a plaque assay and quantitative reverse transcription-PCR, respectively. We conclude that the method that most effectively recovers virus from nonporous fomites uses polyester-tipped swabs prewetted in either one-quarter-strength Ringer's solution or saline solution. This method recovers a median fraction for infective MS2 of 0.40 and for MS2 RNA of 0.07. Use of the proposed method for virus recovery in future fomite sampling studies would provide opportunities to compare findings across multiple studies.

  4. Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations

    PubMed Central

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.

    2016-01-01

    Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when

  5. Methods of analysis and quality-assurance practices of the U.S. Geological Survey organic laboratory, Sacramento, California; determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Domagalski, Joseph L.; Kuivila, Kathryn M.

    1994-01-01

    Analytical method and quality-assurance practices were developed for a study of the fate and transport of pesticides in the Sacramento-San Joaquin Delta and the Sacramento and San Joaquin River. Water samples were filtered to remove suspended parti- culate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide, and the pesticides were eluted with three 2-milliliter aliquots of hexane:diethyl ether (1:1). The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for analytes determined per 1,500-milliliter samples ranged from 0.006 to 0.047 microgram per liter. Recoveries ranged from 47 to 89 percent for 12 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.05 and 0.26 microgram per liter. The method was modified to improve the pesticide recovery by reducing the sample volume to 1,000 milliliters. Internal standards were added to improve quantitative precision and accuracy. The analysis also was expanded to include a total of 21 pesticides. The method detection limits for 1,000-milliliter samples ranged from 0.022 to 0.129 microgram per liter. Recoveries ranged from 38 to 128 percent for 21 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.10 and 0.75 microgram per liter.

  6. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  7. Standard methods for sampling North American freshwater fishes

    USGS Publications Warehouse

    Bonar, Scott A.; Hubert, Wayne A.; Willis, David W.

    2009-01-01

    This important reference book provides standard sampling methods recommended by the American Fisheries Society for assessing and monitoring freshwater fish populations in North America. Methods apply to ponds, reservoirs, natural lakes, and streams and rivers containing cold and warmwater fishes. Range-wide and eco-regional averages for indices of abundance, population structure, and condition for individual species are supplied to facilitate comparisons of standard data among populations. Provides information on converting nonstandard to standard data, statistical and database procedures for analyzing and storing standard data, and methods to prevent transfer of invasive species while sampling.

  8. Do Too Many Rights Make a Wrong? A Qualitative Study of the Experiences of a Sample of Malaysian and Singapore Private Higher Education Providers in Transnational Quality Assurance

    ERIC Educational Resources Information Center

    Lim, Fion Choon Boey

    2010-01-01

    Assuring the quality of transnational education has been an endeavour of increasing importance in the internationalisation of higher education but is also increasingly challenging given the involvement of many stakeholders. This paper focuses on the experiences of and challenges faced by private tertiary education providers in Malaysia and…

  9. Beryllium Wipe Sampling (differing methods - differing exposure potentials)

    SciTech Connect

    Kerr, Kent

    2005-03-09

    This research compared three wipe sampling techniques currently used to test for beryllium contamination on room and equipment surfaces in Department of Energy facilities. Efficiencies of removal of beryllium contamination from typical painted surfaces were tested by wipe sampling without a wetting agent, with water-moistened wipe materials, and by methanol-moistened wipes. Analysis indicated that methanol-moistened wipe sampling removed about twice as much beryllium/oil-film surface contamination as water-moistened wipes, which removed about twice as much residue as dry wipes. Criteria at 10 CFR 850.30 and .31 were established on unspecified wipe sampling method(s). The results of this study reveal a need to identify criteria-setting method and equivalency factors. As facilities change wipe sampling methods among the three compared in this study, these results may be useful for approximate correlations. Accurate decontamination decision-making depends on the selection of appropriate wetting agents for the types of residues and surfaces. Evidence for beryllium sensitization via skin exposure argues in favor of wipe sampling with wetting agents that provide enhanced removal efficiency such as methanol when surface contamination includes oil mist residue.

  10. SU-E-I-60: Quality Assurance Testing Methods and Customized Phantom for Magnetic Resonance Imaging and Spectroscopy

    SciTech Connect

    Song, K-H; Lee, D-W; Choe, B-Y

    2015-06-15

    Purpose: The objectives of this study are to develop an magnetic resonance imaging and spectroscopy (MRI-MRS) fused phantom along with the inserts for metabolite quantification and to conduct quantitative analysis and evaluation of the layered vials of brain-mimicking solution for quality assurance (QA) performance, according to the localization sequence. Methods: The outer cylindrical phantom body is made of acrylic materials. The section other than where the inner vials are located was filled with copper sulfate and diluted with water so as to reduce the T1 relaxation time. Sodium chloride was included to provide conductivity similar to the human body. All measurements of MRI and MRS were made using a 3.0 T scanner (Achiva Tx 3.0 T; Philips Medical Systems, Netherlands). The MRI scan parameters were as follows: (1) spin echo (SE) T1-weighted image: repetition time (TR), 500ms; echo time (TE), 20ms; matrix, 256×256; field of view (FOV), 250mm; gap, 1mm; number of signal averages (NSA), 1; (2) SE T2-weighted image: TR, 2,500 ms; TE, 80 ms; matrix, 256×256; FOV, 250mm; gap, 1mm; NSA, 1; 23 slice images were obtained with slice thickness of 5mm. The water signal of each volume of interest was suppressed by variable pulse power and optimized relaxation delays (VAPOR) applied before the scan. By applying a point-resolved spectroscopy sequence, the MRS scan parameters were as follows: voxel size, 0.8×0.8×0.8 cm{sup 3}; TR, 2,000ms; TE, 35ms; NSA, 128. Results: Using the fused phantom, the results of measuring MRI factors were: geometric distortion, <2% and ±2 mm; image intensity uniformity, 83.09±1.33%; percent-signal ghosting, 0.025±0.004; low-contrast object detectability, 27.85±0.80. In addition, the signal-to-noise ratio of N-acetyl-aspartate was consistently high (42.00±5.66). Conclusion: The MRI-MRS QA factors obtained simultaneously using the phantom can facilitate evaluation of both images and spectra, and provide guidelines for obtaining MRI and MRS QA

  11. College Quality Assurance Assurances. Mendip Papers 020.

    ERIC Educational Resources Information Center

    Sallis, E.; Hingley, P.

    This paper discusses the increasing interest in quality assurance in British education including its measurement and management through the introduction of a quality assurance system. The reasons and benefits of beginning a quality assurance system are discussed, and questions of what constitutes quality, whether it is quality in fact…

  12. QADATA user's manual; an interactive computer program for the retrieval and analysis of the results from the external blind sample quality- assurance project of the U.S. Geological Survey

    USGS Publications Warehouse

    Lucey, K.J.

    1990-01-01

    The U.S. Geological Survey conducts an external blind sample quality assurance project for its National Water Quality Laboratory in Denver, Colorado, based on the analysis of reference water samples. Reference samples containing selected inorganic and nutrient constituents are disguised as environmental samples at the Survey 's office in Ocala, Florida, and are sent periodically through other Survey offices to the laboratory. The results of this blind sample project indicate the quality of analytical data produced by the laboratory. This report provides instructions on the use of QADATA, an interactive, menu-driven program that allows users to retrieve the results of the blind sample quality- assurance project. The QADATA program, which is available on the U.S. Geological Survey 's national computer network, accesses a blind sample data base that contains more than 50,000 determinations from the last five water years for approximately 40 constituents at various concentrations. The data can be retrieved from the database for any user- defined time period and for any or all available constituents. After the user defines the retrieval, the program prepares statistical tables, control charts, and precision plots and generates a report which can be transferred to the user 's office through the computer network. A discussion of the interpretation of the program output is also included. This quality assurance information will permit users to document the quality of the analytical results received from the laboratory. The blind sample data is entered into the database within weeks after being produced by the laboratory and can be retrieved to meet the needs of specific projects or programs. (USGS)

  13. Examination of Hydrate Formation Methods: Trying to Create Representative Samples

    SciTech Connect

    Kneafsey, T.J.; Rees, E.V.L.; Nakagawa, S.; Kwon, T.-H.

    2011-04-01

    Forming representative gas hydrate-bearing laboratory samples is important so that the properties of these materials may be measured, while controlling the composition and other variables. Natural samples are rare, and have often experienced pressure and temperature changes that may affect the property to be measured [Waite et al., 2008]. Forming methane hydrate samples in the laboratory has been done a number of ways, each having advantages and disadvantages. The ice-to-hydrate method [Stern et al., 1996], contacts melting ice with methane at the appropriate pressure to form hydrate. The hydrate can then be crushed and mixed with mineral grains under controlled conditions, and then compacted to create laboratory samples of methane hydrate in a mineral medium. The hydrate in these samples will be part of the load-bearing frame of the medium. In the excess gas method [Handa and Stupin, 1992], water is distributed throughout a mineral medium (e.g. packed moist sand, drained sand, moistened silica gel, other porous media) and the mixture is brought to hydrate-stable conditions (chilled and pressurized with gas), allowing hydrate to form. This method typically produces grain-cementing hydrate from pendular water in sand [Waite et al., 2004]. In the dissolved gas method [Tohidi et al., 2002], water with sufficient dissolved guest molecules is brought to hydrate-stable conditions where hydrate forms. In the laboratory, this is can be done by pre-dissolving the gas of interest in water and then introducing it to the sample under the appropriate conditions. With this method, it is easier to form hydrate from more soluble gases such as carbon dioxide. It is thought that this method more closely simulates the way most natural gas hydrate has formed. Laboratory implementation, however, is difficult, and sample formation is prohibitively time consuming [Minagawa et al., 2005; Spangenberg and Kulenkampff, 2005]. In another version of this technique, a specified quantity of gas

  14. Fluidics platform and method for sample preparation and analysis

    SciTech Connect

    Benner, W. Henry; Dzenitis, John M.; Bennet, William J.; Baker, Brian R.

    2014-08-19

    Herein provided are fluidics platform and method for sample preparation and analysis. The fluidics platform is capable of analyzing DNA from blood samples using amplification assays such as polymerase-chain-reaction assays and loop-mediated-isothermal-amplification assays. The fluidics platform can also be used for other types of assays and analyzes. In some embodiments, a sample in a sealed tube can be inserted directly. The following isolation, detection, and analyzes can be performed without a user's intervention. The disclosed platform may also comprises a sample preparation system with a magnetic actuator, a heater, and an air-drying mechanism, and fluid manipulation processes for extraction, washing, elution, assay assembly, assay detection, and cleaning after reactions and between samples.

  15. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, G.J.; Motes, B.G.; Bird, S.K.; Kotter, D.K.

    1996-03-26

    Apparatus for obtaining a whole gas sample, is composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method is described for obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant. 3 figs.

  16. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, Gary J.; Motes, Billy G.; Bird, Susan K.; Kotter, Dale K.

    1996-01-01

    Apparatus for obtaining a whole gas sample, composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method of obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant.

  17. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    USGS Publications Warehouse

    Fischer, Jesse R.; Quist, Michael

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  18. Comparison of pigment content of paint samples using spectrometric methods.

    PubMed

    Trzcińska, Beata; Kowalski, Rafał; Zięba-Palus, Janina

    2014-09-15

    The aim of the paper was to evaluate the influence of pigment concentration and its distribution in polymer binder on the possibility of colour identification and paint sample comparison. Two sets of paint samples: one containing red and another one green pigment were prepared. Each set consisted of 13 samples differing gradually in the concentration of pigment. To obtain the sets of various colour shades white paint was mixed with the appropriate pigment in the form of a concentrated suspension. After solvents evaporation the samples were examined using spectrometric methods. The resin and main filler were identified by IR method. Colour and white pigments were identified on the base of Raman spectra. Colour of samples were compared based on Vis spectrometry according to colour theory. It was found that samples are homogenous (parameter measuring colour similarity ΔE<3). The values of ΔE between the neighbouring samples in the set revealed decreasing linear function and between the first and following one--a logarithmic function.

  19. Off-axis angular spectrum method with variable sampling interval

    NASA Astrophysics Data System (ADS)

    Kim, Yong-Hae; Byun, Chun-Won; Oh, Himchan; Pi, Jae-Eun; Choi, Ji-Hun; Kim, Gi Heon; Lee, Myung-Lae; Ryu, Hojun; Hwang, Chi-Sun

    2015-08-01

    We proposed a novel off-axis angular spectrum method (ASM) for simulating free space wave propagation with a large shifted destination plane. The off-axis numerical simulation took wave propagation between a parallel source and a destination plane, but a destination plane was shifted from a source plane. The shifted angular spectrum method was proposed for diffraction simulation with a shifted destination plane and satisfied the Nyquist condition for sampling by limiting a bandwidth of a propagation field to avoid an aliasing error due to under sampling. However, the effective sampling number of the shifted ASM decreased when the shifted distance of the destination plane was large which caused a numerical error in the diffraction simulation. To compensate for the decrease of an effective sampling number for the large shifted destination plane, we used a variable sampling interval in a Fourier space to maintain the same effective sampling number independent of the shifted distance of the destination plane. As a result, our proposed off-axis ASM with a variable sampling interval can produce simulation results with high accuracy for nearly every shifted distance of a destination plane when an off-axis angle is less than 75°. We compared the performances of the off-axis ASM using the Chirp Z transform and non-uniform FFT for implementing a variable spatial frequency in a Fourier space.

  20. Comparison of pigment content of paint samples using spectrometric methods

    NASA Astrophysics Data System (ADS)

    Trzcińska, Beata; Kowalski, Rafał; Zięba-Palus, Janina

    2014-09-01

    The aim of the paper was to evaluate the influence of pigment concentration and its distribution in polymer binder on the possibility of colour identification and paint sample comparison. Two sets of paint samples: one containing red and another one green pigment were prepared. Each set consisted of 13 samples differing gradually in the concentration of pigment. To obtain the sets of various colour shades white paint was mixed with the appropriate pigment in the form of a concentrated suspension. After solvents evaporation the samples were examined using spectrometric methods. The resin and main filler were identified by IR method. Colour and white pigments were identified on the base of Raman spectra. Colour of samples were compared based on Vis spectrometry according to colour theory. It was found that samples are homogenous (parameter measuring colour similarity ΔE < 3). The values of ΔE between the neighbouring samples in the set revealed decreasing linear function and between the first and following one - a logarithmic function.

  1. [Progress in sample preparation and analytical methods for trace polar small molecules in complex samples].

    PubMed

    Zhang, Qianchun; Luo, Xialin; Li, Gongke; Xiao, Xiaohua

    2015-09-01

    Small polar molecules such as nucleosides, amines, amino acids are important analytes in biological, food, environmental, and other fields. It is necessary to develop efficient sample preparation and sensitive analytical methods for rapid analysis of these polar small molecules in complex matrices. Some typical materials in sample preparation, including silica, polymer, carbon, boric acid and so on, are introduced in this paper. Meanwhile, the applications and developments of analytical methods of polar small molecules, such as reversed-phase liquid chromatography, hydrophilic interaction chromatography, etc., are also reviewed. PMID:26753274

  2. RAPID METHOD FOR DETERMINATION OF RADIOSTRONTIUM IN EMERGENCY MILK SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.

    2008-07-17

    A new rapid separation method for radiostrontium in emergency milk samples was developed at the Savannah River Site (SRS) Environmental Bioassay Laboratory (Aiken, SC, USA) that will allow rapid separation and measurement of Sr-90 within 8 hours. The new method uses calcium phosphate precipitation, nitric acid dissolution of the precipitate to coagulate residual fat/proteins and a rapid strontium separation using Sr Resin (Eichrom Technologies, Darien, IL, USA) with vacuum-assisted flow rates. The method is much faster than previous method that use calcination or cation exchange pretreatment, has excellent chemical recovery, and effectively removes beta interferences. When a 100 ml sample aliquot is used, the method has a detection limit of 0.5 Bq/L, well below generic emergency action levels.

  3. Chemicals of emerging concern in water and bottom sediment in the Great Lakes Basin, 2012: collection methods, analytical methods, quality assurance, and study data

    USGS Publications Warehouse

    Lee, Kathy E.; Langer, Susan K.; Menheer, Michael A.; Hansen, Donald S.; Foreman, William T.; Furlong, Edward T.; Jorgenson, Zachary G.; Choy, Steven J.; Moore, Jeremy N.; Banda, JoAnn; Gefell, Daniel J.

    2015-01-01

    During this study, 53 environmental samples, 4 field duplicate samples, and 8 field spike samples of bottom sediment and laboratory matrix-spike samples were analyzed for a wide variety of CECs at the USGS National Water Quality Laboratory using laboratory schedule 5433 for wastewater indicators; research method 6434 for steroid hormones, sterols, and bisphenol A; and research method 9008 for human-use pharmaceuticals and antidepressants. Forty of the 57 chemicals analyzed using laboratory schedule 5433 had detectable concentrations ranging from 1 to 49,000 micrograms per kilogram. Fourteen of the 20 chemicals analyzed using research method 6434 had detectable concentrations ranging from 0.04 to 24,940 nanograms per gram. Ten of the 20 chemicals analyzed using research method 9008 had detectable concentrations ranging from 0.59 to 197.5 micrograms per kilogr

  4. Tomotherapy treatment plan quality assurance: The impact of applied criteria on passing rate in gamma index method

    SciTech Connect

    Bresciani, Sara; Di Dia, Amalia; Maggio, Angelo; Cutaia, Claudia; Miranti, Anna; Infusino, Erminia; Stasi, Michele

    2013-12-15

    Purpose: Pretreatment patient plan verification with gamma index (GI) metric analysis is standard procedure for intensity modulated radiation therapy (IMRT) treatment. The aim of this paper is to evaluate the variability of the local and global gamma index obtained during standard pretreatment quality assurance (QA) measurements for plans performed with Tomotherapy unit. The QA measurements were performed with a 3D diode array, using variable passing criteria: 3%/3 mm, 2%/2 mm, 1%/1 mm, each with both local and global normalization.Methods: The authors analyzed the pretreatment QA results for 73 verifications; 37 were prostate cancer plans, 16 were head and neck plans, and 20 were other clinical sites. All plans were treated using the Tomotherapy Hi-Art System. Pretreatment QA plans were performed with the commercially available 3D diode array ArcCHECK™. This device has 1386 diodes arranged in a helical geometry spaced 1 cm apart. The dose measurements were acquired on the ArcCHECK™ and then compared with the calculated dose using the standard gamma analysis method. The gamma passing rate (%GP), defined as the percentage of points satisfying the condition GI < 1, was calculated for different criteria (3%/3 mm, 2%/2 mm, 1%/1 mm) and for both global and local normalization. In the case of local normalization method, the authors set three dose difference threshold (DDT) values of 2, 3, and 5 cGy. Dose difference threshold is defined as the minimum absolute dose error considered in the analysis when using local normalization. Low-dose thresholds (TH) of 5% and 10% were also applied and analyzed.Results: Performing a paired-t-test, the authors determined that the gamma passing rate is independent of the threshold values for all of the adopted criteria (5%TH vs 10%TH, p > 0.1). Our findings showed that mean %GPs for local (or global) normalization for the entire study group were 93% (98%), 84% (92%), and 66% (61%) for 3%/3 mm, 2%/2 mm, and 1%/1 mm criteria

  5. NEW COLUMN SEPARATION METHOD FOR EMERGENCY URINE SAMPLES

    SciTech Connect

    Maxwell, S; Brian Culligan, B

    2007-08-28

    The Savannah River Site Environmental Bioassay Lab participated in the 2007 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2007. A new rapid column separation method was applied directly to the NRIP 2007 emergency urine samples, with only minimal sample preparation to reduce preparation time. Calcium phosphate precipitation, previously used to pre-concentrate actinides and Sr-90 in NRIP 2006 urine and water samples, was not used for the NRIP 2007 urine samples. Instead, the raw urine was acidified and passed directly through the stacked resin columns (TEVA+TRU+SR Resins) to separate the actinides and strontium from the NRIP urine samples more quickly. This improvement reduced sample preparation time for the NRIP 2007 emergency urine analyses significantly. This approach works well for small volume urine samples expected during an emergency response event. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and strontium-90 analyses for NRIP 2007 urine samples.

  6. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    NASA Technical Reports Server (NTRS)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  7. RAPID SEPARATION METHOD FOR EMERGENCY WATER AND URINE SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.

    2008-08-27

    The Savannah River Site Environmental Bioassay Lab participated in the 2008 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2008. A new rapid column separation method was used for analysis of actinides and {sup 90}Sr the NRIP 2008 emergency water and urine samples. Significant method improvements were applied to reduce analytical times. As a result, much faster analysis times were achieved, less than 3 hours for determination of {sup 90}Sr and 3-4 hours for actinides. This represents a 25%-33% improvement in analysis times from NRIP 2007 and a {approx}100% improvement compared to NRIP 2006 report times. Column flow rates were increased by a factor of two, with no significant adverse impact on the method performance. Larger sample aliquots, shorter count times, faster cerium fluoride microprecipitation and streamlined calcium phosphate precipitation were also employed. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and {sup 90}Sr analyses for NRIP 2008 emergency urine samples. High levels of potential matrix interferences may be present in emergency samples and rugged methods are essential. Extremely high levels of {sup 210}Po were found to have an adverse effect on the uranium results for the NRIP-08 urine samples, while uranium results for NRIP-08 water samples were not affected. This problem, which was not observed for NRIP-06 or NRIP-07 urine samples, was resolved by using an enhanced {sup 210}Po removal step, which will be described.

  8. Comparison of sampling methods for radiocarbon dating of carbonyls in air samples via accelerator mass spectrometry

    NASA Astrophysics Data System (ADS)

    Schindler, Matthias; Kretschmer, Wolfgang; Scharf, Andreas; Tschekalinskij, Alexander

    2016-05-01

    Three new methods to sample and prepare various carbonyl compounds for radiocarbon measurements were developed and tested. Two of these procedures utilized the Strecker synthetic method to form amino acids from carbonyl compounds with either sodium cyanide or trimethylsilyl cyanide. The third procedure used semicarbazide to form crystalline carbazones with the carbonyl compounds. The resulting amino acids and semicarbazones were then separated and purified using thin layer chromatography. The separated compounds were then combusted to CO2 and reduced to graphite to determine 14C content by accelerator mass spectrometry (AMS). All of these methods were also compared with the standard carbonyl compound sampling method wherein a compound is derivatized with 2,4-dinitrophenylhydrazine and then separated by high-performance liquid chromatography (HPLC).

  9. Software Assurance Using Structured Assurance Case Models

    PubMed Central

    Rhodes, Thomas; Boland, Frederick; Fong, Elizabeth; Kass, Michael

    2010-01-01

    Software assurance is an important part of the software development process to reduce risks and ensure that the software is dependable and trustworthy. Software defects and weaknesses can often lead to software errors and failures and to exploitation by malicious users. Testing, certification and accreditation have been traditionally used in the software assurance process to attempt to improve software trustworthiness. In this paper, we examine a methodology known as a structured assurance model, which has been widely used for assuring system safety, for its potential application to software assurance. We describe the structured assurance model and examine its application and use for software assurance. We identify strengths and weaknesses of this approach and suggest areas for further investigation and testing. PMID:27134787

  10. Exposure to airborne allergens: a review of sampling methods.

    PubMed

    Renström, Anne

    2002-10-01

    A number of methods are used to assess exposure to high-molecular weight allergens. In the occupational setting, airborne dust is often collected on filters using pumps, the filters are eluted and allergen content in the eluate analysed using immunoassays. Collecting inhalable dust using person-carried pumps may be considered the gold standard. Other allergen sampling methods are available. Recently, a method that collects nasally inhaled dust on adhesive surfaces within nasal samplers has been developed. Allergen content can be analysed in eluates using sensitive enzyme immunoassays, or allergen-bearing particles can be immunostained using antibodies, and studied under the microscope. Settling airborne dust can be collected in petri dishes, a cheap and simple method that has been utilised in large-scale exposure studies. Collection of reservoir dust from surfaces using vacuum cleaners with a dust collector is commonly used to measure pet or mite allergens in homes. The sampling methods differ in properties and relevance to personal allergen exposure. Since methods for all steps from sampling to analysis differ between laboratories, determining occupational exposure limits for protein allergens is today unfeasible. A general standardisation of methods is needed.

  11. Method and apparatus for sampling low-yield wells

    DOEpatents

    Last, George V.; Lanigan, David C.

    2003-04-15

    An apparatus and method for collecting a sample from a low-yield well or perched aquifer includes a pump and a controller responsive to water level sensors for filling a sample reservoir. The controller activates the pump to fill the reservoir when the water level in the well reaches a high level as indicated by the sensor. The controller deactivates the pump when the water level reaches a lower level as indicated by the sensors. The pump continuously activates and deactivates the pump until the sample reservoir is filled with a desired volume, as indicated by a reservoir sensor. At the beginning of each activation cycle, the controller optionally can select to purge an initial quantity of water prior to filling the sample reservoir. The reservoir can be substantially devoid of air and the pump is a low volumetric flow rate pump. Both the pump and the reservoir can be located either inside or outside the well.

  12. Sampling methods for monitoring changes in gonococcal populations.

    PubMed Central

    Bindayna, K. M.; Ison, C. A.

    1989-01-01

    A total of 160 consecutive isolates of Neisseria gonorrhoeae was collected over a 3-month period. They were tested for their susceptibility to penicillin, erythromycin and spectinomycin and the auxotype and the serotype determined. We have evaluated two sampling methods, the collection of every fifth isolate and the first 20 isolates (10 male and 10 female) each month, to determine whether either is representative of the total population. There was no significant difference between either method of sampling and the total for detecting the predominant auxotypes and serovars or the distributions in antibiotic susceptibility. It is possible to monitor major changes in a gonococcal population, particularly susceptibility to antibiotics, using a sample of the total population. PMID:2528473

  13. A New GP Recombination Method Using Random Tree Sampling

    NASA Astrophysics Data System (ADS)

    Tanji, Makoto; Iba, Hitoshi

    We propose a new program evolution method named PORTS (Program Optimization by Random Tree Sampling) which is motivated by the idea of preservation and control of tree fragments in GP (Genetic Programming). We assume that to recombine genetic materials efficiently, tree fragments of any size should be preserved into the next generation. PORTS samples tree fragments and concatenates them by traversing and transitioning between promising trees instead of using subtree crossover and mutation. Because the size of a fragment preserved during a generation update follows a geometric distribution, merits of the method are that it is relatively easy to predict the behavior of tree fragments over time and to control sampling size, by changing a single parameter. From experimental results on RoyalTree, Symbolic Regression and 6-Multiplexer problem, we observed that the performance of PORTS is competitive with Simple GP. Furthermore, the average node size of optimal solutions obtained by PORTS was simple than Simple GP's result.

  14. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  15. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  16. A General Linear Method for Equating with Small Samples

    ERIC Educational Resources Information Center

    Albano, Anthony D.

    2015-01-01

    Research on equating with small samples has shown that methods with stronger assumptions and fewer statistical estimates can lead to decreased error in the estimated equating function. This article introduces a new approach to linear observed-score equating, one which provides flexible control over how form difficulty is assumed versus estimated…

  17. Periodicity detection method for small-sample time series datasets.

    PubMed

    Tominaga, Daisuke

    2010-01-01

    Time series of gene expression often exhibit periodic behavior under the influence of multiple signal pathways, and are represented by a model that incorporates multiple harmonics and noise. Most of these data, which are observed using DNA microarrays, consist of few sampling points in time, but most periodicity detection methods require a relatively large number of sampling points. We have previously developed a detection algorithm based on the discrete Fourier transform and Akaike's information criterion. Here we demonstrate the performance of the algorithm for small-sample time series data through a comparison with conventional and newly proposed periodicity detection methods based on a statistical analysis of the power of harmonics.We show that this method has higher sensitivity for data consisting of multiple harmonics, and is more robust against noise than other methods. Although "combinatorial explosion" occurs for large datasets, the computational time is not a problem for small-sample datasets. The MATLAB/GNU Octave script of the algorithm is available on the author's web site: http://www.cbrc.jp/%7Etominaga/piccolo/. PMID:21151841

  18. Comparison of DNA preservation methods for environmental bacterial community samples

    USGS Publications Warehouse

    Gray, Michael A.; Pratte, Zoe A.; Kellogg, Christina A.

    2013-01-01

    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard™, RNAlater®, DMSO–EDTA–salt (DESS), FTA® cards, and FTA Elute® cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA® cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard™, RNAlater®, and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost.

  19. Comparison of aquatic macroinvertebrate samples collected using different field methods

    USGS Publications Warehouse

    Lenz, Bernard N.; Miller, Michael A.

    1996-01-01

    Government agencies, academic institutions, and volunteer monitoring groups in the State of Wisconsin collect aquatic macroinvertebrate data to assess water quality. Sampling methods differ among agencies, reflecting the differences in the sampling objectives of each agency. Lack of infor- mation about data comparability impedes data shar- ing among agencies, which can result in duplicated sampling efforts or the underutilization of avail- able information. To address these concerns, com- parisons were made of macroinvertebrate samples collected from wadeable streams in Wisconsin by personnel from the U.S. Geological Survey- National Water Quality Assessment Program (USGS-NAWQA), the Wisconsin Department of Natural Resources (WDNR), the U.S. Department of Agriculture-Forest Service (USDA-FS), and volunteers from the Water Action Volunteer-Water Quality Monitoring Program (WAV). This project was part of the Intergovernmental Task Force on Monitoring Water Quality (ITFM) Wisconsin Water Resources Coordination Project. The numbers, types, and environmental tolerances of the organ- isms collected were analyzed to determine if the four different field methods that were used by the different agencies and volunteer groups provide comparable results. Additionally, this study com- pared the results of samples taken from different locations and habitats within the same streams.

  20. A proficiency test system to improve performance of milk analysis methods and produce reference values for component calibration samples for infrared milk analysis.

    PubMed

    Wojciechowski, Karen L; Melilli, Caterina; Barbano, David M

    2016-08-01

    Our goal was to determine the feasibility of combining proficiency testing, analytical method quality-assurance system, and production of reference samples for calibration of infrared milk analyzers to achieve a more efficient use of resources and reduce costs while maximizing analytical accuracy within and among milk payment-testing laboratories. To achieve this, we developed and demonstrated a multilaboratory combined proficiency testing and analytical method quality-assurance system as an approach to evaluate and improve the analytical performance of methods. A set of modified milks was developed and optimized to serve multiple purposes (i.e., proficiency testing, quality-assurance and method improvement, and to provide reference materials for calibration of secondary testing methods). Over a period of years, the approach has enabled the group of laboratories to document improved analytical performance (i.e., reduced within- and between-laboratory variation) of chemical reference methods used as the primary reference for calibration of high-speed electronic milk-testing equipment. An annual meeting of the laboratory technicians allows for review of results and discussion of each method and provides a forum for communication of experience and techniques that are of value to new analysts in the group. The monthly proficiency testing sample exchanges have the added benefit of producing all-laboratory mean reference values for a set of 14 milks that can be used for calibration, evaluation, and troubleshooting of calibration adjustment issues on infrared milk analyzers.

  1. A proficiency test system to improve performance of milk analysis methods and produce reference values for component calibration samples for infrared milk analysis.

    PubMed

    Wojciechowski, Karen L; Melilli, Caterina; Barbano, David M

    2016-08-01

    Our goal was to determine the feasibility of combining proficiency testing, analytical method quality-assurance system, and production of reference samples for calibration of infrared milk analyzers to achieve a more efficient use of resources and reduce costs while maximizing analytical accuracy within and among milk payment-testing laboratories. To achieve this, we developed and demonstrated a multilaboratory combined proficiency testing and analytical method quality-assurance system as an approach to evaluate and improve the analytical performance of methods. A set of modified milks was developed and optimized to serve multiple purposes (i.e., proficiency testing, quality-assurance and method improvement, and to provide reference materials for calibration of secondary testing methods). Over a period of years, the approach has enabled the group of laboratories to document improved analytical performance (i.e., reduced within- and between-laboratory variation) of chemical reference methods used as the primary reference for calibration of high-speed electronic milk-testing equipment. An annual meeting of the laboratory technicians allows for review of results and discussion of each method and provides a forum for communication of experience and techniques that are of value to new analysts in the group. The monthly proficiency testing sample exchanges have the added benefit of producing all-laboratory mean reference values for a set of 14 milks that can be used for calibration, evaluation, and troubleshooting of calibration adjustment issues on infrared milk analyzers. PMID:27209129

  2. A flexible importance sampling method for integrating subgrid processes

    NASA Astrophysics Data System (ADS)

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  3. New Methods of Sample Preparation for Atom Probe Specimens

    NASA Technical Reports Server (NTRS)

    Kuhlman, Kimberly, R.; Kowalczyk, Robert S.; Ward, Jennifer R.; Wishard, James L.; Martens, Richard L.; Kelly, Thomas F.

    2003-01-01

    Magnetite is a common conductive mineral found on Earth and Mars. Disk-shaped precipitates approximately 40 nm in diameter have been shown to have manganese and aluminum concentrations. Atom-probe field-ion microscopy (APFIM) is the only technique that can potentially quantify the composition of these precipitates. APFIM will be used to characterize geological and planetary materials, analyze samples of interest for geomicrobiology; and, for the metrology of nanoscale instrumentation. Prior to APFIM sample preparation was conducted by electropolishing, the method of sharp shards (MSS), or Bosch process (deep reactive ion etching) with focused ion beam (FIB) milling as a final step. However, new methods are required for difficult samples. Many materials are not easily fabricated using electropolishing, MSS, or the Bosch process, FIB milling is slow and expensive, and wet chemistry and the reactive ion etching are typically limited to Si and other semiconductors. APFIM sample preparation using the dicing saw is commonly used to section semiconductor wafers into individual devices following manufacture. The dicing saw is a time-effective method for preparing high aspect ratio posts of poorly conducting materials. Femtosecond laser micromachining is also suitable for preparation of posts. FIB time required is reduced by about a factor of 10 and multi-tip specimens can easily be fabricated using the dicing saw.

  4. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  5. A method for sampling microbial aerosols using high altitude balloons.

    PubMed

    Bryan, N C; Stewart, M; Granger, D; Guzik, T G; Christner, B C

    2014-12-01

    Owing to the challenges posed to microbial aerosol sampling at high altitudes, very little is known about the abundance, diversity, and extent of microbial taxa in the Earth-atmosphere system. To directly address this knowledge gap, we designed, constructed, and tested a system that passively samples aerosols during ascent through the atmosphere while tethered to a helium-filled latex sounding balloon. The sampling payload is ~ 2.7 kg and comprised of an electronics box and three sampling chambers (one serving as a procedural control). Each chamber is sealed with retractable doors that can be commanded to open and close at designated altitudes. The payload is deployed together with radio beacons that transmit GPS coordinates (latitude, longitude and altitude) in real time for tracking and recovery. A cut mechanism separates the payload string from the balloon at any desired altitude, returning all equipment safely to the ground on a parachute. When the chambers are opened, aerosol sampling is performed using the Rotorod® collection method (40 rods per chamber), with each rod passing through 0.035 m3 per km of altitude sampled. Based on quality control measurements, the collection of ~ 100 cells rod(-1) provided a 3-sigma confidence level of detection. The payload system described can be mated with any type of balloon platform and provides a tool for characterizing the vertical distribution of microorganisms in the troposphere and stratosphere. PMID:25455021

  6. Genomic DNA microextraction: a method to screen numerous samples.

    PubMed

    Ramírez-Solis, R; Rivera-Pérez, J; Wallace, J D; Wims, M; Zheng, H; Bradley, A

    1992-03-01

    Many experimental designs require the analysis of genomic DNA from a large number of samples. Although the polymerase chain reaction (PCR) can be used, the Southern blot is preferred for many assays because of its inherent reliability. The rapid acceptance of PCR, despite a significant rate of false positive/negative results, is partly due to the disadvantages of the sample preparation process for Southern blot analysis. We have devised a rapid protocol to extract high-molecular-weight genomic DNA from a large number of samples. It involves the use of a single 96-well tissue culture dish to carry out all the steps of the sample preparation. This, coupled with the use of a multichannel pipette, facilitates the simultaneous analysis of multiple samples. The procedure may be automated since no centrifugation, mixing, or transferring of the samples is necessary. The method has been used to screen embryonic stem cell clones for the presence of targeted mutations at the Hox-2.6 locus and to obtain data from human blood.

  7. Comparison of several analytical methods for the determination of tin in geochemical samples as a function of tin speciation

    USGS Publications Warehouse

    Kane, J.S.; Evans, J.R.; Jackson, J.C.

    1989-01-01

    Accurate and precise determinations of tin in geological materials are needed for fundamental studies of tin geochemistry, and for tin prospecting purposes. Achieving the required accuracy is difficult because of the different matrices in which Sn can occur (i.e. sulfides, silicates and cassiterite), and because of the variability of literature values for Sn concentrations in geochemical reference materials. We have evaluated three methods for the analysis of samples for Sn concentration: graphite furnace atomic absorption spectrometry (HGA-AAS) following iodide extraction, inductively coupled plasma atomic emission spectrometry (ICP-OES), and energy-dispersive X-ray fluorescence (EDXRF) spectrometry. Two of these methods (HGA-AAS and ICP-OES) required sample decomposition either by acid digestion or fusion, while the third (EDXRF) was performed directly on the powdered sample. Analytical details of all three methods, their potential errors, and the steps necessary to correct these errors were investigated. Results showed that similar accuracy was achieved from all methods for unmineralized samples, which contain no known Sn-bearing phase. For mineralized samples, which contain Sn-bearing minerals, either cassiterite or stannous sulfides, only EDXRF and fusion ICP-OES methods provided acceptable accuracy. This summary of our study provides information which helps to assure correct interpretation of data bases for underlying geochemical processes, regardless of method of data collection and its inherent limitations. ?? 1989.

  8. RAPID SEPARATION METHOD FOR ACTINIDES IN EMERGENCY SOIL SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.; Noyes, G.

    2009-11-09

    A new rapid method for the determination of actinides in soil and sediment samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used for samples up to 2 grams in emergency response situations. The actinides in soil method utilizes a rapid sodium hydroxide fusion method, a lanthanum fluoride soil matrix removal step, and a streamlined column separation process with stacked TEVA, TRU and DGA Resin cartridges. Lanthanum was separated rapidly and effectively from Am and Cm on DGA Resin. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha sources are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency soil samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinides in soil results were reported within 4-5 hours with excellent quality.

  9. Comparison of clinical samples and methods in chronic cutaneous leishmaniasis.

    PubMed

    Eroglu, Fadime; Uzun, Soner; Koltas, Ismail Soner

    2014-11-01

    This study aimed at finding out the most effective clinical samples and methods in chronic cutaneous leishmaniasis (CCL). Smear, aspiration fluid, and filter paper samples were taken from 104 skin lesions of suspected cases with CCL, and they were compared using microscopic examination, culture, and molecular methods. We characterized four different forms of CCL and identified the causative agents in CCL forms using high-resolution melting curve real-time polymerase chain reaction assay. We observed that smear was detected to be the most sensitive (63.5%) among clinical samples, and real-time polymerase chain reaction method was the most sensitive (96.8%) among the methods used in diagnosis of CCL. We identified 68.8% Leishmania tropica and 31.2% L. infantum in papular lesions, 69.2% L. infantum and 30.8% L. tropica in nodular lesions, 57.9% L. tropica and 42.1% L. major in ulcerating plaque lesions, and 55.5% L. tropica and 44.5% L. major in noduloulcerative lesions in CCL patients.

  10. Spanish Multicenter Normative Studies (NEURONORMA Project): methods and sample characteristics.

    PubMed

    Peña-Casanova, Jordi; Blesa, Rafael; Aguilar, Miquel; Gramunt-Fombuena, Nina; Gómez-Ansón, Beatriz; Oliva, Rafael; Molinuevo, José Luis; Robles, Alfredo; Barquero, María Sagrario; Antúnez, Carmen; Martínez-Parra, Carlos; Frank-García, Anna; Fernández, Manuel; Alfonso, Verónica; Sol, Josep M

    2009-06-01

    This paper describes the methods and sample characteristics of a series of Spanish normative studies (The NEURONORMA project). The primary objective of our research was to collect normative and psychometric information on a sample of people aged over 49 years. The normative information was based on a series of selected, but commonly used, neuropsychological tests covering attention, language, visuo-perceptual abilities, constructional tasks, memory, and executive functions. A sample of 356 community dwelling individuals was studied. Demographics, socio-cultural, and medical data were collected. Cognitive normality was validated via informants and a cognitive screening test. Norms were calculated for midpoint age groups. Effects of age, education, and sex were determined. The use of these norms should improve neuropsychological diagnostic accuracy in older Spanish subjects. These data may also be of considerable use for comparisons with other normative studies. Limitations of these normative data are also commented on.

  11. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  12. Evaluation of sample preservation methods for poultry manure.

    PubMed

    Pan, J; Fadel, J G; Zhang, R; El-Mashad, H M; Ying, Y; Rumsey, T

    2009-08-01

    When poultry manure is collected but cannot be analyzed immediately, a method for storing the manure is needed to ensure accurate subsequent analyses. This study has 3 objectives: (1) to investigate effects of 4 poultry manure sample preservation methods (refrigeration, freezing, acidification, and freeze-drying) on the compositional characteristics of poultry manure; (2) to determine compositional differences in fresh manure with manure samples at 1, 2, and 3 d of accumulation under bird cages; and (3) to assess the influence of 14-d freezing storage on the composition of manure when later exposed to 25 degrees C for 7 d as compared with fresh manure. All manure samples were collected from a layer house. Analyses performed on the manure samples included total Kjeldahl nitrogen, uric acid nitrogen, ammonia nitrogen, and urea nitrogen. In experiment 1, the storage methods most similar to fresh manure, in order of preference, were freezing, freeze-drying, acidification, and refrigeration. Thoroughly mixing manure samples and compressing them to 2 to 3 mm is important for the freezing and freeze-dried samples. In general, refrigeration was found unacceptable for nitrogen analyses. A significant effect (P < 0.0001) of time for refrigeration was found on uric acid nitrogen and ammonia nitrogen. In experiment 2, the total Kjeldahl nitrogen and uric acid nitrogen were significantly lower (P < 0.05) for 1, 2, and 3 d of accumulation compared with fresh manure. Manure after 1, 2, and 3 d of accumulation had similar nitrogen compositions. The results from experiment 3 show that nitrogen components from fresh manure samples and thawed samples from 14 d of freezing are similar at 7 d but high variability of nitrogen compositions during intermediate times from 0 to 7 d prevents the recommendation of freezing manure for use in subsequent experiments and warrants future experimentation. In conclusion, fresh poultry manure can be frozen for accurate subsequent nitrogen

  13. Evaluation of sample preservation methods for poultry manure.

    PubMed

    Pan, J; Fadel, J G; Zhang, R; El-Mashad, H M; Ying, Y; Rumsey, T

    2009-08-01

    When poultry manure is collected but cannot be analyzed immediately, a method for storing the manure is needed to ensure accurate subsequent analyses. This study has 3 objectives: (1) to investigate effects of 4 poultry manure sample preservation methods (refrigeration, freezing, acidification, and freeze-drying) on the compositional characteristics of poultry manure; (2) to determine compositional differences in fresh manure with manure samples at 1, 2, and 3 d of accumulation under bird cages; and (3) to assess the influence of 14-d freezing storage on the composition of manure when later exposed to 25 degrees C for 7 d as compared with fresh manure. All manure samples were collected from a layer house. Analyses performed on the manure samples included total Kjeldahl nitrogen, uric acid nitrogen, ammonia nitrogen, and urea nitrogen. In experiment 1, the storage methods most similar to fresh manure, in order of preference, were freezing, freeze-drying, acidification, and refrigeration. Thoroughly mixing manure samples and compressing them to 2 to 3 mm is important for the freezing and freeze-dried samples. In general, refrigeration was found unacceptable for nitrogen analyses. A significant effect (P < 0.0001) of time for refrigeration was found on uric acid nitrogen and ammonia nitrogen. In experiment 2, the total Kjeldahl nitrogen and uric acid nitrogen were significantly lower (P < 0.05) for 1, 2, and 3 d of accumulation compared with fresh manure. Manure after 1, 2, and 3 d of accumulation had similar nitrogen compositions. The results from experiment 3 show that nitrogen components from fresh manure samples and thawed samples from 14 d of freezing are similar at 7 d but high variability of nitrogen compositions during intermediate times from 0 to 7 d prevents the recommendation of freezing manure for use in subsequent experiments and warrants future experimentation. In conclusion, fresh poultry manure can be frozen for accurate subsequent nitrogen

  14. WHO informal consultation on the application of molecular methods to assure the quality, safety and efficacy of vaccines, Geneva, Switzerland, 7-8 April 2005.

    PubMed

    Shin, Jinho; Wood, David; Robertson, James; Minor, Philip; Peden, Keith

    2007-03-01

    In April 2005, the World Health Organization convened an informal consultation on molecular methods to assure the quality, safety and efficacy of vaccines. The consultation was attended by experts from national regulatory authorities, vaccine industry and academia. Crosscutting issues on the application of molecular methods for a number of vaccines that are currently in use or under development were presented, and specific methods for further collaborative studies were discussed and identified. The main points of recommendation from meeting participants were fourfold: (i) that molecular methods should be encouraged; (ii) that collaborative studies are needed for many methods/applications; (iii) that basic science should be promoted; and (iv) that investment for training, equipment and facilities should be encouraged.

  15. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  16. A direct method for e-cigarette aerosol sample collection.

    PubMed

    Olmedo, Pablo; Navas-Acien, Ana; Hess, Catherine; Jarmul, Stephanie; Rule, Ana

    2016-08-01

    E-cigarette use is increasing in populations around the world. Recent evidence has shown that the aerosol produced by e-cigarettes can contain a variety of toxicants. Published studies characterizing toxicants in e-cigarette aerosol have relied on filters, impingers or sorbent tubes, which are methods that require diluting or extracting the sample in a solution during collection. We have developed a collection system that directly condenses e-cigarette aerosol samples for chemical and toxicological analyses. The collection system consists of several cut pipette tips connected with short pieces of tubing. The pipette tip-based collection system can be connected to a peristaltic pump, a vacuum pump, or directly to an e-cigarette user for the e-cigarette aerosol to flow through the system. The pipette tip-based system condenses the aerosol produced by the e-cigarette and collects a liquid sample that is ready for analysis without the need of intermediate extraction solutions. We tested a total of 20 e-cigarettes from 5 different brands commercially available in Maryland. The pipette tip-based collection system condensed between 0.23 and 0.53mL of post-vaped e-liquid after 150 puffs. The proposed method is highly adaptable, can be used during field work and in experimental settings, and allows collecting aerosol samples from a wide variety of e-cigarette devices, yielding a condensate of the likely exact substance that is being delivered to the lungs.

  17. A direct method for e-cigarette aerosol sample collection.

    PubMed

    Olmedo, Pablo; Navas-Acien, Ana; Hess, Catherine; Jarmul, Stephanie; Rule, Ana

    2016-08-01

    E-cigarette use is increasing in populations around the world. Recent evidence has shown that the aerosol produced by e-cigarettes can contain a variety of toxicants. Published studies characterizing toxicants in e-cigarette aerosol have relied on filters, impingers or sorbent tubes, which are methods that require diluting or extracting the sample in a solution during collection. We have developed a collection system that directly condenses e-cigarette aerosol samples for chemical and toxicological analyses. The collection system consists of several cut pipette tips connected with short pieces of tubing. The pipette tip-based collection system can be connected to a peristaltic pump, a vacuum pump, or directly to an e-cigarette user for the e-cigarette aerosol to flow through the system. The pipette tip-based system condenses the aerosol produced by the e-cigarette and collects a liquid sample that is ready for analysis without the need of intermediate extraction solutions. We tested a total of 20 e-cigarettes from 5 different brands commercially available in Maryland. The pipette tip-based collection system condensed between 0.23 and 0.53mL of post-vaped e-liquid after 150 puffs. The proposed method is highly adaptable, can be used during field work and in experimental settings, and allows collecting aerosol samples from a wide variety of e-cigarette devices, yielding a condensate of the likely exact substance that is being delivered to the lungs. PMID:27200479

  18. Methods for the analysis of carpet samples for asbestos

    SciTech Connect

    Millette, J.R.; Clark, P.J.; Brackett, K.A.; Wheeles, R.K.

    1993-01-01

    Because of the nature of carpet pile, no samples can be directly prepared from carpet for analysis by transmission electron microscopy (TEM). Two indirect methods are currently used by laboratories when preparing samples for measuring the amount of asbestos present in carpet material. One is an ultrasonic shaking technique which requires that a portion of the carpet be cut and sent to the laboratory. The other is a micro-vacuuming technique which has been used generally in the assessment of asbestos in settled dust in buildings. It is not destructive to the carpet. Both methods utilize TEM to identify, measure and count the asbestos fibers found. Each can provide important but different information when an assessment of the level of contamination of carpeting is being made.

  19. The experience sampling method: Investigating students' affective experience

    NASA Astrophysics Data System (ADS)

    Nissen, Jayson M.; Stetzer, MacKenzie R.; Shemwell, Jonathan T.

    2013-01-01

    Improving non-cognitive outcomes such as attitudes, efficacy, and persistence in physics courses is an important goal of physics education. This investigation implemented an in-the-moment surveying technique called the Experience Sampling Method (ESM) [1] to measure students' affective experience in physics. Measurements included: self-efficacy, cognitive efficiency, activation, intrinsic motivation, and affect. Data are presented that show contrasts in students' experiences (e.g., in physics vs. non-physics courses).

  20. Sampling and analytical methods of stable isotopes and dissolved inorganic carbon from CO2 injection sites

    NASA Astrophysics Data System (ADS)

    van Geldern, Robert; Myrttinen, Anssi; Becker, Veith; Barth, Johannes A. C.

    2010-05-01

    enough to cover the sample concentrations. In order to assure methodological reproducibility, this 'calibration set' should be included in every sequence analysed with the Gasbench CF-IRMS system. The standards, therefore, should also be treated in the same way as the samples. For accurate determination, it is essential to know the exact amount of water in the vial and the density of the sample. This requires weighing of each vial before and after injection of the water sample. For stable isotope analysis, the required signal height can be adjusted by the sample amount. Therefore this method is suitable for analysing samples with highly differing DIC concentrations. Reproducibility and accuracy of the quantitative analysis of the dissolved inorganic carbon need to be verified by independent control standards, treated as samples. This study was conducted as a part of the R&D programme CLEAN, which is funded by the German Federal Ministry of Education in the framework of the programme GEOTECHNOLOGIEN. We would like to thank GDF SUEZ for permitting us to conduct sampling campaigns at their site.

  1. Rock sampling. [method for controlling particle size distribution

    NASA Technical Reports Server (NTRS)

    Blum, P. (Inventor)

    1971-01-01

    A method for sampling rock and other brittle materials and for controlling resultant particle sizes is described. The method involves cutting grooves in the rock surface to provide a grouping of parallel ridges and subsequently machining the ridges to provide a powder specimen. The machining step may comprise milling, drilling, lathe cutting or the like; but a planing step is advantageous. Control of the particle size distribution is effected primarily by changing the height and width of these ridges. This control exceeds that obtainable by conventional grinding.

  2. Chemicals of emerging concern in water and bottom sediment in the Great Lakes Basin, 2012: collection methods, analytical methods, quality assurance, and study data

    USGS Publications Warehouse

    Lee, Kathy E.; Langer, Susan K.; Menheer, Michael A.; Hansen, Donald S.; Foreman, William T.; Furlong, Edward T.; Jorgenson, Zachary G.; Choy, Steven J.; Moore, Jeremy N.; Banda, JoAnn; Gefell, Daniel J.

    2015-01-01

    During this study, 53 environmental samples, 4 field duplicate samples, and 8 field spike samples of bottom sediment and laboratory matrix-spike samples were analyzed for a wide variety of CECs at the USGS National Water Quality Laboratory using laboratory schedule 5433 for wastewater indicators; research method 6434 for steroid hormones, sterols, and bisphenol A; and research method 9008 for human-use pharmaceuticals and antidepressants. Forty of the 57 chemicals analyzed using laboratory schedule 5433 had detectable concentrations ranging from 1 to 49,000 micrograms per kilogram. Fourteen of the 20 chemicals analyzed using research method 6434 had detectable concentrations ranging from 0.04 to 24,940 nanograms per gram. Ten of the 20 chemicals analyzed using research method 9008 had detectable concentrations ranging from 0.59 to 197.5 micrograms per kilogram. Five of the 11 chemicals analyzed using research method 9008 had detectable concentrations ranging from 1.16 to 25.0 micrograms per kilogram.

  3. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  4. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  5. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  6. Measurement of radon potential from soil using a special method of sampling

    NASA Astrophysics Data System (ADS)

    Cosma, Constantin; Papp, Botond; Moldovan, Mircea; Cosma, Victor; Cindea, Ciprian; Suciu, Liviu; Apostu, Adelina

    2010-10-01

    Soil radon gas and/or its exhalation rate are used as indicators for some applications, such as uranium exploration, indoor radon concentration, seismic activity, location of subsurface faults, etc., and also in the studies where the main interest is the field verification of radon transport models. This work proposes a versatile method for the soil radon sampling using a special manner of pumping. The soil gas is passed through a column of charcoal by using passive pumping. A plastic bottle filled with water is coupled to an activated charcoal column and the flow of water through an adjustable hole made at the bottom of bottle assures a controlled gas flow from the soil. The results obtained for the activity of activated charcoal are in the range of 20-40 kBq/m3, for a depth of approximately 0.8 m. The results obtained by this method were confirmed by simultaneous measurements using LUK 3C device for soil radon measurements. Possible applications for the estimation of radon soil potential are discussed.

  7. RAPID SEPARATION METHOD FOR ACTINIDES IN EMERGENCY AIR FILTER SAMPLES

    SciTech Connect

    Maxwell, S.; Noyes, G.; Culligan, B.

    2010-02-03

    A new rapid method for the determination of actinides and strontium in air filter samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used in emergency response situations. The actinides and strontium in air filter method utilizes a rapid acid digestion method and a streamlined column separation process with stacked TEVA, TRU and Sr Resin cartridges. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha emitters are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The purified {sup 90}Sr fractions are mounted directly on planchets and counted by gas flow proportional counting. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency air filter samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinide and {sup 90}Sr in air filter results were reported in {approx}4 hours with excellent quality.

  8. Method for microRNA isolation from clinical serum samples.

    PubMed

    Li, Yu; Kowdley, Kris V

    2012-12-01

    MicroRNAs are a group of intracellular noncoding RNA molecules that have been implicated in a variety of human diseases. Because of their high stability in blood, microRNAs released into circulation could be potentially utilized as noninvasive biomarkers for diagnosis or prognosis. Current microRNA isolation protocols are specifically designed for solid tissues and are impractical for biomarker development utilizing small-volume serum samples on a large scale. Thus, a protocol for microRNA isolation from serum is needed to accommodate these conditions in biomarker development. To establish such a protocol, we developed a simplified approach to normalize sample input by using single synthetic spike-in microRNA. We evaluated three commonly used commercial microRNA isolation kits for the best performance by comparing RNA quality and yield. The manufacturer's protocol was further modified to improve the microRNA yield from 200μl of human serum. MicroRNAs isolated from a large set of clinical serum samples were tested on the miRCURY LNA real-time PCR panel and confirmed to be suitable for high-throughput microRNA profiling. In conclusion, we have established a proven method for microRNA isolation from clinical serum samples suitable for microRNA biomarker development.

  9. Hand held sample tube manipulator, system and method

    DOEpatents

    Kenny, Donald V [Liberty Township, OH; Smith, Deborah L [Liberty Township, OH; Severance, Richard A [late of Columbus, OH

    2001-01-01

    A manipulator apparatus, system and method for measuring analytes present in sample tubes. The manipulator apparatus includes a housing having a central bore with an inlet end and outlet end; a plunger mechanism with at least a portion thereof slideably disposed for reciprocal movement within the central bore, the plunger mechanism having a tubular gas channel with an inlet end and an outlet end, the gas channel inlet end disposed in the same direction as said inlet end of the central bore, wherein the inlet end of said plunger mechanism is adapted for movement so as to expel a sample tube inserted in the bore at the outlet end of the housing, the inlet end of the plunger mechanism is adapted for connection to gas supply; a first seal is disposed in the housing for sealing between the central bore and the plunger mechanism; a second seal is disposed at the outlet end of the housing for sealing between the central bore and a sample tube; a holder mounted on the housing for holding the sample tube; and a biasing mechanism for returning the plunger mechanism to a starting position.

  10. Intervene before leaving: clustered lot quality assurance sampling to monitor vaccination coverage at health district level before the end of a yellow fever and measles vaccination campaign in Sierra Leone in 2009

    PubMed Central

    2012-01-01

    Background In November 2009, Sierra Leone conducted a preventive yellow fever (YF) vaccination campaign targeting individuals aged nine months and older in six health districts. The campaign was integrated with a measles follow-up campaign throughout the country targeting children aged 9–59 months. For both campaigns, the operational objective was to reach 95% of the target population. During the campaign, we used clustered lot quality assurance sampling (C-LQAS) to identify areas of low coverage to recommend timely mop-up actions. Methods We divided the country in 20 non-overlapping lots. Twelve lots were targeted by both vaccinations, while eight only by measles. In each lot, five clusters of ten eligible individuals were selected for each vaccine. The upper threshold (UT) was set at 90% and the lower threshold (LT) at 75%. A lot was rejected for low vaccination coverage if more than 7 unvaccinated individuals (not presenting vaccination card) were found. After the campaign, we plotted the C-LQAS results against the post-campaign coverage estimations to assess if early interventions were successful enough to increase coverage in the lots that were at the level of rejection before the end of the campaign. Results During the last two days of campaign, based on card-confirmed vaccination status, five lots out of 20 (25.0%) failed for having low measles vaccination coverage and three lots out of 12 (25.0%) for low YF coverage. In one district, estimated post-campaign vaccination coverage for both vaccines was still not significantly above the minimum acceptable level (LT = 75%) even after vaccination mop-up activities. Conclusion C-LQAS during the vaccination campaign was informative to identify areas requiring mop-up activities to reach the coverage target prior to leaving the region. The only district where mop-up activities seemed to be unsuccessful might have had logistical difficulties that should be further investigated and resolved. PMID:22676225

  11. A time domain sampling method for inverse acoustic scattering problems

    NASA Astrophysics Data System (ADS)

    Guo, Yukun; Hömberg, Dietmar; Hu, Guanghui; Li, Jingzhi; Liu, Hongyu

    2016-06-01

    This work concerns the inverse scattering problems of imaging unknown/inaccessible scatterers by transient acoustic near-field measurements. Based on the analysis of the migration method, we propose efficient and effective sampling schemes for imaging small and extended scatterers from knowledge of time-dependent scattered data due to incident impulsive point sources. Though the inverse scattering problems are known to be nonlinear and ill-posed, the proposed imaging algorithms are totally "direct" involving only integral calculations on the measurement surface. Theoretical justifications are presented and numerical experiments are conducted to demonstrate the effectiveness and robustness of our methods. In particular, the proposed static imaging functionals enhance the performance of the total focusing method (TFM) and the dynamic imaging functionals show analogous behavior to the time reversal inversion but without solving time-dependent wave equations.

  12. A Novel Method for Sampling Alpha-Helical Protein Backbones

    DOE R&D Accomplishments Database

    Fain, Boris; Levitt, Michael

    2001-01-01

    We present a novel technique of sampling the configurations of helical proteins. Assuming knowledge of native secondary structure, we employ assembly rules gathered from a database of existing structures to enumerate the geometrically possible 3-D arrangements of the constituent helices. We produce a library of possible folds for 25 helical protein cores. In each case the method finds significant numbers of conformations close to the native structure. In addition we assign coordinates to all atoms for 4 of the 25 proteins. In the context of database driven exhaustive enumeration our method performs extremely well, yielding significant percentages of structures (0.02%--82%) within 6A of the native structure. The method's speed and efficiency make it a valuable contribution towards the goal of predicting protein structure.

  13. Field evaluation of endotoxin air sampling assay methods.

    PubMed

    Thorne, P S; Reynolds, S J; Milton, D K; Bloebaum, P D; Zhang, X; Whitten, P; Burmeister, L F

    1997-11-01

    This study tested the importance of filter media, extraction and assay protocol, and bioaerosol source on the determination of endotoxin under field conditions in swine and poultry confinement buildings. Multiple simultaneous air samples were collected using glass fiber (GF) and polycarbonate (PC) filters, and these were assayed using two methods in two separate laboratories: an endpoint chromogenic Limulus amebocyte lysate (LAL) assay (QCL) performed in water and a kinetic chromogenic LAL assay (KQCL) performed in buffer with resistant-parallel line estimation analysis (KLARE). In addition, two aqueous filter extraction methods were compared in the QCL assay: 120 min extraction at 22 degrees C with vigorous shaking and 30 min extraction at 68 degrees C with gentle rocking. These extraction methods yielded endotoxin activities that were not significantly different and were very highly correlated. Reproducibility of endotoxin determinations from duplicate air sampling filters was very high (Cronbach alpha all > 0.94). When analyzed by the QCL method GF filters yielded significantly higher endotoxin activity than PC filters. QCL and KLARE methods gave similar estimates for endotoxin activity from PC filters; however, GF filters analyzed by the QCL method yielded significantly higher endotoxin activity estimates, suggesting enhancement of the QCL assay or inhibition of the KLARE asay with GF filters. Correlation between QCL-GF and QCL-PC was high (r = 0.98) while that between KLARE-GF and KLARE-PC was moderate (r = 0.68). Analysis of variance demonstrated that assay methodology, filter-type, barn-type, and interactions between assay and filter-type and between assay and barn-type were important factors influencing endotoxin exposure assessment.

  14. Well fluid isolation and sample apparatus and method

    DOEpatents

    Schalla, Ronald; Smith, Ronald M.; Hall, Stephen H.; Smart, John E.

    1995-01-01

    The present invention specifically permits purging and/or sampling of a well but only removing, at most, about 25% of the fluid volume compared to conventional methods and, at a minimum, removing none of the fluid volume from the well. The invention is an isolation assembly that is inserted into the well. The isolation assembly is designed so that only a volume of fluid between the outside diameter of the isolation assembly and the inside diameter of the well over a fluid column height from the bottom of the well to the top of the active portion (lower annulus) is removed. A seal may be positioned above the active portion thereby sealing the well and preventing any mixing or contamination of inlet fluid with fluid above the packer. Purged well fluid is stored in a riser above the packer. Ports in the wall of the isolation assembly permit purging and sampling of the lower annulus along the height of the active portion.

  15. Path Sampling Methods for Enzymatic Quantum Particle Transfer Reactions.

    PubMed

    Dzierlenga, M W; Varga, M J; Schwartz, S D

    2016-01-01

    The mechanisms of enzymatic reactions are studied via a host of computational techniques. While previous methods have been used successfully, many fail to incorporate the full dynamical properties of enzymatic systems. This can lead to misleading results in cases where enzyme motion plays a significant role in the reaction coordinate, which is especially relevant in particle transfer reactions where nuclear tunneling may occur. In this chapter, we outline previous methods, as well as discuss newly developed dynamical methods to interrogate mechanisms of enzymatic particle transfer reactions. These new methods allow for the calculation of free energy barriers and kinetic isotope effects (KIEs) with the incorporation of quantum effects through centroid molecular dynamics (CMD) and the full complement of enzyme dynamics through transition path sampling (TPS). Recent work, summarized in this chapter, applied the method for calculation of free energy barriers to reaction in lactate dehydrogenase (LDH) and yeast alcohol dehydrogenase (YADH). We found that tunneling plays an insignificant role in YADH but plays a more significant role in LDH, though not dominant over classical transfer. Additionally, we summarize the application of a TPS algorithm for the calculation of reaction rates in tandem with CMD to calculate the primary H/D KIE of YADH from first principles. We found that the computationally obtained KIE is within the margin of error of experimentally determined KIEs and corresponds to the KIE of particle transfer in the enzyme. These methods provide new ways to investigate enzyme mechanism with the inclusion of protein and quantum dynamics.

  16. Path Sampling Methods for Enzymatic Quantum Particle Transfer Reactions.

    PubMed

    Dzierlenga, M W; Varga, M J; Schwartz, S D

    2016-01-01

    The mechanisms of enzymatic reactions are studied via a host of computational techniques. While previous methods have been used successfully, many fail to incorporate the full dynamical properties of enzymatic systems. This can lead to misleading results in cases where enzyme motion plays a significant role in the reaction coordinate, which is especially relevant in particle transfer reactions where nuclear tunneling may occur. In this chapter, we outline previous methods, as well as discuss newly developed dynamical methods to interrogate mechanisms of enzymatic particle transfer reactions. These new methods allow for the calculation of free energy barriers and kinetic isotope effects (KIEs) with the incorporation of quantum effects through centroid molecular dynamics (CMD) and the full complement of enzyme dynamics through transition path sampling (TPS). Recent work, summarized in this chapter, applied the method for calculation of free energy barriers to reaction in lactate dehydrogenase (LDH) and yeast alcohol dehydrogenase (YADH). We found that tunneling plays an insignificant role in YADH but plays a more significant role in LDH, though not dominant over classical transfer. Additionally, we summarize the application of a TPS algorithm for the calculation of reaction rates in tandem with CMD to calculate the primary H/D KIE of YADH from first principles. We found that the computationally obtained KIE is within the margin of error of experimentally determined KIEs and corresponds to the KIE of particle transfer in the enzyme. These methods provide new ways to investigate enzyme mechanism with the inclusion of protein and quantum dynamics. PMID:27497161

  17. Survey of predators and sampling method comparison in sweet corn.

    PubMed

    Musser, Fred R; Nyrop, Jan P; Shelton, Anthony M

    2004-02-01

    Natural predation is an important component of integrated pest management that is often overlooked because it is difficult to quantify and perceived to be unreliable. To begin incorporating natural predation into sweet corn, Zea mays L., pest management, a predator survey was conducted and then three sampling methods were compared for their ability to accurately monitor the most abundant predators. A predator survey on sweet corn foliage in New York between 1999 and 2001 identified 13 species. Orius insidiosus (Say), Coleomegilla maculata (De Geer), and Harmonia axyridis (Pallas) were the most numerous predators in all years. To determine the best method for sampling adult and immature stages of these predators, comparisons were made among nondestructive field counts, destructive counts, and yellow sticky cards. Field counts were correlated with destructive counts for all populations, but field counts of small insects were biased. Sticky cards underrepresented immature populations. Yellow sticky cards were more attractive to C. maculata adults than H. axyridis adults, especially before pollen shed, making coccinellid population estimates based on sticky cards unreliable. Field counts were the most precise method for monitoring adult and immature stages of the three major predators. Future research on predicting predation of pests in sweet corn should be based on field counts of predators because these counts are accurate, have no associated supply costs, and can be made quickly. PMID:14998137

  18. Vadose Zone Sampling Methods for Detection of Preferential Pesticides Transport

    NASA Astrophysics Data System (ADS)

    Peranginangin, N.; Richards, B. K.; Steenhuis, T. S.

    2003-12-01

    Leaching of agricultural applied chemicals through the vadose zone is a major cause for the occurrence of agrichemicals in groundwater. Accurate soil water sampling methods are needed to ensure meaningful monitoring results, especially for soils that have significant preferential flow paths. The purpose of this study was to assess the capability and the effectiveness of various soil water sampling methods in detecting preferential transport of pesticides in a strongly-structured silty clay loam (Hudson series) soil. Soil water sampling devices tested were wick pan and gravity pan lysimeters, tile lines, porous ceramic cups, and pipe lysimeters; all installed at 45 to105 cm depth below the ground surface. A reasonable worse-case scenario was tested by applying a simulated rain storm soon after pesticides were sprayed at agronomic rates. Herbicides atrazine (6-chloro-N2-ethyl-N4-isopropyl-1,3,5-triazine-2,4-diamine) and 2,4-D (2,4-dichloro-phenoxyacetic acid) were chosen as model compounds. Chloride (KCl) tracer was used to determine spatial and temporal distribution of non-reactive solute and water as well as a basis for determining the retardation in pesticides movement. Results show that observed pesticide mobility was much greater than would be predicted by uniform flow. Under relatively high soil moisture conditions, gravity and wick pan lysimeters had comparably good collection efficiencies, whereas the wick samplers had an advantage over gravity driven sampler when the soil moisture content was below field capacity. Pipe lysimeters had breakthrough patterns that were similar to pan samplers. At small plot scale, tile line samplers tended to underestimate solute concentration because of water dilution around the samplers. The use of porous cup samplers performed poorly because of their sensitivity to local profile characteristics: only by chance can they intercept and sample the preferential flow paths that are critical to transport. Wick sampler had the least

  19. Quality Assurance Project Plan

    SciTech Connect

    Holland, R. C.

    1998-06-01

    This Quality Assurance Project Plan documents the quality assurance activities for the Wastewater/Stormwater/Groundwater and Environmental Surveillance Programs. This QAPP was prepared in accordance with DOE guidance on compliance with 10CFR830.120.

  20. Artifact free denuder method for sampling of carbonaceous aerosols

    NASA Astrophysics Data System (ADS)

    Mikuška, P.; Vecera, Z.; Broškovicová, A.

    2003-04-01

    Over the past decade, a growing attention has been focused on the carbonaceous aerosols. Although they may account for 30--60% of the total fine aerosol mass, their concentration and formation mechanisms are not well understood, particularly in comparison with major fine particle inorganic species. The deficiency in knowledge of carbonaceous aerosols results from their complexity and because of problems associated with their collection. Conventional sampling techniques of the carbonaceous aerosols, which utilize filters/backup adsorbents suffer from sampling artefacts. Positive artifacts are mainly due to adsorption of gas-phase organic compounds by the filter material or by the already collected particles, whereas negative artifacts arise from the volatilisation of already collected organic compounds from the filter. Furthermore, in the course of the sampling, the composition of the collected organic compounds may be modified by oxidants (O_3, NO_2, PAN, peroxides) that are present in the air passing through the sampler. It is clear that new, artifact free, method for sampling of carbonaceous aerosols is needed. A combination of a diffusion denuder and a filter in series is very promising in this respect. The denuder is expected to collect gaseous oxidants and gas-phase organic compounds from sample air stream prior to collection of aerosol particles on filters, and eliminate thus both positive and negative sampling artifacts for carbonaceous aerosols. This combination is subject of the presentation. Several designs of diffusion denuders (cylindrical, annular, parallel plate, multi-channel) in combination with various types of wall coatings (dry, liquid) were examined. Special attention was given to preservation of the long-term collection efficiency. Different adsorbents (activated charcoal, molecular sieve, porous polymers) and sorbents coated with various chemical reagents (KI, Na_2SO_3, MnO_2, ascorbic acid) or chromatographic stationary phases (silicon oils

  1. Drum plug piercing and sampling device and method

    DOEpatents

    Counts, Kevin T.

    2011-04-26

    An apparatus and method for piercing a drum plug of a drum in order to sample and/or vent gases that may accumulate in a space of the drum is provided. The drum is not damaged and can be reused since the pierced drum plug can be subsequently replaced. The apparatus includes a frame that is configured for engagement with the drum. A cylinder actuated by a fluid is mounted to the frame. A piercer is placed into communication with the cylinder so that actuation of the cylinder causes the piercer to move in a linear direction so that the piercer may puncture the drum plug of the drum.

  2. Methods of scaling threshold color difference using printed samples

    NASA Astrophysics Data System (ADS)

    Huang, Min; Cui, Guihua; Liu, Haoxue; Luo, M. Ronnier

    2012-01-01

    A series of printed samples on substrate of semi-gloss paper and with the magnitude of threshold color difference were prepared for scaling the visual color difference and to evaluate the performance of different method. The probabilities of perceptibly was used to normalized to Z-score and different color differences were scaled to the Z-score. The visual color difference was got, and checked with the STRESS factor. The results indicated that only the scales have been changed but the relative scales between pairs in the data are preserved.

  3. Generalized Jones matrix method for homogeneous biaxial samples.

    PubMed

    Ortega-Quijano, Noé; Fade, Julien; Alouini, Mehdi

    2015-08-10

    The generalized Jones matrix (GJM) is a recently introduced tool to describe linear transformations of three-dimensional light fields. Based on this framework, a specific method for obtaining the GJM of uniaxial anisotropic media was recently presented. However, the GJM of biaxial media had not been tackled so far, as the previous method made use of a simplified rotation matrix that lacks a degree of freedom in the three-dimensional rotation, thus being not suitable for calculating the GJM of biaxial media. In this work we propose a general method to derive the GJM of arbitrarily-oriented homogeneous biaxial media. It is based on the differential generalized Jones matrix (dGJM), which is the three-dimensional counterpart of the conventional differential Jones matrix. We show that the dGJM provides a simple and elegant way to describe uniaxial and biaxial media, with the capacity to model multiple simultaneous optical effects. The practical usefulness of this method is illustrated by the GJM modeling of the polarimetric properties of a negative uniaxial KDP crystal and a biaxial KTP crystal for any three-dimensional sample orientation. The results show that this method constitutes an advantageous and straightforward way to model biaxial media, which show a growing relevance for many interesting applications.

  4. A microRNA isolation method from clinical samples

    PubMed Central

    Zununi Vahed, Sepideh; Barzegari, Abolfazl; Rahbar Saadat, Yalda; Mohammadi, Somayeh; Samadi, Nasser

    2016-01-01

    Introduction: microRNAs (miRNAs) are considered to be novel molecular biomakers that could be exploited in the diagnosis and treatment of different diseases. The present study aimed to develop an efficient miRNA isolation method from different clinical specimens. Methods: Total RNAs were isolated by Trizol reagent followed by precipitation of the large RNAs with potassium acetate (KCH3COOH), polyethylene glycol (PEG) 4000 and 6000, and lithium chloride (LiCl). Then, small RNAs were enriched and recovered from the supernatants by applying a combination of LiCl and ethanol. The efficiency of the method was evaluated through the quality, quantity, and integrity of the recovered RNAs using the A260/280 absorbance ratio, reverse transcription PCR (RT-PCR), and quantitative real-time PCR (q-PCR). Results: Comparison of different RNA isolation methods based on the precipitation of DNA and large RNAs, high miRNA recovery and PCR efficiency revealed that applying potassium acetate with final precipitation of small RNAs using 2.5 M LiCl plus ethanol can provide high yield and quality small RNAs that can be exploited for clinical purposes. Conclusion: The current isolation method can be applied for most clinical samples including cells, formalin-fixed and paraffin-embedded (FFPE) tissues and even body fluids with a wide applicability in molecular biology investigations. PMID:27340621

  5. Eigenvector method for umbrella sampling enables error analysis

    NASA Astrophysics Data System (ADS)

    Thiede, Erik H.; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R.

    2016-08-01

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence.

  6. Eigenvector method for umbrella sampling enables error analysis.

    PubMed

    Thiede, Erik H; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R

    2016-08-28

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence. PMID:27586912

  7. Method for testing earth samples for contamination by organic contaminants

    DOEpatents

    Schabron, John F.

    1996-01-01

    Provided is a method for testing earth samples for contamination by organic contaminants, and particularly for aromatic compounds such as those found in diesel fuel and other heavy fuel oils, kerosene, creosote, coal oil, tars and asphalts. A drying step is provided in which a drying agent is contacted with either the earth sample or a liquid extract phase to reduce to possibility of false indications of contamination that could occur when humic material is present in the earth sample. This is particularly a problem when using relatively safe, non-toxic and inexpensive polar solvents such as isopropyl alcohol since the humic material tends to be very soluble in those solvents when water is present. Also provided is an ultraviolet spectroscopic measuring technique for obtaining an indication as to whether a liquid extract phase contains aromatic organic contaminants. In one embodiment, the liquid extract phase is subjected to a narrow and discrete band of radiation including a desired wave length and the ability of the liquid extract phase to absorb that wavelength of ultraviolet radiation is measured to provide an indication of the presence of aromatic organic contaminants.

  8. Method for testing earth samples for contamination by organic contaminants

    DOEpatents

    Schabron, J.F.

    1996-10-01

    Provided is a method for testing earth samples for contamination by organic contaminants, and particularly for aromatic compounds such as those found in diesel fuel and other heavy fuel oils, kerosene, creosote, coal oil, tars and asphalts. A drying step is provided in which a drying agent is contacted with either the earth sample or a liquid extract phase to reduce to possibility of false indications of contamination that could occur when humic material is present in the earth sample. This is particularly a problem when using relatively safe, non-toxic and inexpensive polar solvents such as isopropyl alcohol since the humic material tends to be very soluble in those solvents when water is present. Also provided is an ultraviolet spectroscopic measuring technique for obtaining an indication as to whether a liquid extract phase contains aromatic organic contaminants. In one embodiment, the liquid extract phase is subjected to a narrow and discrete band of radiation including a desired wave length and the ability of the liquid extract phase to absorb that wavelength of ultraviolet radiation is measured to provide an indication of the presence of aromatic organic contaminants. 2 figs.

  9. Multinational Quality Assurance

    ERIC Educational Resources Information Center

    Kinser, Kevin

    2011-01-01

    Multinational colleges and universities pose numerous challenges to the traditional models of quality assurance that are designed to validate domestic higher education. When institutions cross international borders, at least two quality assurance protocols are involved. To guard against fraud and abuse, quality assurance in the host country is…

  10. Quality assurance in the ambulatory care setting.

    PubMed

    Tyler, R D

    1989-01-01

    One of the most utilitarian developments in the field of quality assurance in health care has been the introduction of industrial concepts of quality management. These concepts, coupled with buyer demand for accountability, are bringing new perspectives to health care quality assurance. These perspectives provide a new view of quality assurance as a major responsibility and strategic opportunity for management; a competitive and marketable commodity; and a method of improving safety, effectiveness, and satisfaction with medical care.

  11. BMAA extraction of cyanobacteria samples: which method to choose?

    PubMed

    Lage, Sandra; Burian, Alfred; Rasmussen, Ulla; Costa, Pedro Reis; Annadotter, Heléne; Godhe, Anna; Rydberg, Sara

    2016-01-01

    β-N-Methylamino-L-alanine (BMAA), a neurotoxin reportedly produced by cyanobacteria, diatoms and dinoflagellates, is proposed to be linked to the development of neurological diseases. BMAA has been found in aquatic and terrestrial ecosystems worldwide, both in its phytoplankton producers and in several invertebrate and vertebrate organisms that bioaccumulate it. LC-MS/MS is the most frequently used analytical technique in BMAA research due to its high selectivity, though consensus is lacking as to the best extraction method to apply. This study accordingly surveys the efficiency of three extraction methods regularly used in BMAA research to extract BMAA from cyanobacteria samples. The results obtained provide insights into possible reasons for the BMAA concentration discrepancies in previous publications. In addition and according to the method validation guidelines for analysing cyanotoxins, the TCA protein precipitation method, followed by AQC derivatization and LC-MS/MS analysis, is now validated for extracting protein-bound (after protein hydrolysis) and free BMAA from cyanobacteria matrix. BMAA biological variability was also tested through the extraction of diatom and cyanobacteria species, revealing a high variance in BMAA levels (0.0080-2.5797 μg g(-1) DW).

  12. Automated Aqueous Sample Concentration Methods for in situ Astrobiological Instrumentation

    NASA Astrophysics Data System (ADS)

    Aubrey, A. D.; Grunthaner, F. J.

    2009-12-01

    The era of wet chemical experiments for in situ planetary science investigations is upon us, as evidenced by recent results from the surface of Mars by Phoenix’s microscopy, electrochemistry, and conductivity analyzer, MECA [1]. Studies suggest that traditional thermal volatilization methods for planetary science in situ investigations induce organic degradation during sample processing [2], an effect that is enhanced in the presence of oxidants [3]. Recent developments have trended towards adaptation of non-destructive aqueous extraction and analytical methods for future astrobiological instrumentation. Wet chemical extraction techniques under investigation include subcritical water extraction, SCWE [4], aqueous microwave assisted extraction, MAE, and organic solvent extraction [5]. Similarly, development of miniaturized analytical space flight instruments that require aqueous extracts include microfluidic capillary electrophoresis chips, μCE [6], liquid-chromatography mass-spectrometrometers, LC-MS [7], and life marker chips, LMC [8]. If organics are present on the surface of Mars, they are expected to be present at extremely low concentrations (parts-per-billion), orders of magnitude below the sensitivities of most flight instrument technologies. Therefore, it becomes necessary to develop and integrate concentration mechanisms for in situ sample processing before delivery to analytical flight instrumentation. We present preliminary results of automated solid-phase-extraction (SPE) sample purification and concentration methods for the treatment of highly saline aqueous soil extracts. These methods take advantage of the affinity of low molecular weight organic compounds with natural and synthetic scavenger materials. These interactions allow for the separation of target organic analytes from unfavorable background species (i.e. salts) during inline treatment, and a clever method for selective desorption is utilized to obtain concentrated solutions on the order

  13. Riverland ERA cleanup sampling and analysis plan

    SciTech Connect

    Heiden, C.E.

    1993-07-01

    This report describes the Riverland Expedited Response Action taking place at the Hanford Reservation. Characterization of potential waste sites within the Riverland ERA boundaries was conducted in October and November 1992. This sampling and analysis plan contains two parts: The field sampling plan (Part 1) and the quality assurance project plan (Part 2). The field sampling plan describes the activities to be performed, defines sample designation, and identifies sample analysis to be performed. The quality assurance project plan establishes data quality objectives, defines analytical methods and procedures and documentation requirements, and provides established technical procedures to be used for field sampling and measurement. The quality assurance project plan details all quality assurance/quality control procedures to be followed to ensure that usable and defensible data are collected.

  14. Applicability Comparison of Methods for Acid Generation Assessment of Rock Samples

    NASA Astrophysics Data System (ADS)

    Oh, Chamteut; Ji, Sangwoo; Yim, Giljae; Cheong, Youngwook

    2014-05-01

    Minerals including various forms of sulfur could generate AMD (Acid Mine Drainage) or ARD (Acid Rock Drainage), which can have serious effects on the ecosystem and even on human when exposed to air and/or water. To minimize the hazards by acid drainage, it is necessary to assess in advance the acid generation possibility of rocks and estimate the amount of acid generation. Because of its relatively simple and effective experiment procedure, the method of combining the results of ABA (Acid Base Accounting) and NAG (Net Acid Generation) tests have been commonly used in determining acid drainage conditions. The simplicity and effectiveness of the above method however, are derived from massive assumptions of simplified chemical reactions and this often leads to results of classifying the samples as UC (Uncertain) which would then require additional experimental or field data to reclassify them properly. This paper therefore, attempts to find the reasons that cause samples to be classified as UC and suggest new series of experiments where samples can be reclassified appropriately. Study precedents on evaluating potential acid generation and neutralization capacity were reviewed and as a result three individual experiments were selected in the light of applicability and compatibility of minimizing unnecessary influence among other experiments. The proposed experiments include sulfur speciation, ABCC (Acid Buffering Characteristic Curve), and Modified NAG which are all improved versions of existing experiments of Total S, ANC (Acid Neutralizing Capacity), and NAG respectively. To assure the applicability of the experiments, 36 samples from 19 sites with diverse geologies, field properties, and weathering conditions were collected. The samples were then subject to existing experiments and as a result, 14 samples which either were classified as UC or could be used as a comparison group had been selected. Afterwards, the selected samples were used to conduct the suggested

  15. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  16. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  17. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  18. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  19. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  20. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... form and consistency of the waste materials to be sampled. Samples collected using the sampling... Crushed or powdered material—ASTM Standard D346-75 Soil or rock-like material—ASTM Standard D420-69...

  1. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... form and consistency of the waste materials to be sampled. Samples collected using the sampling... Crushed or powdered material—ASTM Standard D346-75 Soil or rock-like material—ASTM Standard D420-69...

  2. Method of remotely characterizing thermal properties of a sample

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S. (Inventor); Heath, D. Michele (Inventor); Welch, Christopher (Inventor); Winfree, William P. (Inventor); Miller, William E. (Inventor)

    1992-01-01

    A sample in a wind tunnel is radiated from a thermal energy source outside of the wind tunnel. A thermal imager system, also located outside of the wind tunnel, reads surface radiations from the sample as a function of time. The produced thermal images are characteristic of the heat transferred from the sample to the flow across the sample. In turn, the measured rates of heat loss of the sample are characteristic of the flow and the sample.

  3. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    PubMed

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  4. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    PubMed Central

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  5. Markov chain Monte Carlo posterior sampling with the Hamiltonian method.

    SciTech Connect

    Hanson, Kenneth M.

    2001-01-01

    A major advantage of Bayesian data analysis is that provides a characterization of the uncertainty in the model parameters estimated from a given set of measurements in the form of a posterior probability distribution. When the analysis involves a complicated physical phenomenon, the posterior may not be available in analytic form, but only calculable by means of a simulation code. In such cases, the uncertainty in inferred model parameters requires characterization of a calculated functional. An appealing way to explore the posterior, and hence characterize the uncertainty, is to employ the Markov Chain Monte Carlo technique. The goal of MCMC is to generate a sequence random of parameter x samples from a target pdf (probability density function), {pi}(x). In Bayesian analysis, this sequence corresponds to a set of model realizations that follow the posterior distribution. There are two basic MCMC techniques. In Gibbs sampling, typically one parameter is drawn from the conditional pdf at a time, holding all others fixed. In the Metropolis algorithm, all the parameters can be varied at once. The parameter vector is perturbed from the current sequence point by adding a trial step drawn randomly from a symmetric pdf. The trial position is either accepted or rejected on the basis of the probability at the trial position relative to the current one. The Metropolis algorithm is often employed because of its simplicity. The aim of this work is to develop MCMC methods that are useful for large numbers of parameters, n, say hundreds or more. In this regime the Metropolis algorithm can be unsuitable, because its efficiency drops as 0.3/n. The efficiency is defined as the reciprocal of the number of steps in the sequence needed to effectively provide a statistically independent sample from {pi}.

  6. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. PMID

  7. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  8. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed.

  9. Measurement of atmospheric mercury species with manual sampling and analysis methods in a case study in Indiana

    USGS Publications Warehouse

    Risch, M.R.; Prestbo, E.M.; Hawkins, L.

    2007-01-01

    Ground-level concentrations of three atmospheric mercury species were measured using manual sampling and analysis to provide data for estimates of mercury dry deposition. Three monitoring stations were operated simultaneously during winter, spring, and summer 2004, adjacent to three mercury wet-deposition monitoring stations in northern, central, and southern Indiana. The monitoring locations differed in land-use setting and annual mercury-emissions level from nearby sources. A timer-controlled air-sampling system that contained a three-part sampling train was used to isolate reactive gaseous mercury, particulate-bound mercury, and elemental mercury. The sampling trains were exchanged every 6 days, and the mercury species were quantified in a laboratory. A quality-assurance study indicated the sampling trains could be held at least 120 h without a significant change in reactive gaseous or particulate-bound mercury concentrations. The manual sampling method was able to provide valid mercury concentrations in 90 to 95% of samples. Statistical differences in mercury concentrations were observed during the project. Concentrations of reactive gaseous and elemental mercury were higher in the daytime samples than in the nighttime samples. Concentrations of reactive gaseous mercury were higher in winter than in summer and were highest at the urban monitoring location. The results of this case study indicated manual sampling and analysis could be a reliable method for measurement of atmospheric mercury species and has the capability for supplying representative concentrations in an effective manner from a long-term deposition-monitoring network. ?? 2007 Springer Science+Business Media B.V.

  10. Evaluation of field sampling and preservation methods for strontium-90 in ground water at the Idaho National Engineering Laboratory, Idaho

    USGS Publications Warehouse

    Cecil, L.D.; Knobel, L.L.; Wegner, S.J.; Moore, L.L.

    1989-01-01

    Water from four wells completed in the Snake River Plain aquifer was sampled as part of the U.S. Geological Survey 's quality assurance program to evaluate the effect of filtration and preservation methods on strontium-90 concentrations in groundwater at the Idaho National Engineering Laboratory. Water from each well was filtered through either a 0.45-micrometer membrane or a 0.1-micrometer membrane filter; unfiltered samples also were collected. Two sets of filtered and two sets of unfiltered samples was preserved in the field with reagent-grade hydrochloric acid and the other set of samples was not acidified. For water from wells with strontium-90 concentrations at or above the reporting level, 94% or more of the strontium-90 is in true solution or in colloidal particles smaller than 0.1 micrometer. These results suggest that within-laboratory reproducibility for strontium-90 in groundwater at the INEL is not significantly affected by changes in filtration and preservation methods used for sample collections. (USGS)

  11. A comparison of methods for representing sparsely sampled random quantities.

    SciTech Connect

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  12. A Method For Parallel, Automated, Thermal Cycling of Submicroliter Samples

    PubMed Central

    Nakane, Jonathan; Broemeling, David; Donaldson, Roger; Marziali, Andre; Willis, Thomas D.; O'Keefe, Matthew; Davis, Ronald W.

    2001-01-01

    A large fraction of the cost of DNA sequencing and other DNA-analysis processes results from the reagent costs incurred during cycle sequencing or PCR. In particular, the high cost of the enzymes and dyes used in these processes often results in thermal cycling costs exceeding $0.50 per sample. In the case of high-throughput DNA sequencing, this is a significant and unnecessary expense. Improved detection efficiency of new sequencing instrumentation allows the reaction volumes for cycle sequencing to be scaled down to one-tenth of presently used volumes, resulting in at least a 10-fold decrease in the cost of this process. However, commercially available thermal cyclers and automated reaction setup devices have inherent design limitations which make handling volumes of <1 μL extremely difficult. In this paper, we describe a method for thermal cycling aimed at reliable, automated cycling of submicroliter reaction volumes. PMID:11230168

  13. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    PubMed

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  14. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods

    USGS Publications Warehouse

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.

    2014-01-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  15. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    PubMed

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method. PMID:24419241

  16. Data-collection methods, quality-assurance data, and site considerations for total dissolved gas monitoring, lower Columbia River, Oregon and Washington, 2000

    USGS Publications Warehouse

    Tanner, Dwight Q.; Johnston, Matthew W.

    2001-01-01

    Excessive total dissolved gas pressure can cause gas-bubble trauma in fish downstream from dams on the Columbia River. In cooperation with the U.S. Army Corps of Engineers, the U.S. Geological Survey collected data on total dissolved gas pressure, barometric pressure, water temperature, and probe depth at eight stations on the lower Columbia River from the John Day forebay (river mile 215.6) to Camas (river mile 121.7) in water year 2000 (October 1, 1999, to September 30, 2000). These data are in the databases of the U.S. Geological Survey and the U.S. Army Corps of Engineers. Methods of data collection, review, and processing, and quality-assurance data are presented in this report.

  17. MARKOV CHAIN MONTE CARLO POSTERIOR SAMPLING WITH THE HAMILTONIAN METHOD

    SciTech Connect

    K. HANSON

    2001-02-01

    The Markov Chain Monte Carlo technique provides a means for drawing random samples from a target probability density function (pdf). MCMC allows one to assess the uncertainties in a Bayesian analysis described by a numerically calculated posterior distribution. This paper describes the Hamiltonian MCMC technique in which a momentum variable is introduced for each parameter of the target pdf. In analogy to a physical system, a Hamiltonian H is defined as a kinetic energy involving the momenta plus a potential energy {var_phi}, where {var_phi} is minus the logarithm of the target pdf. Hamiltonian dynamics allows one to move along trajectories of constant H, taking large jumps in the parameter space with relatively few evaluations of {var_phi} and its gradient. The Hamiltonian algorithm alternates between picking a new momentum vector and following such trajectories. The efficiency of the Hamiltonian method for multidimensional isotropic Gaussian pdfs is shown to remain constant at around 7% for up to several hundred dimensions. The Hamiltonian method handles correlations among the variables much better than the standard Metropolis algorithm. A new test, based on the gradient of {var_phi}, is proposed to measure the convergence of the MCMC sequence.

  18. Martian Radiative Transfer Modeling Using the Optimal Spectral Sampling Method

    NASA Technical Reports Server (NTRS)

    Eluszkiewicz, J.; Cady-Pereira, K.; Uymin, G.; Moncet, J.-L.

    2005-01-01

    The large volume of existing and planned infrared observations of Mars have prompted the development of a new martian radiative transfer model that could be used in the retrievals of atmospheric and surface properties. The model is based on the Optimal Spectral Sampling (OSS) method [1]. The method is a fast and accurate monochromatic technique applicable to a wide range of remote sensing platforms (from microwave to UV) and was originally developed for the real-time processing of infrared and microwave data acquired by instruments aboard the satellites forming part of the next-generation global weather satellite system NPOESS (National Polarorbiting Operational Satellite System) [2]. As part of our on-going research related to the radiative properties of the martian polar caps, we have begun the development of a martian OSS model with the goal of using it to perform self-consistent atmospheric corrections necessary to retrieve caps emissivity from the Thermal Emission Spectrometer (TES) spectra. While the caps will provide the initial focus area for applying the new model, it is hoped that the model will be of interest to the wider Mars remote sensing community.

  19. Quality assurance program plan for radionuclide airborne emissions monitoring

    SciTech Connect

    Boom, R.J.

    1995-12-01

    This Quality Assurance Program Plan identifies quality assurance program requirements and addresses the various Westinghouse Hanford Company organizations and their particular responsibilities in regards to sample and data handling of radiological airborne emissions. This Quality Assurance Program Plan is prepared in accordance with and to written requirements.

  20. Sequential sampling: a novel method in farm animal welfare assessment.

    PubMed

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  1. Probing methane hydrate nucleation through the forward flux sampling method.

    PubMed

    Bi, Yuanfei; Li, Tianshu

    2014-11-26

    Understanding the nucleation of hydrate is the key to developing effective strategies for controlling methane hydrate formation. Here we present a computational study of methane hydrate nucleation, by combining the forward flux sampling (FFS) method and the coarse-grained water model mW. To facilitate the application of FFS in studying the formation of methane hydrate, we developed an effective order parameter λ on the basis of the topological analysis of the tetrahedral network. The order parameter capitalizes the signature of hydrate structure, i.e., polyhedral cages, and is capable of efficiently distinguishing hydrate from ice and liquid water while allowing the formation of different hydrate phases, i.e., sI, sII, and amorphous. Integration of the order parameter λ with FFS allows explicitly computing hydrate nucleation rates and obtaining an ensemble of nucleation trajectories under conditions where spontaneous hydrate nucleation becomes too slow to occur in direct simulation. The convergence of the obtained hydrate nucleation rate was found to depend crucially on the convergence of the spatial distribution for the spontaneously formed hydrate seeds obtained from the initial sampling of FFS. The validity of the approach is also verified by the agreement between the calculated nucleation rate and that inferred from the direct simulation. Analyzing the obtained large ensemble of hydrate nucleation trajectories, we show hydrate formation at 220 K and 500 bar is initiated by the nucleation events occurring in the vicinity of water-methane interface, and facilitated by a gradual transition from amorphous to crystalline structure. The latter provides the direct support to the proposed two-step nucleation mechanism of methane hydrate. PMID:24849698

  2. COMPARISON OF USEPA FIELD SAMPLING METHODS FOR BENTHIC MACROINVERTEBRATE STUDIES

    EPA Science Inventory

    Two U.S. Environmental Protection Agency (USEPA) macroinvertebrate sampling protocols were compared in the Mid-Atlantic Highlands region. The Environmental Monitoring and Assessment Program (EMAP) wadeable streams protocol results in a single composite sample from nine transects...

  3. Sampling and Decontamination Method for Culture of Nontuberculous Mycobacteria in Respiratory Samples of Cystic Fibrosis Patients

    PubMed Central

    De Geyter, Deborah; De Schutter, Iris; Mouton, Christine; Wellemans, Isabelle; Hanssens, Laurence; Schelstraete, Petra; Malfroot, Anne; Pierard, Denis

    2013-01-01

    We confirmed that chlorhexidine decontamination yielded more nontuberculous mycobacteria than did the N-acetyl-l-cysteine-NaOH-oxalic acid procedure from respiratory samples of cystic fibrosis patients on solid cultures. However, this improved recovery is mostly balanced if the latter is combined with liquid culture. Furthermore, none of the 145 cough swabs, used to sample young children, cultured positive, suggesting that swabs are low-quality samples. PMID:24048532

  4. Photoacoustic spectroscopy sample array vessels and photoacoustic spectroscopy methods for using the same

    DOEpatents

    Amonette, James E.; Autrey, S. Thomas; Foster-Mills, Nancy S.

    2006-02-14

    Methods and apparatus for simultaneous or sequential, rapid analysis of multiple samples by photoacoustic spectroscopy are disclosed. Particularly, a photoacoustic spectroscopy sample array vessel including a vessel body having multiple sample cells connected thereto is disclosed. At least one acoustic detector is acoustically positioned near the sample cells. Methods for analyzing the multiple samples in the sample array vessels using photoacoustic spectroscopy are provided.

  5. A comparison of four gravimetric fine particle sampling methods.

    PubMed

    Yanosky, J D; MacIntosh, D L

    2001-06-01

    A study was conducted to compare four gravimetric methods of measuring fine particle (PM2.5) concentrations in air: the BGI, Inc. PQ200 Federal Reference Method PM2.5 (FRM) sampler; the Harvard-Marple Impactor (HI); the BGI, Inc. GK2.05 KTL Respirable/Thoracic Cyclone (KTL); and the AirMetrics MiniVol (MiniVol). Pairs of FRM, HI, and KTL samplers and one MiniVol sampler were collocated and 24-hr integrated PM2.5 samples were collected on 21 days from January 6 through April 9, 2000. The mean and standard deviation of PM2.5 levels from the FRM samplers were 13.6 and 6.8 microg/m3, respectively. Significant systematic bias was found between mean concentrations from the FRM and the MiniVol (1.14 microg/m3, p = 0.0007), the HI and the MiniVol (0.85 microg/m3, p = 0.0048), and the KTL and the MiniVol (1.23 microg/m3, p = 0.0078) according to paired t test analyses. Linear regression on all pairwise combinations of the sampler types was used to evaluate measurements made by the samplers. None of the regression intercepts was significantly different from 0, and only two of the regression slopes were significantly different from 1, that for the FRM and the MiniVol [beta1 = 0.91, 95% CI (0.83-0.99)] and that for the KTL and the MiniVol [beta1 = 0.88, 95% CI (0.78-0.98)]. Regression R2 terms were 0.96 or greater between all pairs of samplers, and regression root mean square error terms (RMSE) were 1.65 microg/m3 or less. These results suggest that the MiniVol will underestimate measurements made by the FRM, the HI, and the KTL by an amount proportional to PM2.5 concentration. Nonetheless, these results indicate that all of the sampler types are comparable if approximately 10% variation on the mean levels and on individual measurement levels is considered acceptable and the actual concentration is within the range of this study (5-35 microg/m3).

  6. [Current methods for preparing samples on working with hematology analyzers].

    PubMed

    Tsyganova, A V; Pogorelov, V M; Naumova, I N; Kozinets, G I; Antonov, V S

    2011-03-01

    The paper raises a problem of preparing samples in hematology. It considers whether the preanalytical stage is of importance in hematological studies. The use of disposal vacuum blood collection systems is shown to solve the problem in the standardization of a blood sampling procedure. The benefits of the use of close tube hematology analyzers are also considered. PMID:21584966

  7. Acoustically levitated droplets: a contactless sampling method for fluorescence studies.

    PubMed

    Leiterer, Jork; Grabolle, Markus; Rurack, Knut; Resch-Genger, Ute; Ziegler, Jan; Nann, Thomas; Panne, Ulrich

    2008-01-01

    Acoustic levitation is used as a new tool to study concentration-dependent processes in fluorescence spectroscopy. With this technique, small amounts of liquid and solid samples can be measured without the need for sample supports or containers, which often limits signal acquisition and can even alter sample properties due to interactions with the support material. We demonstrate that, because of the small sample volume, fluorescence measurements at high concentrations of an organic dye are possible without the limitation of inner-filter effects, which hamper such experiments in conventional, cuvette-based measurements. Furthermore, we show that acoustic levitation of liquid samples provides an experimentally simple way to study distance-dependent fluorescence modulations in semiconductor nanocrystals. The evaporation of the solvent during levitation leads to a continuous increase of solute concentration and can easily be monitored by laser-induced fluorescence. PMID:18596335

  8. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    PubMed

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination.

  9. Alternative methods for determining the electrical conductivity of core samples.

    PubMed

    Lytle, R J; Duba, A G; Willows, J L

    1979-05-01

    Electrode configurations are described that can be used in measuring the electrical conductivity of a core sample and that do not require access to the core end faces. The use of these configurations eliminates the need for machining the core ends for placement of end electrodes. This is because the conductivity in the cases described is relatively insensitive to the length of the sample. We validated the measurement technique by comparing mathematical models with actual measurements that were made perpendicular and paralled to the core axis of granite samples.

  10. Analytical results, database management and quality assurance for analysis of soil and groundwater samples collected by cone penetrometer from the F and H Area seepage basins

    SciTech Connect

    Boltz, D.R.; Johnson, W.H.; Serkiz, S.M.

    1994-10-01

    The Quantification of Soil Source Terms and Determination of the Geochemistry Controlling Distribution Coefficients (K{sub d} values) of Contaminants at the F- and H-Area Seepage Basins (FHSB) study was designed to generate site-specific contaminant transport factors for contaminated groundwater downgradient of the Basins. The experimental approach employed in this study was to collect soil and its associated porewater from contaminated areas downgradient of the FHSB. Samples were collected over a wide range of geochemical conditions (e.g., pH, conductivity, and contaminant concentration) and were used to describe the partitioning of contaminants between the aqueous phase and soil surfaces at the site. The partitioning behavior may be used to develop site-specific transport factors. This report summarizes the analytical procedures and results for both soil and porewater samples collected as part of this study and the database management of these data.

  11. Processes and procedures for a worldwide biological samples distribution; product assurance and logistic activities to support the mice drawer system tissue sharing event

    NASA Astrophysics Data System (ADS)

    Benassai, Mario; Cotronei, Vittorio

    The Mice Drawer System (MDS) is a scientific payload developed by the Italian Space Agency (ASI), it hosted 6 mice on the International Space Station (ISS) and re-entered on ground on November 28, 2009 with the STS 129 at KSC. Linked to the MDS experiment, a Tissue Sharing Program (TSP), was developed in order to make available to 16 Payload Investigators (PI) (located in USA, Canada, EU -Italy, Belgium and Germany -and Japan) the biological samples coming from the mice. ALTEC SpA (a PPP owned by ASI, TAS-I and local institutions) was responsible to support the logistics aspects of the MDS samples for the first MDS mission, in the frame of Italian Space Agency (ASI) OSMA program (OSteoporosis and Muscle Atrophy). The TSP resulted in a complex scenario, as ASI, progressively, extended the original OSMA Team also to researchers from other ASI programs and from other Agencies (ESA, NASA, JAXA). The science coordination was performed by the University of Genova (UNIGE). ALTEC has managed all the logistic process with the support of a specialized freight forwarder agent during the whole shipping operation phases. ALTEC formalized all the steps from the handover of samples by the dissection Team to the packaging and shipping process in a dedicated procedure. ALTEC approached all the work in a structured way, performing: A study of the aspects connected to international shipments of biological samples. A coopera-tive work with UNIGE/ASI /PIs to identify all the needs of the various researchers and their compatibility. A complete revision and integration of shipment requirements (addresses, tem-peratures, samples, materials and so on). A complete definition of the final shipment scenario in terms of boxes, content, refrigerant and requirements. A formal approach to identification and selection of the most suited and specialized Freight Forwarder. A clear identification of all the processes from sample dissection by PI Team, sample processing, freezing, tube preparation

  12. Assuring NASA's Safety and Mission Critical Software

    NASA Technical Reports Server (NTRS)

    Deadrick, Wesley

    2015-01-01

    What is IV&V? Independent Verification and Validation (IV&V) is an objective examination of safety and mission critical software processes and products. Independence: 3 Key parameters: Technical Independence; Managerial Independence; Financial Independence. NASA IV&V perspectives: Will the system's software: Do what it is supposed to do?; Not do what it is not supposed to do?; Respond as expected under adverse conditions?. Systems Engineering: Determines if the right system has been built and that it has been built correctly. IV&V Technical Approaches: Aligned with IEEE 1012; Captured in a Catalog of Methods; Spans the full project lifecycle. IV&V Assurance Strategy: The IV&V Project's strategy for providing mission assurance; Assurance Strategy is driven by the specific needs of an individual project; Implemented via an Assurance Design; Communicated via Assurance Statements.

  13. Field sampling method for quantifying odorants in humid environments.

    PubMed

    Trabue, Steven L; Scoggin, Kenwood D; Li, Hong; Burns, Robert; Xin, Hongwei

    2008-05-15

    Most air quality studies in agricultural environments use thermal desorption analysis for quantifying semivolatile organic compounds (SVOCs) associated with odor. The objective of this study was to develop a robust sampling technique for measuring SVOCs in humid environments. Test atmospheres were generated at ambient temperatures (23 +/- 1.5 degrees C) and 25, 50, and 80% relative humidity (RH). Sorbent material used included Tenax, graphitized carbon, and carbon molecular sieve (CMS). Sorbent tubes were challenged with 2, 4, 8, 12, and 24 L of air at various RHs. Sorbent tubes with CMS material performed poorly at both 50 and 80% RH dueto excessive sorption of water. Heating of CMS tubes during sampling or dry-purging of CMS tubes post sampling effectively reduced water sorption with heating of tubes being preferred due to the higher recovery and reproducibility. Tenaxtubes had breakthrough of the more volatile compounds and tended to form artifacts with increasing volumes of air sampled. Graphitized carbon sorbent tubes containing Carbopack X and Carbopack C performed best with quantitative recovery of all compounds at all RHs and sampling volumes tested. The graphitized carbon tubes were taken to the field for further testing. Field samples taken from inside swine feeding operations showed that butanoic acid, 4-methylphenol, 4-ethylphenol, indole, and 3-methylindole were the compounds detected most often above their odor threshold values. Field samples taken from a poultry facility demonstrated that butanoic acid, 3-methylbutanoic acid, and 4-methylphenol were the compounds above their odor threshold values detected most often, relative humidity, CAFO, VOC, SVOC, thermal desorption, swine, poultry, air quality, odor. PMID:18546717

  14. Photoacoustic spectroscopy sample array vessel and photoacoustic spectroscopy method for using the same

    DOEpatents

    Amonette, James E.; Autrey, S. Thomas; Foster-Mills, Nancy S.; Green, David

    2005-03-29

    Methods and apparatus for analysis of multiple samples by photoacoustic spectroscopy are disclosed. Particularly, a photoacoustic spectroscopy sample array vessel including a vessel body having multiple sample cells connected thereto is disclosed. At least one acoustic detector is acoustically coupled with the vessel body. Methods for analyzing the multiple samples in the sample array vessels using photoacoustic spectroscopy are provided.

  15. Quality Assurance in Higher Education: Proposals for Consultation.

    ERIC Educational Resources Information Center

    Higher Education Funding Council for England, Bristol.

    This document sets out for consultation proposals for a revised method for quality assurance of teaching and learning in higher education. The proposals cover: (1) the objectives and principles of quality assurance; (2) an approach to quality assurance based on external audit principles; (3) the collection and publication of information; (4)…

  16. Internal Quality Assurance System and Its Implementation in Kaunas College

    ERIC Educational Resources Information Center

    Misiunas, Mindaugas

    2007-01-01

    The article discusses the internal system of quality assurance and its implementation methods in Kaunas College. The issues of quality assurance are reviewed in the context of the European higher education area covering the three levels: European, national and institutional. The importance of quality assurance and its links with external…

  17. Method for preconcentrating a sample for subsequent analysis

    DOEpatents

    Zaromb, Solomon

    1990-01-01

    A system for analysis of trace concentration of contaminants in air includes a portable liquid chromatograph and a preconcentrator for the contaminants to be analyzed. The preconcentrator includes a sample bag having an inlet valve and an outlet valve for collecting an air sample. When the sample is collected the sample bag is connected in series with a sorbing apparatus in a recirculation loop. The sorbing apparatus has an inner gas-permeable container containing a sorbent material and an outer gas-impermeable container. The sample is circulated through the outer container and around the inner container for trapping and preconcentrating the contaminants in the sorbent material. The sorbent material may be a liquid having the same composition as the mobile phase of the chromatograph for direct injection thereinto. Alternatively, the sorbent material may be a porous, solid body, to which mobile phase liquid is added after preconcentration of the contaminants for dissolving the contaminants, the liquid solution then being withdrawn for injection into the chromatograph.

  18. Analytical instrument with apparatus and method for sample concentrating

    DOEpatents

    Zaromb, S.

    1986-08-04

    A system for analysis of trace concentrations of contaminants in air includes a portable liquid chromatograph and a preconcentrator for the contaminants to be analyzed. The preconcentrator includes a sample bag having an inlet valve and an outlet valve for collecting an air sample. When the sample is collected the sample bag is connected in series with a sorbing apparatus in a recirculation loop. The sorbing apparatus has an inner gas-permeable container containing a sorbent material and an outer gas-impermeable container. The sample is circulated through the outer container and around the inner container for trapping and preconcentrating the contaminants in the sorbent material. The sorbent material may be a liquid having the same composition as the mobile phase of the chromatograph for direct injection thereinto. Alternatively, the sorbent material may be a porous, solid body, to which mobile phase liquid is added after preconcentration of the contaminants for dissolving the contaminants, the liquid solution then being withdrawn for injection into the chromatograph.

  19. SU-E-T-570: New Quality Assurance Method Using Motion Tracking for 6D Robotic Couches

    SciTech Connect

    Cheon, W; Cho, J; Ahn, S; Han, Y; Choi, D

    2015-06-15

    Purpose: To accommodate geometrically accurate patient positioning, a robotic couch that is capable of 6-degrees of freedom has been introduced. However, conventional couch QA methods are not sufficient to enable the necessary accuracy of tests. Therefore, we have developed a camera based motion detection and geometry calibration system for couch QA. Methods: Employing a Visual-Tracking System (VTS, BonitaB10, Vicon, UK) which tracks infrared reflective(IR) markers, camera calibration was conducted using a 5.7 × 5.7 × 5.7 cm{sup 3} cube attached with IR markers at each corner. After positioning a robotic-couch at the origin with the cube on the table top, 3D coordinates of the cube’s eight corners were acquired by VTS in the VTS coordinate system. Next, positions in reference coordinates (roomcoordinates) were assigned using the known relation between each point. Finally, camera calibration was completed by finding a transformation matrix between VTS and reference coordinate systems and by applying a pseudo inverse matrix method. After the calibration, the accuracy of linear and rotational motions as well as couch sagging could be measured by analyzing the continuously acquired data of the cube while the couch moves to a designated position. Accuracy of the developed software was verified through comparison with measurement data when using a Laser tracker (FARO, Lake Mary, USA) for a robotic-couch installed for proton therapy. Results: VTS system could track couch motion accurately and measured position in room-coordinates. The VTS measurements and Laser tracker data agreed within 1% of difference for linear and rotational motions. Also because the program analyzes motion in 3-Dimension, it can compute couch sagging. Conclusion: Developed QA system provides submillimeter/ degree accuracy which fulfills the high-end couch QA. This work was supported by the National Research Foundation of Korea funded by Ministry of Science, ICT & Future Planning. (2013M2A2A

  20. Quality assurance planning and structure.

    PubMed

    Jackman, W; Brown, L D; Al-assaf, A F; Reinke, J M; Abubaker, W; Winter, L; Murphy, G; Blumenfeld, S

    1995-01-01

    Planning for the introduction, implementation, and conduct of quality assurance activities has been the key issue from the outset of the project. Despite the various approaches to planning, no single Quality Assurance (QA) planning can be universally accepted by developing countries due to variations in the socioeconomic, cultural and political makeup of individual countries. This paper summarizes the lessons learned from the Quality Assurance Project in planning a QA program: 1) the need to understand organizational strengths and weaknesses to develop appropriate strategies for QA skills training and organizational change; 2) the need to build on existing systems or activities that support the objectives of the organization and provide an adequate foundation for the QA program; 3) the need to assign responsibility for quality assurance through the creation of QA councils and committees and the assignment of coordinators and other individuals; 4) the need to secure top-level management support to legitimize any changes; 5) the need to determine the method of introducing innovations into organizations, either by a top-down or bottom-up approach; 6) the plan should have well-defined priorities and objectives despite its flexibility as projects evolve and grow over time.

  1. Metrology: Measurement Assurance Program Guidelines

    NASA Technical Reports Server (NTRS)

    Eicke, W. G.; Riley, J. P.; Riley, K. J.

    1995-01-01

    The 5300.4 series of NASA Handbooks for Reliability and Quality Assurance Programs have provisions for the establishment and utilization of a documented metrology system to control measurement processes and to provide objective evidence of quality conformance. The intent of these provisions is to assure consistency and conformance to specifications and tolerances of equipment, systems, materials, and processes procured and/or used by NASA, its international partners, contractors, subcontractors, and suppliers. This Measurement Assurance Program (MAP) guideline has the specific objectives to: (1) ensure the quality of measurements made within NASA programs; (2) establish realistic measurement process uncertainties; (3) maintain continuous control over the measurement processes; and (4) ensure measurement compatibility among NASA facilities. The publication addresses MAP methods as applied within and among NASA installations and serves as a guide to: control measurement processes at the local level (one facility); conduct measurement assurance programs in which a number of field installations are joint participants; and conduct measurement integrity (round robin) experiments in which a number of field installations participate to assess the overall quality of particular measurement processes at a point in time.

  2. Statistical Methods and Tools for Hanford Staged Feed Tank Sampling

    SciTech Connect

    Fountain, Matthew S.; Brigantic, Robert T.; Peterson, Reid A.

    2013-10-01

    This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).

  3. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  4. Apparatus and method for centrifugation and robotic manipulation of samples

    NASA Technical Reports Server (NTRS)

    Vellinger, John C. (Inventor); Ormsby, Rachel A. (Inventor); Kennedy, David J. (Inventor); Thomas, Nathan A. (Inventor); Shulthise, Leo A. (Inventor); Kurk, Michael A. (Inventor); Metz, George W. (Inventor)

    2007-01-01

    A device for centrifugation and robotic manipulation of specimen samples, including incubating eggs, and uses thereof are provided. The device may advantageously be used for the incubation of avian, reptilian or any type of vertebrate eggs. The apparatus comprises a mechanism for holding samples individually, rotating them individually, rotating them on a centrifuge collectively, injecting them individually with a fixative or other chemical reagent, and maintaining them at controlled temperature, relative humidity and atmospheric composition. The device is applicable to experiments involving entities other than eggs, such as invertebrate specimens, plants, microorganisms and molecular systems.

  5. [Novel quality assurance method in oncology: the two-level, multi-disciplinary and oncotherapy oncology team system].

    PubMed

    Mangel, László; Kövér, Erika; Szilágyi, István; Varga, Zsuzsanna; Bércesi, Eva; Nagy, Zsuzsanna; Holcz, Tibor; Karádi, Oszkár; Farkas, Róbert; Csák, Szilvia; Csere, Tibor; Kásler, Miklós

    2012-12-16

    By now therapy decision taken by a multi-disciplinary oncology team in cancer care has become a routine method in worldwide. However, multi-disciplinary oncology team has to face more and more difficulties in keeping abreast with the fast development in oncology science, increasing expectations, and financial considerations. Naturally the not properly controlled decision mechanisms, the permanent lack of time and shortage of professionals are also hindering factors. Perhaps it would be a way out if the staff meetings and discussions of physicians in the oncology departments were transformed and provided with administrative, legal and decision credentials corresponding to those of multi-disciplinary oncology team. The new form of the oncotherapy oncoteam might be able to decide the optimal and particular treatment after previous consultation with the patient. The oncotherapy oncoteam is also suitable to carry out training and tasks of a cancer centre and by diminishing the psychological burden of the doctors it contributes to an improved patient care. This study presents the two-level multi-disciplinary and oncotherapy oncology team system at the University of Pécs including the detailed analysis of the considerations above.

  6. Improved sample management in the cylindrical-tube microelectrophoresis method

    NASA Technical Reports Server (NTRS)

    Smolka, A. J. K.

    1980-01-01

    A modification to an analytical microelectrophoresis system is described that improves the manipulation of the sample particles and fluid. The apparatus modification and improved operational procedure should yield more accurate measurements of particle mobilities and permit less skilled operators to use the apparatus.

  7. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... protocols listed below, for sampling waste with properties similar to the indicated materials, will be considered by the Agency to be representative of the waste. Extremely viscous liquid—ASTM Standard D140-70...-like material—ASTM Standard D1452-65 Fly Ash-like material—ASTM Standard D2234-76 Containerized...

  8. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... protocols listed below, for sampling waste with properties similar to the indicated materials, will be considered by the Agency to be representative of the waste. Extremely viscous liquid—ASTM Standard D140-70...-like material—ASTM Standard D1452-65 Fly Ash-like material—ASTM Standard D2234-76 Containerized...

  9. Investigation of Presage 3D Dosimetry as a Method of Clinically Intuitive Quality Assurance and Comparison to a Semi-3D Delta4 System

    NASA Astrophysics Data System (ADS)

    Crockett, Ethan Van

    The need for clinically intuitive metrics for patient-specific quality assurance in radiation therapy has been well-documented (Zhen, Nelms et al. 2011). A novel transform method has shown to be effective at converting full-density 3D dose measurements made in a phantom to dose values in the patient geometry, enabling comparisons using clinically intuitive metrics such as dose-volume histograms (Oldham et al. 2011). This work investigates the transform method and compares its calculated dose-volume histograms (DVHs) to DVH values calculated by a Delta4 QA device (Scandidos), marking the first comparison of a true 3D system to a semi-3D device using clinical metrics. Measurements were made using Presage 3D dosimeters, which were readout by an in-house optical-CT scanner. Three patient cases were chosen for the study: one head-and-neck VMAT treatment and two spine IMRT treatments. The transform method showed good agreement with the planned dose values for all three cases. Furthermore, the transformed DVHs adhered to the planned dose with more accuracy than the Delta4 DVHs. The similarity between the Delta4 DVHs and the transformed DVHs, however, was greater for one of the spine cases than it was for the head-and-neck case, implying that the accuracy of the Delta4 Anatomy software may vary from one treatment site to another. Overall, the transform method, which incorporates data from full-density 3D dose measurements, provides clinically intuitive results that are more accurate and consistent than the corresponding results from a semi-3D Delta 4 system.

  10. Process measurement assurance program

    SciTech Connect

    Pettit, R.B.

    1996-05-01

    This paper describes a new method for determining, improving, and controlling the measurement process errors (or measurement uncertainty) of a measurement system used to monitor product as it is manufactured. The method is called the Process Measurement Assurance Program (PMAP). It integrates metrology early into the product realization process and is a step beyond statistical process control (SPC), which monitors only the product. In this method, a control standard is used to continuously monitor the status of the measurement system. Analysis of the control standard data allow the determination of the measurement error inherent in the product data and allow one to separate the variability in the manufacturing process from variability in the measurement process. These errors can be then associated with either the measurement equipment, variability of the measurement process, operator bias, or local environmental effects. Another goal of PMAP is to determine appropriate re-calibration intervals for the measurement system, which may be significantly longer or shorter than the interval typically assigned by the calibration organization.

  11. Quality Assurance Manual

    SciTech Connect

    McGarrah, J.E.

    1995-05-01

    In order to provide clients with quality products and services, Pacific Northwest Laboratory (PNL) has established and implemented a formal quality assurance program. These management controls are documented in this manual (PNL-MA-70) and its accompanying standards and procedures. The QA Program meets the basic requirements and supplements of ANSI/ASME NQA-1, Quality Assurance Program Requirements for Nuclear Facilities, as interpreted for PNL activities. Additional, the quality requirements are augmented to include the Total Quality approach defined in the Department of Energy Order 5700.6C, Quality Assurance. This manual provides requirements and an overview of the administrative procedures that apply to projects and activities.

  12. Software quality assurance handbook

    SciTech Connect

    Not Available

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  13. Exploring biomolecular dynamics and interactions using advanced sampling methods

    NASA Astrophysics Data System (ADS)

    Luitz, Manuel; Bomblies, Rainer; Ostermeir, Katja; Zacharias, Martin

    2015-08-01

    Molecular dynamics (MD) and Monte Carlo (MC) simulations have emerged as a valuable tool to investigate statistical mechanics and kinetics of biomolecules and synthetic soft matter materials. However, major limitations for routine applications are due to the accuracy of the molecular mechanics force field and due to the maximum simulation time that can be achieved in current simulations studies. For improving the sampling a number of advanced sampling approaches have been designed in recent years. In particular, variants of the parallel tempering replica-exchange methodology are widely used in many simulation studies. Recent methodological advancements and a discussion of specific aims and advantages are given. This includes improved free energy simulation approaches and conformational search applications.

  14. Comparison of methods for sampling plant bugs on cotton in South Texas (2010)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A total of 26 cotton fields were sampled by experienced and inexperienced samplers at 3 growth stages using 5 methods to compare the most efficient and accurate method for sampling plant bugs in cotton. Each of the 5 methods had its own distinct advantages and disadvantages as a sampling method (too...

  15. A new method of snowmelt sampling for water stable isotopes

    USGS Publications Warehouse

    Penna, D.; Ahmad, M.; Birks, S. J.; Bouchaou, L.; Brencic, M.; Butt, S.; Holko, L.; Jeelani, G.; Martinez, D. E.; Melikadze, G.; Shanley, J.B.; Sokratov, S. A.; Stadnyk, T.; Sugimoto, A.; Vreca, P.

    2014-01-01

    We modified a passive capillary sampler (PCS) to collect snowmelt water for isotopic analysis. Past applications of PCSs have been to sample soil water, but the novel aspect of this study was the placement of the PCSs at the ground-snowpack interface to collect snowmelt. We deployed arrays of PCSs at 11 sites in ten partner countries on five continents representing a range of climate and snow cover worldwide. The PCS reliably collected snowmelt at all sites and caused negligible evaporative fractionation effects in the samples. PCS is low-cost, easy to install, and collects a representative integrated snowmelt sample throughout the melt season or at the melt event scale. Unlike snow cores, the PCS collects the water that would actually infiltrate the soil; thus, its isotopic composition is appropriate to use for tracing snowmelt water through the hydrologic cycle. The purpose of this Briefing is to show the potential advantages of PCSs and recommend guidelines for constructing and installing them based on our preliminary results from two snowmelt seasons.

  16. Statistical Methods and Sampling Design for Estimating Step Trends in Surface-Water Quality

    USGS Publications Warehouse

    Hirsch, Robert M.

    1988-01-01

    This paper addresses two components of the problem of estimating the magnitude of step trends in surface water quality. The first is finding a robust estimator appropriate to the data characteristics expected in water-quality time series. The J. L. Hodges-E. L. Lehmann class of estimators is found to be robust in comparison to other nonparametric and moment-based estimators. A seasonal Hodges-Lehmann estimator is developed and shown to have desirable properties. Second, the effectiveness of various sampling strategies is examined using Monte Carlo simulation coupled with application of this estimator. The simulation is based on a large set of total phosphorus data from the Potomac River. To assure that the simulated records have realistic properties, the data are modeled in a multiplicative fashion incorporating flow, hysteresis, seasonal, and noise components. The results demonstrate the importance of balancing the length of the two sampling periods and balancing the number of data values between the two periods.

  17. Stability of nitrate-ion concentrations in simulated deposition samples used for quality-assurance activities by the U.S. Geological Survey

    USGS Publications Warehouse

    Willoughby, T.C.; See, R.B.; Schroder, L.J.

    1989-01-01

    Three experiments were conducted to determine the stability of nitrate-ion concentrations in simulated deposition samples. In the four experiment-A solutions, nitric acid provided nitrate-ion concentrations ranging from 0.6 to 10.0 mg/L and that had pH values ranging from 3.8 to 5.0. In the five experiment-B solutions, sodium nitrate provided nitrate-ion concentrations ranging from 0.5 to 3.0 mg/L. The pH was adjusted to about 4.5 for each of the solutions by addition of sulfuric acid. In the four experiment-C solutions, nitric acid provided nitrate-ion concentrations ranging from 0.5 to 3.0 mg/L. Major cation and anion concentrations were added to each solution to simulate natural deposition. Aliquots were removed from the 13 original solutions and analyzed by ion chromatography about once a week for 100 days to determine if any changes occurred in nitrate-ion concentrations throughout the study period. No substantial changes were observed in the nitrate-ion concentrations in solutions that had initial concentrations below 4.0 mg/L in experiments A and B, although most of the measured nitrate-ion concentrations for the 100-day study were below the initial concentrations. In experiment C, changes in nitrate-ion concentrations were much more pronounced; the measured nitrate-ion concentrations for the study period were less than the initial concentrations for 62 of the 67 analyses. (USGS)

  18. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  19. Developing new extension of GafChromic RTQA2 film to patient quality assurance field using a plan-based calibration method

    NASA Astrophysics Data System (ADS)

    Peng, Jiayuan; Zhang, Zhen; Wang, Jiazhou; Xie, Jiang; Chen, Junchao; Hu, Weigang

    2015-10-01

    GafChromic RTQA2 film is a type of radiochromic film designed for light field and radiation field alignment. The aim of this study is to extend the application of RTQA2 film to the measurement of patient specific quality assurance (QA) fields as a 2D relative dosimeter. Pre-irradiated and post-irradiated RTQA2 films were scanned in reflection mode using a flatbed scanner. A plan-based calibration (PBC) method utilized the mapping information of the calculated dose image and film grayscale image to create a dose versus pixel value calibration model. This model was used to calibrate the film grayscale image to the film relative dose image. The dose agreement between calculated and film dose images were analyzed by gamma analysis. To evaluate the feasibility of this method, eight clinically approved RapidArc cases (one abdomen cancer and seven head-and-neck cancer patients) were tested using this method. Moreover, three MLC gap errors and two MLC transmission errors were introduced to eight Rapidarc cases respectively to test the robustness of this method. The PBC method could overcome the film lot and post-exposure time variations of RTQA2 film to get a good 2D relative dose calibration result. The mean gamma passing rate of eight patients was 97.90%  ±  1.7%, which showed good dose consistency between calculated and film dose images. In the error test, the PBC method could over-calibrate the film, which means some dose error in the film would be falsely corrected to keep the dose in film consistent with the dose in the calculated dose image. This would then lead to a false negative result in the gamma analysis. In these cases, the derivative curve of the dose calibration curve would be non-monotonic which would expose the dose abnormality. By using the PBC method, we extended the application of more economical RTQA2 film to patient specific QA. The robustness of the PBC method has been improved by analyzing the monotonicity of the derivative of the

  20. Capture-recapture and removal methods for sampling closed populations

    USGS Publications Warehouse

    White, Gary C.; Anderson, David R.; Burnham, Kenneth P.; Otis, David L.

    1982-01-01

    The problem of estimating animal abundance is common in wildlife management and environmental impact asessment. Capture-recapture and removal methods are often used to estimate population size. Statistical Inference From Capture Data On Closed Animal Populations, a monograph by Otis et al. (1978), provides a comprehensive synthesis of much of the wildlife and statistical literature on the methods, as well as some extensions of the general theory. In our primer, we focus on capture-recapture and removal methods for trapping studies in which a population is assumed to be closed and do not treat open-population models, such as the Jolly-Seber model, or catch-effort methods in any detail. The primer, written for students interested in population estimation, is intended for use with the more theoretical monograph.

  1. A method for reducing sampling jitter in digital control systems

    NASA Technical Reports Server (NTRS)

    Anderson, T. O.; HURBD W. J.; Hurd, W. J.

    1969-01-01

    Digital phase lock loop system is designed by smoothing the proportional control with a low pass filter. This method does not significantly affect the loop dynamics when the smoothing filter bandwidth is wide compared to loop bandwidth.

  2. Detecting spatial structures in throughfall data: the effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-04-01

    In the last three decades, an increasing number of studies analyzed spatial patterns in throughfall to investigate the consequences of rainfall redistribution for biogeochemical and hydrological processes in forests. In the majority of cases, variograms were used to characterize the spatial properties of the throughfall data. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and an appropriate layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation methods on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with heavy outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling), and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least

  3. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  4. Healthcare Software Assurance

    PubMed Central

    Cooper, Jason G.; Pauley, Keith A.

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA’s software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted. PMID:17238324

  5. Ant colony optimization as a method for strategic genotype sampling.

    PubMed

    Spangler, M L; Robbins, K R; Bertrand, J K; Macneil, M; Rekaya, R

    2009-06-01

    A simulation study was carried out to develop an alternative method of selecting animals to be genotyped. Simulated pedigrees included 5000 animals, each assigned genotypes for a bi-allelic single nucleotide polymorphism (SNP) based on assumed allelic frequencies of 0.7/0.3 and 0.5/0.5. In addition to simulated pedigrees, two beef cattle pedigrees, one from field data and the other from a research population, were used to test selected methods using simulated genotypes. The proposed method of ant colony optimization (ACO) was evaluated based on the number of alleles correctly assigned to ungenotyped animals (AK(P)), the probability of assigning true alleles (AK(G)) and the probability of correctly assigning genotypes (APTG). The proposed animal selection method of ant colony optimization was compared to selection using the diagonal elements of the inverse of the relationship matrix (A(-1)). Comparisons of these two methods showed that ACO yielded an increase in AK(P) ranging from 4.98% to 5.16% and an increase in APTG from 1.6% to 1.8% using simulated pedigrees. Gains in field data and research pedigrees were slightly lower. These results suggest that ACO can provide a better genotyping strategy, when compared to A(-1), with different pedigree sizes and structures. PMID:19220227

  6. RAVEN Quality Assurance Activities

    SciTech Connect

    Cogliati, Joshua Joseph

    2015-09-01

    This report discusses the quality assurance activities needed to raise the Quality Level of Risk Analysis in a Virtual Environment (RAVEN) from Quality Level 3 to Quality Level 2. This report also describes the general RAVEN quality assurance activities. For improving the quality, reviews of code changes have been instituted, more parts of testing have been automated, and improved packaging has been created. For upgrading the quality level, requirements have been created and the workflow has been improved.

  7. A Study of Tapered Beard Sampling Method as Used in HVI

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Beard method is used for sampling cotton fibers to generate fibrograms from which length parameters can be obtained. It is the sampling method used by HVI. HVI uses a fiber comb to sample cotton fibers and form a fiber beard for measuring fiber length parameters. A fundamental issue about this sampl...

  8. [DOE method for evaluating environmental and waste management samples: Revision 1, Addendum 1

    SciTech Connect

    Goheen, S.C.

    1995-04-01

    The US Dapartment of Energy`s (DOE`s) environmental and waste management (EM) sampling and analysis activities require that large numbers of samples be analyzed for materials characterization, environmental surveillance, and site-remediation programs. The present document, DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), is a supplemental resource for analyzing many of these samples.

  9. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of...

  10. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of...

  11. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the...

  12. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the...

  13. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the...

  14. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of...

  15. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of...

  16. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the...

  17. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of...

  18. Software Safety Assurance of Programmable Logic

    NASA Technical Reports Server (NTRS)

    Berens, Kalynnda

    2002-01-01

    Programmable Logic (PLC, FPGA, ASIC) devices are hybrids - hardware devices that are designed and programmed like software. As such, they fall in an assurance gray area. Programmable Logic is usually tested and verified as hardware, and the software aspects are ignored, potentially leading to safety or mission success concerns. The objective of this proposal is to first determine where and how Programmable Logic (PL) is used within NASA and document the current methods of assurance. Once that is known, raise awareness of the PL software aspects within the NASA engineering community and provide guidance for the use and assurance of PL form a software perspective.

  19. Several methods for concentrating bacteria in fluid samples

    NASA Technical Reports Server (NTRS)

    Thomas, R. R.

    1976-01-01

    The sensitivities of the firefly luciferase - ATP flow system and luminol flow system were established as 300,000 E. coli per milliliter and 10,000 E. coli per milliliter respectively. To achieve the detection limit of 1,000 bacteria per milliliter previously established, a method of concentrating microorganisms using a sartorius membrane filter system is investigated. Catalase in 50% ethanol is found to be a stable luminol standard and can be used up to 24 hours with only a 10% loss of activity. The luminol reagent is also stable over a 24 hour period. A method of preparing relatively inexpensive luciferase from desiccated firefly tails is developed.

  20. Performance assurance program plan

    SciTech Connect

    Rogers, B.H.

    1997-11-06

    B and W Protec, Inc. (BWP) is responsible for implementing the Performance Assurance Program for the Project Hanford Management Contract (PHMC) in accordance with DOE Order 470.1, Safeguards and Security Program (DOE 1995a). The Performance Assurance Program applies to safeguards and security (SAS) systems and their essential components (equipment, hardware, administrative procedures, Protective Force personnel, and other personnel) in direct support of Category I and H special nuclear material (SNM) protection. Performance assurance includes several Hanford Site activities that conduct performance, acceptance, operability, effectiveness, and validation tests. These activities encompass areas of training, exercises, quality assurance, conduct of operations, total quality management, self assessment, classified matter protection and control, emergency preparedness, and corrective actions tracking and trending. The objective of the Performance Assurance Program is to capture the critical data of the tests, training, etc., in a cost-effective, manageable program that reflects the overall effectiveness of the program while minimizing operational impacts. To aid in achieving this objective, BWP will coordinate the Performance Assurance Program for Fluor Daniel Hanford, Inc. (FDH) and serve as the central point for data collection.

  1. TASK TECHNICAL AND QUALITY ASSURANCE PLAN FOR THE CHARACTERIZATION AND LEACHING OF A THERMOWELL AND CONDUCTIVITY PROBE PIPE SAMPLE FROM TANK 48H

    SciTech Connect

    Fondeur, F

    2005-11-02

    has been issued. This task plan outlines the proposed method of analysis and testing to estimate (1) the thickness of the solid deposit, (2) chemical composition of the deposits and (3) the leaching behavior of the solid deposits in inhibited water (IW) and in Tank 48H aggregate solution.

  2. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas... June 23, 1987 and revised on April 18, 1991. This document is available on EPA's Web site at...

  3. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas... June 23, 1987 and revised on April 18, 1991. This document is available on EPA's Web site at...

  4. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas... June 23, 1987 and revised on April 18, 1991. This document is available on EPA's Web site at...

  5. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas... June 23, 1987 and revised on April 18, 1991. This document is available on EPA's Web site at...

  6. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas... June 23, 1987 and revised on April 18, 1991. This document is available on EPA's Web site at...

  7. TESTING METHODS FOR DETECTION OF CRYPTOSPORIDIUM SPP. IN WATER SAMPLES

    EPA Science Inventory

    A large waterborne outbreak of cryptosporidiosis in Milwaukee, Wisconsin, U.S.A. in 1993 prompted a search for ways to prevent large-scale waterborne outbreaks of protozoan parasitoses. Methods for detecting Cryptosporidium parvum play an integral role in strategies that lead to...

  8. A New IRT-Based Small Sample DIF Method.

    ERIC Educational Resources Information Center

    Tang, Huixing

    This paper describes an item response theory (IRT) based method of differential item functioning (DIF) detection that involves neither separate calibration nor ability grouping. IRT is used to generate residual scores, scores free of the effects of person or group ability and item difficulty. Analysis of variance is then used to test the group…

  9. COMPARISON OF LARGE RIVER SAMPLING METHODS ON ALGAL METRICS

    EPA Science Inventory

    We compared the results of four methods used to assess the algal communities at 60 sites distributed among four rivers. Based on Principle Component Analysis of physical habitat data collected concomitantly with the algal data, sites were separated into those with a mean thalweg...

  10. COMPARISON OF LARGE RIVER SAMPLING METHOD USING DIATOM METRICS

    EPA Science Inventory

    We compared the results of four methods used to assess the algal communities at 60 sites distributed among four rivers. Based on Principle Component Analysis of physical habitat data collected concomitantly with the algal data, sites were separated into those with a mean thalweg...

  11. High-throughput liquid-absorption preconcentrator sampling methods

    DOEpatents

    Zaromb, Solomon

    1994-01-01

    A system for detecting trace concentrations of an analyte in air includes a preconcentrator for the analyte and an analyte detector. The preconcentrator includes an elongated tubular container comprising a wettable material. The wettable material is continuously wetted with an analyte-sorbing liquid which flows from one part of the container to a lower end. Sampled air flows through the container in contact with the wetted material with a swirling motion which results in efficient transfer of analyte vapors or aerosol particles to the sorbing liquid and preconcentration of traces of analyte in the liquid. The preconcentrated traces of analyte may be either detected within the container or removed therefrom for injection into a separate detection means or for subsequent analysis.

  12. Importance Sampling Approach for the Nonstationary Approximation Error Method

    NASA Astrophysics Data System (ADS)

    Huttunen, J. M. J.; Lehikoinen, A.; Hämäläinen, J.; Kaipio, J. P.

    2010-09-01

    The approximation error approach has been earlier proposed to handle modelling, numerical and computational errors in inverse problems. The idea of the approach is to include the errors to the forward model and compute the approximate statistics of the errors using Monte Carlo sampling. This can be a computationally tedious task but the key property of the approach is that the approximate statistics can be calculated off-line before measurement process takes place. In nonstationary problems, however, information is accumulated over time, and the initial uncertainties may turn out to have been exaggerated. In this paper, we propose an importance weighing algorithm with which the approximation error statistics can be updated during the accumulation of measurement information. As a computational example, we study an estimation problem that is related to a convection-diffusion problem in which the velocity field is not accurately specified.

  13. 30 CFR 14.8 - Quality assurance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... MINING PRODUCTS REQUIREMENTS FOR THE APPROVAL OF FLAME-RESISTANT CONVEYOR BELTS General Provisions § 14.8... order to assure that the finished conveyor belt will meet the flame-resistance test— (1) Flame test a sample of each batch, lot, or slab of conveyor belts; or (2) Flame test or inspect a sample of each...

  14. 30 CFR 14.8 - Quality assurance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... MINING PRODUCTS REQUIREMENTS FOR THE APPROVAL OF FLAME-RESISTANT CONVEYOR BELTS General Provisions § 14.8... order to assure that the finished conveyor belt will meet the flame-resistance test— (1) Flame test a sample of each batch, lot, or slab of conveyor belts; or (2) Flame test or inspect a sample of each...

  15. 30 CFR 14.8 - Quality assurance.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... MINING PRODUCTS REQUIREMENTS FOR THE APPROVAL OF FLAME-RESISTANT CONVEYOR BELTS General Provisions § 14.8... order to assure that the finished conveyor belt will meet the flame-resistance test— (1) Flame test a sample of each batch, lot, or slab of conveyor belts; or (2) Flame test or inspect a sample of each...

  16. 30 CFR 14.8 - Quality assurance.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... MINING PRODUCTS REQUIREMENTS FOR THE APPROVAL OF FLAME-RESISTANT CONVEYOR BELTS General Provisions § 14.8... order to assure that the finished conveyor belt will meet the flame-resistance test— (1) Flame test a sample of each batch, lot, or slab of conveyor belts; or (2) Flame test or inspect a sample of each...

  17. 30 CFR 14.8 - Quality assurance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... MINING PRODUCTS REQUIREMENTS FOR THE APPROVAL OF FLAME-RESISTANT CONVEYOR BELTS General Provisions § 14.8... order to assure that the finished conveyor belt will meet the flame-resistance test— (1) Flame test a sample of each batch, lot, or slab of conveyor belts; or (2) Flame test or inspect a sample of each...

  18. Tissue sampling methods and standards for vertebrate genomics

    PubMed Central

    2012-01-01

    The recent rise in speed and efficiency of new sequencing technologies have facilitated high-throughput sequencing, assembly and analyses of genomes, advancing ongoing efforts to analyze genetic sequences across major vertebrate groups. Standardized procedures in acquiring high quality DNA and RNA and establishing cell lines from target species will facilitate these initiatives. We provide a legal and methodological guide according to four standards of acquiring and storing tissue for the Genome 10K Project and similar initiatives as follows: four-star (banked tissue/cell cultures, RNA from multiple types of tissue for transcriptomes, and sufficient flash-frozen tissue for 1 mg of DNA, all from a single individual); three-star (RNA as above and frozen tissue for 1 mg of DNA); two-star (frozen tissue for at least 700 μg of DNA); and one-star (ethanol-preserved tissue for 700 μg of DNA or less of mixed quality). At a minimum, all tissues collected for the Genome 10K and other genomic projects should consider each species’ natural history and follow institutional and legal requirements. Associated documentation should detail as much information as possible about provenance to ensure representative sampling and subsequent sequencing. Hopefully, the procedures outlined here will not only encourage success in the Genome 10K Project but also inspire the adaptation of standards by other genomic projects, including those involving other biota. PMID:23587255

  19. Assuring Life in Composite Systems

    NASA Technical Reports Server (NTRS)

    Chamis, Christos c.

    2008-01-01

    A computational simulation method is presented to assure life in composite systems by using dynamic buckling of smart composite shells as an example. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 9% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load. The uncertainties in the electric field strength and smart material volume fraction have moderate effects and thereby in the assured life of the shell.

  20. A sampling method for conducting relocation studies with freshwater mussels

    USGS Publications Warehouse

    Waller, D.L.; Rach, J.J.; Cope, W.G.; Luoma, J.A.

    1993-01-01

    Low recovery of transplanted mussels often prevents accurate estimates of survival. We developed a method that provided a high recovery of transplanted mussels and allowed for a reliable assessment of mortality. A 3 x 3 m polyvinyl chloride (PVC) pipe grid was secured to the sediment with iron reinforcing bars. The grid was divided into nine 1-m super(2) segments and each treatment segment, was stocked with 100 marked mussels. The recovery of mussels after six months exceeded 80% in all but one treatment group.

  1. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, T.D.; Beller, L.S.; Clark, M.L.; Klingler, K.M.

    1997-10-14

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: (a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; (b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; (c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and (d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus is also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container. 8 figs.

  2. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, Terry D.; Beller, Laurence S.; Clark, Michael L.; Klingler, Kerry M.

    1997-01-01

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus are also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container.

  3. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens. PMID:26155657

  4. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens.

  5. Random Sampling of Quantum States: a Survey of Methods. And Some Issues Regarding the Overparametrized Method

    NASA Astrophysics Data System (ADS)

    Maziero, Jonas

    2015-12-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed.

  6. Determination of methylmercury in marine biota samples: method validation.

    PubMed

    Carrasco, Luis; Vassileva, Emilia

    2014-05-01

    Regulatory authorities are expected to measure concentration of contaminants in foodstuffs, but the simple determination of total amount cannot be sufficient for fully judging its impact on the human health. In particular, the methylation of metals generally increases their toxicity; therefore validated analytical methods producing reliable results for the assessment of methylated species are highly needed. Nowadays, there is no legal limit for methylmercury (MeHg) in food matrices. Hence, no standardized method for the determination of MeHg exists within the international jurisdiction. Contemplating the possibility of a future legislative limit, a method for low level determination of MeHg in marine biota matrixes, based on aqueous-phase ethylation followed by purge and trap and gas chromatography (GC) coupled to pyrolysis-atomic fluorescence spectrometry (Py-AFS) detection, has been developed and validated. Five different extraction procedures, namely acid and alkaline leaching assisted by microwave and conventional oven heating, as well as enzymatic digestion, were evaluated in terms of their efficiency to extract MeHg from Scallop soft tissue IAEA-452 Certified Reference Material. Alkaline extraction with 25% (w/w) KOH in methanol, microwave-assisted extraction (MAE) with 5M HCl and enzymatic digestion with protease XIV yielded the highest extraction recoveries. Standard addition or the introduction of a dilution step were successfully applied to overcome the matrix effects observed when microwave-assisted extraction using 25% (w/w) KOH in methanol or 25% (w/v) aqueous TMAH were used. ISO 17025 and Eurachem guidelines were followed to perform the validation of the methodology. Accordingly, blanks, selectivity, calibration curve, linearity (0.9995), working range (1-800pg), recovery (97%), precision, traceability, limit of detection (0.45pg), limit of quantification (0.85pg) and expanded uncertainty (15.86%, k=2) were assessed with Fish protein Dorm-3 Certified

  7. Determination of optimal sampling times for a two blood sample clearance method using (51)Cr-EDTA in cats.

    PubMed

    Vandermeulen, Eva; De Sadeleer, Carlos; Piepsz, Amy; Ham, Hamphrey R; Dobbeleir, André A; Vermeire, Simon T; Van Hoek, Ingrid M; Daminet, Sylvie; Slegers, Guido; Peremans, Kathelijne Y

    2010-08-01

    Estimation of the glomerular filtration rate (GFR) is a useful tool in the evaluation of kidney function in feline medicine. GFR can be determined by measuring the rate of tracer disappearance from the blood, and although these measurements are generally performed by multi-sampling techniques, simplified methods are more convenient in clinical practice. The optimal times for a simplified sampling strategy with two blood samples (2BS) for GFR measurement in cats using plasma (51)chromium ethylene diamine tetra-acetic acid ((51)Cr-EDTA) clearance were investigated. After intravenous administration of (51)Cr-EDTA, seven blood samples were obtained in 46 cats (19 euthyroid and 27 hyperthyroid cats, none with previously diagnosed chronic kidney disease (CKD)). The plasma clearance was then calculated from the seven point blood kinetics (7BS) and used for comparison to define the optimal sampling strategy by correlating different pairs of time points to the reference method. Mean GFR estimation for the reference method was 3.7+/-2.5 ml/min/kg (mean+/-standard deviation (SD)). Several pairs of sampling times were highly correlated with this reference method (r(2) > or = 0.980), with the best results when the first sample was taken 30 min after tracer injection and the second sample between 198 and 222 min after injection; or with the first sample at 36 min and the second at 234 or 240 min (r(2) for both combinations=0.984). Because of the similarity of GFR values obtained with the 2BS method in comparison to the values obtained with the 7BS reference method, the simplified method may offer an alternative for GFR estimation. Although a wide range of GFR values was found in the included group of cats, the applicability should be confirmed in cats suspected of renal disease and with confirmed CKD. Furthermore, although no indications of age-related effect were found in this study, a possible influence of age should be included in future studies. PMID:20452793

  8. Sparsity-weighted outlier FLOODing (OFLOOD) method: Efficient rare event sampling method using sparsity of distribution.

    PubMed

    Harada, Ryuhei; Nakamura, Tomotake; Shigeta, Yasuteru

    2016-03-30

    As an extension of the Outlier FLOODing (OFLOOD) method [Harada et al., J. Comput. Chem. 2015, 36, 763], the sparsity of the outliers defined by a hierarchical clustering algorithm, FlexDice, was considered to achieve an efficient conformational search as sparsity-weighted "OFLOOD." In OFLOOD, FlexDice detects areas of sparse distribution as outliers. The outliers are regarded as candidates that have high potential to promote conformational transitions and are employed as initial structures for conformational resampling by restarting molecular dynamics simulations. When detecting outliers, FlexDice defines a rank in the hierarchy for each outlier, which relates to sparsity in the distribution. In this study, we define a lower rank (first ranked), a medium rank (second ranked), and the highest rank (third ranked) outliers, respectively. For instance, the first-ranked outliers are located in a given conformational space away from the clusters (highly sparse distribution), whereas those with the third-ranked outliers are nearby the clusters (a moderately sparse distribution). To achieve the conformational search efficiently, resampling from the outliers with a given rank is performed. As demonstrations, this method was applied to several model systems: Alanine dipeptide, Met-enkephalin, Trp-cage, T4 lysozyme, and glutamine binding protein. In each demonstration, the present method successfully reproduced transitions among metastable states. In particular, the first-ranked OFLOOD highly accelerated the exploration of conformational space by expanding the edges. In contrast, the third-ranked OFLOOD reproduced local transitions among neighboring metastable states intensively. For quantitatively evaluations of sampled snapshots, free energy calculations were performed with a combination of umbrella samplings, providing rigorous landscapes of the biomolecules.

  9. [Comparative Analysis of Spectrophotometric Methods of the Protein Measurement in the Pectic Polysaccharide Samples].

    PubMed

    Ponomareva, S A; Golovchenko, V V; Patova, O A; Vanchikova, E V; Ovodov, Y S

    2015-01-01

    For the assay to reliability of determination of the protein content in the pectic polysaccharide samples by absorbance in the ultraviolet and visible regions of the spectrum a comparison of the eleven techniques called Flores, Lovry, Bradford, Sedmak, Rueman (ninhydrin reaction) methods, the method of ultraviolet spectrophotometry, the method Benedict's reagent, the method Nessler's reagent, the method with amide black, the bicinchoninic reagent and the biuret method was carried out. The data obtained show that insufficient sensitivity of the seven methods from the listed techniques doesn't allow their usage for determination of protein content in pectic polysaccharide samples. But the Lowry, Bradford, Sedmak methods, and the method Nessler's reagent may be used for determination of protein content in pectic polysaccharide samples, and the Bradford method is advisable for protein contaminants content determination in pectic polysaccharide samples in case protein content is less than 15%, and the Lowry method--for samples is more than 15%. PMID:26165122

  10. Method for sequential injection of liquid samples for radioisotope separations

    DOEpatents

    Egorov, Oleg B.; Grate, Jay W.; Bray, Lane A.

    2000-01-01

    The present invention is a method of separating a short-lived daughter isotope from a longer lived parent isotope, with recovery of the parent isotope for further use. Using a system with a bi-directional pump and one or more valves, a solution of the parent isotope is processed to generate two separate solutions, one of which contains the daughter isotope, from which the parent has been removed with a high decontamination factor, and the other solution contains the recovered parent isotope. The process can be repeated on this solution of the parent isotope. The system with the fluid drive and one or more valves is controlled by a program on a microprocessor executing a series of steps to accomplish the operation. In one approach, the cow solution is passed through a separation medium that selectively retains the desired daughter isotope, while the parent isotope and the matrix pass through the medium. After washing this medium, the daughter is released from the separation medium using another solution. With the automated generator of the present invention, all solution handling steps necessary to perform a daughter/parent radionuclide separation, e.g. Bi-213 from Ac-225 "cow" solution, are performed in a consistent, enclosed, and remotely operated format. Operator exposure and spread of contamination are greatly minimized compared to the manual generator procedure described in U.S. patent application Ser. No. 08/789,973, now U.S. Pat. No. 5,749,042, herein incorporated by reference. Using 16 mCi of Ac-225 there was no detectable external contamination of the instrument components.

  11. Flight Dynamics Mission Support and Quality Assurance Process

    NASA Technical Reports Server (NTRS)

    Oh, InHwan

    1996-01-01

    This paper summarizes the method of the Computer Sciences Corporation Flight Dynamics Operation (FDO) quality assurance approach to support the National Aeronautics and Space Administration Goddard Space Flight Center Flight Dynamics Support Branch. Historically, a strong need has existed for developing systematic quality assurance using methods that account for the unique nature and environment of satellite Flight Dynamics mission support. Over the past few years FDO has developed and implemented proactive quality assurance processes applied to each of the six phases of the Flight Dynamics mission support life cycle: systems and operations concept, system requirements and specifications, software development support, operations planing and training, launch support, and on-orbit mission operations. Rather than performing quality assurance as a final step after work is completed, quality assurance has been built in as work progresses in the form of process assurance. Process assurance activities occur throughout the Flight Dynamics mission support life cycle. The FDO Product Assurance Office developed process checklists for prephase process reviews, mission team orientations, in-progress reviews, and end-of-phase audits. This paper will outline the evolving history of FDO quality assurance approaches, discuss the tailoring of Computer Science Corporations's process assurance cycle procedures, describe some of the quality assurance approaches that have been or are being developed, and present some of the successful results.

  12. Assuring eating quality of meat.

    PubMed

    Dalen, G A

    1996-01-01

    The way of assuring quality has changed over the years, from inspection of end product to quality management systems and on-line process control. The latter concepts have had a great impact in many industries during the last decades. But the concept of Total Quality is continuos improvement so it is time to take advantage of the next generation of quality assurance tools: Quality by Design. This is the most powerful instrument in quality assurance today. Quality by design has been used with outstanding results in many industries as the automobile and the electronics industry. Maybe the meat industry will be the next? To succeed, the "eating quality attributes" that are most important to the customer must be brought into focus. The challenge to the meat research scientist is to design products and processes that take care of customer needs despite variation in the raw material and the consumer's rough handling. The Quality Management Standards are helpful in conducting the design and production process, but to focus on the right aspects, there also are need for suitable methods as Quality Function Deployment. Customer needs change and new research changes old 'truths'. This require an organisation, a quality system and a culture which can handle rapid changes and a diversity of customer needs.

  13. Initial evaluation of Centroidal Voronoi Tessellation method for statistical sampling and function integration.

    SciTech Connect

    Romero, Vicente Jose; Peterson, Janet S.; Burkhardt, John V.; Gunzburger, Max Donald

    2003-09-01

    A recently developed Centroidal Voronoi Tessellation (CVT) unstructured sampling method is investigated here to assess its suitability for use in statistical sampling and function integration. CVT efficiently generates a highly uniform distribution of sample points over arbitrarily shaped M-Dimensional parameter spaces. It has recently been shown on several 2-D test problems to provide superior point distributions for generating locally conforming response surfaces. In this paper, its performance as a statistical sampling and function integration method is compared to that of Latin-Hypercube Sampling (LHS) and Simple Random Sampling (SRS) Monte Carlo methods, and Halton and Hammersley quasi-Monte-Carlo sequence methods. Specifically, sampling efficiencies are compared for function integration and for resolving various statistics of response in a 2-D test problem. It is found that on balance CVT performs best of all these sampling methods on our test problems.

  14. A Typology of Mixed Methods Sampling Designs in Social Science Research

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Collins, Kathleen M. T.

    2007-01-01

    This paper provides a framework for developing sampling designs in mixed methods research. First, we present sampling schemes that have been associated with quantitative and qualitative research. Second, we discuss sample size considerations and provide sample size recommendations for each of the major research designs for quantitative and…

  15. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    PubMed Central

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  16. Comparison of Assurance GDS(®) MPX ID for Top STEC with Reference Culture Methods for the Detection of E. coli Top 6 STEC; Direct Confirmation of Top 6 STEC from Isolation Plates and Determination of Equivalence of PickPen(®) and FSIS OctoMACS™ Concentration Protocols.

    PubMed

    Feldsine, Philip; Lienau, Andrew H; Shah, Khyati; Immermann, Amy; Soliven, Khanh; Kaur, Mandeep; Kerr, David E; Jucker, Markus; Hammack, Tom; Brodsky, Michael; Agin, James

    2016-01-01

    Assurance GDS(®) MPX ID for Top Shiga toxin-producing Escherichia coli (STEC; MPX ID) was validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Foods and Environmental Surfaces as (1) a secondary screening method for specific detection of the Top 6 STEC serogroups (O26, O45, O103, O111, O121, and O145) in raw beef trim, raw ground beef, raw spinach, and on stainless steel; and (2) as a confirmatory method for the identification of pure culture isolates as Top 6 STEC. MPX ID is used in conjunction with the upfront BCS Assurance GDS MPX Top 7 STEC assay. This Performance Tested Method(SM) validation has two main parts: Method Developer studies and the Independent Laboratory study. A total of 180 samples and controls were analyzed. Results showed that MPX ID had no statistically significant differences with the reference culture methods for the detection of Top 6 STEC in the food matrixes (raw beef trim, raw ground beef, and raw spinach) and environmental sponges (stainless steel) studied. Inclusivity/exclusivity studies were also conducted. One hundred percent inclusivity among the 50 Top 6 STEC serovars tested and 100% exclusivity for the 30 non-Top 6 STEC organisms tested were demonstrated. For validation of MPX ID as a confirmatory method for isolated colonies, all inclusivity and exclusivity organisms were streaked for isolation onto five STEC plating media: modified rainbow agar, Levine's eosin-methylene blue (L-EMB) agar, rainbow agar with novobiocin and cefixime, and enterohemolysin agar with selective agents as well as trypticase soy agar with yeast extract. These isolated colonies were suspended and analyzed by Assurance GDS MPX Top 7 STEC and MPX ID. MPX ID was able to correctly confirm all inclusivity organisms from all plate types, except two STEC isolates from L-EMB agar plates only in the Independent Laboratory study. All exclusivity organisms were correctly determined by MPX ID as non

  17. Enhanced Sampling in Free Energy Calculations: Combining SGLD with the Bennett's Acceptance Ratio and Enveloping Distribution Sampling Methods.

    PubMed

    König, Gerhard; Miller, Benjamin T; Boresch, Stefan; Wu, Xiongwu; Brooks, Bernard R

    2012-10-01

    One of the key requirements for the accurate calculation of free energy differences is proper sampling of conformational space. Especially in biological applications, molecular dynamics simulations are often confronted with rugged energy surfaces and high energy barriers, leading to insufficient sampling and, in turn, poor convergence of the free energy results. In this work, we address this problem by employing enhanced sampling methods. We explore the possibility of using self-guided Langevin dynamics (SGLD) to speed up the exploration process in free energy simulations. To obtain improved free energy differences from such simulations, it is necessary to account for the effects of the bias due to the guiding forces. We demonstrate how this can be accomplished for the Bennett's acceptance ratio (BAR) and the enveloping distribution sampling (EDS) methods. While BAR is considered among the most efficient methods available for free energy calculations, the EDS method developed by Christ and van Gunsteren is a promising development that reduces the computational costs of free energy calculations by simulating a single reference state. To evaluate the accuracy of both approaches in connection with enhanced sampling, EDS was implemented in CHARMM. For testing, we employ benchmark systems with analytical reference results and the mutation of alanine to serine. We find that SGLD with reweighting can provide accurate results for BAR and EDS where conventional molecular dynamics simulations fail. In addition, we compare the performance of EDS with other free energy methods. We briefly discuss the implications of our results and provide practical guidelines for conducting free energy simulations with SGLD.

  18. Comparing Respondent-Driven Sampling and Targeted Sampling Methods of Recruiting Injection Drug Users in San Francisco

    PubMed Central

    Malekinejad, Mohsen; Vaudrey, Jason; Martinez, Alexis N.; Lorvick, Jennifer; McFarland, Willi; Raymond, H. Fisher

    2010-01-01

    The objective of this article is to compare demographic characteristics, risk behaviors, and service utilization among injection drug users (IDUs) recruited from two separate studies in San Francisco in 2005, one which used targeted sampling (TS) and the other which used respondent-driven sampling (RDS). IDUs were recruited using TS (n = 651) and RDS (n = 534) and participated in quantitative interviews that included demographic characteristics, risk behaviors, and service utilization. Prevalence estimates and 95% confidence intervals (CIs) were calculated to assess whether there were differences in these variables by sampling method. There was overlap in 95% CIs for all demographic variables except African American race (TS: 45%, 53%; RDS: 29%, 44%). Maps showed that the proportion of IDUs distributed across zip codes were similar for the TS and RDS sample, with the exception of a single zip code that was more represented in the TS sample. This zip code includes an isolated, predominantly African American neighborhood where only the TS study had a field site. Risk behavior estimates were similar for both TS and RDS samples, although self-reported hepatitis C infection was lower in the RDS sample. In terms of service utilization, more IDUs in the RDS sample reported no recent use of drug treatment and syringe exchange program services. Our study suggests that perhaps a hybrid sampling plan is best suited for recruiting IDUs in San Francisco, whereby the more intensive ethnographic and secondary analysis components of TS would aid in the planning of seed placement and field locations for RDS. PMID:20582573

  19. Comparing respondent-driven sampling and targeted sampling methods of recruiting injection drug users in San Francisco.

    PubMed

    Kral, Alex H; Malekinejad, Mohsen; Vaudrey, Jason; Martinez, Alexis N; Lorvick, Jennifer; McFarland, Willi; Raymond, H Fisher

    2010-09-01

    The objective of this article is to compare demographic characteristics, risk behaviors, and service utilization among injection drug users (IDUs) recruited from two separate studies in San Francisco in 2005, one which used targeted sampling (TS) and the other which used respondent-driven sampling (RDS). IDUs were recruited using TS (n = 651) and RDS (n = 534) and participated in quantitative interviews that included demographic characteristics, risk behaviors, and service utilization. Prevalence estimates and 95% confidence intervals (CIs) were calculated to assess whether there were differences in these variables by sampling method. There was overlap in 95% CIs for all demographic variables except African American race (TS: 45%, 53%; RDS: 29%, 44%). Maps showed that the proportion of IDUs distributed across zip codes were similar for the TS and RDS sample, with the exception of a single zip code that was more represented in the TS sample. This zip code includes an isolated, predominantly African American neighborhood where only the TS study had a field site. Risk behavior estimates were similar for both TS and RDS samples, although self-reported hepatitis C infection was lower in the RDS sample. In terms of service utilization, more IDUs in the RDS sample reported no recent use of drug treatment and syringe exchange program services. Our study suggests that perhaps a hybrid sampling plan is best suited for recruiting IDUs in San Francisco, whereby the more intensive ethnographic and secondary analysis components of TS would aid in the planning of seed placement and field locations for RDS.

  20. {sup 222}Rn in water: A comparison of two sample collection methods and two sample transport methods, and the determination of temporal variation in North Carolina ground water

    SciTech Connect

    Hightower, J.H. III

    1994-12-31

    Objectives of this field experiment were: (1) determine whether there was a statistically significant difference between the radon concentrations of samples collected by EPA`s standard method, using a syringe, and an alternative, slow-flow method; (2) determine whether there was a statistically significant difference between the measured radon concentrations of samples mailed vs samples not mailed; and (3) determine whether there was a temporal variation of water radon concentration over a 7-month period. The field experiment was conducted at 9 sites, 5 private wells, and 4 public wells, at various locations in North Carolina. Results showed that a syringe is not necessary for sample collection, there was generally no significant radon loss due to mailing samples, and there was statistically significant evidence of temporal variations in water radon concentrations.

  1. Sample Size Determination: A Comparison of Attribute, Continuous Variable, and Cell Size Methods.

    ERIC Educational Resources Information Center

    Clark, Philip M.

    1984-01-01

    Describes three methods of sample size determination, each having its use in investigation of social science problems: Attribute method; Continuous Variable method; Galtung's Cell Size method. Statistical generalization, benefits of cell size method (ease of use, trivariate analysis and trichotyomized variables), and choice of method are…

  2. A method for sampling halothane and enflurane present in trace amounts in ambient air.

    PubMed

    Burm, A G; Spierdijk, J

    1979-03-01

    A method for the sampling of small amounts of halothane and enflurane in ambient air is described. Sampling is performed by drawing air through a sampling tube packed with Porapak Q, which absorbs the anesthetic agent. The amount absorbed is determined by gas chromatography after thermal desorption. This method can be used for "spot" or personal sampling or for determining mean whole-room concentrations over relatively long periods (several hours).

  3. Quality Assurance for All

    ERIC Educational Resources Information Center

    Cheung, Peter P. T.; Tsui, Cecilia B. S.

    2010-01-01

    For higher education reform, most decision-makers aspire to achieving a higher participation rate and a respectable degree of excellence with diversity at the same time. But very few know exactly how. External quality assurance is a fair basis for differentiation but there can be doubt and resistance in some quarters. Stakeholder interests differ…

  4. Mission Operations Assurance

    NASA Technical Reports Server (NTRS)

    Faris, Grant

    2012-01-01

    Integrate the mission operations assurance function into the flight team providing: (1) value added support in identifying, mitigating, and communicating the project's risks and, (2) being an essential member of the team during the test activities, training exercises and critical flight operations.

  5. Quality assurance in Zambia.

    PubMed

    Reinke, J; Tembo, J; Limbambala, M F; Chikuta, S; Zaenger, D

    1996-01-01

    Primary health care reforms in Zambia have focused on the themes of effective leadership, community involvement, and improved service quality. To achieve these goals, the Ministry of Health's structure has been decentralized and a Health Reforms Implementation Team (including a Quality Assurance Unit) has been established. This unit collaborates with government and private sector organizations and professional groups in areas such as strategic planning, problem solving, facility assessment, standards setting, and indicator development. Each province has two linkage facilitators who provide district-level training and support to quality assurance coaches. As part of this process, staff at Nanga Rural Health Center in Mazabuka District selected patient privacy as a priority quality assurance issue and established an enclosed area for patient interviews. This measure facilitated increased patient disclosure about and comfort with discussing sensitive medical issues such as family planning and sexually transmitted diseases. Next, the health center staff examined the problem of pharmaceutical shortages, and user fees were identified as a means of purchasing commonly unavailable drugs. At the Magoye Rural Health Center, quality assurance assessment led to the consolidation of services such as infant weighing and immunization at the same location, thereby significantly increasing service utilization.

  6. Field Methods and Quality-Assurance Plan for Quality-of-Water Activities, U.S. Geological Survey, Idaho National Laboratory, Idaho

    USGS Publications Warehouse

    Knobel, LeRoy L.; Tucker, Betty J.; Rousseau, Joseph P.

    2008-01-01

    Water-quality activities conducted by the staff of the U.S. Geological Survey (USGS) Idaho National Laboratory (INL) Project Office coincide with the USGS mission of appraising the quantity and quality of the Nation's water resources. The activities are conducted in cooperation with the U.S. Department of Energy's (DOE) Idaho Operations Office. Results of the water-quality investigations are presented in various USGS publications or in refereed scientific journals. The results of the studies are highly regarded, and they are used with confidence by researchers, regulatory and managerial agencies, and interested civic groups. In its broadest sense, quality assurance refers to doing the job right the first time. It includes the functions of planning for products, review and acceptance of the products, and an audit designed to evaluate the system that produces the products. Quality control and quality assurance differ in that quality control ensures that things are done correctly given the 'state-of-the-art' technology, and quality assurance ensures that quality control is maintained within specified limits.

  7. Field methods and quality-assurance plan for water-quality activities and water-level measurements, U.S. Geological Survey, Idaho National Laboratory, Idaho

    USGS Publications Warehouse

    Bartholomay, Roy C.; Maimer, Neil V.; Wehnke, Amy J.

    2014-01-01

    Water-quality activities and water-level measurements by the personnel of the U.S. Geological Survey (USGS) Idaho National Laboratory (INL) Project Office coincide with the USGS mission of appraising the quantity and quality of the Nation’s water resources. The activities are carried out in cooperation with the U.S. Department of Energy (DOE) Idaho Operations Office. Results of the water-quality and hydraulic head investigations are presented in various USGS publications or in refereed scientific journals and the data are stored in the National Water Information System (NWIS) database. The results of the studies are used by researchers, regulatory and managerial agencies, and interested civic groups. In the broadest sense, quality assurance refers to doing the job right the first time. It includes the functions of planning for products, review and acceptance of the products, and an audit designed to evaluate the system that produces the products. Quality control and quality assurance differ in that quality control ensures that things are done correctly given the “state-of-the-art” technology, and quality assurance ensures that quality control is maintained within specified limits.

  8. A method to optimize sampling locations for measuring indoor air distributions

    NASA Astrophysics Data System (ADS)

    Huang, Yan; Shen, Xiong; Li, Jianmin; Li, Bingye; Duan, Ran; Lin, Chao-Hsin; Liu, Junjie; Chen, Qingyan

    2015-02-01

    Indoor air distributions, such as the distributions of air temperature, air velocity, and contaminant concentrations, are very important to occupants' health and comfort in enclosed spaces. When point data is collected for interpolation to form field distributions, the sampling locations (the locations of the point sensors) have a significant effect on time invested, labor costs and measuring accuracy on field interpolation. This investigation compared two different sampling methods: the grid method and the gradient-based method, for determining sampling locations. The two methods were applied to obtain point air parameter data in an office room and in a section of an economy-class aircraft cabin. The point data obtained was then interpolated to form field distributions by the ordinary Kriging method. Our error analysis shows that the gradient-based sampling method has 32.6% smaller error of interpolation than the grid sampling method. We acquired the function between the interpolation errors and the sampling size (the number of sampling points). According to the function, the sampling size has an optimal value and the maximum sampling size can be determined by the sensor and system errors. This study recommends the gradient-based sampling method for measuring indoor air distributions.

  9. Sampling uncertainty evaluation for data acquisition board based on Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ge, Leyi; Wang, Zhongyu

    2008-10-01

    Evaluating the data acquisition board sampling uncertainty is a difficult problem in the field of signal sampling. This paper analyzes the sources of dada acquisition board sampling uncertainty in the first, then introduces a simulation theory of dada acquisition board sampling uncertainty evaluation based on Monte Carlo method and puts forward a relation model of sampling uncertainty results, sampling numbers and simulation times. In the case of different sample numbers and different signal scopes, the author establishes a random sampling uncertainty evaluation program of a PCI-6024E data acquisition board to execute the simulation. The results of the proposed Monte Carlo simulation method are in a good agreement with the GUM ones, and the validities of Monte Carlo method are represented.

  10. A comparison of swab and maceration methods for bacterial sampling of pig carcasses.

    PubMed Central

    Morgan, I. R.; Krautil, F.; Craven, J. A.

    1985-01-01

    A swabbing technique was compared with an excision and maceration technique for bacteriological sampling of pig carcass skin surfaces. Total viable counts at 37 degrees C obtained by swabbing were 46% of those obtained by maceration. At 21 degrees C, swabbing gave total viable counts which were 54% of the counts obtained from excision samples. Escherichia coli counts showed wide variation with both sampling methods. Neither method was more efficient than the other in recovering E. coli, although excision sampling gave generally higher counts. Both methods were equally effective at recovering salmonellae from carcass surfaces. There was no significant difference between the methods in recovering particular Salmonella serotypes. PMID:3905957

  11. Evaluation of seven aquatic sampling methods for amphibians and other aquatic fauna

    USGS Publications Warehouse

    Gunzburger, M.S.

    2007-01-01

    To design effective and efficient research and monitoring programs researchers must have a thorough understanding of the capabilities and limitations of their sampling methods. Few direct comparative studies exist for aquatic sampling methods for amphibians. The objective of this study was to simultaneously employ seven aquatic sampling methods in 10 wetlands to compare amphibian species richness and number of individuals detected with each method. Four sampling methods allowed counts of individuals (metal dipnet, D-frame dipnet, box trap, crayfish trap), whereas the other three methods allowed detection of species (visual encounter, aural, and froglogger). Amphibian species richness was greatest with froglogger, box trap, and aural samples. For anuran species, the sampling methods by which each life stage was detected was related to relative length of larval and breeding periods and tadpole size. Detection probability of amphibians varied across sampling methods. Box trap sampling resulted in the most precise amphibian count, but the precision of all four count-based methods was low (coefficient of variation > 145 for all methods). The efficacy of the four count sampling methods at sampling fish and aquatic invertebrates was also analyzed because these predatory taxa are known to be important predictors of amphibian habitat distribution. Species richness and counts were similar for fish with the four methods, whereas invertebrate species richness and counts were greatest in box traps. An effective wetland amphibian monitoring program in the southeastern United States should include multiple sampling methods to obtain the most accurate assessment of species community composition at each site. The combined use of frogloggers, crayfish traps, and dipnets may be the most efficient and effective amphibian monitoring protocol. ?? 2007 Brill Academic Publishers.

  12. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling.

    PubMed

    Rauscher, Sarah; Neale, Chris; Pomès, Régis

    2009-10-13

    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  13. 40 CFR Appendix F to Part 60 - Quality Assurance Procedures

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 60—Quality Assurance Procedures Procedure 1. Quality Assurance Requirements for Gas Continuous... 40 CFR part 60. Procedure 1 also requires the analysis of the EPA audit samples concurrent with....1Continuous Emission Monitoring System. The total equipment required for the determination of a...

  14. 40 CFR Appendix F to Part 60 - Quality Assurance Procedures

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 60—Quality Assurance Procedures Procedure 1. Quality Assurance Requirements for Gas Continuous... 40 CFR part 60. Procedure 1 also requires the analysis of the EPA audit samples concurrent with....1Continuous Emission Monitoring System. The total equipment required for the determination of a...

  15. 40 CFR Appendix F to Part 60 - Quality Assurance Procedures

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 60—Quality Assurance Procedures Procedure 1. Quality Assurance Requirements for Gas Continuous... 40 CFR part 60. Procedure 1 also requires the analysis of the EPA audit samples concurrent with....1Continuous Emission Monitoring System. The total equipment required for the determination of a...

  16. Operational Environmental Monitoring Program Quality Assurance Project Plan

    SciTech Connect

    Perkins, C.J.

    1994-08-01

    This Quality Assurance Project Plan addresses the quality assurance requirements for the activities associated with the preoperational and operational environmental monitoring performed by Westinghouse Hanford Company as it implements the Operational Environmental Monitoring program. This plan applies to all sampling and monitoring activities performed by Westinghouse Hanford Company in implementing the Operational Environmental Monitoring program at the Hanford Site.

  17. Nursing Quality Assurance: The Wisconsin System

    ERIC Educational Resources Information Center

    Hover, Julie; Zimmer, Marie J.

    1978-01-01

    Evaluation model guidelines for hospital departments of nursing to use in their nursing quality assurance programs are presented as developed in Wisconsin. Four essential components of the Wisconsin outcome evaluation system are criteria, assessment, standards, and improvement of care. Sample tests and charts are included in the article. (MF)

  18. A Novel Method of Failure Sample Selection for Electrical Systems Using Ant Colony Optimization

    PubMed Central

    Tian, Shulin; Yang, Chenglin; Liu, Cheng

    2016-01-01

    The influence of failure propagation is ignored in failure sample selection based on traditional testability demonstration experiment method. Traditional failure sample selection generally causes the omission of some failures during the selection and this phenomenon could lead to some fearful risks of usage because these failures will lead to serious propagation failures. This paper proposes a new failure sample selection method to solve the problem. First, the method uses a directed graph and ant colony optimization (ACO) to obtain a subsequent failure propagation set (SFPS) based on failure propagation model and then we propose a new failure sample selection method on the basis of the number of SFPS. Compared with traditional sampling plan, this method is able to improve the coverage of testing failure samples, increase the capacity of diagnosis, and decrease the risk of using. PMID:27738424

  19. [Study on a method of selecting calibration samples in NIR spectral analysis].

    PubMed

    Qin, Chong; Chen, Wen-Wen; He, Xiong-Kui; Zhang, Lu-Da; Ma, Xiang

    2009-10-01

    In the present paper, a simple but novel method based on maximum linearly independent group was introduced into near-infrared (NIR) spectral analysis for selecting representative calibration samples. The experiment materials contained 2,652 tobacco powder samples, with 1,001 samples randomly selected as prediction set, and the others as representative sample candidate set from which calibration sample set was selected. The method of locating maximum linearly independent vectors was used to select representative samples from the spectral vectors of representative samples candidate set. The arithmetic was accomplished by function rref(X,q) in Matlab. The maximum linearly independent spectral vectors were treated as calibration samples set. When different calculating precision q was given, different amount of representative samples were acquired. The selected calibration sample set was used to build regression model to predict the total sugar of tobacco powder samples by PLS. The model was used to analyze 1001 samples in the prediction set. When selecting 32 representative samples, the model presented a good predictive veracity, whose predictive mean relative error was 3.6210%, and correlation coefficient was 0.9643. By paired-samples t-test, we found that the difference between the predicting result of model obtained by 32 samples and that obtained by 146 samples was not significant (alpha=0.05). Also, we compared the methods of randomly selecting calibration samples and maximum linearly independent selection by their predicting effects of models. In the experiment, correspondingly, six calibration sample sets were selected, one of which included 28 samples, while the others included 32, 41, 76, 146 and 163 samples respectively. The method of maximum linearly independent selecting samples turned out to be obviously better than that of randomly selecting. The result indicated that the proposed method can not only effectively enhance the cost-effectiveness of NIR

  20. METHOD DEVELOPMENT FOR THE DETERMINATION OF FORMALDEHYDE IN SAMPLES OF ENVIRONMENTAL ORIGIN

    EPA Science Inventory

    An analytical method was developed for the determination of formaldehyde in samples of environmental origin. After a review of the current literature, five candidate methods involving chemical derivatization were chosen for evaluation. The five derivatization reagents studied wer...

  1. Practical method for extraction of PCR-quality DNA from environmental soil samples.

    PubMed

    Fitzpatrick, Kelly A; Kersh, Gilbert J; Massung, Robert F

    2010-07-01

    Methods for the extraction of PCR-quality DNA from environmental soil samples by using pairs of commercially available kits were evaluated. Coxiella burnetii DNA was detected in spiked soil samples at <1,000 genome equivalents per gram of soil and in 12 (16.4%) of 73 environmental soil samples.

  2. Comparison of uranium determination in some Syrian geologic samples using three reactor based methods

    PubMed

    Jubeli

    2000-04-01

    A set of 25 samples of soil, sediments, carbonate and phosphate rocks from Syria were analysed for uranium, using three reactor based methods; instrumental neutron activation analysis (INAA), delayed neutron counting (DNC) and one cycle of irradiation utilizing the cyclic activation system (CAS). Although the three methods are capable of irradiation samples, the last method is the least established for U determination in rocks. The measurements obtained by the three methods are compared. The results show good agreement, with a distinct linear relationship and significant positive correlation coefficients. It was concluded that the CAS method could reliably be used to rapidly determine uranium in geological samples. PMID:10800739

  3. Improvements in pentosan polysulfate sodium quality assurance using fingerprint electropherograms.

    PubMed

    Schirm, B; Benend, H; Wätzig, H

    2001-04-01

    Complex samples from polymer production, plant extracts or biotechnology mixtures can be characterized by fingerprints. Currently, the standard approach for sample characterization employs near-infrared (NIR) spectroscopy fingerprinting. Up to now, however, fingerprints obtained by chromatography or electrophoresis could only be visually evaluated. This type of inspection is very labor-intensive and difficult to validate. In order to transfer the use of fingerprints from spectroscopy to electrophoresis, spectra-like properties must be obtained through a complete alignment of the electropherograms. This has been achieved by interpolation and wavelet filtering of the baseline signal in the present work. The resulting data have been classified by several algorithms. The methods under survey include self-organizing maps (SOMs), artificial neural networks (ANNs), soft independent modeling of class analogy (SIMCA) and k-nearest neighbors (KNNs). In order to test the performance of this combined approach in practice, it was applied to the quality assurance of pentosan polysulfate (PPS). A recently developed capillary electrophoresis (CE) method using indirect UV detection was employed in these studies [1]. All algorithms were well capable of classifying the examined PPS test batches. Even minor variations in the PPS composition, not perceptible by visual inspection, could be automatically detected. The whole method has been validated by classifying various (n = 400) unknown PPS quality assurance samples, which have been correctly identified without exception.

  4. Systems and methods for separating particles and/or substances from a sample fluid

    DOEpatents

    Mariella, Jr., Raymond P.; Dougherty, George M.; Dzenitis, John M.; Miles, Robin R.; Clague, David S.

    2016-11-01

    Systems and methods for separating particles and/or toxins from a sample fluid. A method according to one embodiment comprises simultaneously passing a sample fluid and a buffer fluid through a chamber such that a fluidic interface is formed between the sample fluid and the buffer fluid as the fluids pass through the chamber, the sample fluid having particles of interest therein; applying a force to the fluids for urging the particles of interest to pass through the interface into the buffer fluid; and substantially separating the buffer fluid from the sample fluid.

  5. Sorbent-based sampling methods for volatile and semi-volatile organic compounds in air Part 1: Sorbent-based air monitoring options.

    PubMed

    Woolfenden, Elizabeth

    2010-04-16

    Sorbent tubes/traps are widely used in combination with gas chromatographic (GC) analytical methods to monitor the vapour-phase fraction of organic compounds in air. Target compounds range in volatility from acetylene and freons to phthalates and PCBs and include apolar, polar and reactive species. Airborne vapour concentrations will vary depending on the nature of the location, nearby pollution sources, weather conditions, etc. Levels can range from low percent concentrations in stack and vent emissions to low part per trillion (ppt) levels in ultra-clean outdoor locations. Hundreds, even thousands of different compounds may be present in any given atmosphere. GC is commonly used in combination with mass spectrometry (MS) detection especially for environmental monitoring or for screening uncharacterised workplace atmospheres. Given the complexity and variability of organic vapours in air, no one sampling approach suits every monitoring scenario. A variety of different sampling strategies and sorbent media have been developed to address specific applications. Key sorbent-based examples include: active (pumped) sampling onto tubes packed with one or more sorbents held at ambient temperature; diffusive (passive) sampling onto sorbent tubes/cartridges; on-line sampling of air/gas streams into cooled sorbent traps; and transfer of air samples from containers (canisters, Tedlar) bags, etc.) into cooled sorbent focusing traps. Whichever sampling approach is selected, subsequent analysis almost always involves either solvent extraction or thermal desorption (TD) prior to GC(/MS) analysis. The overall performance of the air monitoring method will depend heavily on appropriate selection of key sampling and analytical parameters. This comprehensive review of air monitoring using sorbent tubes/traps is divided into 2 parts. (1) Sorbent-based air sampling option. (2) Sorbent selection and other aspects of optimizing sorbent-based air monitoring methods. The paper presents

  6. The impact of particle size selective sampling methods on occupational assessment of airborne beryllium particulates.

    PubMed

    Sleeth, Darrah K

    2013-05-01

    In 2010, the American Conference of Governmental Industrial Hygienists (ACGIH) formally changed its Threshold Limit Value (TLV) for beryllium from a 'total' particulate sample to an inhalable particulate sample. This change may have important implications for workplace air sampling of beryllium. A history of particle size-selective sampling methods, with a special focus on beryllium, will be provided. The current state of the science on inhalable sampling will also be presented, including a look to the future at what new methods or technology may be on the horizon. This includes new sampling criteria focused on particle deposition in the lung, proposed changes to the existing inhalable convention, as well as how the issues facing beryllium sampling may help drive other changes in sampling technology.

  7. Total nitrogen determination of various sample types: a comparison of the Hach, Kjeltec, and Kjeldahl methods.

    PubMed

    Watkins, K L; Veum, T L; Krause, G F

    1987-01-01

    Conventional Kjeldahl analysis with modifications, Kjeltec analysis with block digestion and semiautomated distillation, and the Hach method for determining nitrogen (N) were compared using a wide range of samples. Twenty different sample types were ground and mixed. Each sample type was divided into 5 subsamples which were analyzed for N by each of the 3 methods. In each sample type, differences (P less than 0.05) were detected among the 3 N determination methods in 5 of the 20 N sources analyzed. The mean N content over all 20 samples was higher with Kjeldahl analysis (P less than 0.05) than with Kjeltec, while Hach analysis produced intermediate results. Results also indicated that the Hach procedure had the greatest ability to detect differences in N content among sample types, being more sensitive than either other method (P less than 0.05).

  8. Improved Butanol-Methanol (BUME) Method by Replacing Acetic Acid for Lipid Extraction of Biological Samples.

    PubMed

    Cruz, Mutya; Wang, Miao; Frisch-Daiello, Jessica; Han, Xianlin

    2016-07-01

    Extraction of lipids from biological samples is a critical step in lipidomics, especially for shotgun lipidomics where lipid extracts are directly infused into a mass spectrometer. The butanol-methanol (BUME) extraction method was originally developed to extract lipids from plasma samples with 1 % acetic acid. Considering some lipids are sensitive to acidic environments, we modified this protocol by replacing acetic acid with lithium chloride solution and extended the modified extraction to tissue samples. Although no significant reduction of plasmalogen levels in the acidic BUME extracts of rat heart samples was found, the modified method was established to extract various tissue samples, including rat liver, heart, and plasma. Essentially identical profiles of the majority of lipid classes were obtained from the extracts of the modified BUME and traditional Bligh-Dyer methods. However, it was found that neither the original, nor the modified BUME method was suitable for 4-hydroxyalkenal species measurement in biological samples. PMID:27245345

  9. Rapid method for the determination of 226Ra in hydraulic fracturing wastewater samples

    DOE PAGES

    Maxwell, Sherrod L.; Culligan, Brian K.; Warren, Richard A.; McAlister, Daniel R.

    2016-03-24

    A new method that rapidly preconcentrates and measures 226Ra from hydraulic fracturing wastewater samples was developed in the Savannah River Environmental Laboratory. The method improves the quality of 226Ra measurements using gamma spectrometry by providing up to 100x preconcentration of 226Ra from this difficult sample matrix, which contains very high levels of calcium, barium, strontium, magnesium and sodium. The high chemical yield, typically 80-90%, facilitates a low detection limit, important for lower level samples, and indicates method ruggedness. Ba-133 tracer is used to determine chemical yield and correct for geometry-related counting issues. The 226Ra sample preparation takes < 2 hours.

  10. Quality assurance for gamma knives

    SciTech Connect

    Jones, E.D.; Banks, W.W.; Fischer, L.E.

    1995-09-01

    This report describes and summarizes the results of a quality assurance (QA) study of the Gamma Knife, a nuclear medical device used for the gamma irradiation of intracranial lesions. Focus was on the physical aspects of QA and did not address issues that are essentially medical, such as patient selection or prescription of dose. A risk-based QA assessment approach was used. Sample programs for quality control and assurance are included. The use of the Gamma Knife was found to conform to existing standards and guidelines concerning radiation safety and quality control of external beam therapies (shielding, safety reviews, radiation surveys, interlock systems, exposure monitoring, good medical physics practices, etc.) and to be compliant with NRC teletherapy regulations. There are, however, current practices for the Gamma Knife not covered by existing, formalized regulations, standards, or guidelines. These practices have been adopted by Gamma Knife users and continue to be developed with further experience. Some of these have appeared in publications or presentations and are slowly finding their way into recommendations of professional organizations.

  11. Optical method for the characterization of laterally-patterned samples in integrated circuits

    DOEpatents

    Maris, Humphrey J.

    2001-01-01

    Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.

  12. A LITERATURE REVIEW OF WIPE SAMPLING METHODS FOR CHEMICAL WARFARE AGENTS AND TOXIC INDUSTRIAL CHEMICALS

    EPA Science Inventory

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerous

    methods of wipe sampling exist, and each method has its own specification for the type of wipe, we...

  13. Sampling Methods and the Accredited Population in Athletic Training Education Research

    ERIC Educational Resources Information Center

    Carr, W. David; Volberding, Jennifer

    2009-01-01

    Context: We describe methods of sampling the widely-studied, yet poorly defined, population of accredited athletic training education programs (ATEPs). Objective: There are two purposes to this study; first to describe the incidence and types of sampling methods used in athletic training education research, and second to clearly define the…

  14. 7 CFR 51.308 - Methods of sampling and calculation of percentages.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Grades of Apples Methods of Sampling and Calculation of Percentages § 51.308 Methods of sampling and... weigh ten pounds or less, or in any container where the minimum diameter of the smallest apple does not vary more than 1/2 inch from the minimum diameter of the largest apple, percentages shall be...

  15. 7 CFR 51.308 - Methods of sampling and calculation of percentages.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Grades of Apples Methods of Sampling and Calculation of Percentages § 51.308 Methods of sampling and... weigh ten pounds or less, or in any container where the minimum diameter of the smallest apple does not vary more than 1/2 inch from the minimum diameter of the largest apple, percentages shall be...

  16. 40 CFR 80.8 - Sampling methods for gasoline and diesel fuel.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the American... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Sampling methods for gasoline and... gasoline and diesel fuel. The sampling methods specified in this section shall be used to collect...

  17. 40 CFR 80.8 - Sampling methods for gasoline, diesel fuel, fuel additives, and renewable fuels.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of the Federal Register under 5 U.S.C. 552(a) and 1 CFR part 51. To enforce any edition other than... 40 Protection of Environment 17 2014-07-01 2014-07-01 false Sampling methods for gasoline, diesel... Provisions § 80.8 Sampling methods for gasoline, diesel fuel, fuel additives, and renewable fuels....

  18. 40 CFR 80.8 - Sampling methods for gasoline and diesel fuel.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the American... 40 Protection of Environment 17 2012-07-01 2012-07-01 false Sampling methods for gasoline and... gasoline and diesel fuel. The sampling methods specified in this section shall be used to collect...

  19. 40 CFR 80.8 - Sampling methods for gasoline and diesel fuel.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the American... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Sampling methods for gasoline and... gasoline and diesel fuel. The sampling methods specified in this section shall be used to collect...

  20. Evaluation of beef trim sampling methods for detection of Shiga toxin-producing Escherichia coli (STEC)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Presence of Shiga toxin-producing Escherichia coli (STEC) is a major concern in ground beef. Several methods for sampling beef trim prior to grinding are currently used in the beef industry. The purpose of this study was to determine the efficacy of the sampling methods for detecting STEC in beef ...

  1. Application of a Permethrin Immunosorbent Assay Method to Residential Soil and Dust Samples

    EPA Science Inventory

    A low-cost, high throughput bioanalytical screening method was developed for monitoring cis/trans-permethrin in dust and soil samples. The method consisted of a simple sample preparation procedure [sonication with dichloromethane followed by a solvent exchange into methanol:wate...

  2. THE INFLUENCE OF PHYSICAL FACTORS ON COMPARATIVE PERFORMANCE OF SAMPLING METHODS IN LARGE RIVERS

    EPA Science Inventory

    In 1999, we compared five existing benthic macroinvertebrate sampling methods used in boatable rivers. Each sampling protocol was performed at each of 60 sites distributed among four rivers in the Ohio River drainage basin. Initial comparison of methods using key macroinvertebr...

  3. Viability of Actinobacillus pleuropneumoniae in frozen pig lung samples and comparison of different methods of direct diagnosis in fresh samples.

    PubMed

    Gutierrez, C B; Rodriguez Barbosa, J I; Gonzalez, O R; Tascon, R I; Rodriguez Ferri, E F

    1992-04-01

    A comparative study on different methods of diagnosis of Actinobacillus pleuropneumoniae from both fresh and frozen pig lungs is described. A total of 196 lung tissues with pneumonic lesions were examined for culture isolation on chocolate blood agar, as well as for antigen detection by means of the coagglutination test, the immunodiffusion test and the indirect ELISA. These samples were subsequently frozen for 1 yr and then they were recultured. A. pleuropneumoniae was recovered from fresh lung specimens in 30 cases (15.3%) and from frozen samples in only two cases (0.9%). Such a different degree of isolation demonstrates that long freezing had an adverse effect on the viability of this organism in lung samples. A pleuropneumoniae detection was positive in 134 samples (68.4%) by at least one of the immunological techniques examined. The indirect ELISA was the most sensitive and specific test, with antigen detected in 125 lungs (63.8%). In comparison with the coagglutination and immunodiffusion tests, the sensitivities of the indirect ELISA were 95.8 and 93.7%, and the specificities were 67.0 and 63.4%, respectively.

  4. Estimation of the sugar cane cultivated area from LANDSAT images using the two phase sampling method

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Cappelletti, C. A.; Mendonca, F. J.; Lee, D. C. L.; Shimabukuro, Y. E.

    1982-01-01

    A two phase sampling method and the optimal sampling segment dimensions for the estimation of sugar cane cultivated area were developed. This technique employs visual interpretations of LANDSAT images and panchromatic aerial photographs considered as the ground truth. The estimates, as a mean value of 100 simulated samples, represent 99.3% of the true value with a CV of approximately 1%; the relative efficiency of the two phase design was 157% when compared with a one phase aerial photographs sample.

  5. Quantitative method of determining beryllium or a compound thereof in a sample

    DOEpatents

    McCleskey, T. Mark; Ehler, Deborah S.; John, Kevin D.; Burrell, Anthony K.; Collis, Gavin E.; Minogue, Edel M.; Warner, Benjamin P.

    2010-08-24

    A method of determining beryllium or a beryllium compound thereof in a sample, includes providing a sample suspected of comprising beryllium or a compound thereof, extracting beryllium or a compound thereof from the sample by dissolving in a solution, adding a fluorescent indicator to the solution to thereby bind any beryllium or a compound thereof to the fluorescent indicator, and determining the presence or amount of any beryllium or a compound thereof in the sample by measuring fluorescence.

  6. Quantitative method of determining beryllium or a compound thereof in a sample

    DOEpatents

    McCleskey, T. Mark; Ehler, Deborah S.; John, Kevin D.; Burrell, Anthony K.; Collis, Gavin E.; Minogue, Edel M.; Warner, Benjamin P.

    2006-10-31

    A method of determining beryllium or a beryllium compound thereof in a sample, includes providing a sample suspected of comprising beryllium or a compound thereof, extracting beryllium or a compound thereof from the sample by dissolving in a solution, adding a fluorescent indicator to the solution to thereby bind any beryllium or a compound thereof to the fluorescent indicator, and determining the presence or amount of any beryllium or a compound thereof in the sample by measuring fluorescence.

  7. Map showing locations of samples dated by radiocarbon methods in the San Francisco Bay region

    USGS Publications Warehouse

    Wright, Robert H.

    1971-01-01

    The potential value of a radiocarbon date is diminished, however, if adequate site data are not taken with the sample and do not accompany the date in publication.  At a minimum, published dates should include an accurate location for the dated sample, type of material dated and method of dating, nature of the site, depth below surface (or other accurately defined datum) of date sample, stratigraphy of material overlying date sample, and the significance of the data in the study.

  8. Universal nucleic acids sample preparation method for cells, spores and their mixture

    DOEpatents

    Bavykin, Sergei

    2011-01-18

    The present invention relates to a method for extracting nucleic acids from biological samples. More specifically the invention relates to a universal method for extracting nucleic acids from unidentified biological samples. An advantage of the presently invented method is its ability to effectively and efficiently extract nucleic acids from a variety of different cell types including but not limited to prokaryotic or eukaryotic cells and/or recalcitrant organisms (i.e. spores). Unlike prior art methods which are focused on extracting nucleic acids from vegetative cell or spores, the present invention effectively extracts nucleic acids from spores, multiple cell types or mixtures thereof using a single method. Important that the invented method has demonstrated an ability to extract nucleic acids from spores and vegetative bacterial cells with similar levels effectiveness. The invented method employs a multi-step protocol which erodes the cell structure of the biological sample, isolates, labels, fragments nucleic acids and purifies labeled samples from the excess of dye.

  9. [Preparation of sub-standard samples and XRF analytical method of powder non-metallic minerals].

    PubMed

    Kong, Qin; Chen, Lei; Wang, Ling

    2012-05-01

    In order to solve the problem that standard samples of non-metallic minerals are not satisfactory in practical work by X-ray fluorescence spectrometer (XRF) analysis with pressed powder pellet, a method was studied how to make sub-standard samples according to standard samples of non-metallic minerals and to determine how they can adapt to analysis of mineral powder samples, taking the K-feldspar ore in Ebian-Wudu, Sichuan as an example. Based on the characteristic analysis of K-feldspar ore and the standard samples by X-ray diffraction (XRD) and chemical methods, combined with the principle of the same or similar between the sub-standard samples and unknown samples, the experiment developed the method of preparation of sub-standard samples: both of the two samples above mentioned should have the same kind of minerals and the similar chemical components, adapt mineral processing, and benefit making working curve. Under the optimum experimental conditions, a method for determination of SiO2, Al2O3, Fe2O3, TiO2, CaO, MgO, K2O and Na2O of K-feldspar ore by XRF was established. Thedetermination results are in good agreement with classical chemical methods, which indicates that this method was accurate.

  10. An improved regulatory sampling method for mapping and representing plant disease from a limited number of samples.

    PubMed

    Luo, W; Pietravalle, S; Parnell, S; van den Bosch, F; Gottwald, T R; Irey, M S; Parker, S R

    2012-06-01

    A key challenge for plant pathologists is to develop efficient methods to describe spatial patterns of disease spread accurately from a limited number of samples. Knowledge of disease spread is essential for informing and justifying plant disease management measures. A mechanistic modelling approach is adopted for disease mapping which is based on disease dispersal gradients and consideration of host pattern. The method is extended to provide measures of uncertainty for the estimates of disease at each host location. In addition, improvements have been made to increase computational efficiency by better initialising the disease status of unsampled hosts and speeding up the optimisation process of the model parameters. These improvements facilitate the practical use of the method by providing information on: (a) mechanisms of pathogen dispersal, (b) distance and pattern of disease spread, and (c) prediction of infection probabilities for unsampled hosts. Two data sets of disease observations, Huanglongbing (HLB) of citrus and strawberry powdery mildew, were used to evaluate the performance of the new method for disease mapping. The result showed that our method gave better estimates of precision for unsampled hosts, compared to both the original method and spatial interpolation. This enables decision makers to understand the spatial aspects of disease processes, and thus formulate regulatory actions accordingly to enhance disease control. PMID:22664065

  11. Comparability among four invertebrate sampling methods, Fountain Creek Basin, Colorado, 2010-2012

    USGS Publications Warehouse

    Zuellig, Robert E.; Bruce, James F.; Stogner, Robert W.; Brown, Krystal D.

    2014-01-01

    The U.S. Geological Survey, in cooperation with Colorado Springs City Engineering and Colorado Springs Utilities, designed a study to determine if sampling method and sample timing resulted in comparable samples and assessments of biological condition. To accomplish this task, annual invertebrate samples were collected concurrently using four sampling methods at 15 U.S. Geological Survey streamflow gages in the Fountain Creek basin from 2010 to 2012. Collectively, the four methods are used by local (U.S. Geological Survey cooperative monitoring program) and State monitoring programs (Colorado Department of Public Health and Environment) in the Fountain Creek basin to produce two distinct sample types for each program that target single-and multiple-habitats. This study found distinguishable differences between single-and multi-habitat sample types using both community similarities and multi-metric index values, while methods from each program within sample type were comparable. This indicates that the Colorado Department of Public Health and Environment methods were compatible with the cooperative monitoring program methods within multi-and single-habitat sample types. Comparisons between September and October samples found distinguishable differences based on community similarities for both sample types, whereas only differences were found for single-habitat samples when multi-metric index values were considered. At one site, differences between September and October index values from single-habitat samples resulted in opposing assessments of biological condition. Direct application of the results to inform the revision of the existing Fountain Creek basin U.S. Geological Survey cooperative monitoring program are discussed.

  12. Molecular cancer classification using a meta-sample-based regularized robust coding method

    PubMed Central

    2014-01-01

    Motivation Previous studies have demonstrated that machine learning based molecular cancer classification using gene expression profiling (GEP) data is promising for the clinic diagnosis and treatment of cancer. Novel classification methods with high efficiency and prediction accuracy are still needed to deal with high dimensionality and small sample size of typical GEP data. Recently the sparse representation (SR) method has been successfully applied to the cancer classification. Nevertheless, its efficiency needs to be improved when analyzing large-scale GEP data. Results In this paper we present the meta-sample-based regularized robust coding classification (MRRCC), a novel effective cancer classification technique that combines the idea of meta-sample-based cluster method with regularized robust coding (RRC) method. It assumes that the coding residual and the coding coefficient are respectively independent and identically distributed. Similar to meta-sample-based SR classification (MSRC), MRRCC extracts a set of meta-samples from the training samples, and then encodes a testing sample as the sparse linear combination of these meta-samples. The representation fidelity is measured by the l2-norm or l1-norm of the coding residual. Conclusions Extensive experiments on publicly available GEP datasets demonstrate that the proposed method is more efficient while its prediction accuracy is equivalent to existing MSRC-based methods and better than other state-of-the-art dimension reduction based methods. PMID:25473795

  13. [Sample preparation methods for chromatographic analysis of organic components in atmospheric particulate matter].

    PubMed

    Hao, Liang; Wu, Dapeng; Guan, Yafeng

    2014-09-01

    The determination of organic composition in atmospheric particulate matter (PM) is of great importance in understanding how PM affects human health, environment, climate, and ecosystem. Organic components are also the scientific basis for emission source tracking, PM regulation and risk management. Therefore, the molecular characterization of the organic fraction of PM has become one of the priority research issues in the field of environmental analysis. Due to the extreme complexity of PM samples, chromatographic methods have been the chief selection. The common procedure for the analysis of organic components in PM includes several steps: sample collection on the fiber filters, sample preparation (transform the sample into a form suitable for chromatographic analysis), analysis by chromatographic methods. Among these steps, the sample preparation methods will largely determine the throughput and the data quality. Solvent extraction methods followed by sample pretreatment (e. g. pre-separation, derivatization, pre-concentration) have long been used for PM sample analysis, and thermal desorption methods have also mainly focused on the non-polar organic component analysis in PM. In this paper, the sample preparation methods prior to chromatographic analysis of organic components in PM are reviewed comprehensively, and the corresponding merits and limitations of each method are also briefly discussed.

  14. Effluent monitoring Quality Assurance Project Plan for radioactive airborne emissions data. Revision 2

    SciTech Connect

    Frazier, T.P.

    1995-12-01

    This Quality Assurance Project Plan addresses the quality assurance requirements for compiling Hanford Site radioactive airborne emissions data. These data will be reported to the U.S. Environmental Protection Agency, the US Department of Energy, and the Washington State Department of Health. Effluent Monitoring performs compliance assessments on radioactive airborne sampling and monitoring systems. This Quality Assurance Project Plan is prepared in compliance with interim guidelines and specifications. Topics include: project description; project organization and management; quality assurance objectives; sampling procedures; sample custody; calibration procedures; analytical procedures; monitoring and reporting criteria; data reduction, verification, and reporting; internal quality control; performance and system audits; corrective actions; and quality assurance reports.

  15. Melting Temperature Mapping Method: A Novel Method for Rapid Identification of Unknown Pathogenic Microorganisms within Three Hours of Sample Collection.

    PubMed

    Niimi, Hideki; Ueno, Tomohiro; Hayashi, Shirou; Abe, Akihito; Tsurue, Takahiro; Mori, Masashi; Tabata, Homare; Minami, Hiroshi; Goto, Michihiko; Akiyama, Makoto; Yamamoto, Yoshihiro; Saito, Shigeru; Kitajima, Isao

    2015-07-28

    Acquiring the earliest possible identification of pathogenic microorganisms is critical for selecting the appropriate antimicrobial therapy in infected patients. We herein report the novel "melting temperature (Tm) mapping method" for rapidly identifying the dominant bacteria in a clinical sample from sterile sites. Employing only seven primer sets, more than 100 bacterial species can be identified. In particular, using the Difference Value, it is possible to identify samples suitable for Tm mapping identification. Moreover, this method can be used to rapidly diagnose the absence of bacteria in clinical samples. We tested the Tm mapping method using 200 whole blood samples obtained from patients with suspected sepsis, 85% (171/200) of which matched the culture results based on the detection level. A total of 130 samples were negative according to the Tm mapping method, 98% (128/130) of which were also negative based on the culture method. Meanwhile, 70 samples were positive according to the Tm mapping method, and of the 59 suitable for identification, 100% (59/59) exhibited a "match" or "broad match" with the culture or sequencing results. These findings were obtained within three hours of whole blood collection. The Tm mapping method is therefore useful for identifying infectious diseases requiring prompt treatment.

  16. Quality assurance of specialised treatment of eating disorders using large-scale Internet-based collection systems: methods, results and lessons learned from designing the Stepwise database.

    PubMed

    Birgegård, Andreas; Björck, Caroline; Clinton, David

    2010-01-01

    Computer-based quality assurance of specialist eating disorder (ED) care is a possible way of meeting demands for evaluating the real-life effectiveness of treatment, in a large-scale, cost-effective and highly structured way. The Internet-based Stepwise system combines clinical utility for patients and practitioners, and provides research-quality naturalistic data. Stepwise was designed to capture relevant variables concerning EDs and general psychiatric status, and the database can be used for both clinical and research purposes. The system comprises semi-structured diagnostic interviews, clinical ratings and self-ratings, automated follow-up schedules, as well as administrative functions to facilitate registration compliance. As of June 2009, the system is in use at 20 treatment units and comprises 2776 patients. Diagnostic distribution (including subcategories of eating disorder not otherwise specified) and clinical characteristics are presented, as well as data on registration compliance. Obstacles and keys to successful implementation of the Stepwise system are discussed, including possible gains and on-going challenges inherent in large-scale, Internet-based quality assurance.

  17. Quality assurance of specialised treatment of eating disorders using large-scale Internet-based collection systems: methods, results and lessons learned from designing the Stepwise database.

    PubMed

    Birgegård, Andreas; Björck, Caroline; Clinton, David

    2010-01-01

    Computer-based quality assurance of specialist eating disorder (ED) care is a possible way of meeting demands for evaluating the real-life effectiveness of treatment, in a large-scale, cost-effective and highly structured way. The Internet-based Stepwise system combines clinical utility for patients and practitioners, and provides research-quality naturalistic data. Stepwise was designed to capture relevant variables concerning EDs and general psychiatric status, and the database can be used for both clinical and research purposes. The system comprises semi-structured diagnostic interviews, clinical ratings and self-ratings, automated follow-up schedules, as well as administrative functions to facilitate registration compliance. As of June 2009, the system is in use at 20 treatment units and comprises 2776 patients. Diagnostic distribution (including subcategories of eating disorder not otherwise specified) and clinical characteristics are presented, as well as data on registration compliance. Obstacles and keys to successful implementation of the Stepwise system are discussed, including possible gains and on-going challenges inherent in large-scale, Internet-based quality assurance. PMID:20589767

  18. A case-base sampling method for estimating recurrent event intensities.

    PubMed

    Saarela, Olli

    2016-10-01

    Case-base sampling provides an alternative to risk set sampling based methods to estimate hazard regression models, in particular when absolute hazards are also of interest in addition to hazard ratios. The case-base sampling approach results in a likelihood expression of the logistic regression form, but instead of categorized time, such an expression is obtained through sampling of a discrete set of person-time coordinates from all follow-up data. In this paper, in the context of a time-dependent exposure such as vaccination, and a potentially recurrent adverse event outcome, we show that the resulting partial likelihood for the outcome event intensity has the asymptotic properties of a likelihood. We contrast this approach to self-matched case-base sampling, which involves only within-individual comparisons. The efficiency of the case-base methods is compared to that of standard methods through simulations, suggesting that the information loss due to sampling is minimal.

  19. Electrodeposition as an alternate method for preparation of environmental samples for iodide by AMS

    NASA Astrophysics Data System (ADS)

    Adamic, M. L.; Lister, T. E.; Dufek, E. J.; Jenson, D. D.; Olson, J. E.; Vockenhuber, C.; Watrous, M. G.

    2015-10-01

    This paper presents an evaluation of an alternate method for preparing environmental samples for 129I analysis by accelerator mass spectrometry (AMS) at Idaho National Laboratory. The optimal sample preparation method is characterized by ease of preparation, capability of processing very small quantities of iodide, and ease of loading into a cathode. Electrodeposition of iodide on a silver wire was evaluated using these criteria. This study indicates that the electrochemically-formed silver iodide deposits produce ion currents similar to those from precipitated silver iodide for the same sample mass. Precipitated silver iodide samples are usually mixed with niobium or silver powder prior to loading in a cathode. Using electrodeposition, the silver is already mixed with the sample and can simply be picked up with tweezers, placed in the sample die, and pressed into a cathode. The major advantage of this method is that the silver wire/electrodeposited silver iodide is much easier to load into a cathode.

  20. Sampling/analytical method evaluation for ethylene oxide emission and control-unit efficiency determinations

    SciTech Connect

    Steger, J.; Gergen, W.; Margeson, J.H.

    1988-05-01

    Radian Corporation, assisting the Environmental Monitoring Systems Laboratory, Environmental Protection Agency, Research Triangle Park, North Carolina, performed a field evaluation of a method for sampling and analyzing ethylene oxide (EO) in the vent stream from a sterilization chamber and a dilute-acid scrubber. The utility of the sampling method for measuring the efficiency of the control unit was also evaluated. The evaluated sampling and analysis procedure used semi-continuous direct sampling with on-line gas chromatographic analysis. Laboratory studies of the sampling method previous to the field test showed that semi-continuous direct sampling was capable of measuring EO emissions to within 11% of the expected value with a between-trial precision of 5%.

  1. Electrodeposition as an alternate method for preparation of environmental samples for iodide by AMS

    SciTech Connect

    Adamic, M. L.; Lister, T. E.; Dufek, E. J.; Jenson, D. D.; Olson, J. E.; Vockenhuber, C.; Watrous, M. G.

    2015-03-25

    This paper presents an evaluation of an alternate method for preparing environmental samples for 129I analysis by accelerator mass spectrometry (AMS) at Idaho National Laboratory. The optimal sample preparation method is characterized by ease of preparation, capability of processing very small quantities of iodide, and ease of loading into a cathode. Electrodeposition of iodide on a silver wire was evaluated using these criteria. This study indicates that the electrochemically-formed silver iodide deposits produce ion currents similar to those from precipitated silver iodide for the same sample mass. Furthermore, precipitated silver iodide samples are usually mixed with niobium or silver powder prior to loading in a cathode. Using electrodeposition, the silver is already mixed with the sample and can simply be picked up with tweezers, placed in the sample die, and pressed into a cathode. The major advantage of this method is that the silver wire/electrodeposited silver iodide is much easier to load into a cathode.

  2. Assessment of dust sampling methods for the study of cultivable-microorganism exposure in stables.

    PubMed

    Normand, Anne-Cécile; Vacheyrou, Mallory; Sudre, Bertrand; Heederik, Dick J J; Piarroux, Renaud

    2009-12-01

    Studies have shown a link between living on a farm, exposure to microbial components (e.g., endotoxins or beta-d-glucans), and a lower risk for allergic diseases and asthma. Due to the lack of validated sampling methods, studies of asthma and atopy have not relied on exposure assessment based on culture techniques. Our objective was therefore to compare several dust sampling methods for the detection of cultivable-microorganism exposure in stables. Sixteen French farms were sampled using four different methods: (i) active air sampling using a pump, (ii) passive dust sampling with a plastic box, (iii) dust sampling with an electrostatic dust fall collector (wipe), and (iv) dust sampling using a spatula to collect dust already settled on a windowsill. The results showed that collection of settled dust samples with either plastic boxes or wipes was reproducible (pairwise correlations, 0.72 and 0.73, respectively) and resulted in highly correlated results (pairwise correlation between the two methods, 0.82). We also found that settled dust samples collected with a plastic box correctly reflected the composition of the samples collected in the air of the stable when there was no farmer activity. A loss of microbial diversity was observed when dust was kept for 3 months at room temperature. We therefore conclude that measurement of viable microorganisms within a reasonable time frame gives an accurate representation of the microbial composition of stable air.

  3. Modified shifted angular spectrum method for numerical propagation at reduced spatial sampling rates.

    PubMed

    Ritter, André

    2014-10-20

    The shifted angular spectrum method allows a reduction of the number of samples required for numerical off-axis propagation of scalar wave fields. In this work, a modification of the shifted angular spectrum method is presented. It allows a further reduction of the spatial sampling rate for certain wave fields. We calculate the benefit of this method for spherical waves. Additionally, a working implementation is presented showing the example of a spherical wave propagating through a circular aperture. PMID:25401659

  4. The quality assurance liaison: Combined technical and quality assurance support

    NASA Astrophysics Data System (ADS)

    Bolivar, S. L.; Day, J. L.

    1993-03-01

    The role of the quality assurance liaison, the responsibilities of this position, and the evolutionary changes in duties over the last six years are described. The role of the quality assurance liaison has had a very positive impact on the Los Alamos Yucca Mountain Site Characterization (YW) quality assurance program. Having both technical and quality assurance expertise, the quality assurance liaisons are able to facilitate communications with scientists on quality assurance issues and requirements, thereby generating greater productivity in scientific investigations. The quality assurance liaisons help ensure that the scientific community knows and implements existing requirements, is aware of new or changing regulations, and is able to conduct scientific work within Project requirements. The influence of the role of the quality assurance liaison can be measured by an overall improvement in attitude of the staff regarding quality assurance requirements and improved job performance, as well as a decrease in deficiencies identified during both internal and external audits and surveillances. This has resulted in a more effective implementation of quality assurance requirements.

  5. Method optimization and quality assurance in speciation analysis using high performance liquid chromatography with detection by inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Larsen, Erik H.

    1998-02-01

    Achievement of optimum selectivity, sensitivity and robustness in speciation analysis using high performance liquid chromatography (HPLC) with inductively coupled mass spectrometry (ICP-MS) detection requires that each instrumental component is selected and optimized with a view to the ideal operating characteristics of the entire hyphenated system. An isocratic HPLC system, which employs an aqueous mobile phase with organic buffer constituents, is well suited for introduction into the ICP-MS because of the stability of the detector response and high degree of analyte sensitivity attained. Anion and cation exchange HPLC systems, which meet these requirements, were used for the seperation of selenium and arsenic species in crude extracts of biological samples. Furthermore, the signal-to-noise ratios obtained for these incompletely ionized elements in the argon ICP were further enhanced by a factor of four by continously introducing carbon as methanol via the mobile phase into the ICP. Sources of error in the HPLC system (column overload), in the sample introduction system (memory by organic solvents) and in the ICP-MS (spectroscopic interferences) and their prevention are also discussed. The optimized anion and cation exchange HPLC-ICP-MS systems were used for arsenic speciation in contaminated ground water and in an in-house shrimp reference sample. For the purpose of verification, HPLC coupled with tandem mass spectrometry with electrospray ionization was additionally used for arsenic speciation in the shrimp sample. With this analytical technique the HPLC retention time in combination with mass analysis of the molecular ions and their collision-induced fragments provide almost conclusive evidence of the identity of the analyte species. The speciation methods are validated by establishing a mass balance of the analytes in each fraction of the extraction procedure, by recovery of spikes and by employing and comparing independent techniques. The urgent need for

  6. Comparison of individual and pooled sampling methods for detecting bacterial pathogens of fish

    USGS Publications Warehouse

    Mumford, Sonia; Patterson, Chris; Evered, J.; Brunson, Ray; Levine, J.; Winton, J.

    2005-01-01

    Examination of finfish populations for viral and bacterial pathogens is an important component of fish disease control programs worldwide. Two methods are commonly used for collecting tissue samples for bacteriological culture, the currently accepted standards for detection of bacterial fish pathogens. The method specified in the Office International des Epizooties Manual of Diagnostic Tests for Aquatic Animals permits combining renal and splenic tissues from as many as 5 fish into pooled samples. The American Fisheries Society (AFS) Blue Book/US Fish and Wildlife Service (USFWS) Inspection Manual specifies the use of a bacteriological loop for collecting samples from the kidney of individual fish. An alternative would be to more fully utilize the pooled samples taken for virology. If implemented, this approach would provide substantial savings in labor and materials. To compare the relative performance of the AFS/USFWS method and this alternative approach, cultures of Yersinia ruckeri were used to establish low-level infections in groups of rainbow trout (Oncorhynchus mykiss) that were sampled by both methods. Yersinia ruckeri was cultured from 22 of 37 groups by at least 1 method. The loop method yielded 18 positive groups, with 1 group positive in the loop samples but negative in the pooled samples. The pooled samples produced 21 positive groups, with 4 groups positive in the pooled samples but negative in the loop samples. There was statistically significant agreement (Spearman coefficient 0.80, P < 0.001) in the relative ability of the 2 sampling methods to permit detection of low-level bacterial infections of rainbow trout.

  7. Towards Run-time Assurance of Advanced Propulsion Algorithms

    NASA Technical Reports Server (NTRS)

    Wong, Edmond; Schierman, John D.; Schlapkohl, Thomas; Chicatelli, Amy

    2014-01-01

    This paper covers the motivation and rationale for investigating the application of run-time assurance methods as a potential means of providing safety assurance for advanced propulsion control systems. Certification is becoming increasingly infeasible for such systems using current verification practices. Run-time assurance systems hold the promise of certifying these advanced systems by continuously monitoring the state of the feedback system during operation and reverting to a simpler, certified system if anomalous behavior is detected. The discussion will also cover initial efforts underway to apply a run-time assurance framework to NASA's model-based engine control approach. Preliminary experimental results are presented and discussed.

  8. Simple method to measure power density entering a plane biological sample at millimeter wavelengths.

    PubMed

    Shen, Z Y; Birenbaum, L; Chu, A; Motzkin, S; Rosenthal, S; Sheng, K M

    1987-01-01

    A simple method for measuring microwave power density is described. It is applicable to situations where exposure of samples in the near field of a horn is necessary. A transmitted power method is used to calibrate the power density entering the surface of the sample. Once the calibration is available, the power density is known in terms of the incident and reflected powers within the waveguide. The calibration has been carried out for liquid samples in a quartz cell. Formulas for calculating specific absorption rate (SAR) are derived in terms of the power density and the complex dielectric constant of the sample. An error analysis is also given.

  9. RAPID FUSION METHOD FOR DETERMINATION OF PLUTONIUM ISOTOPES IN LARGE RICE SAMPLES

    SciTech Connect

    Maxwell, S.

    2013-03-01

    A new rapid fusion method for the determination of plutonium in large rice samples has been developed at the Savannah River National Laboratory (Aiken, SC, USA) that can be used to determine very low levels of plutonium isotopes in rice. The recent accident at Fukushima Nuclear Power Plant in March, 2011 reinforces the need to have rapid, reliable radiochemical analyses for radionuclides in environmental and food samples. Public concern regarding foods, particularly foods such as rice in Japan, highlights the need for analytical techniques that will allow very large sample aliquots of rice to be used for analysis so that very low levels of plutonium isotopes may be detected. The new method to determine plutonium isotopes in large rice samples utilizes a furnace ashing step, a rapid sodium hydroxide fusion method, a lanthanum fluoride matrix removal step, and a column separation process with TEVA Resin cartridges. The method can be applied to rice sample aliquots as large as 5 kg. Plutonium isotopes can be determined using alpha spectrometry or inductively-coupled plasma mass spectrometry (ICP-MS). The method showed high chemical recoveries and effective removal of interferences. The rapid fusion technique is a rugged sample digestion method that ensures that any refractory plutonium particles are effectively digested. The MDA for a 5 kg rice sample using alpha spectrometry is 7E-5 mBq g{sup -1}. The method can easily be adapted for use by ICP-MS to allow detection of plutonium isotopic ratios.

  10. Rapid method to determine actinides and 89/90Sr in limestone and marble samples

    DOE PAGES

    Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.; Utsey, Robin C.; Sudowe, Ralf; McAlister, Daniel R.

    2016-04-12

    A new method for the determination of actinides and radiostrontium in limestone and marble samples has been developed that utilizes a rapid sodium hydroxide fusion to digest the sample. Following rapid pre-concentration steps to remove sample matrix interferences, the actinides and 89/90Sr are separated using extraction chromatographic resins and measured radiometrically. The advantages of sodium hydroxide fusion versus other fusion techniques will be discussed. Lastly, this approach has a sample preparation time for limestone and marble samples of <4 hours.

  11. DEVELOPMENT AND FIELD IMPLEMENTATION OF AN IMPROVED METHOD FOR HEADSPACE GAS SAMPLING OF TRANSURANIC WASTE DRUMS

    SciTech Connect

    Polley, M.; Ankrom, J.; Wickland, T.; Warren, J.

    2003-02-27

    A fast, safe, and cost-effective method for obtaining headspace gas samples has been developed and implemented at Los Alamos National Laboratory (LANL). A sample port is installed directly into a drum lid using a pneumatic driver, allowing sampling with a side-port needle. Testing has shown that the sample port can be installed with no release of radioactive material. Use of this system at LANL has significantly reduced the time required for sampling, and eliminates the need for many safety precautions previously used. The system has significantly improved productivity and lowered radiation exposure and cost.

  12. Probe Heating Method for the Analysis of Solid Samples Using a Portable Mass Spectrometer.

    PubMed

    Kumano, Shun; Sugiyama, Masuyuki; Yamada, Masuyoshi; Nishimura, Kazushige; Hasegawa, Hideki; Morokuma, Hidetoshi; Inoue, Hiroyuki; Hashimoto, Yuichiro

    2015-01-01

    We previously reported on the development of a portable mass spectrometer for the onsite screening of illicit drugs, but our previous sampling system could only be used for liquid samples. In this study, we report on an attempt to develop a probe heating method that also permits solid samples to be analyzed using a portable mass spectrometer. An aluminum rod is used as the sampling probe. The powdered sample is affixed to the sampling probe or a droplet of sample solution is placed on the tip of the probe and dried. The probe is then placed on a heater to vaporize the sample. The vapor is then introduced into the portable mass spectrometer and analyzed. With the heater temperature set to 130°C, the developed system detected 1 ng of methamphetamine, 1 ng of amphetamine, 3 ng of 3,4-methylenedioxymethamphetamine, 1 ng of 3,4-methylenedioxyamphetamine, and 0.3 ng of cocaine. Even from mixtures consisting of clove powder and methamphetamine powder, methamphetamine ions were detected by tandem mass spectrometry. The developed probe heating method provides a simple method for the analysis of solid samples. A portable mass spectrometer incorporating this method would thus be useful for the onsite screening of illicit drugs. PMID:26819909

  13. Probe Heating Method for the Analysis of Solid Samples Using a Portable Mass Spectrometer.

    PubMed

    Kumano, Shun; Sugiyama, Masuyuki; Yamada, Masuyoshi; Nishimura, Kazushige; Hasegawa, Hideki; Morokuma, Hidetoshi; Inoue, Hiroyuki; Hashimoto, Yuichiro

    2015-01-01

    We previously reported on the development of a portable mass spectrometer for the onsite screening of illicit drugs, but our previous sampling system could only be used for liquid samples. In this study, we report on an attempt to develop a probe heating method that also permits solid samples to be analyzed using a portable mass spectrometer. An aluminum rod is used as the sampling probe. The powdered sample is affixed to the sampling probe or a droplet of sample solution is placed on the tip of the probe and dried. The probe is then placed on a heater to vaporize the sample. The vapor is then introduced into the portable mass spectrometer and analyzed. With the heater temperature set to 130°C, the developed system detected 1 ng of methamphetamine, 1 ng of amphetamine, 3 ng of 3,4-methylenedioxymethamphetamine, 1 ng of 3,4-methylenedioxyamphetamine, and 0.3 ng of cocaine. Even from mixtures consisting of clove powder and methamphetamine powder, methamphetamine ions were detected by tandem mass spectrometry. The developed probe heating method provides a simple method for the analysis of solid samples. A portable mass spectrometer incorporating this method would thus be useful for the onsite screening of illicit drugs.

  14. Probe Heating Method for the Analysis of Solid Samples Using a Portable Mass Spectrometer

    PubMed Central

    Kumano, Shun; Sugiyama, Masuyuki; Yamada, Masuyoshi; Nishimura, Kazushige; Hasegawa, Hideki; Morokuma, Hidetoshi; Inoue, Hiroyuki; Hashimoto, Yuichiro

    2015-01-01

    We previously reported on the development of a portable mass spectrometer for the onsite screening of illicit drugs, but our previous sampling system could only be used for liquid samples. In this study, we report on an attempt to develop a probe heating method that also permits solid samples to be analyzed using a portable mass spectrometer. An aluminum rod is used as the sampling probe. The powdered sample is affixed to the sampling probe or a droplet of sample solution is placed on the tip of the probe and dried. The probe is then placed on a heater to vaporize the sample. The vapor is then introduced into the portable mass spectrometer and analyzed. With the heater temperature set to 130°C, the developed system detected 1 ng of methamphetamine, 1 ng of amphetamine, 3 ng of 3,4-methylenedioxymethamphetamine, 1 ng of 3,4-methylenedioxyamphetamine, and 0.3 ng of cocaine. Even from mixtures consisting of clove powder and methamphetamine powder, methamphetamine ions were detected by tandem mass spectrometry. The developed probe heating method provides a simple method for the analysis of solid samples. A portable mass spectrometer incorporating this method would thus be useful for the onsite screening of illicit drugs. PMID:26819909

  15. A green method for the determination of cocaine in illicit samples.

    PubMed

    Pérez-Alfonso, Clara; Galipienso, Nieves; Garrigues, Salvador; de la Guardia, Miguel

    2014-04-01

    Direct determination of cocaine in untreated seized samples has been made based on diffuse reflectance measurements of the near infrared (NIR) radiation through samples contained inside standard glass vials. The method used a series of previously analyzed samples, by the reference gas chromatography method, to build a partial least squares calibration model which was validated using an independent set of samples. The use of a general model for samples containing from 11.38% till 86.44% (w/w) cocaine was based on the use of spectral ranges from 12500.7 to 10128.6, 9339.8 to 6967.7 and 5388.3 to 4597.6cm(-1) with previous first derivative and vector normalization data pre-processing and provided a root mean square error of prediction (RMSEP) of 4.0% (w/w) with a residual prediction deviation (RPD) of 3.9% (w/w), based on the use of 8 latent variables, 34 samples for calibration and an independent set of 44 samples for validation. The aforementioned results could be improved on considering two separate models, one for high concentrated bulk samples and another for samples diluted with cutting agents. Additionally a new set of batch samples with cocaine concentrations from 60% till 84% was evaluated by using the developed method. PMID:24607706

  16. Quantifying ChIP-seq data: a spiking method providing an internal reference for sample-to-sample normalization.

    PubMed

    Bonhoure, Nicolas; Bounova, Gergana; Bernasconi, David; Praz, Viviane; Lammers, Fabienne; Canella, Donatella; Willis, Ian M; Herr, Winship; Hernandez, Nouria; Delorenzi, Mauro

    2014-07-01

    Chromatin immunoprecipitation followed by deep sequencing (ChIP-seq) experiments are widely used to determine, within entire genomes, the occupancy sites of any protein of interest, including, for example, transcription factors, RNA polymerases, or histones with or without various modifications. In addition to allowing the determination of occupancy sites within one cell type and under one condition, this method allows, in principle, the establishment and comparison of occupancy maps in various cell types, tissues, and conditions. Such comparisons require, however, that samples be normalized. Widely used normalization methods that include a quantile normalization step perform well when factor occupancy varies at a subset of sites, but may miss uniform genome-wide increases or decreases in site occupancy. We describe a spike adjustment procedure (SAP) that, unlike commonly used normalization methods intervening at the analysis stage, entails an experimental step prior to immunoprecipitation. A constant, low amount from a single batch of chromatin of a foreign genome is added to the experimental chromatin. This "spike" chromatin then serves as an internal control to which the experimental signals can be adjusted. We show that the method improves similarity between replicates and reveals biological differences including global and largely uniform changes.

  17. An emergency radiobioassay method for 226Ra in human urine samples.

    PubMed

    Sadi, Baki B; Li, Chunsheng; Kramer, Gary H

    2012-08-01

    A new radioanalytical method was developed for rapid determination of (226)Ra in human urine samples. The method is based on organic removal and decolourisation of a urine sample by a polymeric (acrylic ester) solid phase sorbent material followed by extraction and preconcentration of (226)Ra in an organic solvent using a dispersive liquid-liquid microextraction technique. Radiometric measurement of (226)Ra was carried out using a liquid scintillation counting instrument. The minimum detectable activity for the method (0.15 Bq l(-1)) is lower than the required sensitivity of 0.2 Bq l(-1) for (226)Ra in human urine samples as defined in the requirements for radiation emergency bioassay techniques for the public and first responders based on the dose threshold for possible medical attention recommended by the International Commission on Radiological Protection (ICRP). The accuracy (expressed as relative bias, B(r)) and repeatability of the method (expressed as relative precision, S(B)) evaluated at the reference level (2 Bq l(-1)) were found to be -4.5 and 2.6 %, respectively. The sample turnaround time was <5 h for a single urine sample and <20 h for a batch of six urine samples. With the fast sample turnaround time combined with the potential to carry out the analysis in a field deployable mobile laboratory, the newly developed method can be used for emergency radiobioassay of (226)Ra in human urine samples following a radiological or nuclear accident.

  18. Evaluation of gravimetric methods for dissoluble matter in extracts of environmental samples

    SciTech Connect

    Lafleur, A.L.; Monchamp, P.A.; Plummer, E.F.; Kruzel, E.L.

    1986-01-01

    A number of gravimetric methods were evaluated for the determination of dissolved matter in solvent extracts of combustion samples. The methods described included thermogravimetric analysis, weighing after evaporation under nitrogen, and a microscale evaporation method developed in this study. A well characterized combustion sample, known to consist primarily of alkylated bicyclic and tricyclic aromatic compounds, served as a reference material. Results for the three methods are presented and compared. Although the thermogravimetric analyzer was found to be accurate and versatile, a good compromise between cost, time and accuracy was provided by the microscale evaporation method.

  19. Quality assurance program plan for low-level waste at the WSCF Laboratory

    SciTech Connect

    Morrison, J.A.

    1994-11-01

    The purpose of this document is to provide guidance for the implementation of the Quality Assurance Program Plan (QAPP) for the management of low-level waste at the Waste Sampling and Characterization Facility (WSCF) Laboratory Complex as required by WHC-CM-4-2, Quality Assurance Manual, which is based on Quality Assurance Program Requirements for Nuclear Facilities, NQA-1 (ASME).

  20. A stochastic optimization method to estimate the spatial distribution of a pathogen from a sample.

    PubMed

    Parnell, S; Gottwald, T R; Irey, M S; Luo, W; van den Bosch, F

    2011-10-01

    Information on the spatial distribution of plant disease can be utilized to implement efficient and spatially targeted disease management interventions. We present a pathogen-generic method to estimate the spatial distribution of a plant pathogen using a stochastic optimization process which is epidemiologically motivated. Based on an initial sample, the method simulates the individual spread processes of a pathogen between patches of host to generate optimized spatial distribution maps. The method was tested on data sets of Huanglongbing of citrus and was compared with a kriging method from the field of geostatistics using the well-established kappa statistic to quantify map accuracy. Our method produced accurate maps of disease distribution with kappa values as high as 0.46 and was able to outperform the kriging method across a range of sample sizes based on the kappa statistic. As expected, map accuracy improved with sample size but there was a high amount of variation between different random sample placements (i.e., the spatial distribution of samples). This highlights the importance of sample placement on the ability to estimate the spatial distribution of a plant pathogen and we thus conclude that further research into sampling design and its effect on the ability to estimate disease distribution is necessary. PMID:21916625

  1. INVESTIGATION OF THE TOTAL ORGANIC HALOGEN ANALYTICAL METHOD AT THE WASTE SAMPLING CHARACTERIZATION FACILITY (WSCF)

    SciTech Connect

    DOUGLAS JG; MEZNARICH HD, PHD; OLSEN JR; ROSS GA; STAUFFER M

    2008-09-30

    Total organic halogen (TOX) is used as a parameter to screen groundwater samples at the Hanford Site. Trending is done for each groundwater well, and changes in TOX and other screening parameters can lead to costly changes in the monitoring protocol. The Waste Sampling and Characterization Facility (WSCF) analyzes groundwater samples for TOX using the United States Environmental Protection Agency (EPA) SW-846 method 9020B (EPA 1996a). Samples from the Soil and Groundwater Remediation Project (S&GRP) are submitted to the WSCF for analysis without information regarding the source of the sample; each sample is in essence a 'blind' sample to the laboratory. Feedback from the S&GRP indicated that some of the WSCF-generated TOX data from groundwater wells had a number of outlier values based on the historical trends (Anastos 2008a). Additionally, analysts at WSCF observed inconsistent TOX results among field sample replicates. Therefore, the WSCF lab performed an investigation of the TOX analysis to determine the cause of the outlier data points. Two causes were found that contributed to generating out-of-trend TOX data: (1) The presence of inorganic chloride in the groundwater samples: at inorganic chloride concentrations greater than about 10 parts per million (ppm), apparent TOX values increase with increasing chloride concentration. A parallel observation is the increase in apparent breakthrough of TOX from the first to the second activated-carbon adsorption tubes with increasing inorganic chloride concentration. (2) During the sample preparation step, excessive purging of the adsorption tubes with oxygen pressurization gas after sample loading may cause channeling in the activated-carbon bed. This channeling leads to poor removal of inorganic chloride during the subsequent wash step with aqueous potassium nitrate. The presence of this residual inorganic chloride then produces erroneously high TOX values. Changes in sample preparation were studied to more effectively

  2. Nuclear and conventional methods for soil determination in sugar cane industry. Validity of sampling procedure.

    PubMed

    Fernandes, E A; Bacchi, M A

    1994-01-01

    Scandium and ash methods' performances were compared in terms of soil content assessment in sugar cane loads, emphasizing the common sampling drawbacks. Both methods are adequate for such determination in controlled conditions. The scandium has demonstrated better analytical characteristics, since it is free from interferences of cane matrix, which decreases the accuracy of the ash method in normal mill conditions.

  3. Methods, compounds and systems for detecting a microorganism in a sample

    DOEpatents

    Colston, Jr, Bill W.; Fitch, J. Patrick; Gardner, Shea N.; Williams, Peter L.; Wagner, Mark C.

    2016-09-06

    Methods to identify a set of probe polynucleotides suitable for detecting a set of targets and in particular methods for identification of primers suitable for detection of target microorganisms related polynucleotides, set of polynucleotides and compositions, and related methods and systems for detection and/or identification of microorganisms in a sample.

  4. Reliability assurance for regulation of advanced reactors

    SciTech Connect

    Fullwood, R.; Lofaro, R.; Samanta, P.

    1991-01-01

    The advanced nuclear power plants must achieve higher levels of safety than the first generation of plants. Showing that this is indeed true provides new challenges to reliability and risk assessment methods in the analysis of the designs employing passive and semi-passive protection. Reliability assurance of the advanced reactor systems is important for determining the safety of the design and for determining the plant operability. Safety is the primary concern, but operability is considered indicative of good and safe operation. This paper discusses several concerns for reliability assurance of the advanced design encompassing reliability determination, level of detail required in advanced reactor submittals, data for reliability assurance, systems interactions and common cause effects, passive component reliability, PRA-based configuration control system, and inspection, training, maintenance and test requirements. Suggested approaches are provided for addressing each of these topics.

  5. Reliability assurance for regulation of advanced reactors

    SciTech Connect

    Fullwood, R.; Lofaro, R.; Samanta, P.

    1991-12-31

    The advanced nuclear power plants must achieve higher levels of safety than the first generation of plants. Showing that this is indeed true provides new challenges to reliability and risk assessment methods in the analysis of the designs employing passive and semi-passive protection. Reliability assurance of the advanced reactor systems is important for determining the safety of the design and for determining the plant operability. Safety is the primary concern, but operability is considered indicative of good and safe operation. This paper discusses several concerns for reliability assurance of the advanced design encompassing reliability determination, level of detail required in advanced reactor submittals, data for reliability assurance, systems interactions and common cause effects, passive component reliability, PRA-based configuration control system, and inspection, training, maintenance and test requirements. Suggested approaches are provided for addressing each of these topics.

  6. A comparison of microscopic and spectroscopic identification methods for analysis of microplastics in environmental samples.

    PubMed

    Song, Young Kyoung; Hong, Sang Hee; Jang, Mi; Han, Gi Myung; Rani, Manviri; Lee, Jongmyoung; Shim, Won Joon

    2015-04-15

    The analysis of microplastics in various environmental samples requires the identification of microplastics from natural materials. The identification technique lacks a standardized protocol. Herein, stereomicroscope and Fourier transform infrared spectroscope (FT-IR) identification methods for microplastics (<1mm) were compared using the same samples from the sea surface microlayer (SML) and beach sand. Fragmented microplastics were significantly (p<0.05) underestimated and fiber was significantly overestimated using the stereomicroscope both in the SML and beach samples. The total abundance by FT-IR was higher than by microscope both in the SML and beach samples, but they were not significantly (p>0.05) different. Depending on the number of samples and the microplastic size range of interest, the appropriate identification method should be determined; selecting a suitable identification method for microplastics is crucial for evaluating microplastic pollution. PMID:25682567

  7. A comparison of microscopic and spectroscopic identification methods for analysis of microplastics in environmental samples.

    PubMed

    Song, Young Kyoung; Hong, Sang Hee; Jang, Mi; Han, Gi Myung; Rani, Manviri; Lee, Jongmyoung; Shim, Won Joon

    2015-04-15

    The analysis of microplastics in various environmental samples requires the identification of microplastics from natural materials. The identification technique lacks a standardized protocol. Herein, stereomicroscope and Fourier transform infrared spectroscope (FT-IR) identification methods for microplastics (<1mm) were compared using the same samples from the sea surface microlayer (SML) and beach sand. Fragmented microplastics were significantly (p<0.05) underestimated and fiber was significantly overestimated using the stereomicroscope both in the SML and beach samples. The total abundance by FT-IR was higher than by microscope both in the SML and beach samples, but they were not significantly (p>0.05) different. Depending on the number of samples and the microplastic size range of interest, the appropriate identification method should be determined; selecting a suitable identification method for microplastics is crucial for evaluating microplastic pollution.

  8. Apparatus and method for maintaining multi-component sample gas constituents in vapor phase during sample extraction and cooling

    DOEpatents

    Felix, Larry Gordon; Farthing, William Earl; Irvin, James Hodges; Snyder, Todd Robert

    2010-05-11

    A dilution apparatus for diluting a gas sample. The apparatus includes a sample gas conduit having a sample gas inlet end and a diluted sample gas outlet end, and a sample gas flow restricting orifice disposed proximate the sample gas inlet end connected with the sample gas conduit and providing fluid communication between the exterior and the interior of the sample gas conduit. A diluted sample gas conduit is provided within the sample gas conduit having a mixing end with a mixing space inlet opening disposed proximate the sample gas inlet end, thereby forming an annular space between the sample gas conduit and the diluted sample gas conduit. The mixing end of the diluted sample gas conduit is disposed at a distance from the sample gas flow restricting orifice. A dilution gas source connected with the sample gas inlet end of the sample gas conduit is provided for introducing a dilution gas into the annular space, and a filter is provided for filtering the sample gas. The apparatus is particularly suited for diluting heated sample gases containing one or more condensable components.

  9. A unified method to process biosolids samples for the recovery of bacterial, viral, and helminths pathogens.

    PubMed

    Alum, Absar; Rock, Channah; Abbaszadegan, Morteza

    2014-01-01

    For land application, biosolids are classified as Class A or Class B based on the levels of bacterial, viral, and helminths pathogens in residual biosolids. The current EPA methods for the detection of these groups of pathogens in biosolids include discrete steps. Therefore, a separate sample is processed independently to quantify the number of each group of the pathogens in biosolids. The aim of the study was to develop a unified method for simultaneous processing of a single biosolids sample to recover bacterial, viral, and helminths pathogens. At the first stage for developing a simultaneous method, nine eluents were compared for their efficiency to recover viruses from a 100 gm spiked biosolids sample. In the second stage, the three top performing eluents were thoroughly evaluated for the recovery of bacteria, viruses, and helminthes. For all three groups of pathogens, the glycine-based eluent provided higher recovery than the beef extract-based eluent. Additional experiments were performed to optimize performance of glycine-based eluent under various procedural factors such as, solids to eluent ratio, stir time, and centrifugation conditions. Last, the new method was directly compared with the EPA methods for the recovery of the three groups of pathogens spiked in duplicate samples of biosolids collected from different sources. For viruses, the new method yielded up to 10% higher recoveries than the EPA method. For bacteria and helminths, recoveries were 74% and 83% by the new method compared to 34% and 68% by the EPA method, respectively. The unified sample processing method significantly reduces the time required for processing biosolids samples for different groups of pathogens; it is less impacted by the intrinsic variability of samples, while providing higher yields (P = 0.05) and greater consistency than the current EPA methods.

  10. Sampling Methods for Detection and Monitoring of the Asian Citrus Psyllid (Hemiptera: Psyllidae).

    PubMed

    Monzo, C; Arevalo, H A; Jones, M M; Vanaclocha, P; Croxton, S D; Qureshi, J A; Stansly, P A

    2015-06-01

    The Asian citrus psyllid (ACP), Diaphorina citri Kuwayama is a key pest of citrus due to its role as vector of citrus greening disease or "huanglongbing." ACP monitoring is considered an indispensable tool for management of vector and disease. In the present study, datasets collected between 2009 and 2013 from 245 citrus blocks were used to evaluate precision, sensitivity for detection, and efficiency of five sampling methods. The number of samples needed to reach a 0.25 standard error-mean ratio was estimated using Taylor's power law and used to compare precision among sampling methods. Comparison of detection sensitivity and time expenditure (cost) between stem-tap and other sampling methodologies conducted consecutively at the same location were also assessed. Stem-tap sampling was the most efficient sampling method when ACP densities were moderate to high and served as the basis for comparison with all other methods. Protocols that grouped trees near randomly selected locations across the block were more efficient than sampling trees at random across the block. Sweep net sampling was similar to stem-taps in number of captures per sampled unit, but less precise at any ACP density. Yellow sticky traps were 14 times more sensitive than stem-taps but much more time consuming and thus less efficient except at very low population densities. Visual sampling was efficient for detecting and monitoring ACP at low densities. Suction sampling was time consuming and taxing but the most sensitive of all methods for detection of sparse populations. This information can be used to optimize ACP monitoring efforts. PMID:26313984

  11. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  12. Fast egg collection method greatly improves randomness of egg sampling in Drosophila melanogaster.

    PubMed

    Schou, Mads Fristrup

    2013-01-01

    When obtaining samples for population genetic studies, it is essential that the sampling is random. For Drosophila, one of the crucial steps in sampling experimental flies is the collection of eggs. Here an egg collection method is presented, which randomizes the eggs in a water column and diminishes environmental variance. This method was compared with a traditional egg collection method where eggs are collected directly from the medium. Within each method the observed and expected standard deviations of egg-to-adult viability were compared, whereby the difference in the randomness of the samples between the two methods was assessed. The method presented here was superior to the traditional method. Only 14% of the samples had a standard deviation higher than expected, as compared with 58% in the traditional method. To reduce bias in the estimation of the variance and the mean of a trait and to obtain a representative collection of genotypes, the method presented here is strongly recommended when collecting eggs from Drosophila.

  13. 40 CFR 792.35 - Quality assurance unit.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 33 2013-07-01 2013-07-01 false Quality assurance unit. 792.35 Section 792.35 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL... monitoring each study to assure management that the facilities, equipment, personnel, methods,...

  14. Ontario's Quality Assurance Framework: A Critical Response

    ERIC Educational Resources Information Center

    Heap, James

    2013-01-01

    Ontario's Quality Assurance Framework (QAF) is reviewed and found not to meet all five criteria proposed for a strong quality assurance system focused on student learning. The QAF requires a statement of student learning outcomes and a method and means of assessing those outcomes, but it does not require that data on achievement of intended…

  15. 42 CFR 431.53 - Assurance of transportation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Assurance of transportation. 431.53 Section 431.53... Requirements § 431.53 Assurance of transportation. A State plan must— (a) Specify that the Medicaid agency will ensure necessary transportation for recipients to and from providers; and (b) Describe the methods...

  16. 42 CFR 431.53 - Assurance of transportation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Assurance of transportation. 431.53 Section 431.53... Requirements § 431.53 Assurance of transportation. A State plan must— (a) Specify that the Medicaid agency will ensure necessary transportation for recipients to and from providers; and (b) Describe the methods...

  17. 40 CFR 160.35 - Quality assurance unit.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... LABORATORY PRACTICE STANDARDS Organization and Personnel § 160.35 Quality assurance unit. (a) A testing... standard operating procedures were made without proper authorization and documentation. (6) Review the final study report to assure that such report accurately describes the methods and standard...

  18. Quality assurance of absorbed energy in Charpy impact test

    NASA Astrophysics Data System (ADS)

    Rocha, C. L. F.; Fabricio, D. A. K.; Costa, V. M.; Reguly, A.

    2016-07-01

    In order to ensure the quality assurance and comply with standard requirements, an intralaboratory study has been performed for impact Charpy tests, involving two operators. The results based on ANOVA (Analysis of Variance) and Normalized Error statistical techniques pointed out that the execution of the tests is appropriate, because the implementation of quality assurance methods showed acceptable results.

  19. Product assurance policies and procedures for flight dynamics software development

    NASA Technical Reports Server (NTRS)

    Perry, Sandra; Jordan, Leon; Decker, William; Page, Gerald; Mcgarry, Frank E.; Valett, Jon

    1987-01-01

    The product assurance policies and procedures necessary to support flight dynamics software development projects for Goddard Space Flight Center are presented. The quality assurance and configuration management methods and tools for each phase of the software development life cycles are described, from requirements analysis through acceptance testing; maintenance and operation are not addressed.

  20. Development and Evaluation of a Micro- and Nanoscale Proteomic Sample Preparation Method

    SciTech Connect

    Wang, Haixing H.; Qian, Weijun; Mottaz, Heather M.; Clauss, Therese R.W.; Anderson, David J.; Moore, Ronald J.; Camp, David G.; Khan, Arshad H.; Sforza, Daniel M.; Pallavicini, Maria; Smith, Desmond J.; Smith, Richard D.

    2005-10-05

    Efficient and effective sample preparation of micro- and nano-scale (micro- and nano-gram) clinical specimens for proteomic applications is often difficult due to losses during the processing steps. Herein we describe a simple “single-tube” preparation protocol appropriate for small proteomic samples using the organic co-solvent, trifluoroethanol (TFE). TFE facilitates both protein extraction and protein denaturation without requiring a separate cleanup step, thus minimizing sample loss. The performance of the TFE method was initially evaluated by comparing to traditional detergent-based methods on relatively large scale sample processing using human breast cancer cells and mouse brain tissue. The results demonstrated that the TFE protocol provided comparable results to the traditional detergent-based protocols for larger samples (milligrams), based on both sample recovery and peptide/protein identification. The effectiveness of this protocol for micro- and nano-scale sample processing was then evaluated for the extraction of proteins/peptides and shown effective for small mouse brain tissue samples (~ 20 μg total protein content) and also for samples of ~ 5 000 human breast cancer MCF-7 cells (~ 500 ng total protein content), where the detergent-based methods were ineffective due to losses during cleanup and transfer steps.