Science.gov

Sample records for assurance sampling method

  1. Specified assurance level sampling procedure

    SciTech Connect

    Willner, O.

    1980-11-01

    In the nuclear industry design specifications for certain quality characteristics require that the final product be inspected by a sampling plan which can demonstrate product conformance to stated assurance levels. The Specified Assurance Level (SAL) Sampling Procedure has been developed to permit the direct selection of attribute sampling plans which can meet commonly used assurance levels. The SAL procedure contains sampling plans which yield the minimum sample size at stated assurance levels. The SAL procedure also provides sampling plans with acceptance numbers ranging from 0 to 10, thus, making available to the user a wide choice of plans all designed to comply with a stated assurance level.

  2. Field trial of applicability of lot quality assurance sampling survey method for rapid assessment of prevalence of active trachoma.

    PubMed Central

    Myatt, Mark; Limburg, Hans; Minassian, Darwin; Katyola, Damson

    2003-01-01

    OBJECTIVE: To test the applicability of lot quality assurance sampling (LQAS) for the rapid assessment of the prevalence of active trachoma. METHODS: Prevalence of active trachoma in six communities was found by examining all children aged 2-5 years. Trial surveys were conducted in these communities. A sampling plan appropriate for classifying communities with prevalences < or =20% and > or =40% was applied to the survey data. Operating characteristic and average sample number curves were plotted, and screening test indices were calculated. The ability of LQAS to provide a three-class classification system was investigated. FINDINGS: Ninety-six trial surveys were conducted. All communities with prevalences < or =20% and > or =40% were identified correctly. The method discriminated between communities with prevalences < or =30% and >30%, with sensitivity of 98% (95% confidence interval (CI)=88.2-99.9%), specificity of 84.4% (CI=69.9-93.0%), positive predictive value of 87.7% (CI=75.7-94.5%), negative predictive value of 97.4% (CI=84.9-99.9%), and accuracy of 91.7% (CI=83.8-96.1%). Agreement between the three prevalence classes and survey classifications was 84.4% (CI=75.2-90.7%). The time needed to complete the surveys was consistent with the need to complete a survey in one day. CONCLUSION: Lot quality assurance sampling provides a method of classifying communities according to the prevalence of active trachoma. It merits serious consideration as a replacement for the assessment of the prevalence of active trachoma with the currently used trachoma rapid assessment method. It may be extended to provide a multi-class classification method. PMID:14997240

  3. Comparing two survey methods of measuring health-related indicators: Lot Quality Assurance Sampling and Demographic Health Surveys.

    PubMed

    Anoke, Sarah C; Mwai, Paul; Jeffery, Caroline; Valadez, Joseph J; Pagano, Marcello

    2015-12-01

    Two common methods used to measure indicators for health programme monitoring and evaluation are the demographic and health surveys (DHS) and lot quality assurance sampling (LQAS); each one has different strengths. We report on both methods when utilised in comparable situations. We compared 24 indicators in south-west Uganda, where data for prevalence estimations were collected independently for the two methods in 2011 (LQAS: n = 8876; DHS: n = 1200). Data were stratified (e.g. gender and age) resulting in 37 comparisons. We used a two-sample two-sided Z-test of proportions to compare both methods. The average difference between LQAS and DHS for 37 estimates was 0.062 (SD = 0.093; median = 0.039). The average difference among the 21 failures to reject equality of proportions was 0.010 (SD = 0.041; median = 0.009); among the 16 rejections, it was 0.130 (SD = 0.010, median = 0.118). Seven of the 16 rejections exhibited absolute differences of <0.10, which are clinically (or managerially) not significant; 5 had differences >0.10 and <0.20 (mean = 0.137, SD = 0.031) and four differences were >0.20 (mean = 0.261, SD = 0.083). There is 75.7% agreement across the two surveys. Both methods yield regional results, but only LQAS provides information at less granular levels (e.g. the district level) where managerial action is taken. The cost advantage and localisation make LQAS feasible to conduct more frequently, and provides the possibility for real-time health outcomes monitoring. © 2015 The Authors. Tropical Medicine & International Health Published by John Wiley & Sons Ltd.

  4. Quality Assurance for Water Sampling.

    DTIC Science & Technology

    1986-02-01

    in meaningful (precise) data. SAMLE ACQUISITION Collection Groundwater samples can be contaminated by material and/or equipment used to install the...samples must be shipped according to Department of Transportation (DOT) standards. Groundwater and wastewater samples are not considered haz.1rdou:3...Extraction *40 Days After Extraction Radiological Tests Alpha, Beta and Radium P,G HN03 to pH ɚ 6 Months NOTES I. P = Polyethylene G = Glass G-(TLS

  5. Comparative validation study to demonstrate the equivalence of a minor modification to AOAC Official Method 2005.05 Assurance GDS shiga Toxin Genes (O157) method to the reference culture method: 375 gram sample size.

    PubMed

    Feldsine, Philip T; Montgomery-Fullerton, Megan; Roa, Nerie; Kaur, Mandeep; Kerr, David E; Lienau, Andrew H; Jucker, Markus

    2013-01-01

    The Assurance GDS Shiga Toxin Genes (0157), AOAC Official MethodsM 2005.05, has been modified to include a larger sample size of 375 g. A methods comparison study was conducted to demonstrate the equivalence of this modification to the reference culture method. Ninety samples and controls, representing three foods, were analyzed. Results show no statistically detectable difference between the Assurance GDS Escherichia coli O157:H7 assay and the reference culture methods for the detection of E. coli O157:H7, other than the low level of inoculation for leaf lettuce for which the GDS gave noticeably higher recovery [difference in Probability of Detection between candidate methods (dPODc = +0.45)]. There were also suggestions of moderate differences (dPODc = +0.15 to +0.20) for ground beef and the high level of leaf lettuce, but the study size was too small to detect differences of this size. Results showed that the Assurance GDS Shiga Toxin Genes (0157) method is equivalent to the reference culture methods for the detection of Shiga toxigenic E. coli O157:H7.

  6. Comparative validation study to demonstrate the equivalence of a minor modification to AOAC official method 2005.04 assurance GdS E. coli 0157:H7 method to the reference culture method: 375 gram sample size.

    PubMed

    Feldsine, Philip T; Montgomery-Fullerton, Megan; Roa, Nerie; Kaur, Mandeep; Lienau, Andrew H; Jucker, Markus; Kerr, David E

    2013-01-01

    The Assurance GDS Escherichia coli (E. col) O157:H7, AOAC Official Method 2005.04, has been modified to include a larger sample size of 375 g. A methods comparison study was conducted to demonstrate the equivalence of this modification to the reference culture method. Ninety samples and controls, representing three foods, were analyzed. Results show no statistically detectable difference between the Assurance GDS E. coli O157:H7 assay and the reference culture methods for the detection of E. coli O157:H7, other than the low level of inoculation for leaf lettuce, for which the GDS gave noticeably higher recovery [difference in probability of detection between candidate methods (dPODc = +0.45)]. There were also suggestions of moderate differences (dPODc = +0.15 to +0.20) for ground beef and the high level of leaf lettuce, but the study size was too small to detect differences of this size. Results showed that the Assurance GDS E. coli O157:H7 method is equivalent to reference culture methods for the detection of E. coli O157:H7.

  7. Sampling for assurance of future reliability

    NASA Astrophysics Data System (ADS)

    Klauenberg, Katy; Elster, Clemens

    2017-02-01

    Ensuring measurement trueness, compliance with regulations and conformity with standards are key tasks in metrology which are often considered at the time of an inspection. Current practice does not always verify quality after or between inspections, calibrations, laboratory comparisons, conformity assessments, etc. Statistical models describing behavior over time may ensure reliability, i.e. they may give the probability of functioning, compliance or survival until some future point in time. It may not always be possible or economic to inspect a whole population of measuring devices or other units. Selecting a subset of the population according to statistical sampling plans and inspecting only these, allows conclusions about the quality of the whole population with a certain confidence. Combining these issues of sampling and aging, raises questions such as: How many devices need to be inspected, and at least how many of them must conform, so that one can be sure, that more than 100p % of the population will comply until the next inspection? This research is to raise awareness and offer a simple answer to such time- and sample-based quality statements in metrology and beyond. Reliability demonstration methods, such as the prevailing Weibull binomial model, quantify the confidence in future reliability on the basis of a sample. We adapt the binomial model to be applicable to sampling without replacement and simplify the Weibull model so that sampling plans may be determined on the basis of existing ISO standards. Provided the model is suitable, no additional information and no software are needed; and yet, the consumer is protected against future failure. We establish new sampling plans for utility meter surveillance, which are required by a recent modification of German law. These sampling plans are given in similar tables to the previous ones, which demonstrates their suitability for everyday use.

  8. Chesapeake Bay coordinated split sample program annual report, 1990-1991: Analytical methods and quality assurance workgroup of the Chesapeake Bay program monitoring subcommittee

    SciTech Connect

    Not Available

    1991-01-01

    The Chesapeake Bay Program is a federal-state partnership with a goal of restoring the Chesapeake Bay. Its ambient water quality monitoring programs, started in 1984, sample over 150 monitoring stations once or twice a month a month. Due to the size of the Bay watershed (64,000 square miles) and the cooperative nature of the CBP, these monitoring programs involve 10 different analytical laboratories. The Chesapeake Bay Coordinated Split Sample Program (CSSP), initialed in 1988, assesses the comparability of the water quality results from these laboratories. The report summarizes CSSP results for 1990 and 1991, its second and third full years of operation. The CSSP has two main objectives: identifying parameters with low inter-organization agreement, and estimating measurement system variability. The identification of parmeters with low agreement is used as part of the overall Quality Assurance program. Laboratory and program personnel use the information to investigate possible causes of the differences, and take action to increase agreement if possible. Later CSSP results will document any improvements in inter-organization agreement. The variability estimates are most useful to data analysts and modelers who need confidence estimates for monitoring data.

  9. Authentication Assurance Level Application to the Inventory Sampling Measurement System

    SciTech Connect

    Devaney, Mike M.; Kouzes, Richard T.; Hansen, Randy R.; Geelhood, Bruce D.

    2001-09-06

    This document concentrates on the identification of a standardized assessment approach for the verification of security functionality in specific equipment, the Inspection Sampling Measurement System (ISMS) being developed for MAYAK. Specifically, an Authentication Assurance Level 3 is proposed to be reached in authenticating the ISMS.

  10. Measurement assurance program for LSC analyses of tritium samples

    SciTech Connect

    Levi, G.D. Jr.; Clark, J.P.

    1997-05-01

    Liquid Scintillation Counting (LSC) for Tritium is done on 600 to 800 samples daily as part of a contamination control program at the Savannah River Site`s Tritium Facilities. The tritium results from the LSCs are used: to release items as radiologically clean; to establish radiological control measures for workers; and to characterize waste. The following is a list of the sample matrices that are analyzed for tritium: filter paper smears, aqueous, oil, oily rags, ethylene glycol, ethyl alcohol, freon and mercury. Routine and special causes of variation in standards, counting equipment, environment, operators, counting times, samples, activity levels, etc. produce uncertainty in the LSC measurements. A comprehensive analytical process measurement assurance program such as JTIPMAP{trademark} has been implemented. The process measurement assurance program is being used to quantify and control many of the sources of variation and provide accurate estimates of the overall measurement uncertainty associated with the LSC measurements. The paper will describe LSC operations, process improvements, quality control and quality assurance programs along with future improvements associated with the implementation of the process measurement assurance program.

  11. Choosing a Cluster Sampling Design for Lot Quality Assurance Sampling Surveys.

    PubMed

    Hund, Lauren; Bedrick, Edward J; Pagano, Marcello

    2015-01-01

    Lot quality assurance sampling (LQAS) surveys are commonly used for monitoring and evaluation in resource-limited settings. Recently several methods have been proposed to combine LQAS with cluster sampling for more timely and cost-effective data collection. For some of these methods, the standard binomial model can be used for constructing decision rules as the clustering can be ignored. For other designs, considered here, clustering is accommodated in the design phase. In this paper, we compare these latter cluster LQAS methodologies and provide recommendations for choosing a cluster LQAS design. We compare technical differences in the three methods and determine situations in which the choice of method results in a substantively different design. We consider two different aspects of the methods: the distributional assumptions and the clustering parameterization. Further, we provide software tools for implementing each method and clarify misconceptions about these designs in the literature. We illustrate the differences in these methods using vaccination and nutrition cluster LQAS surveys as example designs. The cluster methods are not sensitive to the distributional assumptions but can result in substantially different designs (sample sizes) depending on the clustering parameterization. However, none of the clustering parameterizations used in the existing methods appears to be consistent with the observed data, and, consequently, choice between the cluster LQAS methods is not straightforward. Further research should attempt to characterize clustering patterns in specific applications and provide suggestions for best-practice cluster LQAS designs on a setting-specific basis.

  12. Extending cluster lot quality assurance sampling designs for surveillance programs.

    PubMed

    Hund, Lauren; Pagano, Marcello

    2014-07-20

    Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate.

  13. Illinois' Forests, 2005: Statistics, Methods, and Quality Assurance

    Treesearch

    Susan J. Crocker; Charles J. Barnett; Mark A. Hatfield

    2013-01-01

    The first full annual inventory of Illinois' forests was completed in 2005. This report contains 1) descriptive information on methods, statistics, and quality assurance of data collection, 2) a glossary of terms, 3) tables that summarize quality assurance, and 4) a core set of tabular estimates for a variety of forest resources. A detailed analysis of inventory...

  14. Japanese Society for Laboratory Hematology flow cytometric reference method of determining the differential leukocyte count: external quality assurance using fresh blood samples.

    PubMed

    Kawai, Y; Nagai, Y; Ogawa, E; Kondo, H

    2017-04-01

    To provide target values for the manufacturers' survey of the Japanese Society for Laboratory Hematology (JSLH), accurate standard data from healthy volunteers were needed for the five-part differential leukocyte count. To obtain such data, JSLH required an antibody panel that achieved high specificity (particularly for mononuclear cells) using simple gating procedures. We developed a flow cytometric method for determining the differential leukocyte count (JSLH-Diff) and validated it by comparison with the flow cytometric differential leukocyte count of the International Council for Standardization in Haematology (ICSH-Diff) and the manual differential count obtained by microscopy (Manual-Diff). First, the reference laboratory performed an imprecision study of JSLH-Diff and ICSH-Diff, as well as performing comparison among JSLH-Diff, Manual-Diff, and ICSH-Diff. Then two reference laboratories and seven participating laboratories performed imprecision and accuracy studies of JSLH-Diff, Manual-Diff, and ICSH-Diff. Simultaneously, six manufacturers' laboratories provided their own representative values by using automated hematology analyzers. The precision of both JSLH-Diff and ICSH-Diff methods was adequate. Comparison by the reference laboratory showed that all correlation coefficients, slopes and intercepts obtained by the JSLH-Diff, ICSH-Diff, and Manual-Diff methods conformed to the criteria. When the imprecision and accuracy of JSLH-Diff were assessed at seven laboratories, the CV% for lymphocytes, neutrophils, monocytes, eosinophils, and basophils was 0.5~0.9%, 0.3~0.7%, 1.7~2.6%, 3.0~7.9%, and 3.8~10.4%, respectively. More than 99% of CD45 positive leukocytes were identified as normal leukocytes by JSLH-Diff. When JSLH-Diff method were validated by comparison with Manual-Diff and ICSH-Diff, JSLH-Diff showed good performance as a reference method. © 2016 John Wiley & Sons Ltd.

  15. Assure

    Integrated Risk Information System (IRIS)

    Assure ; CASRN 76578 - 14 - 8 Human health assessment information on a chemical substance is included in the IRIS database only after a comprehensive review of toxicity data , as outlined in the IRIS assessment development process . Sections I ( Health Hazard Assessments for Noncarcinogenic Effects

  16. Optimization of single plate-serial dilution spotting (SP-SDS) with sample anchoring as an assured method for bacterial and yeast cfu enumeration and single colony isolation from diverse samples.

    PubMed

    Thomas, Pious; Sekhar, Aparna C; Upreti, Reshmi; Mujawar, Mohammad M; Pasha, Sadiq S

    2015-12-01

    We propose a simple technique for bacterial and yeast cfu estimations from diverse samples with no prior idea of viable counts, designated as single plate-serial dilution spotting (SP-SDS) with the prime recommendation of sample anchoring (10(0) stocks). For pure cultures, serial dilutions were prepared from 0.1 OD (10(0)) stock and 20 μl aliquots of six dilutions (10(1)-10(6)) were applied as 10-15 micro-drops in six sectors over agar-gelled medium in 9-cm plates. For liquid samples 10(0)-10(5) dilutions, and for colloidal suspensions and solid samples (10% w/v), 10(1)-10(6) dilutions were used. Following incubation, at least one dilution level yielded 6-60 cfu per sector comparable to the standard method involving 100 μl samples. Tested on diverse bacteria, composite samples and Saccharomyces cerevisiae, SP-SDS offered wider applicability over alternative methods like drop-plating and track-dilution for cfu estimation, single colony isolation and culture purity testing, particularly suiting low resource settings.

  17. Triangle area water supply monitoring project, October 1988 through September 2001, North Carolina -- description of the water-quality network, sampling and analysis methods, and quality-assurance practices

    USGS Publications Warehouse

    Oblinger, Carolyn J.

    2004-01-01

    The Triangle Area Water Supply Monitoring Project was initiated in October 1988 to provide long-term water-quality data for six area water-supply reservoirs and their tributaries. In addition, the project provides data that can be used to determine the effectiveness of large-scale changes in water-resource management practices, document differences in water quality among water-supply types (large multiuse reservoir, small reservoir, run-of-river), and tributary-loading and in-lake data for water-quality modeling of Falls and Jordan Lakes. By September 2001, the project had progressed in four phases and included as many as 34 sites (in 1991). Most sites were sampled and analyzed by the U.S. Geological Survey. Some sites were already a part of the North Carolina Division of Water Quality statewide ambient water-quality monitoring network and were sampled by the Division of Water Quality. The network has provided data on streamflow, physical properties, and concentrations of nutrients, major ions, metals, trace elements, chlorophyll, total organic carbon, suspended sediment, and selected synthetic organic compounds. Project quality-assurance activities include written procedures for sample collection, record management and archive, collection of field quality-control samples (blank samples and replicate samples), and monitoring the quality of field supplies. In addition to project quality-assurance activities, the quality of laboratory analyses was assessed through laboratory quality-assurance practices and an independent laboratory quality-control assessment provided by the U.S. Geological Survey Branch of Quality Systems through the Blind Inorganic Sample Project and the Organic Blind Sample Project.

  18. Development of quality assurance methods for epoxy graphite prepreg

    NASA Technical Reports Server (NTRS)

    Chen, J. S.; Hunter, A. B.

    1982-01-01

    Quality assurance methods for graphite epoxy/prepregs were developed. Liquid chromatography, differential scanning calorimetry, and gel permeation chromatography were investigated. These methods were applied to a second prepreg system. The resin matrix formulation was correlated with mechanical properties. Dynamic mechanical analysis and fracture toughness methods were investigated. The chromatography and calorimetry techniques were all successfully developed as quality assurance methods for graphite epoxy prepregs. The liquid chromatography method was the most sensitive to changes in resin formulation. The were also successfully applied to the second prepreg system.

  19. Measurement assurance program for FTIR analyses of deuterium oxide samples

    SciTech Connect

    Johnson, S.R.; Clark, J.P.

    1997-01-01

    Analytical chemistry measurements require an installed criterion based assessment program to identify and control sources of error. This program should also gauge the uncertainty about the data. A self- assessment was performed of long established quality control practices against the characteristics of a comprehensive measurement assurance program. Opportunities for improvement were identified. This paper discusses the efforts to transform quality control practices into a complete measurement assurance program. The resulting program heightened the laboratory`s confidence in the data it generated, by providing real-time statistical information to control and determine measurement quality.

  20. Using lot quality assurance sampling to improve immunization coverage in Bangladesh.

    PubMed Central

    Tawfik, Y.; Hoque, S.; Siddiqi, M.

    2001-01-01

    OBJECTIVE: To determine areas of low vaccination coverage in five cities in Bangladesh (Chittagong, Dhaka, Khulna, Rajshahi, and Syedpur). METHODS: Six studies using lot quality assurance sampling were conducted between 1995 and 1997 by Basic Support for Institutionalizing Child Survival and the Bangladesh National Expanded Programme on Immunization. FINDINGS: BCG vaccination coverage was acceptable in all lots studied; however, the proportion of lots rejected because coverage of measles vaccination was low ranged from 0% of lots in Syedpur to 12% in Chittagong and 20% in Dhaka's zones 7 and 8. The proportion of lots rejected because an inadequate number of children in the sample had been fully vaccinated varied from 11% in Syedpur to 30% in Dhaka. Additionally, analysis of aggregated, weighted immunization coverage showed that there was a high BCG vaccination coverage (the first administered vaccine) and a low measles vaccination coverage (the last administered vaccine) indicating a high drop-out rate, ranging from 14% in Syedpur to 36% in Dhaka's zone 8. CONCLUSION: In Bangladesh, where resources are limited, results from surveys using lot quality assurance sampling enabled managers of the National Expanded Programme on Immunization to identify areas with poor vaccination coverage. Those areas were targeted to receive focused interventions to improve coverage. Since this sampling method requires only a small sample size and was easy for staff to use, it is feasible for routine monitoring of vaccination coverage. PMID:11436470

  1. Using lot quality-assurance sampling and area sampling to identify priority areas for trachoma control: Viet Nam.

    PubMed Central

    Myatt, Mark; Mai, Nguyen Phuong; Quynh, Nguyen Quang; Nga, Nguyen Huy; Tai, Ha Huy; Long, Nguyen Hung; Minh, Tran Hung; Limburg, Hans

    2005-01-01

    OBJECTIVE: To report on the use of lot quality-assurance sampling (LQAS) surveys undertaken within an area-sampling framework to identify priority areas for intervention with trachoma control activities in Viet Nam. METHODS: The LQAS survey method for the rapid assessment of the prevalence of active trachoma was adapted for use in Viet Nam with the aim of classifying individual communes by the prevalence of active trachoma among children in primary school. School-based sampling was used; school sites to be sampled were selected using an area-sampling approach. A total of 719 communes in 41 districts in 18 provinces were surveyed. FINDINGS: Survey staff found the LQAS survey method both simple and rapid to use after initial problems with area-sampling methods were identified and remedied. The method yielded a finer spatial resolution of prevalence than had been previously achieved in Viet Nam using semiquantitative rapid assessment surveys and multistage cluster-sampled surveys. CONCLUSION: When used with area-sampling techniques, the LQAS survey method has the potential to form the basis of survey instruments that can be used to efficiently target resources for interventions against active trachoma. With additional work, such methods could provide a generally applicable tool for effective programme planning and for the certification of the elimination of trachoma as a blinding disease. PMID:16283052

  2. [Quality assurance in geriatric rehabilitation--approaches and methods].

    PubMed

    Deckenbach, B; Borchelt, M; Steinhagen-Thiessen, E

    1997-08-01

    It did not take the provisions of the 5th Book of the Social Code for quality assurance issues to gain significance in the field of geriatric rehabilitation as well. While in the surgical specialties, experience in particular with external quality assurance have already been gathered over several years now, suitable concepts and methods for the new Geriatric Rehabilitation specialty are still in the initial stages of development. Proven methods from the industrial and service sectors, such as auditing, monitoring and quality circles, can in principle be drawn on for devising geriatric rehabilitation quality assurance schemes; these in particular need to take into account the multiple factors influencing the course and outcome of rehabilitation entailed by multimorbidity and multi-drug use; the eminent role of the social environment; therapeutic interventions by a multidisciplinary team; as well as the multi-dimensional nature of rehabilitation outcomes. Moreover, the specific conditions of geriatric rehabilitation require development not only of quality standards unique to this domain but also of quality assurance procedures specific to geriatrics. Along with a number of other methods, standardized geriatric assessment will play a crucial role in this respect.

  3. Estimation after classification using lot quality assurance sampling: corrections for curtailed sampling with application to evaluating polio vaccination campaigns.

    PubMed

    Olives, Casey; Valadez, Joseph J; Pagano, Marcello

    2014-03-01

    To assess the bias incurred when curtailment of Lot Quality Assurance Sampling (LQAS) is ignored, to present unbiased estimators, to consider the impact of cluster sampling by simulation and to apply our method to published polio immunization data from Nigeria. We present estimators of coverage when using two kinds of curtailed LQAS strategies: semicurtailed and curtailed. We study the proposed estimators with independent and clustered data using three field-tested LQAS designs for assessing polio vaccination coverage, with samples of size 60 and decision rules of 9, 21 and 33, and compare them to biased maximum likelihood estimators. Lastly, we present estimates of polio vaccination coverage from previously published data in 20 local government authorities (LGAs) from five Nigerian states. Simulations illustrate substantial bias if one ignores the curtailed sampling design. Proposed estimators show no bias. Clustering does not affect the bias of these estimators. Across simulations, standard errors show signs of inflation as clustering increases. Neither sampling strategy nor LQAS design influences estimates of polio vaccination coverage in 20 Nigerian LGAs. When coverage is low, semicurtailed LQAS strategies considerably reduces the sample size required to make a decision. Curtailed LQAS designs further reduce the sample size when coverage is high. Results presented dispel the misconception that curtailed LQAS data are unsuitable for estimation. These findings augment the utility of LQAS as a tool for monitoring vaccination efforts by demonstrating that unbiased estimation using curtailed designs is not only possible but these designs also reduce the sample size. © 2014 John Wiley & Sons Ltd.

  4. Chapter 5: Quality assurance/quality control in stormwater sampling

    USDA-ARS?s Scientific Manuscript database

    Sampling the quality of stormwater presents unique challenges because stormwater flow is relatively short-lived with drastic variability. Furthermore, storm events often occur with little advance warning, outside conventional work hours, and under adverse weather conditions. Therefore, most stormwat...

  5. Sampling system and method

    DOEpatents

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  6. Evaluation of a clinically intuitive quality assurance method

    NASA Astrophysics Data System (ADS)

    Norris, H.; Thomas, A.; Oldham, M.

    2013-06-01

    There is a pressing need for clinically intuitive quality assurance methods that report metrics of relevance to the likely impact on tumor control of normal tissue injury. This paper presents a preliminary investigation into the accuracy of a novel "transform method" which enables a clinically relevant analysis through dose-volume-histograms (DVHs) and dose overlays on the patient's CT data. The transform method was tested by inducing a series of known mechanical and delivery errors onto simulated 3D dosimetry measurements of six different head-and-neck IMRT treatment plans. Accuracy was then examined through the comparison of the transformed patient dose distributions and the known actual patient dose distributions through dose-volume histograms and normalized dose difference analysis. Through these metrics, the transform method was found to be highly accurate in predicting measured patient dose distributions for these types of errors.

  7. Validity of Lot Quality Assurance Sampling to optimize falciparum malaria surveys in low-transmission areas.

    PubMed

    Rabarijaona, L; Rakotomanana, F; Ranaivo, L; Raharimalala, L; Modiano, D; Boisier, P; De Giorgi, F; Raveloson, N; Jambou, R

    2001-01-01

    To control the reappearance of malaria in the Madagascan highlands, indoor house-spraying of DDT was conducted from 1993 until 1998. Before the end of the insecticide-spraying programme, a surveillance system was set up to allow rapid identification of new malaria epidemics. When the number of suspected clinical malaria cases notified to the surveillance system exceeds a predetermined threshold, a parasitological survey is carried out in the community to confirm whether or not transmission of falciparum malaria is increasing. Owing to the low specificity of the surveillance system, this confirmation stage is essential to guide the activities of the control programme. For this purpose, Lot Quality Assurance Sampling (LQAS), which usually requires smaller sample sizes, seemed to be a valuable alternative to conventional survey methods. In parallel to a conventional study of Plasmodium falciparum prevalence carried out in 1998, we investigated the ability of LQAS to rapidly classify zones according to a predetermined prevalence level. Two prevalence thresholds (5% and 15%) were tested using various sampling plans. A plan (36, 2), meaning that at least 2 individuals found to be positive among a random sample of 36, enabled us to classify a community correctly with a sensitivity of 100% and a specificity of 94%. LQAS is an effective tool for rapid assessment of falciparum malaria prevalence when monitoring malaria transmission.

  8. Evaluation of a Standardized Method of Quality Assurance in Mental Health Records: A Pilot Study

    ERIC Educational Resources Information Center

    Bradshaw, Kelsey M.; Donohue, Bradley; Fayeghi, Jasmine; Lee, Tiffany; Wilks, Chelsey R.; Ross, Brendon

    2016-01-01

    The widespread adoption of research-supported treatments by mental health providers has facilitated empirical development of quality assurance (QA) methods. Research in this area has focused on QA systems aimed at assuring the integrity of research-supported treatment implementation, while examination of QA systems to assure appropriate…

  9. Evaluation of a Standardized Method of Quality Assurance in Mental Health Records: A Pilot Study

    ERIC Educational Resources Information Center

    Bradshaw, Kelsey M.; Donohue, Bradley; Fayeghi, Jasmine; Lee, Tiffany; Wilks, Chelsey R.; Ross, Brendon

    2016-01-01

    The widespread adoption of research-supported treatments by mental health providers has facilitated empirical development of quality assurance (QA) methods. Research in this area has focused on QA systems aimed at assuring the integrity of research-supported treatment implementation, while examination of QA systems to assure appropriate…

  10. Transition Path Sampling Methods

    NASA Astrophysics Data System (ADS)

    Dellago, C.; Bolhuis, P. G.; Geissler, P. L.

    Transition path sampling, based on a statistical mechanics in trajectory space, is a set of computational methods for the simulation of rare events in complex systems. In this chapter we give an overview of these techniques and describe their statistical mechanical basis as well as their application.

  11. Sampling system and method

    DOEpatents

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2017-03-07

    In one embodiment, the present disclosure provides an apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. In various examples, the clamp is external to the tubing bundle or integral with the tubing bundle. According to one method, a tubing bundle and wireline are deployed together and the tubing bundle periodically secured to the wireline using a clamp. In another embodiment, the present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit. In a specific example, one or more clamps are used to connect the first and/or second conduits to an external wireline.

  12. Quality Assurance in Online Content Literacy Methods Courses

    ERIC Educational Resources Information Center

    Marsh, Josephine P.; Lammers, Jayne C.; Alvermann, Donna E.

    2012-01-01

    As institutions offer more online courses in their teacher certification and literacy master's programs, research is needed to address issues of quality assurance in online instruction. This multicase study analyzes qualitatively elements for addressing quality assurance of the implementation of an online content literacy teacher education course…

  13. An assessment of Lot Quality Assurance Sampling to evaluate malaria outcome indicators: extending malaria indicator surveys

    PubMed Central

    Biedron, Caitlin; Pagano, Marcello; Hedt, Bethany L; Kilian, Albert; Ratcliffe, Amy; Mabunda, Samuel; Valadez, Joseph J

    2010-01-01

    Background Large investments and increased global prioritization of malaria prevention and treatment have resulted in greater emphasis on programme monitoring and evaluation (M&E) in many countries. Many countries currently use large multistage cluster sample surveys to monitor malaria outcome indicators on a regional and national level. However, these surveys often mask local-level variability important to programme management. Lot Quality Assurance Sampling (LQAS) has played a valuable role for local-level programme M&E. If incorporated into these larger surveys, it would provide a comprehensive M&E plan at little, if any, extra cost. Methods The Mozambique Ministry of Health conducted a Malaria Indicator Survey (MIS) in June and July 2007. We applied LQAS classification rules to the 345 sampled enumeration areas to demonstrate identifying high- and low-performing areas with respect to two malaria program indicators—‘household possession of any bednet’ and ‘household possession of any insecticide-treated bednet (ITN)’. Results As shown by the MIS, no province in Mozambique achieved the 70% coverage target for household possession of bednets or ITNs. By applying LQAS classification rules to the data, we identify 266 of the 345 enumeration areas as having bednet coverage severely below the 70% target. An additional 73 were identified with low ITN coverage. Conclusions This article demonstrates the feasibility of integrating LQAS into multistage cluster sampling surveys and using these results to support a comprehensive national, regional and local programme M&E system. Furthermore, in the recommendations we outlined how to integrate the Large Country-LQAS design into macro-surveys while still obtaining results available through current sampling practices. PMID:20139435

  14. Acceptance sampling methods for sample results verification

    SciTech Connect

    Jesse, C.A.

    1993-06-01

    This report proposes a statistical sampling method for use during the sample results verification portion of the validation of data packages. In particular, this method was derived specifically for the validation of data packages for metals target analyte analysis performed under United States Environmental Protection Agency Contract Laboratory Program protocols, where sample results verification can be quite time consuming. The purpose of such a statistical method is to provide options in addition to the ``all or nothing`` options that currently exist for sample results verification. The proposed method allows the amount of data validated during the sample results verification process to be based on a balance between risks and the cost of inspection.

  15. Modified aerospace reliability and quality assurance method for wind turbines

    NASA Technical Reports Server (NTRS)

    Klein, W. E.

    1980-01-01

    The safety, reliability, and quality assurance (SR&QA) approach developed for the first large wind turbine generator project is described. The SR&QA approach was used to assure that the machine would not be hazardous to the public or operating personnel, would operate unattended on a utility grid, would demonstrate reliable operation and would help establish the quality assurance and maintainability requirements for future wind turbine projects. A modified failure modes and effects analysis during the design phase, minimal hardware inspections during parts fabrication, and three simple documents to control activities during machine construction and operation were presented.

  16. QUALITY ASSURANCE PROGRAM FOR WET DEPOSITION SAMPLING AND CHEMICAL ANALYSES FOR THE NATIONAL TRENDS NETWORK.

    USGS Publications Warehouse

    Schroder, LeRoy J.; Malo, Bernard A.; ,

    1985-01-01

    The purpose of the National Trends Network is to delineate the major inorganic constituents in the wet deposition in the United States. The approach chosen to monitor the Nation's wet deposition is to install approximately 150 automatic sampling devices with at least one collector in each state. Samples are collected at one week intervals, removed from collectors, and transported to an analytical laboratory for chemical analysis. The quality assurance program has divided wet deposition monitoring into 5 parts: (1) Sampling site selection, (2) sampling device, (3) sample container, (4) sample handling, and (5) laboratory analysis. Each of these five components is being examined using existing designs or new designs. Each existing or proposed sampling site is visited and a criteria audit is performed.

  17. 42 CFR 440.260 - Methods and standards to assure quality of services.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Methods and standards to assure quality of services. 440.260 Section 440.260 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH... and Limits Applicable to All Services § 440.260 Methods and standards to assure quality of services...

  18. Improved Sampling Method Reduces Isokinetic Sampling Errors.

    ERIC Educational Resources Information Center

    Karels, Gale G.

    The particulate sampling system currently in use by the Bay Area Air Pollution Control District, San Francisco, California is described in this presentation for the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971. The method represents a practical, inexpensive tool that can…

  19. Improved Sampling Method Reduces Isokinetic Sampling Errors.

    ERIC Educational Resources Information Center

    Karels, Gale G.

    The particulate sampling system currently in use by the Bay Area Air Pollution Control District, San Francisco, California is described in this presentation for the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971. The method represents a practical, inexpensive tool that can…

  20. Triangle Area Water Supply Monitoring Project, October 1988 Through September 2001, North Carolina-Description of the Water-Quality Network, Sampling and Analysis Methods, and Quality-Assurance Practices

    DTIC Science & Technology

    2004-01-01

    focus of phase III was to assess the occurrence of Cryptosporidium parvum oocyts and Giardia lamblia cysts in raw-water supplies. Samples were...parvum oocysts and Giardia Lamblia cysts in raw-water supplies. In phase IV, the special focus was on the collection of more samples during high-flow

  1. Sampling system and method

    SciTech Connect

    Decker, David L; Lyles, Brad F; Purcell, Richard G; Hershey, Ronald Lee

    2014-05-20

    An apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. The method includes deploying the tubing bundle and wireline together, The tubing bundle is periodically secured to the wireline using a clamp.

  2. HDR quality assurance methods for personal digital assistants.

    PubMed

    Astrahan, Melvin A

    2004-01-01

    An important component of every clinical high-dose-rate (HDR) brachytherapy program is quality assurance (QA). One of the QA recommendations of the AAPM TG59 report is an independent verification on the results of treatment planning. It is desirable for the verification procedure to be as quick and easy to perform as possible and yet to have a high probability of detecting significant errors. The objective of this work is to describe the dosimetric methods and software developed to implement a departmental HDR QA program using personal digital assistants (PDAs). Verification of MammoSite treatment plans is presented as a practical example. PDAs that run the PalmOS were selected for their low cost and popularity among health care professionals. General-purpose applications were developed for linear sources, planar, and volume implants, that estimate the total dwell time of an HDR implant. This value can then be compared to the total dwell time calculated by the primary treatment planning system. The software incorporates the Paterson-Parker (PP) radium tables and the Greenfield-Tichman-Norman (GTN) version of the Quimby radium tables, which have been modified to a form more convenient for HDR calculations. A special purpose application based on the AAPM TG43 formalism was developed for the MammoSite breast applicator. For QA calculations perpendicular to the center of a single Iridium-192 (192I) HDR source, as exemplified by MammoSite treatments, linearly interpolating the PP or GTN tables is equivalent to applying the TG43 formalism at distances up to 5 cm from the source axis. The MammoSite-specific software also offers the option to calculate dosimetry based on the balloon volume. The PDA clock/calendar permits the software to automatically account for source decay. The touch-sensitive screen allows the familiar tabular format to be maintained while minimizing the effort required for calculations. The PP and GTN radium implant tables are easily modified to a form

  3. Data quality assessment in the routine health information system: an application of the Lot Quality Assurance Sampling in Benin.

    PubMed

    Glèlè Ahanhanzo, Yolaine; Ouendo, Edgard-Marius; Kpozèhouen, Alphonse; Levêque, Alain; Makoutodé, Michel; Dramaix-Wilmet, Michèle

    2015-09-01

    Health information systems in developing countries are often faulted for the poor quality of the data generated and for the insufficient means implemented to improve system performance. This study examined data quality in the Routine Health Information System in Benin in 2012 and carried out a cross-sectional evaluation of the quality of the data using the Lot Quality Assurance Sampling method. The results confirm the insufficient quality of the data based on three criteria: completeness, reliability and accuracy. However, differences can be seen as the shortcomings are less significant for financial data and for immunization data. The method is simple, fast and can be proposed for current use at operational level as a data quality control tool during the production stage.

  4. Sediment laboratory quality-assurance project: studies of methods and materials

    USGS Publications Warehouse

    Gordon, J.D.; Newland, C.A.; Gray, J.R.

    2001-01-01

    In August 1996 the U.S. Geological Survey initiated the Sediment Laboratory Quality-Assurance project. The Sediment Laboratory Quality Assurance project is part of the National Sediment Laboratory Quality-Assurance program. This paper addresses the fmdings of the sand/fme separation analysis completed for the single-blind reference sediment-sample project and differences in reported results between two different analytical procedures. From the results it is evident that an incomplete separation of fme- and sand-size material commonly occurs resulting in the classification of some of the fme-size material as sand-size material. Electron microscopy analysis supported the hypothesis that the negative bias for fme-size material and the positive bias for sand-size material is largely due to aggregation of some of the fine-size material into sand-size particles and adherence of fine-size material to the sand-size grains. Electron microscopy analysis showed that preserved river water, which was low in dissolved solids, specific conductance, and neutral pH, showed less aggregation and adhesion than preserved river water that was higher in dissolved solids and specific conductance with a basic pH. Bacteria were also found growing in the matrix, which may enhance fme-size material aggregation through their adhesive properties. Differences between sediment-analysis methods were also investigated as pan of this study. Suspended-sediment concentration results obtained from one participating laboratory that used a total-suspended solids (TSS) method had greater variability and larger negative biases than results obtained when this laboratory used a suspended-sediment concentration method. When TSS methods were used to analyze the reference samples, the median suspended sediment concentration percent difference was -18.04 percent. When the laboratory used a suspended-sediment concentration method, the median suspended-sediment concentration percent difference was -2

  5. Quality assurance guidance for field sampling and measurement assessment plates in support of EM environmental sampling and analysis activities

    SciTech Connect

    Not Available

    1994-05-01

    This document is one of several guidance documents developed by the US Department of Energy (DOE) Office of Environmental Restoration and Waste Management (EM). These documents support the EM Analytical Services Program (ASP) and are based on applicable regulatory requirements and DOE Orders. They address requirements in DOE Orders by providing guidance that pertains specifically to environmental restoration and waste management sampling and analysis activities. DOE 5700.6C Quality Assurance (QA) defines policy and requirements to establish QA programs ensuring that risks and environmental impacts are minimized and that safety, reliability, and performance are maximized. This is accomplished through the application of effective management systems commensurate with the risks imposed by the facility and the project. Every organization supporting EM`s environmental sampling and analysis activities must develop and document a QA program. Management of each organization is responsible for appropriate QA program implementation, assessment, and improvement. The collection of credible and cost-effective environmental data is critical to the long-term success of remedial and waste management actions performed at DOE facilities. Only well established and management supported assessment programs within each EM-support organization will enable DOE to demonstrate data quality. The purpose of this series of documents is to offer specific guidance for establishing an effective assessment program for EM`s environmental sampling and analysis (ESA) activities.

  6. Multitumor "sausage" blocks in immunohistochemistry. Simplified method of preparation, practical uses, and roles in quality assurance.

    PubMed

    Miller, R T; Groothuis, C L

    1991-08-01

    This report describes a simplified method for preparing multitumor sausage blocks (MTSBs) for use in immunohistochemical procedures. Rather than relying on previously processed paraffin blocks as a source of material, this procedure involves procuring tissue at the time of gross specimen examination. The tissue is processed along with routine surgical pathologic material, and the paraffinized samples are placed in "storage cassettes" for easy cataloging and storage. Thin strips are cut from the tissue in the "storage cassettes" and combined by dripping liquid paraffin onto them while they are rolled between the thumbs and forefingers, somewhat like making a cigarette. This results in a tissue "log." Transverse sections of the "log" are embedded in paraffin blocks and used as MTSBs. Practical uses of MTSBs are discussed, and their role in quality assurance is stressed.

  7. An assessment of Lot Quality Assurance Sampling to evaluate malaria outcome indicators: extending malaria indicator surveys.

    PubMed

    Biedron, Caitlin; Pagano, Marcello; Hedt, Bethany L; Kilian, Albert; Ratcliffe, Amy; Mabunda, Samuel; Valadez, Joseph J

    2010-02-01

    Large investments and increased global prioritization of malaria prevention and treatment have resulted in greater emphasis on programme monitoring and evaluation (M&E) in many countries. Many countries currently use large multistage cluster sample surveys to monitor malaria outcome indicators on a regional and national level. However, these surveys often mask local-level variability important to programme management. Lot Quality Assurance Sampling (LQAS) has played a valuable role for local-level programme M&E. If incorporated into these larger surveys, it would provide a comprehensive M&E plan at little, if any, extra cost. The Mozambique Ministry of Health conducted a Malaria Indicator Survey (MIS) in June and July 2007. We applied LQAS classification rules to the 345 sampled enumeration areas to demonstrate identifying high- and low-performing areas with respect to two malaria program indicators-'household possession of any bednet' and 'household possession of any insecticide-treated bednet (ITN)'. As shown by the MIS, no province in Mozambique achieved the 70% coverage target for household possession of bednets or ITNs. By applying LQAS classification rules to the data, we identify 266 of the 345 enumeration areas as having bednet coverage severely below the 70% target. An additional 73 were identified with low ITN coverage. This article demonstrates the feasibility of integrating LQAS into multistage cluster sampling surveys and using these results to support a comprehensive national, regional and local programme M&E system. Furthermore, in the recommendations we outlined how to integrate the Large Country-LQAS design into macro-surveys while still obtaining results available through current sampling practices.

  8. A Method for Evaluating Quality Assurance Needs in Radiation Therapy

    SciTech Connect

    Huq, M. Saiful Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Ibbott, Geoffrey S.; Medin, Paul M.; Mundt, Arno; Mutic, Sassa; Palta, Jatinder R.; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2008-05-01

    The increasing complexity of modern radiation therapy planning and delivery techniques challenges traditional prescriptive quality control and quality assurance programs that ensure safety and reliability of treatment planning and delivery systems under all clinical scenarios. Until now quality management (QM) guidelines published by concerned organizations (e.g., American Association of Physicists in Medicine [AAPM], European Society for Therapeutic Radiology and Oncology [ESTRO], International Atomic Energy Agency [IAEA]) have focused on monitoring functional performance of radiotherapy equipment by measurable parameters, with tolerances set at strict but achievable values. In the modern environment, however, the number and sophistication of possible tests and measurements have increased dramatically. There is a need to prioritize QM activities in a way that will strike a balance between being reasonably achievable and optimally beneficial to patients. A systematic understanding of possible errors over the course of a radiation therapy treatment and the potential clinical impact of each is needed to direct limited resources in such a way to produce maximal benefit to the quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and is developing a framework for designing QM activities, and hence allocating resources, based on estimates of clinical outcome, risk assessment, and failure modes. The report will provide guidelines on risk assessment approaches with emphasis on failure mode and effect analysis (FMEA) and an achievable QM program based on risk analysis. Examples of FMEA to intensity-modulated radiation therapy and high-dose-rate brachytherapy are presented. Recommendations on how to apply this new approach to individual clinics and further research and development will also be discussed.

  9. A method for evaluating quality assurance needs in radiation therapy.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Medin, Paul M; Mundt, Arno; Mutic, Sassa; Palta, Jatinder R; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2008-01-01

    The increasing complexity of modern radiation therapy planning and delivery techniques challenges traditional prescriptive quality control and quality assurance programs that ensure safety and reliability of treatment planning and delivery systems under all clinical scenarios. Until now quality management (QM) guidelines published by concerned organizations (e.g., American Association of Physicists in Medicine [AAPM], European Society for Therapeutic Radiology and Oncology [ESTRO], International Atomic Energy Agency [IAEA]) have focused on monitoring functional performance of radiotherapy equipment by measurable parameters, with tolerances set at strict but achievable values. In the modern environment, however, the number and sophistication of possible tests and measurements have increased dramatically. There is a need to prioritize QM activities in a way that will strike a balance between being reasonably achievable and optimally beneficial to patients. A systematic understanding of possible errors over the course of a radiation therapy treatment and the potential clinical impact of each is needed to direct limited resources in such a way to produce maximal benefit to the quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and is developing a framework for designing QM activities, and hence allocating resources, based on estimates of clinical outcome, risk assessment, and failure modes. The report will provide guidelines on risk assessment approaches with emphasis on failure mode and effect analysis (FMEA) and an achievable QM program based on risk analysis. Examples of FMEA to intensity-modulated radiation therapy and high-dose-rate brachytherapy are presented. Recommendations on how to apply this new approach to individual clinics and further research and development will also be discussed.

  10. Use of Lot Quality Assurance Sampling to Ascertain Levels of Drug Resistant Tuberculosis in Western Kenya

    PubMed Central

    Cohen, Ted; Zignol, Matteo; Nyakan, Edwin; Hedt-Gauthier, Bethany L.; Gardner, Adrian; Kamle, Lydia; Injera, Wilfred; Carter, E. Jane

    2016-01-01

    Objective To classify the prevalence of multi-drug resistant tuberculosis (MDR-TB) in two different geographic settings in western Kenya using the Lot Quality Assurance Sampling (LQAS) methodology. Design The prevalence of drug resistance was classified among treatment-naïve smear positive TB patients in two settings, one rural and one urban. These regions were classified as having high or low prevalence of MDR-TB according to a static, two-way LQAS sampling plan selected to classify high resistance regions at greater than 5% resistance and low resistance regions at less than 1% resistance. Results This study classified both the urban and rural settings as having low levels of TB drug resistance. Out of the 105 patients screened in each setting, two patients were diagnosed with MDR-TB in the urban setting and one patient was diagnosed with MDR-TB in the rural setting. An additional 27 patients were diagnosed with a variety of mono- and poly- resistant strains. Conclusion Further drug resistance surveillance using LQAS may help identify the levels and geographical distribution of drug resistance in Kenya and may have applications in other countries in the African Region facing similar resource constraints. PMID:27167381

  11. Equivalence of assurance Gold Enzyme Immunoassay for visual or instrumental detection of motile and nonmotile Salmonella in all foods to AOAC culture method: collaborative study.

    PubMed

    Feldsine, P T; Mui, L A; Forgey, R L; Kerr, D E

    2000-01-01

    Six foods representative of a wide variety of processed, dried powder processed, and raw food types were analyzed by the Assurance Gold Salmonella Enzyme Immunoassay (EIA) and AOAC INTERNATIONAL culture method. Paired samples of each food type were simultaneously analyzed; one sample by the Assurance method and one by the AOAC culture method. The results for Assurance method were read visually and instrumentally with a microplate reader. A total of 24 laboratories representing federal government agencies and private industry, in the United States and Canada, participated in this collaborative study. Food types were inoculated with species of Salmonella with the exception of raw ground chicken, which was naturally contaminated. No statistical differences (p < 0.05) were observed between Assurance Gold Salmonella EIA with either visual or instrumental interpretation and the AOAC culture method for any inoculation level of any food type or naturally contaminated food. The Assurance visual and instrumental options of reading sample reactions produced the same results for 1277 of the 1296 sample and controls analyzed.

  12. Experience with Formal Methods techniques at the Jet Propulsion Laboratory from a quality assurance perspective

    NASA Technical Reports Server (NTRS)

    Kelly, John C.; Covington, Rick

    1993-01-01

    Recent experience with Formal Methods (FM) in the Software Quality Assurance Section at the Jet Propulsion Lab is presented. An integrated Formal Method process is presented to show how related existing requirements analysis and FM techniques complement one another. Example application of FM techniques such as formal specifications and specification animators are presented. The authors suggest that the quality assurance organization is a natural home for the Formal Methods specialist, whose expertise can then be used to best advantage across a range of projects.

  13. Experience with Formal Methods techniques at the Jet Propulsion Laboratory from a quality assurance perspective

    NASA Technical Reports Server (NTRS)

    Kelly, John C.; Covington, Rick

    1993-01-01

    Recent experience with Formal Methods (FM) in the Software Quality Assurance Section at the Jet Propulsion Lab is presented. An integrated Formal Method process is presented to show how related existing requirements analysis and FM techniques complement one another. Example application of FM techniques such as formal specifications and specification animators are presented. The authors suggest that the quality assurance organization is a natural home for the Formal Methods specialist, whose expertise can then be used to best advantage across a range of projects.

  14. Data Mining Methods Applied to Flight Operations Quality Assurance Data: A Comparison to Standard Statistical Methods

    NASA Technical Reports Server (NTRS)

    Stolzer, Alan J.; Halford, Carl

    2007-01-01

    In a previous study, multiple regression techniques were applied to Flight Operations Quality Assurance-derived data to develop parsimonious model(s) for fuel consumption on the Boeing 757 airplane. The present study examined several data mining algorithms, including neural networks, on the fuel consumption problem and compared them to the multiple regression results obtained earlier. Using regression methods, parsimonious models were obtained that explained approximately 85% of the variation in fuel flow. In general data mining methods were more effective in predicting fuel consumption. Classification and Regression Tree methods reported correlation coefficients of .91 to .92, and General Linear Models and Multilayer Perceptron neural networks reported correlation coefficients of about .99. These data mining models show great promise for use in further examining large FOQA databases for operational and safety improvements.

  15. Transuranic waste characterization sampling and analysis methods manual

    SciTech Connect

    1995-05-01

    The Transuranic Waste Characterization Sampling and Analysis Methods Manual (Methods Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program). This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP.

  16. Quality Assurance Program Plan for the Waste Sampling and Characterization Facility

    SciTech Connect

    Grabbe, R.R.

    1995-03-02

    The objective of this Quality Assurance Plan is to provide quality assurance (QA) guidance, implementation of regulatory QA requirements, and quality control (QC) specifications for analytical service. This document follows the Department of Energy (DOE)-issued Hanford Analytical Services Quality Assurance Plan (HASQAP) and additional federal [10 US Code of Federal Regulations (CFR) 830.120] QA requirements that HASQAP does not cover. This document describes how the laboratory implements QA requirements to meet the federal or state requirements, provides what are the default QC specifications, and/or identifies the procedural information that governs how the laboratory operates. In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. This document also covers QA elements that are required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAPPs), (QAMS-004), and Interim Guidelines and Specifications for Preparing Quality Assurance Product Plans (QAMS-005) from the Environmental Protection Agency (EPA). A QA Index is provided in the Appendix A.

  17. Formulating priorities for quality assurance activity. Description of a method and its application.

    PubMed

    Williamson, J W

    1978-02-13

    Quality assurance activity seems to have had little documented impact in terms of improving patient health or reducing care costs. One reason may be the lack of a practical and effective decision process for selecting priority areas where improvement of health or any other target outcome will most likely be achieved. This article describes a structured procedure for meeting this need. In addition, results of 14 years of quality assurance experience with structured and nonstructured topic selection procedures in 23 multispecialty group clinics and their associated hospitals are briefly reviewed. On the basis of this experience it is suggested that this priority method is both feasible and practical and can be recommended for application to most quality assurance systems. It is especially suited for planning medical care evaluation studies of the Professional Standards Review Organizations or the performance evaluation projects of the Joint Commission on the Accreditation of Hospitals.

  18. Statistical considerations for plot design, sampling procedures, analysis, and quality assurance of ozone injury studies

    Treesearch

    Michael Arbaugh; Larry Bednar

    1996-01-01

    The sampling methods used to monitor ozone injury to ponderosa and Jeffrey pines depend on the objectives of the study, geographic and genetic composition of the forest, and the source and composition of air pollutant emissions. By using a standardized sampling methodology, it may be possible to compare conditions within local areas more accurately, and to apply the...

  19. Kansas's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; W. Keith Moser; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Kansas's forests was completed in 2005 after 8,868 plots were selected and 468 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of Kansas inventory is presented...

  20. South Dakota's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; Ronald J. Piva; Charles J. Barnett

    2011-01-01

    The first full annual inventory of South Dakota's forests was completed in 2005 after 8,302 plots were selected and 325 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the South Dakota...

  1. Nebraska's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; Dacia M. Meneguzzo; Charles J. Barnett

    2011-01-01

    The first full annual inventory of Nebraska's forests was completed in 2005 after 8,335 plots were selected and 274 forested plots were visited and measured. This report includes detailed information on forest inventory methods, and data quality estimates. Tables of various important resource statistics are presented. Detailed analysis of the inventory data are...

  2. North Dakota's forests, 2005: statistics, methods, and quality assurance

    Treesearch

    Patrick D. Miles; David E. Haugen; Charles J. Barnett

    2011-01-01

    The first full annual inventory of North Dakota's forests was completed in 2005 after 7,622 plots were selected and 164 forested plots were visited and measured. This report includes detailed information on forest inventory methods and data quality estimates. Important resource statistics are included in the tables. A detailed analysis of the North Dakota...

  3. Quality assurance flood source and method of making

    DOEpatents

    Fisher, Darrell R [Richland, WA; Alexander, David L [West Richland, WA; Satz, Stanley [Surfside, FL

    2002-12-03

    Disclosed is a is an improved flood source, and method of making the same, which emits an evenly distributed flow of energy from a gamma emitting radionuclide dispersed throughout the volume of the flood source. The flood source is formed by filling a bottom pan with a mix of epoxy resin with cobalt-57, preferably at 10 to 20 millicuries and then adding a hardener. The pan is secured to a flat, level surface to prevent the pan from warping and to act as a heat sink for removal of heat from the pan during the curing of the resin-hardener mixture.

  4. Quality assurance manual plutonium liquid scintillation methods and procedures

    SciTech Connect

    Romero, L.

    1997-01-01

    Nose swipe analysis is a very important tool for Radiation Protection personnel. Nose swipe analysis is a very fast and accurate method for (1) determining if a worker has been exposed to airborne plutonium contamination and (2) Identifying the area where there has been a possible plutonium release. Liquid scintillation analysis techniques have been effectively applied to accurately determine the plutonium alpha activity on nose swipe media. Whatman-40 paper and Q-Tips are the only two media which have been evaluated and can be used for nose swipe analysis. Presently, only Q-Tips are used by Group HSE-1 Radiation Protection Personnel. However, both swipe media will be discussed in this report.

  5. Sampling methods for phlebotomine sandflies.

    PubMed

    Alexander, B

    2000-06-01

    A review is presented of methods for sampling phlebotomine sandflies (Diptera: Psychodidae). Among approximately 500 species of Phlebotominae so far described, mostly in the New World genus Lutzomyia and the Old World genus Phlebotomus, about 10% are known vectors of Leishmania parasites or other pathogens. Despite being small and fragile, sandflies have a wide geographical range with species occupying a considerable diversity of ecotopes and habitats, from deserts to humid forests, so that suitable methods for collecting them are influenced by environmental conditions where they are sought. Because immature phlebotomines occupy obscure terrestrial habitats, it is difficult to find their breeding sites. Therefore, most trapping methods and sampling procedures focus on sandfly adults, whether resting or active. The diurnal resting sites of adult sandflies include tree holes, buttress roots, rock crevices, houses, animal shelters and burrows, from which they may be aspirated directly or trapped after being disturbed. Sandflies can be collected during their periods of activity by interception traps, or by using attractants such as bait animals, CO2 or light. The method of trapping used should: (a) be suited to the habitat and area to be surveyed, (b) take into account the segment of the sandfly population to be sampled (species, sex and reproduction condition) and (c) yield specimens of appropriate condition for the study objectives (e.g. identification of species present, population genetics or vector implication). Methods for preservation and transportation of sandflies to the laboratory also depend on the objectives of a particular study and are described accordingly.

  6. Elaborating transition interface sampling methods

    SciTech Connect

    Erp, Titus S. van . E-mail: bolhuis@science.uva.nl

    2005-05-01

    We review two recently developed efficient methods for calculating rate constants of processes dominated by rare events in high-dimensional complex systems. The first is transition interface sampling (TIS), based on the measurement of effective fluxes through hypersurfaces in phase space. TIS improves efficiency with respect to standard transition path sampling (TPS) rate constant techniques, because it allows a variable path length and is less sensitive to recrossings. The second method is the partial path version of TIS. Developed for diffusive processes, it exploits the loss of long time correlation. We discuss the relation between the new techniques and the standard reactive flux methods in detail. Path sampling algorithms can suffer from ergodicity problems, and we introduce several new techniques to alleviate these problems, notably path swapping, stochastic configurational bias Monte Carlo shooting moves and order-parameter free path sampling. In addition, we give algorithms to calculate other interesting properties from path ensembles besides rate constants, such as activation energies and reaction mechanisms.

  7. Elaborating transition interface sampling methods

    NASA Astrophysics Data System (ADS)

    van Erp, Titus S.; Bolhuis, Peter G.

    2005-05-01

    We review two recently developed efficient methods for calculating rate constants of processes dominated by rare events in high-dimensional complex systems. The first is transition interface sampling (TIS), based on the measurement of effective fluxes through hypersurfaces in phase space. TIS improves efficiency with respect to standard transition path sampling (TPS) rate constant techniques, because it allows a variable path length and is less sensitive to recrossings. The second method is the partial path version of TIS. Developed for diffusive processes, it exploits the loss of long time correlation. We discuss the relation between the new techniques and the standard reactive flux methods in detail. Path sampling algorithms can suffer from ergodicity problems, and we introduce several new techniques to alleviate these problems, notably path swapping, stochastic configurational bias Monte Carlo shooting moves and order-parameter free path sampling. In addition, we give algorithms to calculate other interesting properties from path ensembles besides rate constants, such as activation energies and reaction mechanisms.

  8. A Survey of Software Quality Assurance Methods and an Evaluation of Software Quality Assurance at Fleet Material Support Office.

    DTIC Science & Technology

    1982-12-01

    IN ZNFR. ATION SYSTEMS from the NAVAL POSrGRADUATE SCHOOL De-ember 1982 Authors : k_ a 𔄁/- Approved - --------------- Thss Advisor / Second Reader...within their company structure will also be discussed. 10 ~, ~ 07 In Chapter 2, -he authors will !-ls: and idertifv ’ur_- trends and state of tha art...3. Costs of ~iS The cost of a quality assurance function is very difficult to estimate or even measure. William Perry, an author of extensive material

  9. Quality assurance

    SciTech Connect

    Gillespie, B.M.; Gleckler, B.P.

    1995-06-01

    This section of the 1994 Hanford Site Environmental Report summarizes the quality assurance and quality control practices of Hanford Site environmental monitoring and surveillance programs. Samples are analyzed according to documented standard analytical procedures. This section discusses specific measures taken to ensure quality in project management, sample collection, and analytical results.

  10. Sampling Plan Development in Support of DLA’s Quality Assurance Laboratory Testing Program

    DTIC Science & Technology

    1991-09-01

    Objective 2 states "Develop and implement initiatives for continuously improving the quality of products and services delivered to our customers." Task 6 of... of products provided to the military services, Defense Logistics Agency (DLA) embarked on a comprehensive plan *° for enhancing its Quality Assurance...technically sound and appropriate for supporting the DoDIG’s Audit recommendation for laboratory testing. xi I. INTRODUCTION In its quest to improve the quality

  11. Asbestos-containing materials in school buildings: Bulk-sample analysis quality-assurance program. Bulk sample rounds 16, 17 and 18

    SciTech Connect

    Starner, K.K.; Perkins, R.L.; Harvey, B.W.; Westbrook, S.H.

    1990-02-01

    The report presents the performance results of laboratories participating in the sixteenth, seventeenth and eighteenth rounds of the Bulk Sample Analysis Quality Assurance Program sponsored by the United States Environmental Protection Agency, (EPA). Round 16 of the program operated along the guidelines established in previous rounds and was a voluntary quality assurance program. The Asbestos Hazard Emergency Response Act of 1986 (AHERA), directed the National Institute of Standards and Technology (NIST) to establish and maintain a laboratory accreditation program for bulk sample analysis of asbestos. The program began in October 1988 by evaluating enrolled polariscope laboratories in the interim prior to the initiation of the National Voluntary Laboratory Accreditation Program (NVLAP) for bulk asbestos laboratories, sponsored by NIST.

  12. A method for critical software event execution reliability in high assurance systems

    SciTech Connect

    Kidd, M.E.C.

    1997-03-01

    This paper presents a method for Critical Software Event Execution Reliability (Critical SEER). The Critical SEER method is intended for high assurance software that operates in an environment where transient upsets could occur, causing a disturbance of the critical software event execution order, which could cause safety or security hazards. The method has a finite automata based module that watches (hence SEER) and tracks the critical events and ensures they occur in the proper order or else a fail safe state is forced. This method is applied during the analysis, design and implementation phases of software engineering.

  13. Fluid sampling apparatus and method

    DOEpatents

    Yeamans, David R.

    1998-01-01

    Incorporation of a bellows in a sampling syringe eliminates ingress of contaminants, permits replication of amounts and compression of multiple sample injections, and enables remote sampling for off-site analysis.

  14. Fluid sampling apparatus and method

    DOEpatents

    Yeamans, D.R.

    1998-02-03

    Incorporation of a bellows in a sampling syringe eliminates ingress of contaminants, permits replication of amounts and compression of multiple sample injections, and enables remote sampling for off-site analysis. 3 figs.

  15. HPV DNA testing of the residual sample of liquid-based Pap test: utility as a quality assurance monitor.

    PubMed

    Zuna, R E; Moore, W; Dunn, S T

    2001-03-01

    HPV DNA testing of the residual sample volume of liquid-based Pap tests has been recommended as a way to determine the appropriate follow-up for women who have equivocal results in routine clinical screening. A major aspect of quality assurance in the cytopathology laboratory consists of correlation of smear interpretation with biopsy or conization results as mandated by CLIA '88. However, the use of histology as the gold standard suffers from similar problems of subjectivity and sampling as the Pap smear. In this study we explore the potential use of HPV DNA testing of the residual volume from the ThinPrep Pap Test (Cytyc Corporation, Boxborough, Massachusetts) as a substitute gold standard in quality assurance monitoring of a cervical cytology screening program. The residual samples from 397 ThinPrep Pap cases were retrospectively analyzed for high-risk HPV DNA using the Hybrid Capture II technique. Sensitivity (71.8%), specificity (86.5%), predictive value of positive (77.1%) and negative (82.9%) ThinPrep Pap interpretations were calculated on the basis of HPV DNA results for 266 cases classed as either squamous intraepithelial lesion (SIL) or negative. Overall, there was agreement between the two tests in 80.8% of cases (Cohen's kappa =.59). The percentage of HPV DNA-positive cases interpreted as atypical squamous cells of uncertain significance (ASCUS) was 43.7%, and the percentage of negative cases was 17.1%. We believe that this approach is an objective adjunct to the traditional quality assurance protocol, with the added benefit that it includes cases interpreted as negative, as well as abnormal cases that do not come to biopsy.

  16. Methods for Quality-Assurance Review of Water-Quality Data in New Jersey

    DTIC Science & Technology

    2003-01-01

    traditional measure of the effect of an organic waste load on the oxygen levels of a receiving body of water . In other words , it is a measure of...U .S . Depar tment of the Inter ior U .S . Geological Sur v e y METHODS FOR QUALITY-ASSURANCE REVIEW OF WATER ...Review of Water -Quality Data in New Jersey 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  17. Direct method for second-order sensitivity analysis of modal assurance criterion

    NASA Astrophysics Data System (ADS)

    Lei, Sheng; Mao, Kuanmin; Li, Li; Xiao, Weiwei; Li, Bin

    2016-08-01

    A Lagrange direct method is proposed to calculate the second-order sensitivity of modal assurance criterion (MAC) values of undamped systems. The eigenvalue problem and normalizations of eigenvectors, which augmented by using some Lagrange multipliers, are used as the constraints of the Lagrange functional. Once the Lagrange multipliers are determined, the sensitivities of MAC values can be evaluated directly. The Lagrange direct method is accurate, efficient and easy to implement. A simply supported beam is utilized to check the accuracy of the proposed method. A frame is adopted to validate the predicting capacity of the first- and second-order sensitivities of MAC values. It is shown that the computational costs of the proposed method can be remarkably reduced in comparison with those of the indirect method without loss of accuracy.

  18. Duplex sampling apparatus and method

    DOEpatents

    Brown, Paul E.; Lloyd, Robert

    1992-01-01

    An improved apparatus is provided for sampling a gaseous mixture and for measuring mixture components. The apparatus includes two sampling containers connected in series serving as a duplex sampling apparatus. The apparatus is adapted to independently determine the amounts of condensable and noncondensable gases in admixture from a single sample. More specifically, a first container includes a first port capable of selectively connecting to and disconnecting from a sample source and a second port capable of selectively connecting to and disconnecting from a second container. A second container also includes a first port capable of selectively connecting to and disconnecting from the second port of the first container and a second port capable of either selectively connecting to and disconnecting from a differential pressure source. By cooling a mixture sample in the first container, the condensable vapors form a liquid, leaving noncondensable gases either as free gases or dissolved in the liquid. The condensed liquid is heated to drive out dissolved noncondensable gases, and all the noncondensable gases are transferred to the second container. Then the first and second containers are separated from one another in order to separately determine the amount of noncondensable gases and the amount of condensable gases in the sample.

  19. Lot quality assurance sampling to monitor supplemental immunization activity quality: an essential tool for improving performance in polio endemic countries.

    PubMed

    Brown, Alexandra E; Okayasu, Hiromasa; Nzioki, Michael M; Wadood, Mufti Z; Chabot-Couture, Guillaume; Quddus, Arshad; Walker, George; Sutter, Roland W

    2014-11-01

    Monitoring the quality of supplementary immunization activities (SIAs) is a key tool for polio eradication. Regular monitoring data, however, are often unreliable, showing high coverage levels in virtually all areas, including those with ongoing virus circulation. To address this challenge, lot quality assurance sampling (LQAS) was introduced in 2009 as an additional tool to monitor SIA quality. Now used in 8 countries, LQAS provides a number of programmatic benefits: identifying areas of weak coverage quality with statistical reliability, differentiating areas of varying coverage with greater precision, and allowing for trend analysis of campaign quality. LQAS also accommodates changes to survey format, interpretation thresholds, evaluations of sample size, and data collection through mobile phones to improve timeliness of reporting and allow for visualization of campaign quality. LQAS becomes increasingly important to address remaining gaps in SIA quality and help focus resources on high-risk areas to prevent the continued transmission of wild poliovirus.

  20. Quality assurance and quality control for thermal/optical analysis of aerosol samples for organic and elemental carbon.

    PubMed

    Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K

    2011-12-01

    Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (<4 °C) to minimize loss of semi-volatile OC. QA activities include participation in laboratory accreditation programs, external system audits, and interlaboratory comparisons. For thermal/optical carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (<100 ppmv) in the helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components.

  1. Apparatus and method for handheld sampling

    DOEpatents

    Staab, Torsten A.

    2005-09-20

    The present invention includes an apparatus, and corresponding method, for taking a sample. The apparatus is built around a frame designed to be held in at least one hand. A sample media is used to secure the sample. A sample media adapter for securing the sample media is operated by a trigger mechanism connectively attached within the frame to the sample media adapter.

  2. A method to enhance 2D ion chamber array patient specific quality assurance for IMRT.

    PubMed

    Diaz Moreno, Rogelio Manuel; Venencia, Daniel; Garrigo, Edgardo; Pipman, Yakov

    2016-11-21

    Gamma index comparison has been established as a method for patient specific quality assurance in IMRT. Detector arrays can replace radiographic film systems to record 2D dose distributions and fulfill quality assurance requirements. These electronic devices present spatial resolution disadvantages with respect to films. This handicap can be partially overcome with a multiple acquisition sequence of adjacent 2D dose distributions. The detector spatial response influence can also be taken into account through the convolution of the calculated dose with the detector spatial response. A methodology that employs both approaches could allow for enhancements of the quality assurance procedure. 35 beams from different step and shoot IMRT plans were delivered on a phantom. 2D dose distributions were recorded with a PTW-729 ion chamber array for individual beams, following the multiple acquisition methodology. 2D dose distributions were also recorded on radiographic films. Measured dose distributions with films and with the PTW-729 array were processed with the software RITv5.2 for Gamma index comparison with calculated doses. Calculated dose was also convolved with the ion chamber 2D response and the Gamma index comparisons with the 2D dose distribution measured with the PTW-729 array was repeated. 3.7 ± 2.7% of points surpassed the accepted Gamma index when using radiographic films compared with calculated dose, with a minimum of 0.67 and a maximum of 13.27. With the PTW-729 multiple acquisition methodology compared with calculated dose, 4.1 ± 1.3% of points surpassed the accepted Gamma index, with a minimum of 1.44 and a maximum of 11.26. With the PTW- multiple acquisition methodology compared with convolved calculated dose, 2.7 ± 1.3% of points surpassed the accepted Gamma index, with a minimum of 0.42 and a maximum of 5.75. The results obtained in this work suggest that the comparison of merged adjacent dose distributions with convolved calculated dose

  3. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, C.V.

    1991-02-05

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allows an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds. 11 figures.

  4. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, Cyril V.

    1991-01-01

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allow an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds.

  5. Private sector delivery of health services in developing countries: a mixed-methods study on quality assurance in social franchises

    PubMed Central

    2013-01-01

    Background Across the developing world health care services are most often delivered in the private sector and social franchising has emerged, over the past decade, as an increasingly popular method of private sector health care delivery. Social franchising aims to strengthen business practices through economies of scale: branding clinics and purchasing drugs in bulk at wholesale prices. While quality is one of the established goals of social franchising, there is no published documentation of how quality levels might be set in the context of franchised private providers, nor what quality assurance measures can or should exist within social franchises. The aim of this study was to better understand the quality assurance systems currently utilized in social franchises, and to determine if there are shared standards for practice or quality outcomes that exist across programs. Methods The study included three data sources and levels of investigation: 1) Self-reported program data; 2) Scoping telephone interviews; and 3) In-depth field interviews and clinic visits. Results Social Franchises conceive of quality assurance not as an independent activity, but rather as a goal that is incorporated into all areas of franchise operations, including recruitment, training, monitoring of provider performance, monitoring of client experience and the provision of feedback. Conclusions These findings are the first evidence to support the 2002 conceptual model of social franchising which proposed that the assurance of quality was one of the three core goals of all social franchises. However, while quality is important to franchise programs, quality assurance systems overall are not reflective of the evidence to-date on quality measurement or quality improvement best practices. Future research in this area is needed to better understand the details of quality assurance systems as applied in social franchise programs, the process by which quality assurance becomes a part of the

  6. Field Sampling Plan/Quality Assurance Project Plan Volume I of III

    EPA Pesticide Factsheets

    This document contains procedures related to the collection and analysis of soil, sediment, groundwater, surface water, air and biota samples at GE’s Pittsfield, Massachusetts facility and at other areas.

  7. Quality assurance of monoclonal antibody pharmaceuticals based on their charge variants using microchip isoelectric focusing method.

    PubMed

    Kinoshita, Mitsuhiro; Nakatsuji, Yuki; Suzuki, Shigeo; Hayakawa, Takao; Kakehi, Kazuaki

    2013-09-27

    Monoclonal antibody (mAb) pharmaceuticals are much more complex than small-molecule drugs. Such complex characteristics raise challenging questions for regulatory evaluation. Although heterogeneity in mAbs based on their charge variants has been mainly evaluated using gel-based isoelectric focusing (IEF) method, recent development in capillary electrophoresis and microchip electrophoresis has made it possible to assure their heterogeneities in more easy and rapid manner. In the present paper, we customized the imaged microchip isoelectric focusing (mIEF) for the analysis of mAbs, and compared the customized version with the conventional capillary isoelectric focusing (cIEF) method, and found that mIEF has much higher performance in operations, and its resolving powers are comparable with those obtained by cIEF. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Assuring Quality in Education Evaluation.

    ERIC Educational Resources Information Center

    Trochim, William M. K.; Visco, Ronald J.

    1986-01-01

    A number of quality assurance educational evaluation methods are illustrated. Evaluation data obtained from the Providence, Rhode Island, school district are used. The methods are: (1) from auditing, internal control; (2) from accounting, double bookkeeping; and (3) from industrial quality control, acceptance sampling and cumulative percentage…

  9. Assuring Quality in Education Evaluation.

    ERIC Educational Resources Information Center

    Trochim, William M. K.; Visco, Ronald J.

    1986-01-01

    A number of quality assurance educational evaluation methods are illustrated. Evaluation data obtained from the Providence, Rhode Island, school district are used. The methods are: (1) from auditing, internal control; (2) from accounting, double bookkeeping; and (3) from industrial quality control, acceptance sampling and cumulative percentage…

  10. Toward cost-efficient sampling methods

    NASA Astrophysics Data System (ADS)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  11. AOAC Official MethodSM Matrix Extension Validation Study of Assurance GDSTM for the Detection of Salmonella in Selected Spices.

    PubMed

    Feldsine, Philip; Kaur, Mandeep; Shah, Khyati; Immerman, Amy; Jucker, Markus; Lienau, Andrew

    2015-01-01

    Assurance GDSTM for Salmonella Tq has been validated according to the AOAC INTERNATIONAL Methods Committee Guidelines for Validation of Microbiological Methods for Food and Environmental Surfaces for the detection of selected foods and environmental surfaces (Official Method of AnalysisSM 2009.03, Performance Tested MethodSM No. 050602). The method also completed AFNOR validation (following the ISO 16140 standard) compared to the reference method EN ISO 6579. For AFNOR, GDS was given a scope covering all human food, animal feed stuff, and environmental surfaces (Certificate No. TRA02/12-01/09). Results showed that Assurance GDS for Salmonella (GDS) has high sensitivity and is equivalent to the reference culture methods for the detection of motile and non-motile Salmonella. As part of the aforementioned validations, inclusivity and exclusivity studies, stability, and ruggedness studies were also conducted. Assurance GDS has 100% inclusivity and exclusivity among the 100 Salmonella serovars and 35 non-Salmonella organisms analyzed. To add to the scope of the Assurance GDS for Salmonella method, a matrix extension study was conducted, following the AOAC guidelines, to validate the application of the method for selected spices, specifically curry powder, cumin powder, and chili powder, for the detection of Salmonella.

  12. Storm Event Sampling in the Sinclair and Dyes Inlet Watershed: FY2005 Quality Assurance Project Plan

    DTIC Science & Technology

    2005-01-18

    Daily CVAA Maintenance Soda lime Check and change Checked daily, changed weekly Reagents (SnCl,3% HNO3, rinse water) Check and change Checked daily...the mouth of Gorst Creek (GC-M) will only be sampled at low tide, if possible. • Click here to see photos from the Gorst Site Visit (Oct 2004). 11...Station (P1, P2, and P3), along the Port Orchard waterfront (BJ-EST, SN12), the mouth of the Port Washington Narrows (DY01) ambient waters in the middle

  13. New prior sampling methods for nested sampling - Development and testing

    NASA Astrophysics Data System (ADS)

    Stokes, Barrie; Tuyl, Frank; Hudson, Irene

    2017-06-01

    Nested Sampling is a powerful algorithm for fitting models to data in the Bayesian setting, introduced by Skilling [1]. The nested sampling algorithm proceeds by carrying out a series of compressive steps, involving successively nested iso-likelihood boundaries, starting with the full prior distribution of the problem parameters. The "central problem" of nested sampling is to draw at each step a sample from the prior distribution whose likelihood is greater than the current likelihood threshold, i.e., a sample falling inside the current likelihood-restricted region. For both flat and informative priors this ultimately requires uniform sampling restricted to the likelihood-restricted region. We present two new methods of carrying out this sampling step, and illustrate their use with the lighthouse problem [2], a bivariate likelihood used by Gregory [3] and a trivariate Gaussian mixture likelihood. All the algorithm development and testing reported here has been done with Mathematica® [4].

  14. On the use of certified reference materials for assuring the quality of results for the determination of mercury in environmental samples.

    PubMed

    Bulska, Ewa; Krata, Agnieszka; Kałabun, Mateusz; Wojciechowski, Marcin

    2017-03-01

    This work focused on the development and validation of methodologies for the accurate determination of mercury in environmental samples and its further application for the preparation and certification of new reference materials (RMs). Two certified RMs ERM-CC580 (inorganic matrix) and ERM-CE464 (organic matrix) were used for the evaluation of digestion conditions assuring the quantitative recovery of mercury. These conditions were then used for the digestion of new candidates for the environmental RMs: bottom sediment (M_2 BotSed), herring tissue (M_3 HerTis), cormorant tissue (M_4 CormTis), and codfish muscle (M_5 CodTis). Cold vapor atomic absorption spectrometry (CV AAS) and inductively coupled plasma mass spectrometry (ICP MS) were used for the measurement of mercury concentration in all RMs. In order to validate and assure the accuracy of results, isotope dilution mass spectrometry (IDMS) was applied as a primary method of measurement, assuring the traceability of obtained values to the SI units: the mole, the kilogram, and the second. Results obtained by IDMS using n((200)Hg)/n((202)Hg) ratio, with estimated combined uncertainty, were as follows: (916 ± 41)/[4.5 %] ng g(-1) (M_2 BotSed), (236 ± 14)/[5.9 %] ng g(-1) (M_3 HerTis), (2252 ± 54)/[2.4 %] ng g(-1) (M_4 CormTis), and (303 ± 15)/[4.9 %] ng g(-1) (M_CodTis), respectively. Different types of detection techniques and quantification (external calibration, standard addition, isotope dilution) were applied in order to improve the quality of the analytical results. The good agreement (within less than 2.5 %) between obtained results and those derived from the Inter-laboratory Comparison, executed by the Institute of Nuclear Chemistry and Technology (Warsaw, Poland) on the same sample matrices, further validated the analytical procedures developed in this study, as well as the concentration of mercury in all four new RMs. Although the developed protocol enabling the metrological

  15. Lot quality assurance sampling for monitoring coverage and quality of a targeted condom social marketing programme in traditional and non-traditional outlets in India

    PubMed Central

    Piot, Bram; Navin, Deepa; Krishnan, Nattu; Bhardwaj, Ashish; Sharma, Vivek; Marjara, Pritpal

    2010-01-01

    Objectives This study reports on the results of a large-scale targeted condom social marketing campaign in and around areas where female sex workers are present. The paper also describes the method that was used for the routine monitoring of condom availability in these sites. Methods The lot quality assurance sampling (LQAS) method was used for the assessment of the geographical coverage and quality of coverage of condoms in target areas in four states and along selected national highways in India, as part of Avahan, the India AIDS initiative. Results A significant general increase in condom availability was observed in the intervention area between 2005 and 2008. High coverage rates were gradually achieved through an extensive network of pharmacies and particularly of non-traditional outlets, whereas traditional outlets were instrumental in providing large volumes of condoms. Conclusion LQAS is seen as a valuable tool for the routine monitoring of the geographical coverage and of the quality of delivery systems of condoms and of health products and services in general. With a relatively small sample size, easy data collection procedures and simple analytical methods, it was possible to inform decision-makers regularly on progress towards coverage targets. PMID:20167732

  16. A Method of Separation Assurance for Instrument Flight Procedures at Non-Radar Airports

    NASA Technical Reports Server (NTRS)

    Conway, Sheila R.; Consiglio, Maria

    2002-01-01

    A method to provide automated air traffic separation assurance services during approach to or departure from a non-radar, non-towered airport environment is described. The method is constrained by provision of these services without radical changes or ambitious investments in current ground-based technologies. The proposed procedures are designed to grant access to a large number of airfields that currently have no or very limited access under Instrument Flight Rules (IFR), thus increasing mobility with minimal infrastructure investment. This paper primarily addresses a low-cost option for airport and instrument approach infrastructure, but is designed to be an architecture from which a more efficient, albeit more complex, system may be developed. A functional description of the capabilities in the current NAS infrastructure is provided. Automated terminal operations and procedures are introduced. Rules of engagement and the operations are defined. Results of preliminary simulation testing are presented. Finally, application of the method to more terminal-like operations, and major research areas, including necessary piloted studies, are discussed.

  17. Subrandom methods for multidimensional nonuniform sampling

    NASA Astrophysics Data System (ADS)

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics.

  18. Lot quality assurance sampling for monitoring coverage and quality of a targeted condom social marketing programme in traditional and non-traditional outlets in India.

    PubMed

    Piot, Bram; Mukherjee, Amajit; Navin, Deepa; Krishnan, Nattu; Bhardwaj, Ashish; Sharma, Vivek; Marjara, Pritpal

    2010-02-01

    This study reports on the results of a large-scale targeted condom social marketing campaign in and around areas where female sex workers are present. The paper also describes the method that was used for the routine monitoring of condom availability in these sites. The lot quality assurance sampling (LQAS) method was used for the assessment of the geographical coverage and quality of coverage of condoms in target areas in four states and along selected national highways in India, as part of Avahan, the India AIDS initiative. A significant general increase in condom availability was observed in the intervention area between 2005 and 2008. High coverage rates were gradually achieved through an extensive network of pharmacies and particularly of non-traditional outlets, whereas traditional outlets were instrumental in providing large volumes of condoms. LQAS is seen as a valuable tool for the routine monitoring of the geographical coverage and of the quality of delivery systems of condoms and of health products and services in general. With a relatively small sample size, easy data collection procedures and simple analytical methods, it was possible to inform decision-makers regularly on progress towards coverage targets.

  19. Method and apparatus for data sampling

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium.

  20. Method and apparatus for data sampling

    DOEpatents

    Odell, D.M.C.

    1994-04-19

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples is described. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium. 6 figures.

  1. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    SciTech Connect

    Suermann, J.F.

    1996-04-01

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits.

  2. Mixed Methods Sampling: A Typology with Examples

    ERIC Educational Resources Information Center

    Teddlie, Charles; Yu, Fen

    2007-01-01

    This article presents a discussion of mixed methods (MM) sampling techniques. MM sampling involves combining well-established qualitative and quantitative techniques in creative ways to answer research questions posed by MM research designs. Several issues germane to MM sampling are presented including the differences between probability and…

  3. Mixed Methods Sampling: A Typology with Examples

    ERIC Educational Resources Information Center

    Teddlie, Charles; Yu, Fen

    2007-01-01

    This article presents a discussion of mixed methods (MM) sampling techniques. MM sampling involves combining well-established qualitative and quantitative techniques in creative ways to answer research questions posed by MM research designs. Several issues germane to MM sampling are presented including the differences between probability and…

  4. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    SciTech Connect

    NELSEN LA

    2009-01-30

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining.

  5. Uniform sampling table method and its applications: establishment of a uniform sampling method.

    PubMed

    Chen, Yibin; Chen, Jiaxi; Wang, Wei

    2013-01-01

    A novel uniform sampling method is proposed in this paper. According to the requirements of uniform sampling, we propose the properties that must be met by analyzing the distribution of samples. Based on this, the proposed uniform sampling method is demonstrated and evaluated strictly by mathematical means such as inference. The uniform sampling tables with respect to Cn(t2) and Cn(t3) are established. Furthermore, a one-dimension uniform sampling method and a multidimension method are proposed. The proposed novel uniform sampling method, which is guided by uniform design theory, enjoys the advantages of simplified use and good representativeness of the whole sample.

  6. Monitoring maternal, newborn, and child health interventions using lot quality assurance sampling in Sokoto State of northern Nigeria

    PubMed Central

    Abegunde, Dele; Orobaton, Nosa; Shoretire, Kamil; Ibrahim, Mohammed; Mohammed, Zainab; Abdulazeez, Jumare; Gwamzhi, Ringpon; Ganiyu, Akeem

    2015-01-01

    Background Maternal mortality ratio and infant mortality rate are as high as 1,576 per 100,000 live births and 78 per 1,000 live births, respectively, in Nigeria's northwestern region, where Sokoto State is located. Using applicable monitoring indicators for tracking progress in the UN/WHO framework on continuum of maternal, newborn, and child health care, this study evaluated the progress of Sokoto toward achieving the Millennium Development Goals (MDGs) 4 and 5 by December 2015. The changes in outcomes in 2012–2013 associated with maternal and child health interventions were assessed. Design We used baseline and follow-up lot quality assurance sampling (LQAS) data obtained in 2012 and 2013, respectively. In each of the surveys, data were obtained from 437 households sampled from 19 LQAS locations in each of the 23 local government areas (LGAs). The composite state-level coverage estimates of the respective indicators were aggregated from estimated LGA coverage estimates. Results None of the nine indicators associated with the continuum of maternal, neonatal, and child care satisfied the recommended 90% coverage target for achieving MDGs 4 and 5. Similarly, the average state coverage estimates were lower than national coverage estimates. Marginal improvements in coverage were obtained in the demand for family planning satisfied, antenatal care visits, postnatal care for mothers, and exclusive breast-feeding. Antibiotic treatment for acute pneumonia increased significantly by 12.8 percentage points. The majority of the LGAs were classifiable as low-performing, high-priority areas for intensified program intervention. Conclusions Despite the limited time left in the countdown to December 2015, Sokoto State, Nigeria, is not on track to achieving the MDG 90% coverage of indicators tied to the continuum of maternal and child care, to reduce maternal and childhood mortality by a third by 2015. Targeted health system investments at the primary care level remain a

  7. Methods of analysis for toxic elements in food products. 3. Limit of determination of methods for assuring safety

    SciTech Connect

    Skurikhin, I.M.

    1989-03-01

    To evaluate the suitability of the analytical methods used in determining food safety, a new metrological characteristic ''MQS'' is suggested. MQS is defined as the absolute minimum quantity in micrograms of a substance that can be determined in a test solution (solubilized test portion). MQS accounts for 2 factors: (a) the necessity for a reliable determination of ML (maximum permitted level, i.e., regulatory tolerance), and (b) the optimum quantity of test portion of a food product to be analyzed, and thus assists in evaluating the suitability of a method to assure food safety. The MQS of 8 toxic elements in any food are As, 3; Cd, 0.5; Cu, 20; Fe, 50; Hg, 0.2; Pb, 4; Sn, 200; Zn, 100 micrograms. To characterize the applicability of any given method, the specific minimum limit of determination, MQSM, must be established for each method. The method in question may be used to control food safety only if MQSM is less than MQS. MQSM values are given for the common polarographic and colorimetric methods for determining these elements.

  8. A LITERATURE REVIEW OF WIPE SAMPLING METHODS ...

    EPA Pesticide Factsheets

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerousmethods of wipe sampling exist, and each method has its own specification for the type of wipe, wetting solvent, and determinative step to be used, depending upon the contaminant of concern. The objective of this report is to concisely summarize the findings of a literature review that was conducted to identify the state-of-the-art wipe sampling techniques for a target list of compounds. This report describes the methods used to perform the literature review; a brief review of wipe sampling techniques in general; an analysis of physical and chemical properties of each target analyte; an analysis of wipe sampling techniques for the target analyte list; and asummary of the wipe sampling techniques for the target analyte list, including existing data gaps. In general, no overwhelming consensus can be drawn from the current literature on how to collect a wipe sample for the chemical warfare agents, organophosphate pesticides, and other toxic industrial chemicals of interest to this study. Different methods, media, and wetting solvents have been recommended and used by various groups and different studies. For many of the compounds of interest, no specific wipe sampling methodology has been established for their collection. Before a wipe sampling method (or methods) can be established for the co

  9. Fluidics platform and method for sample preparation

    DOEpatents

    Benner, Henry W.; Dzenitis, John M.

    2016-06-21

    Provided herein are fluidics platforms and related methods for performing integrated sample collection and solid-phase extraction of a target component of the sample all in one tube. The fluidics platform comprises a pump, particles for solid-phase extraction and a particle-holding means. The method comprises contacting the sample with one or more reagents in a pump, coupling a particle-holding means to the pump and expelling the waste out of the pump while the particle-holding means retains the particles inside the pump. The fluidics platform and methods herein described allow solid-phase extraction without pipetting and centrifugation.

  10. Alternate calibration method of radiochromic EBT3 film for quality assurance verification of clinical radiotherapy treatments

    NASA Astrophysics Data System (ADS)

    Park, Soah; Kang, Sei-Kwon; Cheong, Kwang-Ho; Hwang, Taejin; Yoon, Jai-Woong; Koo, Taeryool; Han, Tae Jin; Kim, Haeyoung; Lee, Me Yeon; Bae, Hoonsik; Kim, Kyoung Ju

    2016-07-01

    EBT3 film is utilized as a dosimetry quality assurance tool for the verification of clinical radiotherapy treatments. In this work, we suggest a percentage-depth-dose (PDD) calibration method that can calibrate several EBT3 film pieces together at different dose levels because photon beams provide different dose levels at different depths along the axis of the beam. We investigated the feasibility of the film PDD calibration method based on PDD data and compared the results those from the traditional film calibration method. Photon beams at 6 MV were delivered to EBT3 film pieces for both calibration methods. For the PDD-based calibration, the film pieces were placed on solid phantoms at the depth of maximum dose (dmax) and at depths of 3, 5, 8, 12, 17, and 22 cm, and a photon beam was delivered twice, at 100 cGy and 400 cGy, to extend the calibration dose range under the same conditions. Fourteen film pieces, to maintain their consistency, were irradiated at doses ranging from approximately 30 to 400 cGy for both film calibrations. The film pieces were located at the center position on the scan bed of an Epson 1680 flatbed scanner in the parallel direction. Intensity-modulated radiation therapy (IMRT) plans were created, and their dose distributions were delivered to the film. The dose distributions for the traditional method and those for the PDD-based calibration method were evaluated using a Gamma analysis. The PDD dose values using a CC13 ion chamber and those obtained by using a FC65-G Farmer chamber and measured at the depth of interest produced very similar results. With the objective test criterion of a 1% dosage agreement at 1 mm, the passing rates for the four cases of the three IMRT plans were essentially identical. The traditional and the PDD-based calibrations provided similar plan verification results. We also describe another alternative for calibrating EBT3 films, i.e., a PDD-based calibration method that provides an easy and time-saving approach

  11. Dynamic Method for Identifying Collected Sample Mass

    NASA Technical Reports Server (NTRS)

    Carson, John

    2008-01-01

    G-Sample is designed for sample collection missions to identify the presence and quantity of sample material gathered by spacecraft equipped with end effectors. The software method uses a maximum-likelihood estimator to identify the collected sample's mass based on onboard force-sensor measurements, thruster firings, and a dynamics model of the spacecraft. This makes sample mass identification a computation rather than a process requiring additional hardware. Simulation examples of G-Sample are provided for spacecraft model configurations with a sample collection device mounted on the end of an extended boom. In the absence of thrust knowledge errors, the results indicate that G-Sample can identify the amount of collected sample mass to within 10 grams (with 95-percent confidence) by using a force sensor with a noise and quantization floor of 50 micrometers. These results hold even in the presence of realistic parametric uncertainty in actual spacecraft inertia, center-of-mass offset, and first flexibility modes. Thrust profile knowledge is shown to be a dominant sensitivity for G-Sample, entering in a nearly one-to-one relationship with the final mass estimation error. This means thrust profiles should be well characterized with onboard accelerometers prior to sample collection. An overall sample-mass estimation error budget has been developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  12. Innovative methods for inorganic sample preparation

    SciTech Connect

    Essling, A.M.; Huff, E.A.; Graczyk, D.G.

    1992-04-01

    Procedures and guidelines are given for the dissolution of a variety of selected materials using fusion, microwave, and Parr bomb techniques. These materials include germanium glass, corium-concrete mixtures, and zeolites. Emphasis is placed on sample-preparation approaches that produce a single master solution suitable for complete multielement characterization of the sample. In addition, data are presented on the soil microwave digestion method approved by the Environmental Protection Agency (EPA). Advantages and disadvantages of each sample-preparation technique are summarized.

  13. Identification methods for Legionella from environmental samples.

    PubMed

    Bartie, C; Venter, S N; Nel, L H

    2003-03-01

    Laboratories responsible for Legionella diagnostics around the world use a number of different culturing methods of non-equivalent sensitivities and specificities, to detect Legionella species in environmental samples. Specific countries usually standardize and use one approved method. For example, laboratories in Australia use the Australian Standard (AS) method and those in Europe, the International Standard method (ISO). However, no standard culturing methods have been established in South Africa to date. As a result, there is uncertainty about the true prevalence and most common species of Legionella present in the South African environment. In an attempt to provide guidelines for the development of a standard method specific for South Africa, the ISO, AS and a most probable number method were evaluated and compared. In addition, the effect of sample re-incubation with autochthonous amoebae on culture outcome was studied. Samples were collected from four environments, representing industrial water, mine water and biofilm. The samples were concentrated by membrane filtration and divided into three portions and cultured without pretreatment, after acid treatment and after heat treatment, on four culture media namely alphaBCYE, BMPA, MWY and GVPC agar. A selective approach, incorporating heat treatment, but not acid treatment, combined with culture on alphaBCYE and GVPC or MWY, was most appropriate for legionellae detection in the samples evaluated. Legionellae were cultured from 82% of the environmental samples we evaluated. In 54% of the samples tested, legionellae were present in numbers equal to or exceeding 10(2) colony-forming units per milliliter (cfu/ml). Legionella pneumophila serogroups (SGs) 1-14 were the most prevalent species and were present as single, or a combination of two or more SGs in a number of samples tested. Re-incubation of sample concentrates with autochthonous amoebae improved the culturability of legionellae in 50% of cultures on alpha

  14. New methods for sampling sparse populations

    Treesearch

    Anna Ringvall

    2007-01-01

    To improve surveys of sparse objects, methods that use auxiliary information have been suggested. Guided transect sampling uses prior information, e.g., from aerial photographs, for the layout of survey strips. Instead of being laid out straight, the strips will wind between potentially more interesting areas. 3P sampling (probability proportional to prediction) uses...

  15. Experience with modified aerospace reliability and quality assurance method for wind turbines

    NASA Technical Reports Server (NTRS)

    Klein, W. E.

    1982-01-01

    The SR&QA approach assures that the machine is not hazardous to the public or operating personnel, can operate unattended on a utility grid, demonstrates reliability operation, and helps establish the quality assurance and maintainability requirements for future wind turbine projects. The approach consisted of modified failure modes and effects analysis (FMEA) during the design phase, minimal hardware inspection during parts fabrication, and three simple documents to control activities during machine construction and operation. Five years experience shows that this low cost approach works well enough that it should be considered by others for similar projects.

  16. The rank product method with two samples.

    PubMed

    Koziol, James A

    2010-11-05

    Breitling et al. (2004) introduced a statistical technique, the rank product method, for detecting differentially regulated genes in replicated microarray experiments. The technique has achieved widespread acceptance and is now used more broadly, in such diverse fields as RNAi analysis, proteomics, and machine learning. In this note, we extend the rank product method to the two sample setting, provide distribution theory attending the rank product method in this setting, and give numerical details for implementing the method.

  17. Chemicals of emerging concern in water and bottom sediment in Great Lakes areas of concern, 2010 to 2011-Collection methods, analyses methods, quality assurance, and data

    USGS Publications Warehouse

    Lee, Kathy E.; Langer, Susan K.; Menheer, Michael A.; Foreman, William T.; Furlong, Edward T.; Smith, Steven G.

    2012-01-01

    The U.S. Geological Survey (USGS) cooperated with the U.S. Environmental Protection Agency and the U.S. Fish and Wildlife Service on a study to identify the occurrence of chemicals of emerging concern (CECs) in water and bottom-sediment samples collected during 2010–11 at sites in seven areas of concern (AOCs) throughout the Great Lakes. Study sites include tributaries to the Great Lakes in AOCs located near Duluth, Minn.; Green Bay, Wis.; Roches­ter, N.Y.; Detroit, Mich.; Toledo, Ohio; Milwaukee, Wis.; and Ashtabula, Ohio. This report documents the collection meth­ods, analyses methods, quality-assurance data and analyses, and provides the data for this study. Water and bottom-sediment samples were analyzed at the USGS National Water Quality Laboratory in Denver, Colo., for a broad suite of CECs. During this study, 135 environmental and 23 field dupli­cate samples of surface water and wastewater effluent, 10 field blank water samples, and 11 field spike water samples were collected and analyzed. Sixty-one of the 69 wastewater indicator chemicals (laboratory method 4433) analyzed were detected at concentrations ranging from 0.002 to 11.2 micrograms per liter. Twenty-eight of the 48 pharmaceuticals (research method 8244) analyzed were detected at concentrations ranging from 0.0029 to 22.0 micro­grams per liter. Ten of the 20 steroid hormones and sterols analyzed (research method 4434) were detected at concentrations ranging from 0.16 to 10,000 nanograms per liter. During this study, 75 environmental, 13 field duplicate samples, and 9 field spike samples of bottom sediment were collected and analyzed for a wide variety of CECs. Forty-seven of the 57 wastewater indicator chemicals (laboratory method 5433) analyzed were detected at concentrations ranging from 0.921 to 25,800 nanograms per gram. Seventeen of the 20 steroid hormones and sterols (research method 6434) analyzed were detected at concentrations ranging from 0.006 to 8,921 nanograms per gram. Twelve of

  18. Method and apparatus for sampling atmospheric mercury

    DOEpatents

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.

    1976-01-20

    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  19. Subrandom methods for multidimensional nonuniform sampling.

    PubMed

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. A Quality Assurance Method that Utilizes 3D Dosimetry and Facilitates Clinical Interpretation

    SciTech Connect

    Oldham, Mark; Thomas, Andrew; O'Daniel, Jennifer; Juang, Titania; Ibbott, Geoffrey; Adamovics, John; Kirkpatrick, John P.

    2012-10-01

    Purpose: To demonstrate a new three-dimensional (3D) quality assurance (QA) method that provides comprehensive dosimetry verification and facilitates evaluation of the clinical significance of QA data acquired in a phantom. Also to apply the method to investigate the dosimetric efficacy of base-of-skull (BOS) intensity-modulated radiotherapy (IMRT) treatment. Methods and Materials: Two types of IMRT QA verification plans were created for 6 patients who received BOS IMRT. The first plan enabled conventional 2D planar IMRT QA using the Varian portal dosimetry system. The second plan enabled 3D verification using an anthropomorphic head phantom. In the latter, the 3D dose distribution was measured using the DLOS/Presage dosimetry system (DLOS = Duke Large-field-of-view Optical-CT System, Presage Heuris Pharma, Skillman, NJ), which yielded isotropic 2-mm data throughout the treated volume. In a novel step, measured 3D dose distributions were transformed back to the patient's CT to enable calculation of dose-volume histograms (DVH) and dose overlays. Measured and planned patient DVHs were compared to investigate clinical significance. Results: Close agreement between measured and calculated dose distributions was observed for all 6 cases. For gamma criteria of 3%, 2 mm, the mean passing rate for portal dosimetry was 96.8% (range, 92.0%-98.9%), compared to 94.9% (range, 90.1%-98.9%) for 3D. There was no clear correlation between 2D and 3D passing rates. Planned and measured dose distributions were evaluated on the patient's anatomy, using DVH and dose overlays. Minor deviations were detected, and the clinical significance of these are presented and discussed. Conclusions: Two advantages accrue to the methods presented here. First, treatment accuracy is evaluated throughout the whole treated volume, yielding comprehensive verification. Second, the clinical significance of any deviations can be assessed through the generation of DVH curves and dose overlays on the patient

  1. Comparison of sampling methods for urine cultures.

    PubMed

    Unlü, Hayriye; Sardan, Yeşim Cetinkaya; Ulker, Saadet

    2007-01-01

    To compare efficacy and cost of conventional and alternative sampling methods concerning urine cultures. An experimental study with two replications was carried out in a 900-bed university hospital in Ankara, Turkey. The sample was 160 hospitalized female patients, who were asked to give urine specimens, September 10,2000 and September 1,2001. They were patients on urology and obstetrics and gynaecology wards. The authors informed the patients about the study first and then obtained two samples from each patient under their observation. The number of specimens was 320. Statistical methods were descriptive. The rates of contamination and significant growth, respectively, were 4.4% and 7.5% for the conventional method and 5.6% and 10% for the alternative method. The cost per culture was 2.588.257 TL (2.10 USD) for the conventional method and 57.021 TL (0.05 USD) for the alternative method. The cost difference was statistically significant. The two methods yielded similar results but the alternative method was less expensive.

  2. STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...

  3. STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...

  4. A Review of Quality Assurance Methods to Assist Professional Record Keeping: Implications for Providers of Interpersonal Violence Treatment

    PubMed Central

    Bradshaw, Kelsey M.; Donohue, Brad; Wilks, Chelsey

    2014-01-01

    Errors have been found to frequently occur in the management of case records within mental health service systems. In cases involving interpersonal violence, such errors have been found to negatively impact service implementation and lead to significant trauma and fatalities. In an effort to ensure adherence to specified standards of care, quality assurance programs (QA) have been developed to monitor and enhance service implementation. These programs have generally been successful in facilitating record management. However, these systems are rarely disseminated, and not well integrated. Therefore, within the context of interpersonal violence, we provide an extensive review of evidence supported record keeping practices, and methods to assist in assuring these practices are implemented with adherence. PMID:24976786

  5. Guidelines for the processing and quality assurance of benthic invertebrate samples collected as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.

    1993-01-01

    Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to

  6. Safety and quality assurance of chemotherapeutic preparations in a hospital production unit: acceptance sampling plan and economic impact.

    PubMed

    Paci, A; Borget, I; Mercier, L; Azar, Y; Desmaris, R P; Bourget, P

    2012-06-01

    The opportunity to apply a sampling plan was evaluated. Costs were computed by a microcosting study. In 2003, a sampling plan was defined to reduce the number of chemotherapy quality controls while preserving the same level of quality. Recent qualitative and quantitative changes led us to define a second sampling plan supplemented by an economic evaluation to determine the cost and cost-savings of quality control. The study considers preparation produced during four semesters classified into three groups. The first one includes drugs produced below 200 batches a semester. Group 2, those for which the lot of preparation lots would have been rejected twice among these four semesters. Group 3, those would have been accepted (≥3 'acceptable lot'). A single sampling plan by attributes was applied to this group with an acceptance quality level of 1.65% and a lot tolerance percent defective below 5%. A micro-costing study was conducted on quality control, from the sampling to the validation of the results. Among 39 cytotoxic drugs, 11 were sampled which enabled to avoid a mean of 17,512 control assays per year. Each batch of the 28 non-sampled drugs was however analyzed. Costs were estimated at 2.98€ and 5.25€ for control assays depending of the analytical method. The savings from the application of the sampling plans was 153,207€ in 6 years. The sampling plan allowed maintaining constancy in number of controls and the level of quality with significant costsavings, despite a substantial increase in drugs to assay and in the number of preparations produced.

  7. Sparse Sampling Methods In Multidimensional NMR

    PubMed Central

    Mobli, Mehdi; Maciejewski, Mark W.; Schuyler, Adam D.; Stern, Alan S.; Hoch, Jeffrey C.

    2014-01-01

    Although the discrete Fourier transform played an enabling role in the development of modern NMR spectroscopy, it suffers from a well-known difficulty providing high-resolution spectra from short data records. In multidimensional NMR experiments, so-called indirect time dimensions are sampled parametrically, with each instance of evolution times along the indirect dimensions sampled via separate one-dimensional experiments. The time required to conduct multidimensional experiments is directly proportional to the number of indirect evolution times sampled. Despite remarkable advances in resolution with increasing magnetic field strength, multiple dimensions remain essential for resolving individual resonances in NMR spectra of biological macromolecues. Conventional Fourier-based methods of spectrum analysis limit the resolution that can be practically achieved in the indirect dimensions. Nonuniform or sparse data collection strategies, together with suitable non-Fourier methods of spectrum analysis, enable high-resolution multidimensional spectra to be obtained. Although some of these approaches were first employed in NMR more than two decades ago, it is only relatively recently that they have been widely adopted. Here we describe the current practice of sparse sampling methods and prospects for further development of the approach to improve resolution and sensitivity and shorten experiment time in multidimensional NMR. While sparse sampling is particularly promising for multidimensional NMR, the basic principles could apply to other forms of multidimensional spectroscopy. PMID:22481242

  8. Turbidity threshold sampling: Methods and instrumentation

    Treesearch

    Rand Eads; Jack Lewis

    2001-01-01

    Traditional methods for determining the frequency of suspended sediment sample collection often rely on measurements, such as water discharge, that are not well correlated to sediment concentration. Stream power is generally not a good predictor of sediment concentration for rivers that transport the bulk of their load as fines, due to the highly variable routing of...

  9. Sampling methods for terrestrial amphibians and reptiles.

    Treesearch

    Paul Stephen Corn; R. Bruce. Bury

    1990-01-01

    Methods described for sampling amphibians and reptiles in Douglas-fir forests in the Pacific Northwest include pitfall trapping, time-constrained collecting, and surveys of coarse woody debris. The herpetofauna of this region differ in breeding and nonbreeding habitats and vagility, so that no single technique is sufficient for a community study. A combination of...

  10. Actinide recovery method -- Large soil samples

    SciTech Connect

    Maxwell , S.L. III

    2000-04-25

    There is a need to measure actinides in environmental samples with lower and lower detection limits, requiring larger sample sizes. This analysis is adversely affected by sample-matrix interferences, which make analyzing soil samples above five-grams very difficult. A new Actinide-Recovery Method has been developed by the Savannah River Site Central Laboratory to preconcentrate actinides from large-soil samples. Diphonix Resin (Eichrom Industries), a 1994 R and D 100 winner, is used to preconcentrate the actinides from large soil samples, which are bound powerfully to the resin's diphosphonic acid groups. A rapid microwave-digestion technique is used to remove the actinides from the Diphonix Resin, which effectively eliminates interfering matrix components from the soil matrix. The microwave-digestion technique is more effective and less tedious than catalyzed hydrogen peroxide digestions of the resin or digestion of diphosphonic stripping agents such as HEDPA. After resin digestion, the actinides are recovered in a small volume of nitric acid which can be loaded onto small extraction chromatography columns, such as TEVA Resin, U-TEVA Resin or TRU Resin (Eichrom Industries). Small, selective extraction columns do not generate large volumes of liquid waste and provide consistent tracer recoveries after soil matrix elimination.

  11. Filmless methods for quality assurance of Tomotherapy using ArcCHECK.

    PubMed

    Yang, B; Wong, W K R; Geng, H; Lam, W W; Ho, Y W; Kwok, W M; Cheung, K Y; Yu, S K

    2017-01-01

    Tomotherapy delivers an intensity-modulated radiation therapy (IMRT) treatment by the synchronization of gantry rotation, multileaf collimator (MLC), and couch movement. This dynamic nature makes the quality assurance (QA) important and challenging. The purpose of this study is to develop some methodologies using an ArcCHECK for accurate QA measurements of the gantry angle and speed, MLC synchronization and leaf open time, couch translation per gantry rotation, couch speed and uniformity, and constancy of longitudinal beam profile for a Tomotherapy unit. Four test plans recommended by AAPM Task Group 148 (TG148) and the manufacturer were chosen for this study. Helical and static star shot tests are used for checking the leaves opened at the expected gantry angles. Another helical test is to verify the couch traveled the expected distance per gantry rotation. The final test is for checking the couch speed constancy with a static gantry. ArcCHECK can record the detector signal every 50 ms as a movie file, and has a virtual inclinometer for gantry angle measurement. These features made the measurement of gantry angle and speed, MLC synchronization and leaf open time, and longitudinal beam profile possible. A shaping parameter was defined for facilitating the location of the beam center during the plan delivery, which was thereafter used to calculate the couch translation per gantry rotation and couch speed. The full width at half maximum (FWHM) was calculated for each measured longitudinal beam profile and then used to evaluate the couch speed uniformity. Furthermore, a mean longitudinal profile was obtained for constancy check of field width. The machine trajectory log data were also collected for comparison. Inhouse programs were developed in MATLAB to process both the ArcCHECK and machine log data. The deviation of our measurement results from the log data for gantry angle was calculated to be less than 0.4°. The percentage differences between measured and planned

  12. Quality Assurance.

    ERIC Educational Resources Information Center

    Massachusetts Career Development Inst., Springfield.

    This booklet is one of six texts from a workplace literacy curriculum designed to assist learners in facing the increased demands of the workplace. The booklet contains five sections that cover the following topics: (1) importance of reliability; (2) meaning of quality assurance; (3) historical development of quality assurance; (4) statistical…

  13. Annual Quality Assurance Conference Abstracts by Barbara Marshik

    EPA Pesticide Factsheets

    25th Annual Quality Assurance Conference. Abstracts: Material and Process Conditions for Successful Use of Extractive Sampling Techniques and Certification Methods Errors in the Analysis of NMHC and VOCs in CNG-Based Engine Emissions by Barbara Marshik

  14. Constrained sampling method for analytic continuation

    NASA Astrophysics Data System (ADS)

    Sandvik, Anders W.

    2016-12-01

    A method for analytic continuation of imaginary-time correlation functions (here obtained in quantum Monte Carlo simulations) to real-frequency spectral functions is proposed. Stochastically sampling a spectrum parametrized by a large number of δ functions, treated as a statistical-mechanics problem, it avoids distortions caused by (as demonstrated here) configurational entropy in previous sampling methods. The key development is the suppression of entropy by constraining the spectral weight to within identifiable optimal bounds and imposing a set number of peaks. As a test case, the dynamic structure factor of the S =1 /2 Heisenberg chain is computed. Very good agreement is found with Bethe ansatz results in the ground state (including a sharp edge) and with exact diagonalization of small systems at elevated temperatures.

  15. Actinide Recovery Method for Large Soil Samples

    SciTech Connect

    Maxwell, S.L. III; Nichols, S.

    1998-11-01

    A new Actinide Recovery Method has been developed by the Savannah River Site Central Laboratory to preconcentrate actinides in very large soil samples. Diphonix Resin(r) is used eliminate soil matrix interferences and preconcentrate actinides after soil leaching or soil fusion. A rapid microwave digestion technique is used to remove the actinides from the Diphonix Resin(r). After the resin digestion, the actinides are recovered in a small volume of nitric acid which can be easily loaded onto small extraction-chromatography columns, such as TEVA Resin(r), U-TEVA Resin(r) or TRU Resin(r) (Eichrom Industries). This method enables the application of small, selective extraction-columns to recover actinides from very large soil samples with high selectivity, consistent tracer recoveries and minimal liquid waste.

  16. Constrained sampling method for analytic continuation.

    PubMed

    Sandvik, Anders W

    2016-12-01

    A method for analytic continuation of imaginary-time correlation functions (here obtained in quantum Monte Carlo simulations) to real-frequency spectral functions is proposed. Stochastically sampling a spectrum parametrized by a large number of δ functions, treated as a statistical-mechanics problem, it avoids distortions caused by (as demonstrated here) configurational entropy in previous sampling methods. The key development is the suppression of entropy by constraining the spectral weight to within identifiable optimal bounds and imposing a set number of peaks. As a test case, the dynamic structure factor of the S=1/2 Heisenberg chain is computed. Very good agreement is found with Bethe ansatz results in the ground state (including a sharp edge) and with exact diagonalization of small systems at elevated temperatures.

  17. Methods for Sampling of Airborne Viruses

    PubMed Central

    Verreault, Daniel; Moineau, Sylvain; Duchaine, Caroline

    2008-01-01

    Summary: To better understand the underlying mechanisms of aerovirology, accurate sampling of airborne viruses is fundamental. The sampling instruments commonly used in aerobiology have also been used to recover viruses suspended in the air. We reviewed over 100 papers to evaluate the methods currently used for viral aerosol sampling. Differentiating infections caused by direct contact from those caused by airborne dissemination can be a very demanding task given the wide variety of sources of viral aerosols. While epidemiological data can help to determine the source of the contamination, direct data obtained from air samples can provide very useful information for risk assessment purposes. Many types of samplers have been used over the years, including liquid impingers, solid impactors, filters, electrostatic precipitators, and many others. The efficiencies of these samplers depend on a variety of environmental and methodological factors that can affect the integrity of the virus structure. The aerodynamic size distribution of the aerosol also has a direct effect on sampler efficiency. Viral aerosols can be studied under controlled laboratory conditions, using biological or nonbiological tracers and surrogate viruses, which are also discussed in this review. Lastly, general recommendations are made regarding future studies on the sampling of airborne viruses. PMID:18772283

  18. SOIL AND SEDIMENT SAMPLING METHODS | Science ...

    EPA Pesticide Factsheets

    The EPA Office of Solid Waste and Emergency Response's (OSWER) Office of Superfund Remediation and Technology Innovation (OSRTI) needs innovative methods and techniques to solve new and difficult sampling and analytical problems found at the numerous Superfund sites throughout the United States. Inadequate site characterization and a lack of knowledge of surface and subsurface contaminant distributions hinders EPA's ability to make the best decisions on remediation options and to conduct the most effective cleanup efforts. To assist OSWER, NERL conducts research to improve their capability to more accurately, precisely, and efficiently characterize Superfund, RCRA, LUST, oil spills, and brownfield sites and to improve their risk-based decision making capabilities, research is being conducted on improving soil and sediment sampling techniques and improving the sampling and handling of volatile organic compound (VOC) contaminated soils, among the many research programs and tasks being performed at ESD-LV.Under this task, improved sampling approaches and devices will be developed for characterizing the concentration of VOCs in soils. Current approaches and devices used today can lose up to 99% of the VOCs present in the sample due inherent weaknesses in the device and improper/inadequate collection techniques. This error generally causes decision makers to markedly underestimate the soil VOC concentrations and, therefore, to greatly underestimate the ecological

  19. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi, Shanavaz [Livermore, CA; Langlois, Richard G [Livermore, CA; Venkateswaran, Kodumudi S [Round Rock, TX

    2011-07-05

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM.TM. on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA.TM., on the 5' end.

  20. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi, Shanavaz; Langlois, Richard G.; Venkateswaran, Kodumudi S.

    2006-08-01

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM, on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA, on the 5' end.

  1. Methods, quality assurance, and data for assessing atmospheric deposition of pesticides in the Central Valley of California

    USGS Publications Warehouse

    Zamora, Celia; Majewski, Michael S.; Foreman, William T.

    2013-01-01

    The U.S. Geological Survey monitored atmospheric deposition of pesticides in the Central Valley of California during two studies in 2001 and 2002–04. The 2001 study sampled wet deposition (rain) and storm-drain runoff in the Modesto, California, area during the orchard dormant-spray season to examine the contribution of pesticide concentrations to storm runoff from rainfall. In the 2002–04 study, the number and extent of collection sites in the Central Valley were increased to determine the areal distribution of organophosphate insecticides and other pesticides, and also five more sample types were collected. These were dry deposition, bulk deposition, and three sample types collected from a soil box: aqueous phase in runoff, suspended sediment in runoff, and surficial-soil samples. This report provides concentration data and describes methods and quality assurance of sample collection and laboratory analysis for pesticide compounds in all samples collected from 16 sites. Each sample was analyzed for 41 currently used pesticides and 23 pesticide degradates, including oxygen analogs (oxons) of 9 organophosphate insecticides. Analytical results are presented by sample type and study period. The median concentrations of both chloryprifos and diazinon sampled at four urban (0.067 micrograms per liter [μg/L] and 0.515 μg/L, respectively) and four agricultural sites (0.079 μg/L and 0.583 μg/L, respectively) during a January 2001 storm event in and around Modesto, Calif., were nearly identical, indicating that the overall atmospheric burden in the region appeared to be fairly similar during the sampling event. Comparisons of median concentrations in the rainfall to those in the McHenry storm-drain runoff showed that, for some compounds, rainfall contributed a substantial percentage of the concentration in the runoff; for other compounds, the concentrations in rainfall were much greater than in the runoff. For example, diazinon concentrations in rainfall were about

  2. Quality assurance of nursing web sites: development and implications of the ALEU method.

    PubMed

    Cambil-Martín, Jacobo; Flynn, Maria; Villaverde-Gutiérrez, Carmen

    2011-09-01

    This article presents a study that evaluated the physical accessibility, readability, and usability of Spanish nursing Web sites and discusses the quality assurance issues raised, which are relevant to the wider nursing community. The Internet is recognized as an important source of health information for both nurses and the general public. Although it makes health-related information universally available, the wide variation in the overall quality of health Web sites is problematic. This raises many questions for the nursing profession: about what constitutes a good-quality Web site, about the nature of the information that nurses are finding and using to support their professional education, research, and clinical practice, and about the impact that Internet information ultimately has on health interactions and nursing care. The process of completing this small study showed that it is possible to usefully assess dimensions of Web site quality and suggested that it may be feasible to develop tools to help nurses evaluate national and international nursing Web sites. More research is needed to understand how nurses use the Internet to support their everyday professional practices, but the development and application of international Web site quality assurance tools may be important for maintaining professional nursing standards in the Internet age.

  3. Well purge and sample apparatus and method

    DOEpatents

    Schalla, Ronald; Smith, Ronald M.; Hall, Stephen H.; Smart, John E.; Gustafson, Gregg S.

    1995-01-01

    The present invention specifically permits purging and/or sampling of a well but only removing, at most, about 25% of the fluid volume compared to conventional methods and, at a minimum, removing none of the fluid volume from the well. The invention is an isolation assembly with a packer, pump and exhaust, that is inserted into the well. The isolation assembly is designed so that only a volume of fluid between the outside diameter of the isolation assembly and the inside diameter of the well over a fluid column height from the bottom of the well to the top of the active portion (lower annulus) is removed. The packer is positioned above the active portion thereby sealing the well and preventing any mixing or contamination of inlet fluid with fluid above the packer. Ports in the wall of the isolation assembly permit purging and sampling of the lower annulus along the height of the active portion.

  4. Well purge and sample apparatus and method

    DOEpatents

    Schalla, R.; Smith, R.M.; Hall, S.H.; Smart, J.E.; Gustafson, G.S.

    1995-10-24

    The present invention specifically permits purging and/or sampling of a well but only removing, at most, about 25% of the fluid volume compared to conventional methods and, at a minimum, removing none of the fluid volume from the well. The invention is an isolation assembly with a packer, pump and exhaust, that is inserted into the well. The isolation assembly is designed so that only a volume of fluid between the outside diameter of the isolation assembly and the inside diameter of the well over a fluid column height from the bottom of the well to the top of the active portion (lower annulus) is removed. The packer is positioned above the active portion thereby sealing the well and preventing any mixing or contamination of inlet fluid with fluid above the packer. Ports in the wall of the isolation assembly permit purging and sampling of the lower annulus along the height of the active portion. 8 figs.

  5. Quality assurance of metabolomics.

    PubMed

    Bouhifd, Mounir; Beger, Richard; Flynn, Thomas; Guo, Lining; Harris, Georgina; Hogberg, Helena; Kaddurah-Daouk, Rima; Kamp, Hennicke; Kleensang, Andre; Maertens, Alexandra; Odwin-DaCosta, Shelly; Pamies, David; Robertson, Donald; Smirnova, Lena; Sun, Jinchun; Zhao, Liang; Hartung, Thomas

    2015-01-01

    Metabolomics promises a holistic phenotypic characterization of biological responses to toxicants. This technology is based on advanced chemical analytical tools with reasonable throughput, including mass-spectroscopy and NMR. Quality assurance, however - from experimental design, sample preparation, metabolite identification, to bioinformatics data-mining - is urgently needed to assure both quality of metabolomics data and reproducibility of biological models. In contrast to microarray-based transcriptomics, where consensus on quality assurance and reporting standards has been fostered over the last two decades, quality assurance of metabolomics is only now emerging. Regulatory use in safety sciences, and even proper scientific use of these technologies, demand quality assurance. In an effort to promote this discussion, an expert workshop discussed the quality assurance needs of metabolomics. The goals for this workshop were 1) to consider the challenges associated with metabolomics as an emerging science, with an emphasis on its application in toxicology and 2) to identify the key issues to be addressed in order to establish and implement quality assurance procedures in metabolomics-based toxicology. Consensus has still to be achieved regarding best practices to make sure sound, useful, and relevant information is derived from these new tools.

  6. Different methods for volatile sampling in mammals

    PubMed Central

    Möller, Manfred; Marcillo, Andrea; Einspanier, Almuth; Weiß, Brigitte M.

    2017-01-01

    Previous studies showed that olfactory cues are important for mammalian communication. However, many specific compounds that convey information between conspecifics are still unknown. To understand mechanisms and functions of olfactory cues, olfactory signals such as volatile compounds emitted from individuals need to be assessed. Sampling of animals with and without scent glands was typically conducted using cotton swabs rubbed over the skin or fur and analysed by gas chromatography-mass spectrometry (GC-MS). However, this method has various drawbacks, including a high level of contaminations. Thus, we adapted two methods of volatile sampling from other research fields and compared them to sampling with cotton swabs. To do so we assessed the body odor of common marmosets (Callithrix jacchus) using cotton swabs, thermal desorption (TD) tubes and, alternatively, a mobile GC-MS device containing a thermal desorption trap. Overall, TD tubes comprised most compounds (N = 113), with half of those compounds being volatile (N = 52). The mobile GC-MS captured the fewest compounds (N = 35), of which all were volatile. Cotton swabs contained an intermediate number of compounds (N = 55), but very few volatiles (N = 10). Almost all compounds found with the mobile GC-MS were also captured with TD tubes (94%). Hence, we recommend TD tubes for state of the art sampling of body odor of mammals or other vertebrates, particularly for field studies, as they can be easily transported, stored and analysed with high performance instruments in the lab. Nevertheless, cotton swabs capture compounds which still may contribute to the body odor, e.g. after bacterial fermentation, while profiles from mobile GC-MS include only the most abundant volatiles of the body odor. PMID:28841690

  7. Different methods for volatile sampling in mammals.

    PubMed

    Kücklich, Marlen; Möller, Manfred; Marcillo, Andrea; Einspanier, Almuth; Weiß, Brigitte M; Birkemeyer, Claudia; Widdig, Anja

    2017-01-01

    Previous studies showed that olfactory cues are important for mammalian communication. However, many specific compounds that convey information between conspecifics are still unknown. To understand mechanisms and functions of olfactory cues, olfactory signals such as volatile compounds emitted from individuals need to be assessed. Sampling of animals with and without scent glands was typically conducted using cotton swabs rubbed over the skin or fur and analysed by gas chromatography-mass spectrometry (GC-MS). However, this method has various drawbacks, including a high level of contaminations. Thus, we adapted two methods of volatile sampling from other research fields and compared them to sampling with cotton swabs. To do so we assessed the body odor of common marmosets (Callithrix jacchus) using cotton swabs, thermal desorption (TD) tubes and, alternatively, a mobile GC-MS device containing a thermal desorption trap. Overall, TD tubes comprised most compounds (N = 113), with half of those compounds being volatile (N = 52). The mobile GC-MS captured the fewest compounds (N = 35), of which all were volatile. Cotton swabs contained an intermediate number of compounds (N = 55), but very few volatiles (N = 10). Almost all compounds found with the mobile GC-MS were also captured with TD tubes (94%). Hence, we recommend TD tubes for state of the art sampling of body odor of mammals or other vertebrates, particularly for field studies, as they can be easily transported, stored and analysed with high performance instruments in the lab. Nevertheless, cotton swabs capture compounds which still may contribute to the body odor, e.g. after bacterial fermentation, while profiles from mobile GC-MS include only the most abundant volatiles of the body odor.

  8. Conflict Prevention and Separation Assurance Method in the Small Aircraft Transportation System

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria C.; Carreno, Victor A.; Williams, Daniel M.; Munoz, Cesar

    2005-01-01

    A multilayer approach to the prevention of conflicts due to the loss of aircraft-to-aircraft separation which relies on procedures and on-board automation was implemented as part of the SATS HVO Concept of Operations. The multilayer system gives pilots support and guidance during the execution of normal operations and advance warning for procedure deviations or off-nominal operations. This paper describes the major concept elements of this multilayer approach to separation assurance and conflict prevention and provides the rationale for its design. All the algorithms and functionality described in this paper have been implemented in an aircraft simulation in the NASA Langley Research Center s Air Traffic Operation Lab and on the NASA Cirrus SR22 research aircraft.

  9. Field evaluation of a VOST sampling method

    SciTech Connect

    Jackson, M.D.; Johnson, L.D.; Fuerst, R.G.; McGaughey, J.F.; Bursey, J.T.; Merrill, R.G.

    1994-12-31

    The VOST (SW-846 Method 0030) specifies the use of Tenax{reg_sign} and a particular petroleum-based charcoal (SKC Lot 104, or its equivalent), that is no longer commercially available. In field evaluation studies of VOST methodology, a replacement petroleum-based charcoal has been used: candidate replacement sorbents for charcoal were studied, and Anasorb{reg_sign} 747, a carbon-based sorbent, was selected for field testing. The sampling train was modified to use only Anasorb{reg_sign} in the back tube and Tenax{reg_sign} in the two front tubes to avoid analytical difficulties associated with the analysis of the sequential bed back tube used in the standard VOST train. The standard (SW-846 Method 0030) and the modified VOST methods were evaluated at a chemical manufacturing facility using a quadruple probe system with quadruple trains. In this field test, known concentrations of the halogenated volatile organic compounds, that are listed in the Clean Air Act Amendments of 1990, Title 3, were introduced into the VOST train and the modified VOST train, using the same certified gas cylinder as a source of test compounds. Statistical tests of the comparability of methods were performed on a compound-by-compound basis. For most compounds, the VOST and modified VOST methods were found to be statistically equivalent.

  10. Selected quality assurance data for water samples collected by the US Geological Survey, Idaho National Engineering Laboratory, Idaho, 1980 to 1988

    USGS Publications Warehouse

    Wegner, S.J.

    1989-01-01

    Multiple water samples from 115 wells and 3 surface water sites were collected between 1980 and 1988 for the ongoing quality assurance program at the Idaho National Engineering Laboratory. The reported results from the six laboratories involved were analyzed for agreement using descriptive statistics. The constituents and properties included: tritium, plutonium-238, plutonium-239, -240 (undivided), strontium-90, americium-241, cesium-137, total dissolved chromium, selected dissolved trace metals, sodium, chloride, nitrate, selected purgeable organic compounds, and specific conductance. Agreement could not be calculated for purgeable organic compounds, trace metals, some nitrates and blank sample analyses because analytical uncertainties were not consistently reported. However, differences between results for most of these data were calculated. The blank samples were not analyzed for differences. The laboratory results analyzed using descriptive statistics showed a median agreement between all useable data pairs of 95%. (USGS)

  11. [Establishment and assessment of QA/QC method for sampling and analysis of atmosphere background CO2].

    PubMed

    Liu, Li-xin; Zhou, Ling-xi; Xia, Ling-jun; Wang, Hong-yang; Fang, Shuang-xi

    2014-12-01

    To strengthen scientific management and sharing of greenhouse gas data obtained from atmospheric background stations in China, it is important to ensure the standardization of quality assurance and quality control method for background CO2 sampling and analysis. Based on the greenhouse gas sampling and observation experience of CMA, using portable sampling observation and WS-CRDS analysis technique as an example, the quality assurance measures for atmospheric CO,sampling and observation in the Waliguan station (Qinghai), the glass bottle quality assurance measures and the systematic quality control method during sample analysis, the correction method during data processing, as well as the data grading quality markers and data fitting interpolation method were systematically introduced. Finally, using this research method, the CO2 sampling and observation data at the atmospheric background stations in 3 typical regions were processed and the concentration variation characteristics were analyzed, indicating that this research method could well catch the influences of the regional and local environmental factors on the observation results, and reflect the characteristics of natural and human activities in an objective and accurate way.

  12. Assurance Cases

    DTIC Science & Technology

    2015-01-26

    2015 Carnegie Mellon University Maturity of Assurance Case Technology Developed in late 90s in Europe Used for safety cases in Europe for over 20 years...The UK Ministry of Defence requires generation of a compelling case to support claims that specific safety requirements are met: “The safety case...assurance case consistent with 15026-2. Developed to help organize and structure safety cases in a readily reviewable form Used successfully for over a

  13. A DOE manual: DOE methods for evaluating environmental and waste management samples

    SciTech Connect

    Goheen, S.C.; Fadeff, S.K.; Sklarew, D.S.; McCulloch, M.; Mong, G.M.; Riley, R.G.; Thomas, B.L.

    1994-08-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a guidance/methods document supporting environmental restoration (ER) and waste management (WM) (collectively referred to as EM) sampling and analysis activities at US Department of Energy (DOE) sites. DOE Methods is intended to supplement existing guidance documents (e.g., the US Environmental Protection Agency`s Test Methods for Evaluating Solid Waste, SW-846), which apply to low-level or non-radioactive samples, and the complexities of waste and environmental samples encountered at DOE sites. The document contains quality assurance (QA), quality control (QC), safety, sampling, organic analysis, inorganic analysis, and radio-analytical guidance as well as sampling and analytical methods. It is updated every six months (April and October) with additional methods. As of April 1994, DOE methods contained 3 sampling and 39 analytical methods. It is anticipated that between 10 and 20 new methods will be added in October 1994. All methods are either peer reviewed and contain performance data, or are included as draft methods.

  14. Quality Assurance Through Reimbursement

    PubMed Central

    Shaughnessy, Peter W.; Kurowski, Bettina

    1982-01-01

    Quality assurance and reimbursement programs normally function separately in the health care field. This paper reviews objectives and certain conceptual issues associated with each type of program. Its primary intent is to summarize substantive and operational topics which must be addressed if quality of care is to be enhanced through reimbursement. The focus is on methods for integrating quality assurance and reimbursement. The final section presents topics for future research. PMID:6807939

  15. The interlaboratorial quality assurance program for blood lead determination. An evaluation of methods and results.

    PubMed

    Morisi, G; Patriarca, M; Taggi, F

    1989-01-01

    For over five years a national program, promoted by a working group of the Istituto Superiore di Sanità (Italian National Institute of Health), has been active in Italy for the quality control of the blood lead levels determination. The program is based on the adoption, by the laboratories, of the same known-titre materials, for the internal quality control, and on the participation in periodical collaborative exercises for the external quality evaluation. The promoting laboratory prepared the control samples, verified their homogeneity and stability and distributed them to the laboratories following a randomized procedure; then, it provided a preliminary elaboration of the results (precision, difference from the median, distribution) after each exercise, and carried out the global evaluation of the performances of each laboratory after at least one year of activity in the program using parametric (regression analysis) and non-parametric (evaluation of the results according to pre-determined acceptability criteria) statistical methods. After four years of activity, the results obtained show that the adopted scheme and the procedures used turned out to be adequate. The study of the regression parameters between the results of each laboratory and the medians of the results of all the laboratories has confirmed the validity of the graphic criterion adopted, also yielding specific information on the relative contribution of the different kinds of error (systematic, constant and/or proportional and casual) to the global error. Furthermore, the proportion of the laboratories with "good level" performances (i.e., acceptable results in at least 80% of the examined samples) has increased from approximately 30% in the first phase to approximately 50% in the fourth phase.

  16. System and Method for Isolation of Samples

    NASA Technical Reports Server (NTRS)

    Zhang, Ye (Inventor); Wu, Honglu (Inventor)

    2014-01-01

    Systems and methods for isolating samples are provided. The system comprises a first membrane and a second membrane disposed within an enclosure. First and second reservoirs can also be disposed within the enclosure and adapted to contain one or more reagents therein. A first valve can be disposed within the enclosure and in fluid communication with the first reservoir, the second reservoir, or both. The first valve can also be in fluid communication with the first or second membranes or both. The first valve can be adapted to selectively regulate the flow of the reagents from the first reservoir, through at least one of the first and second membranes, and into the second reservoir.

  17. Adaptive quality assurance of the product development process of additive manufacturing with modern 3D data evaluation methods

    NASA Astrophysics Data System (ADS)

    Kroll, Julia; Botta, Sabine; Breuninger, Jannis; Verl, Alexander

    2013-03-01

    In this paper, the possibilities of modern 3D data evaluation for metrology and quality assurance are presented for the special application of the plastic laser sinter process, especially the Additive Manufacturing process. We use the advantages of computer tomography and of the 3D focus variation at all stages of a production process for an increased quality of the resulting products. With the CT and the 3D focus variation the modern quality assurance and metrology have state of the art instruments that allow non-destructive, complete and accurate measuring of parts. Therefore, these metrological methods can be used in many stages of the product development process for non-destructive quality control. In this work, studies and evaluation of 3D data and the conclusions for relevant quality criteria are presented. Additionally, new developments and implementations for adapting the evaluation results for quality prediction, comparison and for correction are described to show how an adequate process control can be achieved with the help of modern 3D metrology techniques. The focus is on the optimization of laser sintering components with regard to their quality requirements so that the functionality during production can be guaranteed and quantified.

  18. Evaluation of proposed hardness assurance method for bipolar linear circuits with enhanced low dose rate sensitivity (ELDRS)

    SciTech Connect

    Pease, R.L.; Gehlhausen, M.; Krieg, J.; Titus, J.; Turflinger, T.; Emily, D.; Cohn, L.

    1998-12-01

    Data are presented on several low dose rate sensitive bipolar linear circuits to evaluate a proposed hardness assurance method. The circuits include primarily operational amplifiers and voltage comparators with a variety of sensitive components and failure modes. The proposed method, presented in 1997, includes an option between a low dose rate test at 10 mrd(Si)/s and room temperature and a 100 C elevated temperature irradiation test at a moderate dose rate. The results of this evaluation demonstrate that a 10 mrd(Si)/s test is able (in all but one case) to bound the worst case response within a factor of 2. For the moderate dose rate, 100 C test the worst case response is within a factor of 3 for 8 of 11 circuits, and for some circuits overpredicts the low dose rate response. The irradiation bias used for these tests often represents a more degrading bias condition than would be encountered in a typical space system application.

  19. Transuranic Waste Characterization Quality Assurance Program Plan

    SciTech Connect

    1995-04-30

    This quality assurance plan identifies the data necessary, and techniques designed to attain the required quality, to meet the specific data quality objectives associated with the DOE Waste Isolation Pilot Plant (WIPP). This report specifies sampling, waste testing, and analytical methods for transuranic wastes.

  20. Ensuring trial validity by data quality assurance and diversification of monitoring methods.

    PubMed

    Baigent, Colin; Harrell, Frank E; Buyse, Marc; Emberson, Jonathan R; Altman, Douglas G

    2008-01-01

    Errors in the design, the conduct, the data collection process, and the analysis of a randomized trial have the potential to affect not only the safety of the patients in the trial, but also, through the introduction of bias, the safety of future patients. Trial monitoring, defined broadly to include methods of oversight which begin when the study is designed and continue until it is reported in a publication, has a role to play in eliminating such errors. On-site monitoring can be extremely inefficient for the identification of errors most likely to compromise patient safety or bias study results. However, a variety of other monitoring strategies offer alternatives to on-site monitoring. Each new trial should conduct a risk assessment to identify the optimal means of monitoring, taking into account the likely sources of error, their consequences for patients, the study's validity, and the available resources. Trial management committees should consider central statistical monitoring a key aspect of such monitoring. The systematic application of this approach would be likely to lead to tangible benefits, and resources that are currently wasted on inefficient on-site monitoring could be diverted to increasing trial sample sizes or conducting more trials.

  1. Substantial improvements in performance indicators achieved in a peripheral blood mononuclear cell cryopreservation quality assurance program using single donor samples.

    PubMed

    Dyer, Wayne B; Pett, Sarah L; Sullivan, John S; Emery, Sean; Cooper, David A; Kelleher, Anthony D; Lloyd, Andrew; Lewin, Sharon R

    2007-01-01

    Storage of high-quality cryopreserved peripheral blood mononuclear cells (PBMC) is often a requirement for multicenter clinical trials and requires a reproducibly high standard of practice. A quality assurance program (QAP) was established to assess an Australia-wide network of laboratories in the provision of high-quality PBMC (determined by yield, viability, and function), using blood taken from single donors (human immunodeficiency virus [HIV] positive and HIV negative) and shipped to each site for preparation and cryopreservation of PBMC. The aim of the QAP was to provide laboratory accreditation for participation in clinical trials and cohort studies which require preparation and cryopreservation of PBMC and to assist all laboratories to prepare PBMC with a viability of >80% and yield of >50% following thawing. Many laboratories failed to reach this standard on the initial QAP round. Interventions to improve performance included telephone interviews with the staff at each laboratory, two annual wet workshops, and direct access to a senior scientist to discuss performance following each QAP round. Performance improved substantially in the majority of sites that initially failed the QAP (P = 0.002 and P = 0.001 for viability and yield, respectively). In a minority of laboratories, there was no improvement (n = 2), while a high standard was retained at the laboratories that commenced with adequate performance (n = 3). These findings demonstrate that simple interventions and monitoring of PBMC preparation and cryopreservation from multiple laboratories can significantly improve performance and contribute to maintenance of a network of laboratories accredited for quality PBMC fractionation and cryopreservation.

  2. Quality assurance program plan for radionuclide airborne emissions monitoring

    SciTech Connect

    Boom, R.J.

    1995-03-01

    This Quality Assurance Program Plan identifies quality assurance program requirements and addresses the various Westinghouse Hanford Company organizations and their particular responsibilities in regards to sample and data handling of airborne emissions. The Hanford Site radioactive airborne emissions requirements are defined in National Emissions Standards for Hazardous Air Pollutants (NESHAP), Code of Federal Regulations, Title 40, Part 61, Subpart H (EPA 1991a). Reporting of the emissions to the US Department of Energy is performed in compliance with requirements of US Department of Energy, Richland Operations Office Order 5400.1, General Environmental Protection Program (DOE-RL 1988). This Quality Assurance Program Plan is prepared in accordance with and to the requirements of QAMS-004/80, Guidelines and Specifications for Preparing Quality Assurance Program Plans (EPA 1983). Title 40 CFR Part 61, Appendix B, Method 114, Quality Assurance Methods (EPA 1991b) specifies the quality assurance requirements and that a program plan should be prepared to meet the requirements of this regulation. This Quality Assurance Program Plan identifies NESHAP responsibilities and how the Westinghouse Hanford Company Environmental, Safety, Health, and Quality Assurance Division will verify that the methods are properly implemented.

  3. Log sampling methods and software for stand and landscape analyses.

    Treesearch

    Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton; Shawn C. Clabough

    2008-01-01

    We describe methods for efficient, accurate sampling of logs at landscape and stand scales to estimate density, total length, cover, volume, and weight. Our methods focus on optimizing the sampling effort by choosing an appropriate sampling method and transect length for specific forest conditions and objectives. Sampling methods include the line-intersect method and...

  4. Understanding and Evaluating Assurance Cases

    NASA Technical Reports Server (NTRS)

    Rushby, John; Xu, Xidong; Rangarajan, Murali; Weaver, Thomas L.

    2015-01-01

    Assurance cases are a method for providing assurance for a system by giving an argument to justify a claim about the system, based on evidence about its design, development, and tested behavior. In comparison with assurance based on guidelines or standards (which essentially specify only the evidence to be produced), the chief novelty in assurance cases is provision of an explicit argument. In principle, this can allow assurance cases to be more finely tuned to the specific circumstances of the system, and more agile than guidelines in adapting to new techniques and applications. The first part of this report (Sections 1-4) provides an introduction to assurance cases. Although this material should be accessible to all those with an interest in these topics, the examples focus on software for airborne systems, traditionally assured using the DO-178C guidelines and its predecessors. A brief survey of some existing assurance cases is provided in Section 5. The second part (Section 6) considers the criteria, methods, and tools that may be used to evaluate whether an assurance case provides sufficient confidence that a particular system or service is fit for its intended use. An assurance case cannot provide unequivocal "proof" for its claim, so much of the discussion focuses on the interpretation of such less-than-definitive arguments, and on methods to counteract confirmation bias and other fallibilities in human reasoning.

  5. Time efficient methods for scanning a fluorescent membrane with a fluorescent microscopic imager for the quality assurance of food

    NASA Astrophysics Data System (ADS)

    Lerm, Steffen; Holder, Silvio; Schellhorn, Mathias; Brückner, Peter; Linß, Gerhard

    2013-05-01

    An important part of the quality assurance of meat is the estimation of germs in the meat exudes. The kind and the number of the germs in the meat affect the medical risk for the consumer of the meat. State-of-the-art analyses of meat are incubator test procedures. The main disadvantages of such incubator tests are the time consumption, the necessary equipment and the need of special skilled employees. These facts cause in high inspection cost. For this reason a new method for the quality assurance is necessary which combines low detection limits and less time consumption. One approach for such a new method is fluorescence microscopic imaging. The germs in the meat exude are caught in special membranes by body-antibody reactions. The germ typical signature could be enhanced with fluorescent chemical markers instead of reproduction of the germs. Each fluorescent marker connects with a free germ or run off the membrane. An image processing system is used to detect the number of fluorescent particles. Each fluorescent spot should be a marker which is connected with a germ. Caused by the small object sizes of germs, the image processing system needs a high optical magnification of the camera. However, this leads to a small field of view and a small depth of focus. For this reasons the whole area of the membrane has to be scanned in three dimensions. To minimize the time consumption, the optimal path has to be found. This optimization problem is influenced by features of the hardware and is presented in this paper. The traversing range in each direction, the step width, the velocity, the shape of the inspection volume and the field of view have influence on the optimal path to scan the membrane.

  6. Guidance for characterizing explosives contaminated soils: Sampling and selecting on-site analytical methods

    SciTech Connect

    Crockett, A.B.; Craig, H.D.; Jenkins, T.F.; Sisk, W.E.

    1996-09-01

    A large number of defense-related sites are contaminated with elevated levels of secondary explosives. Levels of contamination range from barely detectable to levels above 10% that need special handling due to the detonation potential. Characterization of explosives-contaminated sites is particularly difficult due to the very heterogeneous distribution of contamination in the environment and within samples. To improve site characterization, several options exist including collecting more samples, providing on-site analytical data to help direct the investigation, compositing samples, improving homogenization of samples, and extracting larger samples. On-site analytical methods are essential to more economical and improved characterization. On-site methods might suffer in terms of precision and accuracy, but this is more than offset by the increased number of samples that can be run. While verification using a standard analytical procedure should be part of any quality assurance program, reducing the number of samples analyzed by the more expensive methods can result in significantly reduced costs. Often 70 to 90% of the soil samples analyzed during an explosives site investigation do not contain detectable levels of contamination. Two basic types of on-site analytical methods are in wide use for explosives in soil, calorimetric and immunoassay. Calorimetric methods generally detect broad classes of compounds such as nitroaromatics or nitramines, while immunoassay methods are more compound specific. Since TNT or RDX is usually present in explosive-contaminated soils, the use of procedures designed to detect only these or similar compounds can be very effective.

  7. Relationship of Indoor, Outdoor and Personal Air (RIOPA) study: study design, methods and quality assurance/control results.

    PubMed

    Weisel, Clifford P; Zhang, Junfeng; Turpin, Barbara J; Morandi, Maria T; Colome, Steven; Stock, Thomas H; Spektor, Dalia M; Korn, Leo; Winer, Arthur; Alimokhtari, Shahnaz; Kwon, Jaymin; Mohan, Krishnan; Harrington, Robert; Giovanetti, Robert; Cui, William; Afshar, Masoud; Maberti, Silvia; Shendell, Derek

    2005-03-01

    The Relationship of Indoor, Outdoor and Personal Air (RIOPA) Study was undertaken to evaluate the contribution of outdoor sources of air toxics, as defined in the 1990 Clean Air Act Amendments, to indoor concentrations and personal exposures. The concentrations of 18 volatile organic compounds (VOCs), 17 carbonyl compounds, and fine particulate matter mass (PM(2.5)) were measured using 48-h outdoor, indoor and personal air samples collected simultaneously. PM2.5 mass, as well as several component species (elemental carbon, organic carbon, polyaromatic hydrocarbons and elemental analysis) were also measured; only PM(2.5) mass is reported here. Questionnaires were administered to characterize homes, neighborhoods and personal activities that might affect exposures. The air exchange rate was also measured in each home. Homes in close proximity (<0.5 km) to sources of air toxics were preferentially (2:1) selected for sampling. Approximately 100 non-smoking households in each of Elizabeth, NJ, Houston, TX, and Los Angeles, CA were sampled (100, 105, and 105 respectively) with second visits performed at 84, 93, and 81 homes in each city, respectively. VOC samples were collected at all homes, carbonyls at 90% and PM(2.5) at 60% of the homes. Personal samples were collected from nonsmoking adults and a portion of children living in the target homes. This manuscript provides the RIOPA study design and quality control and assurance data. The results from the RIOPA study can potentially provide information on the influence of ambient sources on indoor air concentrations and exposure for many air toxics and will furnish an opportunity to evaluate exposure models for these compounds.

  8. Statistical sampling methods for soils monitoring

    Treesearch

    Ann M. Abbott

    2010-01-01

    Development of the best sampling design to answer a research question should be an interactive venture between the land manager or researcher and statisticians, and is the result of answering various questions. A series of questions that can be asked to guide the researcher in making decisions that will arrive at an effective sampling plan are described, and a case...

  9. Quality assurance for in vitro alternative test methods: quality control issues in test kit production.

    PubMed

    Rispin, Amy; Harbell, John W; Klausner, Mitchell; Jordan, Foster T; Coecke, Sandra; Gupta, Kailash; Stitzel, Katherine

    2004-06-01

    In vitro toxicology methods are being adopted by regulatory agencies worldwide. Many of these methods have been validated by using proprietary materials, often in the form of test kits. Guidelines for the use of Good Laboratory Practice methods for in vitro methods have been proposed. However, users of the data from these methods also need to be reassured that the proprietary materials and the test kits will provide consistent, good quality data over time, not just during the validation process. This paper presents an overview of the methods currently used by representatives of kit manufacturers and contract testing laboratories to ensure that the results from methods that utilise test kits are reproducible over time and across different types of test materials. This information will be valuable as a basis for future discussion on the need for formalised oversight of the quality of these materials.

  10. SU-E-T-438: Commissioning of An In-Vivo Quality Assurance Method Using the Electronic Portal Imaging Device

    SciTech Connect

    Morin, O; Held, M; Pouliot, J

    2014-06-01

    Purpose: Patient specific pre-treatment quality assurance (QA) using arrays of detectors or film have been the standard approach to assure the correct treatment is delivered to the patient. This QA approach is expensive, labor intensive and does not guarantee or document that all remaining fractions were treated properly. The purpose of this abstract is to commission and evaluate the performance of a commercially available in-vivo QA software using the electronic portal imaging device (EPID) to record the daily treatments. Methods: The platform EPIgray V2.0.2 (Dosisoft), which machine model compares ratios of TMR with EPID signal to predict dose was commissioned for an Artiste (Siemens Oncology Care Systems) and a Truebeam (Varian medical systems) linear accelerator following the given instructions. The systems were then tested on three different phantoms (homogeneous stack of solid water, anthropomorphic head and pelvis) and on a library of patient cases. Simple and complex fields were delivered at different exposures and for different gantry angles. The effects of the table attenuation and the EPID sagging were evaluated. Gamma analysis of the measured dose was compared to the predicted dose for complex clinical IMRT cases. Results: Commissioning of the EPIgray system for two photon energies took 8 hours. The difference between the dose planned and the dose measured with EPIgray was better than 3% for all phantom scenarios tested. Preliminary results on patients demonstrate an accuracy of 5% is achievable in high dose regions for both 3DCRT and IMRT. Large discrepancies (>5%) were observed due to metallic structures or air cavities and in low dose areas. Flat panel sagging was visible and accounted for in the EPIgray model. Conclusion: The accuracy achieved by EPIgray is sufficient to document the safe delivery of complex IMRT treatments. Future work will evaluate EPIgray for VMAT and high dose rate deliveries. This work is supported by Dosisoft, Cachan, France.

  11. Evaluation of Sampling Methods for Bacillus Spore ...

    EPA Pesticide Factsheets

    Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.

  12. Method Description, Quality Assurance, Environmental Data, and other Information for Analysis of Pharmaceuticals in Wastewater-Treatment-Plant Effluents, Streamwater, and Reservoirs, 2004-2009

    USGS Publications Warehouse

    Phillips, Patrick J.; Smith, Steven G.; Kolpin, Dana W.; Zaugg, Steven D.; Buxton, Herbert T.; Furlong, Edward T.

    2010-01-01

    Abstract Wastewater-treatment-plant (WWTP) effluents are a demonstrated source of pharmaceuticals to the environment. During 2004-09, a study was conducted to identify pharmaceutical compounds in effluents from WWTPs (including two that receive substantial discharges from pharmaceutical formulation facilities), streamwater, and reservoirs. The methods used to determine and quantify concentrations of seven pharmaceuticals are described. In addition, the report includes information on pharmaceuticals formulated or potentially formulated at the two pharmaceutical formulation facilities that provide substantial discharge to two of the WWTPs, and potential limitations to these data are discussed. The analytical methods used to provide data on the seven pharmaceuticals (including opioids, muscle relaxants, and other pharmaceuticals) in filtered water samples also are described. Data are provided on method performance, including spike data, method detection limit results, and an estimation of precision. Quality-assurance data for sample collection and handling are included. Quantitative data are presented for the seven pharmaceuticals in water samples collected at WWTP discharge points, from streams, and at reservoirs. Occurrence data also are provided for 19 pharmaceuticals that were qualitatively identified. Flow data at selected WWTP and streams are presented. Between 2004-09, 35-38 effluent samples were collected from each of three WWTPs in New York and analyzed for seven pharmaceuticals. Two WWTPs (NY2 and NY3) receive substantial inflows (greater than 20 percent of plant flow) from pharmaceutical formulation facilities (PFF) and one (NY1) receives no PFF flow. Samples of effluents from 23 WWTPs across the United States were analyzed once for these pharmaceuticals as part of a national survey. Maximum pharmaceutical effluent concentrations for the national survey and NY1 effluent samples were generally less than 1 ug/L. Four pharmaceuticals (methadone, oxycodone

  13. Fast identification of microplastics in complex environmental samples by a thermal degradation method.

    PubMed

    Dümichen, Erik; Eisentraut, Paul; Bannick, Claus Gerhard; Barthel, Anne-Kathrin; Senz, Rainer; Braun, Ulrike

    2017-05-01

    In order to determine the relevance of microplastic particles in various environmental media, comprehensive investigations are needed. However, no analytical method exists for fast identification and quantification. At present, optical spectroscopy methods like IR and RAMAN imaging are used. Due to their time consuming procedures and uncertain extrapolation, reliable monitoring is difficult. For analyzing polymers Py-GC-MS is a standard method. However, due to a limited sample amount of about 0.5 mg it is not suited for analysis of complex sample mixtures like environmental samples. Therefore, we developed a new thermoanalytical method as a first step for identifying microplastics in environmental samples. A sample amount of about 20 mg, which assures the homogeneity of the sample, is subjected to complete thermal decomposition. The specific degradation products of the respective polymer are adsorbed on a solid-phase adsorber and subsequently analyzed by thermal desorption gas chromatography mass spectrometry. For certain identification, the specific degradation products for the respective polymer were selected first. Afterwards real environmental samples from the aquatic (three different rivers) and the terrestrial (bio gas plant) systems were screened for microplastics. Mainly polypropylene (PP), polyethylene (PE) and polystyrene (PS) were identified for the samples from the bio gas plant and PE and PS from the rivers. However, this was only the first step and quantification measurements will follow.

  14. Use of Lot quality assurance sampling surveys to evaluate community health worker performance in rural Zambia: a case of Luangwa district.

    PubMed

    Mwanza, Moses; Zulu, Japhet; Topp, Stephanie M; Musonda, Patrick; Mutale, Wilbroad; Chilengi, Roma

    2017-04-17

    The Better Health Outcomes through Mentoring and Assessment (BHOMA) project is a cluster randomized controlled trial aimed at reducing age-standardized mortality rates in three rural districts through involvement of Community Health Workers (CHWs), Traditional Birth Attendants (TBAs), and Neighborhood Health Committees (NHCs). CHWs conduct quarterly surveys on all households using a questionnaire that captures key health events occurring within their catchment population. In order to validate contact with households, we utilize the Lot Quality Assurance Sampling (LQAS) methodology. In this study, we report experiences of applying the LQAS approach to monitor performance of CHWs in Luangwa District. Between April 2011 and December 2013, seven health facilities in Luangwa district were enrolled into the BHOMA project. The health facility catchment areas were divided into 33 geographic zones. Quality assurance was performed each quarter by randomly selecting zones representing about 90% of enrolled catchment areas from which 19 households per zone where also randomly identified. The surveys were conducted by CHW supervisors who had been trained on using the LQAS questionnaire. Information collected included household identity number (ID), whether the CHW visited the household, duration of the most recent visit, and what health information was discussed during the CHW visit. The threshold for success was set at 75% household outreach by CHWs in each zone. There are 4,616 total households in the 33 zones. This yielded a target of 32,212 household visits by community health workers during the 7 survey rounds. Based on the set cutoff point for passing the surveys (at least 75% households confirmed as visited), only one team of CHWs at Luangwa high school failed to reach the target during round 1 of the surveys; all the teams otherwise registered successful visits in all the surveys. We have employed the LQAS methodology for assurance that quarterly surveys were

  15. Quality-assurance audits for the ARB (Air Resources Board)-sponsored Carbonaceous Species Methods Comparison Study at Citrus College, Glendora, California, August 12-21, 1986. Final report

    SciTech Connect

    Countess, R.J.

    1987-09-21

    A series of quality-assurance tasks were performed in support of the Air Resources Board-sponsored Carbonaceous Species Methods Comparison Study, conducted at Citrus College in Glendora, CA, August 1986. The project summarizes the quality assurance efforts for the study, which included: (1) flow rate audits for all samplers deployed in the nine day field study; (2) preparation and supplies of carbonaceous reference materials for an interlaboratory round-robin study; and (3) analysis of the reference materials as well as 20% of the ambient particulate samples collected by each of the study participants for both organic and elemental carbon. The final task was done in order to assess the influence of samplers upon collected particulate carbon.

  16. U.S. Geological Survey nutrient preservation experiment; nutrient concentration data for surface-, ground-, and municipal-supply water samples and quality-assurance samples

    USGS Publications Warehouse

    Patton, Charles J.; Truitt, Earl P.

    1995-01-01

    This report is a compilation of analytical results from a study conducted at the U.S. Geological Survey, National Water Quality Laboratory (NWQL) in 1992 to assess the effectiveness of three field treatment protocols to stabilize nutrient concentra- tions in water samples stored for about 1 month at 4C. Field treatments tested were chilling, adjusting sample pH to less than 2 with sulfuric acid and chilling, and adding 52 milligrams of mercury (II) chloride per liter of sample and chilling. Field treatments of samples collected for determination of ammonium, nitrate plus nitrite, nitrite, dissolved Kjeldahl nitrogen, orthophosphate, and dissolved phosphorus included 0.45-micrometer membrane filtration. Only total Kjeldahl nitrogen and total phosphorus were determined in unfiltered samples. Data reported here pertain to water samples collected in April and May 1992 from 15 sites within the continental United States. Also included in this report are analytical results for nutrient concentrations in synthetic reference samples that were analyzed concurrently with real samples.

  17. Quality assurance and quality control in light stable isotope laboratories: a case study of Rio Grande, Texas, water samples.

    PubMed

    Coplen, Tyler B; Qi, Haiping

    2009-06-01

    New isotope laboratories can achieve the goal of reporting the same isotopic composition within analytical uncertainty for the same material analysed decades apart by (1) writing their own acceptance testing procedures and putting them into their mass spectrometric or laser-based isotope-ratio equipment procurement contract, (2) requiring a manufacturer to demonstrate acceptable performance using all sample ports provided with the instrumentation, (3) for each medium to be analysed, prepare two local reference materials substantially different in isotopic composition to encompass the range in isotopic composition expected in the laboratory and calibrated them with isotopic reference materials available from the International Atomic Energy Agency (IAEA) or the US National Institute of Standards and Technology (NIST), (4) using the optimum storage containers (for water samples, sealing in glass ampoules that are sterilised after sealing is satisfactory), (5) interspersing among sample unknowns local laboratory isotopic reference materials daily (internationally distributed isotopic reference materials can be ordered at three-year intervals, and can be used for elemental analyser analyses and other analyses that consume less than 1 mg of material) - this process applies to H, C, N, O, and S isotope ratios, (6) calculating isotopic compositions of unknowns by normalising isotopic data to that of local reference materials, which have been calibrated to internationally distributed isotopic reference materials, (7) reporting results on scales normalised to internationally distributed isotopic reference materials (where they are available) and providing to sample submitters the isotopic compositions of internationally distributed isotopic reference materials of the same substance had they been analysed with unknowns, (8) providing an audit trail in the laboratory for analytical results - this trail commonly will be in electronic format and might include a laboratory

  18. Quality assurance and quality control in light stable isotope laboratories: A case study of Rio Grande, Texas, water samples

    USGS Publications Warehouse

    Coplen, T.B.; Qi, H.

    2009-01-01

    New isotope laboratories can achieve the goal of reporting the same isotopic composition within analytical uncertainty for the same material analysed decades apart by (1) writing their own acceptance testing procedures and putting them into their mass spectrometric or laser-based isotope-ratio equipment procurement contract, (2) requiring a manufacturer to demonstrate acceptable performance using all sample ports provided with the instrumentation, (3) for each medium to be analysed, prepare two local reference materials substantially different in isotopic composition to encompass the range in isotopic composition expected in the laboratory and calibrated them with isotopic reference materials available from the International Atomic Energy Agency (IAEA) or the US National Institute of Standards and Technology (NIST), (4) using the optimum storage containers (for water samples, sealing in glass ampoules that are sterilised after sealing is satisfactory), (5) interspersing among sample unknowns local laboratory isotopic reference materials daily (internationally distributed isotopic reference materials can be ordered at three-year intervals, and can be used for elemental analyser analyses and other analyses that consume less than 1 mg of material) - this process applies to H, C, N, O, and S isotope ratios, (6) calculating isotopic compositions of unknowns by normalising isotopic data to that of local reference materials, which have been calibrated to internationally distributed isotopic reference materials, (7) reporting results on scales normalised to internationally distributed isotopic reference materials (where they are available) and providing to sample submitters the isotopic compositions of internationally distributed isotopic reference materials of the same substance had they been analysed with unknowns, (8) providing an audit trail in the laboratory for analytical results - this trail commonly will be in electronic format and might include a laboratory

  19. Quality Assurance using Outlier Detection on an Automatic Segmentation Method for the Cerebellar Peduncles.

    PubMed

    Li, Ke; Ye, Chuyang; Yang, Zhen; Carass, Aaron; Ying, Sarah H; Prince, Jerry L

    2016-02-27

    Cerebellar peduncles (CPs) are white matter tracts connecting the cerebellum to other brain regions. Automatic segmentation methods of the CPs have been proposed for studying their structure and function. Usually the performance of these methods is evaluated by comparing segmentation results with manual delineations (ground truth). However, when a segmentation method is run on new data (for which no ground truth exists) it is highly desirable to efficiently detect and assess algorithm failures so that these cases can be excluded from scientific analysis. In this work, two outlier detection methods aimed to assess the performance of an automatic CP segmentation algorithm are presented. The first one is a univariate non-parametric method using a box-whisker plot. We first categorize automatic segmentation results of a dataset of diffusion tensor imaging (DTI) scans from 48 subjects as either a success or a failure. We then design three groups of features from the image data of nine categorized failures for failure detection. Results show that most of these features can efficiently detect the true failures. The second method-supervised classification-was employed on a larger DTI dataset of 249 manually categorized subjects. Four classifiers-linear discriminant analysis (LDA), logistic regression (LR), support vector machine (SVM), and random forest classification (RFC)-were trained using the designed features and evaluated using a leave-one-out cross validation. Results show that the LR performs worst among the four classifiers and the other three perform comparably, which demonstrates the feasibility of automatically detecting segmentation failures using classification methods.

  20. Quality Assurance using Outlier Detection on an Automatic Segmentation Method for the Cerebellar Peduncles

    PubMed Central

    Li, Ke; Ye, Chuyang; Yang, Zhen; Carass, Aaron; Ying, Sarah H.; Prince, Jerry L.

    2017-01-01

    Cerebellar peduncles (CPs) are white matter tracts connecting the cerebellum to other brain regions. Automatic segmentation methods of the CPs have been proposed for studying their structure and function. Usually the performance of these methods is evaluated by comparing segmentation results with manual delineations (ground truth). However, when a segmentation method is run on new data (for which no ground truth exists) it is highly desirable to efficiently detect and assess algorithm failures so that these cases can be excluded from scientific analysis. In this work, two outlier detection methods aimed to assess the performance of an automatic CP segmentation algorithm are presented. The first one is a univariate non-parametric method using a box-whisker plot. We first categorize automatic segmentation results of a dataset of diffusion tensor imaging (DTI) scans from 48 subjects as either a success or a failure. We then design three groups of features from the image data of nine categorized failures for failure detection. Results show that most of these features can efficiently detect the true failures. The second method—supervised classification—was employed on a larger DTI dataset of 249 manually categorized subjects. Four classifiers—linear discriminant analysis (LDA), logistic regression (LR), support vector machine (SVM), and random forest classification (RFC)—were trained using the designed features and evaluated using a leave-one-out cross validation. Results show that the LR performs worst among the four classifiers and the other three perform comparably, which demonstrates the feasibility of automatically detecting segmentation failures using classification methods. PMID:28203039

  1. Quality assurance using outlier detection on an automatic segmentation method for the cerebellar peduncles

    NASA Astrophysics Data System (ADS)

    Li, Ke; Ye, Chuyang; Yang, Zhen; Carass, Aaron; Ying, Sarah H.; Prince, Jerry L.

    2016-03-01

    Cerebellar peduncles (CPs) are white matter tracts connecting the cerebellum to other brain regions. Automatic segmentation methods of the CPs have been proposed for studying their structure and function. Usually the performance of these methods is evaluated by comparing segmentation results with manual delineations (ground truth). However, when a segmentation method is run on new data (for which no ground truth exists) it is highly desirable to efficiently detect and assess algorithm failures so that these cases can be excluded from scientific analysis. In this work, two outlier detection methods aimed to assess the performance of an automatic CP segmentation algorithm are presented. The first one is a univariate non-parametric method using a box-whisker plot. We first categorize automatic segmentation results of a dataset of diffusion tensor imaging (DTI) scans from 48 subjects as either a success or a failure. We then design three groups of features from the image data of nine categorized failures for failure detection. Results show that most of these features can efficiently detect the true failures. The second method—supervised classification—was employed on a larger DTI dataset of 249 manually categorized subjects. Four classifiers—linear discriminant analysis (LDA), logistic regression (LR), support vector machine (SVM), and random forest classification (RFC)—were trained using the designed features and evaluated using a leave-one-out cross validation. Results show that the LR performs worst among the four classifiers and the other three perform comparably, which demonstrates the feasibility of automatically detecting segmentation failures using classification methods.

  2. System and method for extracting a sample from a surface

    DOEpatents

    Van Berkel, Gary; Covey, Thomas

    2015-06-23

    A system and method is disclosed for extracting a sample from a sample surface. A sample is provided and a sample surface receives the sample which is deposited on the sample surface. A hydrophobic material is applied to the sample surface, and one or more devices are configured to dispense a liquid on the sample, the liquid dissolving the sample to form a dissolved sample material, and the one or more devices are configured to extract the dissolved sample material from the sample surface.

  3. ANALYTICAL METHODS AND QUALITY ASSURANCE CRITERIA FOR LC/ES/MS DETERMINATION OF PFOS IN FISH

    EPA Science Inventory

    PFOS, perfluorooctanesulfonate, has recently received much attention from environmental researchers. Previous analytical methods were based upon complexing with a strong ion-pairing reagent and extraction into MTBE. Detection was done on a concentrate using negative ion LC/ES/MS/...

  4. In-Place Nondestructive Evaluation Methods for Quality Assurance of Building Materials.

    DTIC Science & Technology

    1982-03-01

    American Society of Metals , 1976). 13 Table 4 (Cont’d) meNDE Method possible No. in...Nondestructive Inspection and Quality Control, 8th Edition ( American Society of Metals , 1976). 32 PRIMARY ELECTROMAGNETIC FIELD COIL 0 TEST MATERIAL 0...8th Edition ( American Society of Metals , 1976); F. W. Dunn, "Magnetic Particle Inspection Fundamentals," Lesson i, Fundamentals of

  5. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris

    Treesearch

    Michael S. Williams; Jeffrey H. Gove

    2003-01-01

    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  6. Evaporation from weighing precipitation gauges: impacts on automated gauge measurements and quality assurance methods

    NASA Astrophysics Data System (ADS)

    Leeper, R. D.; Kochendorfer, J.

    2015-06-01

    Evaporation from a precipitation gauge can cause errors in the amount of measured precipitation. For automated weighing-bucket gauges, the World Meteorological Organization (WMO) suggests the use of evaporative suppressants and frequent observations to limit these biases. However, the use of evaporation suppressants is not always feasible due to environmental hazards and the added cost of maintenance, transport, and disposal of the gauge additive. In addition, research has suggested that evaporation prior to precipitation may affect precipitation measurements from auto-recording gauges operating at sub-hourly frequencies. For further evaluation, a field campaign was conducted to monitor evaporation and its impacts on the quality of precipitation measurements from gauges used at U.S. Climate Reference Network (USCRN) stations. Two Geonor gauges were collocated, with one gauge using an evaporative suppressant (referred to as Geonor-NonEvap) and the other with no suppressant (referred to as Geonor-Evap) to evaluate evaporative losses and evaporation biases on precipitation measurements. From June to August, evaporative losses from the Geonor-Evap gauge exceeded accumulated precipitation, with an average loss of 0.12 mm h-1. The impact of evaporation on precipitation measurements was sensitive to the choice of calculation method. In general, the pairwise method that utilized a longer time series to smooth out sensor noise was more sensitive to gauge evaporation (-4.6% bias with respect to control) than the weighted-average method that calculated depth change over a smaller window (<+1% bias). These results indicate that while climate and gauge design affect gauge evaporation rates, computational methods also influence the magnitude of evaporation biases on precipitation measurements. This study can be used to advance quality insurance (QA) techniques used in other automated networks to mitigate the impact of evaporation biases on precipitation measurements.

  7. Evaporation from weighing precipitation gauges: impacts on automated gauge measurements and quality assurance methods

    NASA Astrophysics Data System (ADS)

    Leeper, R. D.; Kochendorfer, J.

    2014-12-01

    The effects of evaporation on precipitation measurements have been understood to bias total precipitation lower. For automated weighing-bucket gauges, the World Meteorological Organization (WMO) suggests the use of evaporative suppressants with frequent observations. However, the use of evaporation suppressants is not always feasible due to environmental hazards and the added cost of maintenance, transport, and disposal of the gauge additive. In addition, research has suggested that evaporation prior to precipitation may affect precipitation measurements from auto-recording gauges operating at sub-hourly frequencies. For further evaluation, a field campaign was conducted to monitor evaporation and its impacts on the quality of precipitation measurements from gauges used at US Climate Reference Network (USCRN) stations. Collocated Geonor gauges with (nonEvap) and without (evap) an evaporative suppressant were compared to evaluate evaporative losses and evaporation biases on precipitation measurements. From June to August, evaporative losses from the evap gauge exceeded accumulated precipitation, with an average loss of 0.12 mm h-1. However, the impact of evaporation on precipitation measurements was sensitive to calculation methods. In general, methods that utilized a longer time series to smooth out sensor noise were more sensitive to gauge (-4.6% bias with respect to control) evaporation than methods computing depth change without smoothing (< +1% bias). These results indicate that while climate and gauge design affect gauge evaporation rates computational methods can influence the magnitude of evaporation bias on precipitation measurements. It is hoped this study will advance QA techniques that mitigate the impact of evaporation biases on precipitation measurements from other automated networks.

  8. Sci—Sat AM: Stereo — 05: The Development of Quality Assurance Methods for Trajectory based Cranial SRS Treatments

    SciTech Connect

    Wilson, B; Duzenli, C; Gete, E; Teke, T

    2014-08-15

    The goal of this work was to develop and validate non-planar linac beam trajectories defined by the dynamic motion of the gantry, couch, jaws, collimator and MLCs. This was conducted on the Varian TrueBeam linac by taking advantage of the linac's advanced control features in a non-clinical mode (termed developers mode). In this work, we present quality assurance methods that we have developed to test for the positional and temporal accuracy of the linac's moving components. The first QA method focuses on the coordination of couch and gantry. For this test, we developed a cylindrical phantom which has a film insert. Using this phantom we delivered a plan with dynamic motion of the couch and gantry. We found the mean absolute deviation of the entrance position from its expected value to be 0.5mm, with a standard deviation of 0.5mm. This was within the tolerances set by the machine's mechanical accuracy and the setup accuracy of the phantom. We also present an altered picket fence test which has added dynamic and simultaneous rotations of the couch and the collimator. While the test was shown to be sensitive enough to discern errors 1° and greater, we were unable to identify any errors in the coordination of the linacs collimator and couch. When operating under normal conditions, the Varian TrueBeam linac was able to pass both tests and is within tolerances acceptable for complex trajectory based treatments.

  9. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245... Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or...

  10. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official...

  11. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises.

    PubMed

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B; Pereira, Nuno Sousa; Behrman, Jere

    2012-05-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization's Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples' statistical properties.

  12. Analytical methods, quality assurance and quality control used in the Greenland AMAP programme.

    PubMed

    Asmund, G; Cleemann, M

    2000-01-17

    The majority of analytical results in the Greenland AMAP (Arctic Monitoring and Assessment Programme) have been produced by laboratories that participate regularly in performance studies. This makes it possible to judge the quality of the results based on objective measurements made by independent assessors. AMAP laboratories participated while analysing the AMAP samples in the QUASIMEME laboratory performance study programme, in the 'Interlaboratory Comparison Program' organised by Le Centre de Toxicologie du Québec, in a toxaphene intercomparison study organised by The Food Research Division of Health Canada, and in an International Atomic Energy Agency Intercomparison exercise. The relative errors of the trace analyses, i.e. the relative deviation of the result obtained by the AMAP laboratory from the assigned value, are in most cases less than the 25% which is regarded as acceptable by QUASIMEME. Usually the errors, especially for trace elements, are less than 12.5%, while errors for trace organics below 1 microgram kg-1 may rise to 50% or more. This study covers the period 1993 to 1998 for trace elements and one or more years from the period 1994-1996 for trace organics.

  13. A method for sampling waste corn

    USGS Publications Warehouse

    Frederick, R.B.; Klaas, E.E.; Baldassarre, G.A.; Reinecke, K.J.

    1984-01-01

    Corn had become one of the most important wildlife food in the United States. It is eaten by a wide variety of animals, including white-tailed deer (Odocoileus virginianus ), raccoon (Procyon lotor ), ring-necked pheasant (Phasianus colchicus , wild turkey (Meleagris gallopavo ), and many species of aquatic birds. Damage to unharvested crops had been documented, but many birds and mammals eat waste grain after harvest and do not conflict with agriculture. A good method for measuring waste-corn availability can be essential to studies concerning food density and food and feeding habits of field-feeding wildlife. Previous methods were developed primarily for approximating losses due to harvest machinery. In this paper, a method is described for estimating the amount of waste corn potentially available to wildlife. Detection of temporal changes in food availability and differences caused by agricultural operations (e.g., recently harvested stubble fields vs. plowed fields) are discussed.

  14. Remedial investigation sampling and analysis plan for J-Field, Aberdeen Proving Ground, Maryland: Volume 2, Quality Assurance Project Plan

    SciTech Connect

    Prasad, S.; Martino, L.; Patton, T.

    1995-03-01

    J-Field encompasses about 460 acres at the southern end of the Gunpowder Neck Peninsula in the Edgewood Area of APG (Figure 2.1). Since World War II, the Edgewood Area of APG has been used to develop, manufacture, test, and destroy chemical agents and munitions. These materials were destroyed at J-Field by open burning and open detonation (OB/OD). For the purposes of this project, J-Field has been divided into eight geographic areas or facilities that are designated as areas of concern (AOCs): the Toxic Burning Pits (TBP), the White Phosphorus Burning Pits (WPP), the Riot Control Burning Pit (RCP), the Robins Point Demolition Ground (RPDG), the Robins Point Tower Site (RPTS), the South Beach Demolition Ground (SBDG), the South Beach Trench (SBT), and the Prototype Building (PB). The scope of this project is to conduct a remedial investigation/feasibility study (RI/FS) and ecological risk assessment to evaluate the impacts of past disposal activities at the J-Field site. Sampling for the RI will be carried out in three stages (I, II, and III) as detailed in the FSP. A phased approach will be used for the J-Field ecological risk assessment (ERA).

  15. SU-E-J-126: Respiratory Gating Quality Assurance: A Simple Method to Achieve Millisecond Temporal Resolution

    SciTech Connect

    McCabe, B; Wiersma, R

    2014-06-01

    Purpose: Low temporal latency between a gating on/off signal and a linac beam on/off during respiratory gating is critical for patient safety. Although, a measurement of temporal lag is recommended by AAPM Task Group 142 for commissioning and annual quality assurance, there currently exists no published method. Here we describe a simple, inexpensive, and reliable method to precisely measure gating lag at millisecond resolutions. Methods: A Varian Real-time Position Management™ (RPM) gating simulator with rotating disk was modified with a resistive flex sensor (Spectra Symbol) attached to the gating box platform. A photon diode was placed at machine isocenter. Output signals of the flex sensor and diode were monitored with a multichannel oscilloscope (Tektronix™ DPO3014). Qualitative inspection of the gating window/beam on synchronicity were made by setting the linac to beam on/off at end-expiration, and the oscilloscope's temporal window to 100 ms to visually examine if the on/off timing was within the recommended 100-ms tolerance. Quantitative measurements were made by saving the signal traces and analyzing in MatLab™. The on and off of the beam signal were located and compared to the expected gating window (e.g. 40% to 60%). Four gating cycles were measured and compared. Results: On a Varian TrueBeam™ STx linac with RPM gating software, the average difference in synchronicity at beam on and off for four cycles was 14 ms (3 to 30 ms) and 11 ms (2 to 32 ms), respectively. For a Varian Clinac™ 21EX the average difference at beam on and off was 127 ms (122 to 133 ms) and 46 ms (42 to 49 ms), respectively. The uncertainty in the synchrony difference was estimated at ±6 ms. Conclusion: This new gating QA method is easy to implement and allows for fast qualitative inspection and quantitative measurements for commissioning and TG-142 annual QA measurements.

  16. Evaluation of Environmental Sample Analysis Methods and Results Reporting in the National Children's Study Vanguard Study.

    PubMed

    Heikkinen, Maire S A; Khalaf, Abdisalam; Beard, Barbara; Viet, Susan M; Dellarco, Michael

    2016-05-03

    During the initial Vanguard phase of the U.S. National Children's Study (NCS), about 2000 tap water, surface wipe, and air samples were collected and analyzed immediately. The shipping conditions, analysis methods, results, and laboratory performance were evaluated to determine the best approaches for use in the NCS Main Study. The main conclusions were (1) to employ established sample analysis methods, when possible, and alternate methodologies only after careful consideration with method validation studies; (2) lot control and prescreening sample collection materials are important quality assurance procedures; (3) packing samples correctly requires careful training and adjustment of shipping conditions to local conditions; (4) trip blanks and spiked samples should be considered for samplers with short expiration times and labile analytes; (5) two study-specific results reports should be required: laboratory electronic data deliverables (EDD) of sample results in a useable electronic format (CSV or SEDD XML/CSV) and a data package with sample results and supporting information in PDF format. These experiences and lessons learned can be applied to any long-term study.

  17. Printed Circuit Board Quality Assurance

    NASA Technical Reports Server (NTRS)

    Sood, Bhanu

    2016-01-01

    PCB Assurance Summary: PCB assurance actives are informed by risk in context of the Project. Lessons are being applied across Projects for continuous improvements. Newer component technologies, smaller/high pitch devices: tighter and more demanding PCB designs: Identifying new research areas. New materials, designs, structures and test methods.

  18. Exploration and Sampling Methods for Borrow Areas

    DTIC Science & Technology

    1990-12-01

    environmentally soundcoastal project designs. seIsti-.re f t lv nC.CLS- ~ 0. 1 7J.s The current state of knowledge regarding geological indicators of subaqueous...This report discusses the equipment and techniques that are used in coastal marine and lacustrine environments to locate and characterize poten- tial...because the fill material used was unstable in the beach environment and rapidly washed away. More recently, methods for specifying fill material based

  19. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises

    PubMed Central

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere

    2011-01-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004

  20. Modified electrokinetic sample injection method in chromatography and electrophoresis analysis

    DOEpatents

    Davidson, J. Courtney; Balch, Joseph W.

    2001-01-01

    A sample injection method for horizontal configured multiple chromatography or electrophoresis units, each containing a number of separation/analysis channels, that enables efficient introduction of analyte samples. This method for loading when taken in conjunction with horizontal microchannels allows much reduced sample volumes and a means of sample stacking to greatly reduce the concentration of the sample. This reduction in the amount of sample can lead to great cost savings in sample preparation, particularly in massively parallel applications such as DNA sequencing. The essence of this method is in preparation of the input of the separation channel, the physical sample introduction, and subsequent removal of excess material. By this method, sample volumes of 100 nanoliter to 2 microliters have been used successfully, compared to the typical 5 microliters of sample required by the prior separation/analysis method.

  1. Microfluidic DNA sample preparation method and device

    DOEpatents

    Krulevitch, Peter A.; Miles, Robin R.; Wang, Xiao-Bo; Mariella, Raymond P.; Gascoyne, Peter R. C.; Balch, Joseph W.

    2002-01-01

    Manipulation of DNA molecules in solution has become an essential aspect of genetic analyses used for biomedical assays, the identification of hazardous bacterial agents, and in decoding the human genome. Currently, most of the steps involved in preparing a DNA sample for analysis are performed manually and are time, labor, and equipment intensive. These steps include extraction of the DNA from spores or cells, separation of the DNA from other particles and molecules in the solution (e.g. dust, smoke, cell/spore debris, and proteins), and separation of the DNA itself into strands of specific lengths. Dielectrophoresis (DEP), a phenomenon whereby polarizable particles move in response to a gradient in electric field, can be used to manipulate and separate DNA in an automated fashion, considerably reducing the time and expense involved in DNA analyses, as well as allowing for the miniaturization of DNA analysis instruments. These applications include direct transport of DNA, trapping of DNA to allow for its separation from other particles or molecules in the solution, and the separation of DNA into strands of varying lengths.

  2. Surface Sampling Methods for Bacillus anthracis Spore Contamination

    PubMed Central

    Hein, Misty J.; Taylor, Lauralynn; Curwin, Brian D.; Kinnes, Gregory M.; Seitz, Teresa A.; Popovic, Tanja; Holmes, Harvey T.; Kellum, Molly E.; McAllister, Sigrid K.; Whaley, David N.; Tupin, Edward A.; Walker, Timothy; Freed, Jennifer A.; Small, Dorothy S.; Klusaritz, Brian; Bridges, John H.

    2002-01-01

    During an investigation conducted December 17–20, 2001, we collected environmental samples from a U.S. postal facility in Washington, D.C., known to be extensively contaminated with Bacillus anthracis spores. Because methods for collecting and analyzing B. anthracis spores have not yet been validated, our objective was to compare the relative effectiveness of sampling methods used for collecting spores from contaminated surfaces. Comparison of wipe, wet and dry swab, and HEPA vacuum sock samples on nonporous surfaces indicated good agreement between results with HEPA vacuum and wipe samples. However, results from HEPA vacuum sock and wipe samples agreed poorly with the swab samples. Dry swabs failed to detect spores >75% of the time they were detected by wipe and HEPA vacuum samples. Wipe samples collected after HEPA vacuum samples and HEPA vacuum samples after wipe samples indicated that neither method completely removed spores from the sampled surfaces. PMID:12396930

  3. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural Marketing...

  4. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA, Agricultural...

  5. Method for sampling sub-micron particles

    DOEpatents

    Gay, Don D.; McMillan, William G.

    1985-01-01

    Apparatus and method steps for collecting sub-micron sized particles include a collection chamber and cryogenic cooling. The cooling is accomplished by coil tubing carrying nitrogen in liquid form, with the liquid nitrogen changing to the gas phase before exiting from the collection chamber in the tubing. Standard filters are used to filter out particles of diameter greater than or equal to 0.3 microns; however the present invention is used to trap particles of less than 0.3 micron in diameter. A blower draws air to said collection chamber through a filter which filters particles with diameters greater than or equal to 0.3 micron. The air is then cryogenically cooled so that moisture and sub-micron sized particles in the air condense into ice on the coil. The coil is then heated so that the ice melts, and the liquid is then drawn off and passed through a Buchner funnel where the liquid is passed through a Nuclepore membrane. A vacuum draws the liquid through the Nuclepore membrane, with the Nuclepore membrane trapping sub-micron sized particles therein. The Nuclepore membrane is then covered on its top and bottom surfaces with sheets of Mylar.RTM. and the assembly is then crushed into a pellet. This effectively traps the sub-micron sized particles for later analysis.

  6. Methods for collecting algal samples as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Porter, Stephen D.; Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic algae (periphyton) and phytoplankton communities are characterized in the U.S. Geological Survey's National Water-Quality Assessment Program as part of an integrated physical, chemical, and biological assessment of the Nation's water quality. This multidisciplinary approach provides multiple lines of evidence for evaluating water-quality status and trends, and for refining an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. Water quality can be characterized by evaluating the results of qualitative and quantitative measurements of the algal community. Qualitative periphyton samples are collected to develop of list of taxa present in the sampling reach. Quantitative periphyton samples are collected to measure algal community structure within selected habitats. These samples of benthic algal communities are collected from natural substrates, using the sampling methods that are most appropriate for the habitat conditions. Phytoplankton samples may be collected in large nonwadeable streams and rivers to meet specific program objectives. Estimates of algal biomass (chlorophyll content and ash-free dry mass) also are optional measures that may be useful for interpreting water-quality conditions. A nationally consistent approach provides guidance on site, reach, and habitat selection, as well as information on methods and equipment for qualitative and quantitative sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data locally, regionally, and nationally.

  7. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    PubMed

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  8. Simple quality assurance method of dynamic tumor tracking with the gimbaled linac system using a light field.

    PubMed

    Miura, Hideharu; Ozawa, Shuichi; Hayata, Masahiro; Tsuda, Shintaro; Yamada, Kiyoshi; Nagata, Yasushi

    2016-09-08

    We proposed a simple visual method for evaluating the dynamic tumor tracking (DTT) accuracy of a gimbal mechanism using a light field. A single photon beam was set with a field size of 30 × 30 mm2 at a gantry angle of 90°. The center of a cube phantom was set up at the isocenter of a motion table, and 4D modeling was performed based on the tumor and infrared (IR) marker motion. After 4D modeling, the cube phantom was replaced with a sheet of paper, which was placed perpen-dicularly, and a light field was projected on the sheet of paper. The light field was recorded using a web camera in a treatment room that was as dark as possible. Calculated images from each image obtained using the camera were summed to compose a total summation image. Sinusoidal motion sequences were produced by moving the phantom with a fixed amplitude of 20 mm and different breathing periods of 2, 4, 6, and 8 s. The light field was projected on the sheet of paper under three conditions: with the moving phantom and DTT based on the motion of the phantom, with the moving phantom and non-DTT, and with a stationary phantom for comparison. The values of tracking errors using the light field were 1.12 ± 0.72, 0.31 ± 0.19, 0.27 ± 0.12, and 0.15 ± 0.09 mm for breathing periods of 2, 4, 6, and 8s, respectively. The tracking accuracy showed dependence on the breath-ing period. We proposed a simple quality assurance (QA) process for the tracking accuracy of a gimbal mechanism system using a light field and web camera. Our method can assess the tracking accuracy using a light field without irradiation and clearly visualize distributions like film dosimetry.

  9. Simple quality assurance method of dynamic tumor tracking with the gimbaled linac system using a light field.

    PubMed

    Miura, Hideharu; Ozawa, Shuichi; Hayata, Masahiro; Tsuda, Shintaro; Yamada, Kiyoshi; Nagata, Yasushi

    2016-09-01

    We proposed a simple visual method for evaluating the dynamic tumor tracking (DTT) accuracy of a gimbal mechanism using a light field. A single photon beam was set with a field size of 30×30 mm2 at a gantry angle of 90°. The center of a cube phantom was set up at the isocenter of a motion table, and 4D modeling was performed based on the tumor and infrared (IR) marker motion. After 4D modeling, the cube phantom was replaced with a sheet of paper, which was placed perpendicularly, and a light field was projected on the sheet of paper. The light field was recorded using a web camera in a treatment room that was as dark as possible. Calculated images from each image obtained using the camera were summed to compose a total summation image. Sinusoidal motion sequences were produced by moving the phantom with a fixed amplitude of 20 mm and different breathing periods of 2, 4, 6, and 8 s. The light field was projected on the sheet of paper under three conditions: with the moving phantom and DTT based on the motion of the phantom, with the moving phantom and non-DTT, and with a stationary phantom for comparison. The values of tracking errors using the light field were 1.12±0.72, 0.31±0.19, 0.27±0.12, and 0.15±0.09 mm for breathing periods of 2, 4, 6, and 8 s, respectively. The tracking accuracy showed dependence on the breathing period. We proposed a simple quality assurance (QA) process for the tracking accuracy of a gimbal mechanism system using a light field and web camera. Our method can assess the tracking accuracy using a light field without irradiation and clearly visualize distributions like film dosimetry. PACS number(s): 87.56 Fc, 87.55.Qr.

  10. Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom

    USGS Publications Warehouse

    Zimmerman, L.R.; Strahan, A.P.; Thurman, E.M.

    2001-01-01

    A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

  11. Methods for characterizing, classifying, and identifying unknowns in samples

    DOEpatents

    Grate, Jay W [West Richland, WA; Wise, Barry M [Manson, WA

    2002-01-01

    Disclosed is a method for taking the data generated from an array of responses from a multichannel instrument, and determining the characteristics of a chemical in the sample without the necessity of calibrating or training the instrument with known samples containing the same chemical. The characteristics determined by the method are then used to classify and identify the chemical in the sample. The method can also be used to quantify the concentration of the chemical in the sample.

  12. Methods for characterizing, classifying, and identifying unknowns in samples

    DOEpatents

    Grate, Jay W.; Wise, Barry M.

    2003-08-12

    Disclosed is a method for taking the data generated from an array of responses from a multichannel instrument, and determining the characteristics of a chemical in the sample without the necessity of calibrating or training the instrument with known samples containing the same chemical. The characteristics determined by the method are then used to classify and identify the chemical in the sample. The method can also be used to quantify the concentration of the chemical in the sample.

  13. Validation of a treatment plan-based calibration method for 2D detectors used for treatment delivery quality assurance

    SciTech Connect

    Olch, Arthur J.; Whitaker, Matthew L.

    2010-08-15

    Purpose: Dosimetry using film, CR, electronic portal imaging, or other 2D detectors requires calibration of the raw image data to obtain dose. Typically, a series of known doses are given to the detector, the raw signal for each dose is obtained, and a calibration curve is created. This calibration curve is then applied to the measured raw signals to convert them to dose. With the advent of IMRT, film dosimetry for quality assurance has become a routine and labor intensive part of the physicist's day. The process of calibrating the film or other 2D detector takes time and additional film or images for performing the calibration, and comes with its own source of errors. This article studies a new methodology for the relative dose calibration of 2D imaging detectors especially useful for IMRT QA, which relies on the treatment plan dose image to provide the dose information which is paired with the raw QA image data after registration of the two images (plan-based calibration). Methods: Validation of the accuracy and robustness of the method is performed on ten IMRT cases performed using EDR2 film with conventional and plan-based calibration. Also, for each of the ten cases, a 5 mm registration error was introduced and the Gamma analysis was reevaluated. In addition, synthetic image tests were performed to test the limits of the method. The Gamma analysis is used as a measure of dosimetric agreement between plan and film for the clinical cases and a dose difference metric for the synthetic cases. Results: The QA image calibrated by the plan-based method was found to more accurately match the treatment plan doses than the conventionally calibrated films and also to reveal dose errors more effectively when a registration error was introduced. When synthetic acquired images were systematically studied, localized and randomly placed dose errors were correctly identified without excessive falsely passing or falsely failing pixels, unless the errors were concentrated in a

  14. Methods for collecting benthic invertebrate samples as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Cuffney, Thomas F.; Gurtz, Martin E.; Meador, Michael R.

    1993-01-01

    Benthic invertebrate communities are evaluated as part of the ecological survey component of the U.S. Geological Survey's National Water-Quality Assessment Program. These biological data are collected along with physical and chemical data to assess water-quality conditions and to develop an understanding of the factors that affect water-quality conditions locally, regionally, and nationally. The objectives of benthic invertebrate community characterizations are to (1) develop for each site a list of tax a within the associated stream reach and (2) determine the structure of benthic invertebrate communities within selected habitats of that reach. A nationally consistent approach is used to achieve these objectives. This approach provides guidance on site, reach, and habitat selection and methods and equipment for qualitative multihabitat sampling and semi-quantitative single habitat sampling. Appropriate quality-assurance and quality-control guidelines are used to maximize the ability to analyze data within and among study units.

  15. Assuring quality.

    PubMed

    Eaton, K A; Reynolds, P A; Mason, R; Cardell, R

    2008-08-09

    All those involved in education have a strong motivation to ensure that all its aspects, including content and teaching practice, are of the highest standard. This paper describes how agencies such as the Quality Assurance Agency for Higher Education (QAA) and the General Dental Council (GDC) have established frameworks and specifications to monitor the quality of education provided in dental schools and other institutes that provide education and training for dentists and dental care professionals (DCPs). It then considers quality issues in programme and course development, techniques for assessing the quality of education, including content and presentation, and the role of students. It goes on to review the work that has been done in developing quality assessment for distance learning in dentistry. It concludes that, to date, much of the work on quality applies to education as a whole and that the assessment of the quality of e-learning in dentistry is in its infancy.

  16. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  17. Laser based water equilibration method for d18O determination of water samples

    NASA Astrophysics Data System (ADS)

    Mandic, Magda; Smajgl, Danijela; Stoebener, Nils

    2017-04-01

    Determination of d18O with water equilibration method using mass spectrometers equipped with equilibration unit or Gas Bench is known already for many years. Now, with development of laser spectrometers this extends methods and possibilities to apply different technologies in laboratory but also in the field. The Thermo Scientific™ Delta Ray™ Isotope Ratio Infrared Spectrometer (IRIS) analyzer with the Universal Reference Interface (URI) Connect and Teledyne Cetac ASX-7100 offers high precision and throughput of samples. It employs optical spectroscopy for continuous measurement of isotope ratio values and concentration of carbon dioxide in ambient air, and also for analysis of discrete samples from vials, syringes, bags, or other user-provided sample containers. Test measurements and conformation of precision and accuracy of method determination d18O in water samples were done in Thermo Fisher application laboratory with three lab standards, namely ANST, Ocean II and HBW. All laboratory standards were previously calibrated with international reference material VSMOW2 and SLAP2 to assure accuracy of the isotopic values of the water. With method that we present in this work achieved repeatability and accuracy are 0.16‰ and 0.71‰, respectively, which fulfill requirements of regulatory method for wine and must after equilibration with CO2.

  18. Validation of a treatment plan-based calibration method for 2D detectors used for treatment delivery quality assurance.

    PubMed

    Olch, Arthur J; Whitaker, Matthew L

    2010-08-01

    Dosimetry using film, CR, electronic portal imaging, or other 2D detectors requires calibration of the raw image data to obtain dose. Typically, a series of known doses are given to the detector, the raw signal for each dose is obtained, and a calibration curve is created. This calibration curve is then applied to the measured raw signals to convert them to dose. With the advent of IMRT, film dosimetry for quality assurance has become a routine and labor intensive part of the physicist's day. The process of calibrating the film or other 2D detector takes time and additional film or images for performing the calibration, and comes with its own source of errors. This article studies a new methodology for the relative dose calibration of 2D imaging detectors especially useful for IMRT QA, which relies on the treatment plan dose image to provide the dose information which is paired with the raw QA image data after registration of the two images (plan-based calibration). Validation of the accuracy and robustness of the method is performed on ten IMRT cases performed using EDR2 film with conventional and plan-based calibration. Also, for each of the ten cases, a 5 mm registration error was introduced and the Gamma analysis was reevaluated. In addition, synthetic image tests were performed to test the limits of the method. The Gamma analysis is used as a measure of dosimetric agreement between plan and film for the clinical cases and a dose difference metric for the synthetic cases. The QA image calibrated by the plan-based method was found to more accurately match the treatment plan doses than the conventionally calibrated films and also to reveal dose errors more effectively when a registration error was introduced. When synthetic acquired images were systematically studied, localized and randomly placed dose errors were correctly identified without excessive falsely passing or falsely failing pixels, unless the errors were concentrated in a majority of pixels in a

  19. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling. For...

  20. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  1. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  2. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 2 2012-04-01 2012-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling. For...

  3. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 2 2014-04-01 2014-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling. For...

  4. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  5. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  6. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  7. Marshall Island radioassay quality assurance program an overview

    SciTech Connect

    Conrado, C.L.; Hamilton, T.F.; Kehl, S.R.; Robison, W.L.; Stoker, A.C.

    1998-09-01

    The Lawrence Livermore National Laboratory has developed an extensive quality assurance program to provide high quality data and assessments in support of the Marshall Islands Dose Assessment and Radioecology Program. Our quality assurance objectives begin with the premise of providing integrated and cost-effective program support (to meet wide-ranging programmatic needs, scientific peer review, litigation defense, and build public confidence) and continue through from design and implementation of large-scale field programs, sampling and sample preparation, radiometric and chemical analyses, documentation of quality assurance/quality control practices, exposure assessments, and dose/risk assessments until publication. The basic structure of our radioassay quality assurance/quality control program can be divided into four essential elements; (1) sample and data integrity control; (2) instrument validation and calibration; (3) method performance testing, validation, development and documentation; and (4) periodic peer review and on-site assessments. While our quality assurance objectives are tailored towards a single research program and the evaluation of major exposure pathways/critical radionuclides pertinent to the Marshall Islands, we have attempted to develop quality assurance practices that are consistent with proposed criteria designed for laboratory accre

  8. Quality assurance in acid precipitation measurements

    SciTech Connect

    Campbell, S.; Scott, H.

    1985-06-01

    The growing interest in acid deposition has led to a proliferation of laboratories engaged in such studies. High-level quality assurance (QA) procedures are required for each program to standardize the diverse measurement methods in use and to determine the validity of differences in measurements widely separated in space and time. Both in-laboratory (quality control) and external (quality assurance) procedures are required. A complete QA program for acid precipitation measurements must address program objectives; site selection and operation; operator selection and training; sample collection, handling, and analyses; and data checking, storage, retrieval, and transmission. Objective criteria must be developed for detecting adulterated samples and invalid data. Appropriate laboratory and field blanks must be collected and analyzed. Standard techniques (sample spiking, replicate analysis of standards and samples) should ensure the reliability of analytical results. Relevant quality assurance data, including analytical detection limits, blank values, and the variability of replicate determinations, must be supplied with each data transmittal. Experimental information should be available upon request. The measurement of the pH of dilute solutions such as rain is particularly difficult; differences as large as 0.3 pH unit may be observed in replicate analyses of the same sample using different electrode types. Laboratory results are presented demonstrating typical variability to be expected in the collection, storage, and analysis of rainwater for major ions, including hydrogen ion. 15 references, 4 tables.

  9. Sampling methods in Clinical Research; an Educational Review.

    PubMed

    Elfil, Mohamed; Negida, Ahmed

    2017-01-01

    Clinical research usually involves patients with a certain disease or a condition. The generalizability of clinical research findings is based on multiple factors related to the internal and external validity of the research methods. The main methodological issue that influences the generalizability of clinical research findings is the sampling method. In this educational article, we are explaining the different sampling methods in clinical research.

  10. Pilot quality assurance programme for plasma metanephrines.

    PubMed

    Pillai, Dilo; Callen, Shaw

    2010-03-01

    Up to 2007 there was no formal external quality assurance programme for plasma free metanephrines. A pilot programme was conceived by the AACB (Australian Association of Clinical Biochemists) Working Party on biogenic amines. With support from the AACB and Royal College of Pathologists of Australasia Quality Assurance programmes, a pilot study was developed. Data from this study are presented for the first time. Twelve lyophilized plasma samples were distributed to 15 centres. Samples were spiked with metanephrine (metadrenaline), normetanephrine (normetadrenaline) and 3-methoxytyramine, all derived from human urine. Concentrations were arranged in a linear relationship. The analytes were present at six levels and samples were duplicated. High-pressure liquid chromatography and tandem mass spectrometry methods showed acceptable precision but in general enzyme immunoassay displayed a higher degree of imprecision as well as a negative bias. Differences in calibration and matrix effects are likely to have been responsible for the discrepancy between chromatographic and immunoassay methods. These differences need to be further examined although efforts at standardization between different methods have been hampered by the lack of a universal calibrator for plasma metanephrines. Meanwhile, a laboratory's performance characteristics can be monitored and enhanced by participation in suitable external quality assurance programmes.

  11. Photoacoustic sample vessel and method of elevated pressure operation

    DOEpatents

    Autrey, Tom; Yonker, Clement R.

    2004-05-04

    An improved photoacoustic vessel and method of photoacoustic analysis. The photoacoustic sample vessel comprises an acoustic detector, an acoustic couplant, and an acoustic coupler having a chamber for holding the acoustic couplant and a sample. The acoustic couplant is selected from the group consisting of liquid, solid, and combinations thereof. Passing electromagnetic energy through the sample generates an acoustic signal within the sample, whereby the acoustic signal propagates through the sample to and through the acoustic couplant to the acoustic detector.

  12. A Swiss cheese error detection method for real-time EPID-based quality assurance and error prevention.

    PubMed

    Passarge, Michelle; Fix, Michael K; Manser, Peter; Stampanoni, Marco F M; Siebers, Jeffrey V

    2017-04-01

    To develop a robust and efficient process that detects relevant dose errors (dose errors of ≥5%) in external beam radiation therapy and directly indicates the origin of the error. The process is illustrated in the context of electronic portal imaging device (EPID)-based angle-resolved volumetric-modulated arc therapy (VMAT) quality assurance (QA), particularly as would be implemented in a real-time monitoring program. A Swiss cheese error detection (SCED) method was created as a paradigm for a cine EPID-based during-treatment QA. For VMAT, the method compares a treatment plan-based reference set of EPID images with images acquired over each 2° gantry angle interval. The process utilizes a sequence of independent consecutively executed error detection tests: an aperture check that verifies in-field radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment check to examine if rotation, scaling, and translation are within tolerances; pixel intensity check containing the standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each check were determined. To test the SCED method, 12 different types of errors were selected to modify the original plan. A series of angle-resolved predicted EPID images were artificially generated for each test case, resulting in a sequence of precalculated frames for each modified treatment plan. The SCED method was applied multiple times for each test case to assess the ability to detect introduced plan variations. To compare the performance of the SCED process with that of a standard gamma analysis, both error detection methods were applied to the generated test cases with realistic noise variations. Averaged over ten test runs, 95.1% of all plan variations that resulted in relevant patient dose errors were detected within 2° and 100% within 14° (<4% of patient dose delivery

  13. Product assurance management and software product assurance

    NASA Technical Reports Server (NTRS)

    Schneider, C.; Borycki, G.; Panaroni, P.; Surbone, M.; Borcz, R.; Beddow, A. J.

    1991-01-01

    The evolution of software assurance is discussed. The definition and implementation of standards are considered. It is recommended that requirements be clarified at the start of a project. The need for quality assurance in hardware is identified as the coming trend in the production of high cost single units which call for eradication of all errors during the early stages of development. The need to apply quality assurance throughout the whole mission is stressed. The dangers of overpricing product assurance services is stressed.

  14. Access to Education for Orphans and Vulnerable Children in Uganda: A Multi-District, Cross-Sectional Study Using Lot Quality Assurance Sampling from 2011 to 2013.

    PubMed

    Olanrewaju, Ayobami D; Jeffery, Caroline; Crossland, Nadine; Valadez, Joseph J

    2015-01-01

    This study estimates the proportion of Orphans and Vulnerable Children (OVC) attending school in 89 districts of Uganda from 2011 - 2013 and investigates the factors influencing OVC access to education among this population. This study used secondary survey data from OVCs aged 5 - 17 years, collected using Lot Quality Assurance Sampling in 87 Ugandan districts over a 3-year period (2011 - 2013). Estimates of OVC school attendance were determined for the yearly time periods. Logistic regression was used to investigate the factors influencing OVC access to education. 19,354 children aged 5-17 were included in the analysis. We estimated that 79.1% (95% CI: 78.5% - 79.7%) of OVCs attended school during the 3-year period. Logistic regression revealed the odds of attending school were lower among OVCs from Western (OR 0.88; 95% CI: 0.79 - 0.99) and Northern (OR 0.64; 95% CI: 0.56 - 0.73) regions compared to the Central region. Female OVCs had a significantly higher odds of attending school (OR 1.09; 95% CI: 1.02 - 1.17) compared to their male counterparts. When adjusting for all variables simultaneously, we found the odds of school attendance reduced by 12% between 2011 and 2012 among all OVCs (OR 0.88; 95% CI: 0.81 - 0.97). Our findings reinforce the need to provide continuing support to OVC in Uganda, ensuring they have the opportunity to attain an education. The data indicate important regional and gender variation that needs to be considered for support strategies and in social policy. The results suggest the need for greater local empowerment to address the needs of OVCs. We recommend further research to understand why OVC access to education and attendance varies between regions and improvement of district level mapping of OVC access to education, and further study to understand the particular factors impacting the lower school attendance of male OVCs.

  15. Systems and methods for self-synchronized digital sampling

    NASA Technical Reports Server (NTRS)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  16. SAMSAN- MODERN NUMERICAL METHODS FOR CLASSICAL SAMPLED SYSTEM ANALYSIS

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any

  17. SAMSAN- MODERN NUMERICAL METHODS FOR CLASSICAL SAMPLED SYSTEM ANALYSIS

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any

  18. College Quality Assurance Assurances. Mendip Papers 020.

    ERIC Educational Resources Information Center

    Sallis, E.; Hingley, P.

    This paper discusses the increasing interest in quality assurance in British education including its measurement and management through the introduction of a quality assurance system. The reasons and benefits of beginning a quality assurance system are discussed, and questions of what constitutes quality, whether it is quality in fact…

  19. Passive Samplers for Investigations of Air Quality: Method Description, Implementation, and Comparison to Alternative Sampling Methods

    EPA Science Inventory

    This Paper covers the basics of passive sampler design, compares passive samplers to conventional methods of air sampling, and discusses considerations when implementing a passive sampling program. The Paper also discusses field sampling and sample analysis considerations to ensu...

  20. Passive Samplers for Investigations of Air Quality: Method Description, Implementation, and Comparison to Alternative Sampling Methods

    EPA Science Inventory

    This Paper covers the basics of passive sampler design, compares passive samplers to conventional methods of air sampling, and discusses considerations when implementing a passive sampling program. The Paper also discusses field sampling and sample analysis considerations to ensu...

  1. DOE methods for evaluating environmental and waste management samples.

    SciTech Connect

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  2. Sampling bee communities using pan traps: alternative methods increase sample size

    USDA-ARS?s Scientific Manuscript database

    Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...

  3. Sampling methods for amphibians in streams in the Pacific Northwest.

    Treesearch

    R. Bruce Bury; Paul Stephen. Corn

    1991-01-01

    Methods describing how to sample aquatic and semiaquatic amphibians in small streams and headwater habitats in the Pacific Northwest are presented. We developed a technique that samples 10-meter stretches of selected streams, which was adequate to detect presence or absence of amphibian species and provided sample sizes statistically sufficient to compare abundance of...

  4. A random spatial sampling method in a rural developing nation

    Treesearch

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  5. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    PubMed

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.

  6. SU-E-I-60: Quality Assurance Testing Methods and Customized Phantom for Magnetic Resonance Imaging and Spectroscopy

    SciTech Connect

    Song, K-H; Lee, D-W; Choe, B-Y

    2015-06-15

    Purpose: The objectives of this study are to develop an magnetic resonance imaging and spectroscopy (MRI-MRS) fused phantom along with the inserts for metabolite quantification and to conduct quantitative analysis and evaluation of the layered vials of brain-mimicking solution for quality assurance (QA) performance, according to the localization sequence. Methods: The outer cylindrical phantom body is made of acrylic materials. The section other than where the inner vials are located was filled with copper sulfate and diluted with water so as to reduce the T1 relaxation time. Sodium chloride was included to provide conductivity similar to the human body. All measurements of MRI and MRS were made using a 3.0 T scanner (Achiva Tx 3.0 T; Philips Medical Systems, Netherlands). The MRI scan parameters were as follows: (1) spin echo (SE) T1-weighted image: repetition time (TR), 500ms; echo time (TE), 20ms; matrix, 256×256; field of view (FOV), 250mm; gap, 1mm; number of signal averages (NSA), 1; (2) SE T2-weighted image: TR, 2,500 ms; TE, 80 ms; matrix, 256×256; FOV, 250mm; gap, 1mm; NSA, 1; 23 slice images were obtained with slice thickness of 5mm. The water signal of each volume of interest was suppressed by variable pulse power and optimized relaxation delays (VAPOR) applied before the scan. By applying a point-resolved spectroscopy sequence, the MRS scan parameters were as follows: voxel size, 0.8×0.8×0.8 cm{sup 3}; TR, 2,000ms; TE, 35ms; NSA, 128. Results: Using the fused phantom, the results of measuring MRI factors were: geometric distortion, <2% and ±2 mm; image intensity uniformity, 83.09±1.33%; percent-signal ghosting, 0.025±0.004; low-contrast object detectability, 27.85±0.80. In addition, the signal-to-noise ratio of N-acetyl-aspartate was consistently high (42.00±5.66). Conclusion: The MRI-MRS QA factors obtained simultaneously using the phantom can facilitate evaluation of both images and spectra, and provide guidelines for obtaining MRI and MRS QA

  7. Analytical quality assurance in veterinary drug residue analysis methods: matrix effects determination and monitoring for sulfonamides analysis.

    PubMed

    Hoff, Rodrigo Barcellos; Rübensam, Gabriel; Jank, Louise; Barreto, Fabiano; Peralba, Maria do Carmo Ruaro; Pizzolato, Tânia Mara; Silvia Díaz-Cruz, M; Barceló, Damià

    2015-01-01

    In residue analysis of veterinary drugs in foodstuff, matrix effects are one of the most critical points. This work present a discuss considering approaches used to estimate, minimize and monitoring matrix effects in bioanalytical methods. Qualitative and quantitative methods for estimation of matrix effects such as post-column infusion, slopes ratios analysis, calibration curves (mathematical and statistical analysis) and control chart monitoring are discussed using real data. Matrix effects varying in a wide range depending of the analyte and the sample preparation method: pressurized liquid extraction for liver samples show matrix effects from 15.5 to 59.2% while a ultrasound-assisted extraction provide values from 21.7 to 64.3%. The matrix influence was also evaluated: for sulfamethazine analysis, losses of signal were varying from -37 to -96% for fish and eggs, respectively. Advantages and drawbacks are also discussed considering a workflow for matrix effects assessment proposed and applied to real data from sulfonamides residues analysis.

  8. The PARTI Architecture Assurance

    DTIC Science & Technology

    2008-10-01

    example safety critical system, 2, Issues Guidance Papers ( IGPs ) that further explain key concepts or requirements of the STAN- DARD, The guidance...Organisation (2009) IGP -OOl: Guidance Notes for Project Offices. Published as part of [20]. 4. Defence Material Organisation (2009) IGP -002: Methodsfor Safety...Architecture Analysis. Published as part of [20]. 5. Defence Material Organisation (2009) IGP -003: Methods for Design Assurance. Published as part of

  9. In-depth analysis of sampling optimization methods

    NASA Astrophysics Data System (ADS)

    Lee, Honggoo; Han, Sangjun; Kim, Myoungsoo; Habets, Boris; Buhl, Stefan; Guhlemann, Steffen; Rößiger, Martin; Bellmann, Enrico; Kim, Seop

    2016-03-01

    High order overlay and alignment models require good coverage of overlay or alignment marks on the wafer. But dense sampling plans are not possible for throughput reasons. Therefore, sampling plan optimization has become a key issue. We analyze the different methods for sampling optimization and discuss the different knobs to fine-tune the methods to constraints of high volume manufacturing. We propose a method to judge sampling plan quality with respect to overlay performance, run-to-run stability and dispositioning criteria using a number of use cases from the most advanced lithography processes.

  10. Configurations and calibration methods for passive sampling techniques.

    PubMed

    Ouyang, Gangfeng; Pawliszyn, Janusz

    2007-10-19

    Passive sampling technology has developed very quickly in the past 15 years, and is widely used for the monitoring of pollutants in different environments. The design and quantification of passive sampling devices require an appropriate calibration method. Current calibration methods that exist for passive sampling, including equilibrium extraction, linear uptake, and kinetic calibration, are presented in this review. A number of state-of-the-art passive sampling devices that can be used for aqueous and air monitoring are introduced according to their calibration methods.

  11. Engineering Study of 500 ML Sample Bottle Transportation Methods

    SciTech Connect

    BOGER, R.M.

    1999-08-25

    This engineering study reviews and evaluates all available methods for transportation of 500-mL grab sample bottles, reviews and evaluates transportation requirements and schedules and analyzes and recommends the most cost-effective method for transporting 500-mL grab sample bottles.

  12. Investigating Test Equating Methods in Small Samples through Various Factors

    ERIC Educational Resources Information Center

    Asiret, Semih; Sünbül, Seçil Ömür

    2016-01-01

    In this study, equating methods for random group design using small samples through factors such as sample size, difference in difficulty between forms, and guessing parameter was aimed for comparison. Moreover, which method gives better results under which conditions was also investigated. In this study, 5,000 dichotomous simulated data…

  13. GROUND WATER PURGING AND SAMPLING METHODS: HISTORY VS. HYSTERIA

    EPA Science Inventory

    It has been over 10 years since the low-flow ground water purging and sampling method was initially reported in the literature. The method grew from the recognition that well purging was necessary to collect representative samples, bailers could not achieve well purging, and high...

  14. Rapid method for sampling metals for materials identification

    NASA Technical Reports Server (NTRS)

    Higgins, L. E.

    1971-01-01

    Nondamaging process similar to electrochemical machining is useful in obtaining metal samples from places inaccessible to conventional sampling methods or where methods would be hazardous or contaminating to specimens. Process applies to industries where metals or metal alloys play a vital role.

  15. Methods of analysis and quality-assurance practices of the U.S. Geological Survey organic laboratory, Sacramento, California; determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Domagalski, Joseph L.; Kuivila, Kathryn

    1994-01-01

    Analytical method and quality-assurance practices were developed for a study of the fate and transport of pesticides in the Sacramento-San Joaquin Delta and the Sacramento and San Joaquin River. Water samples were filtered to remove suspended parti- culate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide, and the pesticides were eluted with three 2-milliliter aliquots of hexane:diethyl ether (1:1). The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for analytes determined per 1,500-milliliter samples ranged from 0.006 to 0.047 microgram per liter. Recoveries ranged from 47 to 89 percent for 12 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.05 and 0.26 microgram per liter. The method was modified to improve the pesticide recovery by reducing the sample volume to 1,000 milliliters. Internal standards were added to improve quantitative precision and accuracy. The analysis also was expanded to include a total of 21 pesticides. The method detection limits for 1,000-milliliter samples ranged from 0.022 to 0.129 microgram per liter. Recoveries ranged from 38 to 128 percent for 21 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.10 and 0.75 microgram per liter.

  16. Method and apparatus for imaging a sample on a device

    DOEpatents

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.

    2001-01-01

    A method and apparatus for imaging a sample are provided. An electromagnetic radiation source generates excitation radiation which is sized by excitation optics to a line. The line is directed at a sample resting on a support and excites a plurality of regions on the sample. Collection optics collect response radiation reflected from the sample I and image the reflected radiation. A detector senses the reflected radiation and is positioned to permit discrimination between radiation reflected from a certain focal plane in the sample and certain other planes within the sample.

  17. Software Assurance Using Structured Assurance Case Models

    PubMed Central

    Rhodes, Thomas; Boland, Frederick; Fong, Elizabeth; Kass, Michael

    2010-01-01

    Software assurance is an important part of the software development process to reduce risks and ensure that the software is dependable and trustworthy. Software defects and weaknesses can often lead to software errors and failures and to exploitation by malicious users. Testing, certification and accreditation have been traditionally used in the software assurance process to attempt to improve software trustworthiness. In this paper, we examine a methodology known as a structured assurance model, which has been widely used for assuring system safety, for its potential application to software assurance. We describe the structured assurance model and examine its application and use for software assurance. We identify strengths and weaknesses of this approach and suggest areas for further investigation and testing. PMID:27134787

  18. A Mixed Methods Sampling Methodology for a Multisite Case Study

    ERIC Educational Resources Information Center

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  19. A Mixed Methods Sampling Methodology for a Multisite Case Study

    ERIC Educational Resources Information Center

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  20. Evaluation of Common Methods for Sampling Invertebrate Pollinator Assemblages: Net Sampling Out-Perform Pan Traps

    PubMed Central

    Popic, Tony J.; Davila, Yvonne C.; Wardle, Glenda M.

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service. PMID:23799127

  1. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    PubMed

    Popic, Tony J; Davila, Yvonne C; Wardle, Glenda M

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2) area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  2. Evaluating Composite Sampling Methods of Bacillus spores at Low Concentrations

    SciTech Connect

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.

    2016-10-13

    Restoring facility operations after the 2001 Amerithrax attacks took over three months to complete, highlighting the need to reduce remediation time. The most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite: a single cellulose sponge samples multiple coupons; 2) single medium multi-pass composite: a single cellulose sponge is used to sample multiple coupons; and 3) multi-medium post-sample composite: a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155CFU/cm2, respectively). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p-value < 0.0001) and coupon material (p-value = 0.0008). Recovery efficiency (RE) was higher overall using the post-sample composite (PSC) method compared to single medium composite from both clean and grime coated materials. RE with the PSC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, painted wall board, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but significantly lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when

  3. Do Too Many Rights Make a Wrong? A Qualitative Study of the Experiences of a Sample of Malaysian and Singapore Private Higher Education Providers in Transnational Quality Assurance

    ERIC Educational Resources Information Center

    Lim, Fion Choon Boey

    2010-01-01

    Assuring the quality of transnational education has been an endeavour of increasing importance in the internationalisation of higher education but is also increasingly challenging given the involvement of many stakeholders. This paper focuses on the experiences of and challenges faced by private tertiary education providers in Malaysia and…

  4. Do Too Many Rights Make a Wrong? A Qualitative Study of the Experiences of a Sample of Malaysian and Singapore Private Higher Education Providers in Transnational Quality Assurance

    ERIC Educational Resources Information Center

    Lim, Fion Choon Boey

    2010-01-01

    Assuring the quality of transnational education has been an endeavour of increasing importance in the internationalisation of higher education but is also increasingly challenging given the involvement of many stakeholders. This paper focuses on the experiences of and challenges faced by private tertiary education providers in Malaysia and…

  5. Tomotherapy treatment plan quality assurance: The impact of applied criteria on passing rate in gamma index method

    SciTech Connect

    Bresciani, Sara; Di Dia, Amalia; Maggio, Angelo; Cutaia, Claudia; Miranti, Anna; Infusino, Erminia; Stasi, Michele

    2013-12-15

    Purpose: Pretreatment patient plan verification with gamma index (GI) metric analysis is standard procedure for intensity modulated radiation therapy (IMRT) treatment. The aim of this paper is to evaluate the variability of the local and global gamma index obtained during standard pretreatment quality assurance (QA) measurements for plans performed with Tomotherapy unit. The QA measurements were performed with a 3D diode array, using variable passing criteria: 3%/3 mm, 2%/2 mm, 1%/1 mm, each with both local and global normalization.Methods: The authors analyzed the pretreatment QA results for 73 verifications; 37 were prostate cancer plans, 16 were head and neck plans, and 20 were other clinical sites. All plans were treated using the Tomotherapy Hi-Art System. Pretreatment QA plans were performed with the commercially available 3D diode array ArcCHECK™. This device has 1386 diodes arranged in a helical geometry spaced 1 cm apart. The dose measurements were acquired on the ArcCHECK™ and then compared with the calculated dose using the standard gamma analysis method. The gamma passing rate (%GP), defined as the percentage of points satisfying the condition GI < 1, was calculated for different criteria (3%/3 mm, 2%/2 mm, 1%/1 mm) and for both global and local normalization. In the case of local normalization method, the authors set three dose difference threshold (DDT) values of 2, 3, and 5 cGy. Dose difference threshold is defined as the minimum absolute dose error considered in the analysis when using local normalization. Low-dose thresholds (TH) of 5% and 10% were also applied and analyzed.Results: Performing a paired-t-test, the authors determined that the gamma passing rate is independent of the threshold values for all of the adopted criteria (5%TH vs 10%TH, p > 0.1). Our findings showed that mean %GPs for local (or global) normalization for the entire study group were 93% (98%), 84% (92%), and 66% (61%) for 3%/3 mm, 2%/2 mm, and 1%/1 mm criteria

  6. [Medical quality assurance today].

    PubMed

    Schäfer, Robert D

    2008-01-01

    Both the quality and performance of health systems are strongly influenced by the number and the qualification of the professional staff. Quality assurance programs help to analyse causalities which are responsible for medical malpractice. On the basis of the experiences gained by the performance of established Quality Assurance Programs (QAP) in the North Rhine area since 1982 various aspects of the efficiency of these programs will be discussed. The implementation of legal regulations making these programs mandatory is criticised not only for its bureaucratic effect but also for the attempt to exclude professional experts from the interpretation of results. It is recommended to liberalize these regulations in order to facilitate improvement of methods and participation of the medical profession.

  7. Orientation sampling for dictionary-based diffraction pattern indexing methods

    NASA Astrophysics Data System (ADS)

    Singh, S.; De Graef, M.

    2016-12-01

    A general framework for dictionary-based indexing of diffraction patterns is presented. A uniform sampling method of orientation space using the cubochoric representation is introduced and used to derive an empirical relation between the average disorientation between neighboring sampling points and the number of grid points sampled along the semi-edge of the cubochoric cube. A method to uniformly sample misorientation iso-surfaces is also presented. This method is used to show that the dot product serves as a proxy for misorientation. Furthermore, it is shown that misorientation iso-surfaces in Rodrigues space are quadractic surfaces. Finally, using the concept of Riesz energies, it is shown that the sampling method results in a near optimal covering of orientation space.

  8. The use of microbead-based spoligotyping for Mycobacterium tuberculosis complex to evaluate the quality of the conventional method: Providing guidelines for Quality Assurance when working on membranes

    PubMed Central

    2011-01-01

    Background The classical spoligotyping technique, relying on membrane reverse line-blot hybridization of the spacers of the Mycobacterium tuberculosis CRISPR locus, is used world-wide (598 references in Pubmed on April 8th, 2011). However, until now no inter-laboratory quality control study had been undertaken to validate this technique. We analyzed the quality of membrane-based spoligotyping by comparing it to the recently introduced and highly robust microbead-based spoligotyping. Nine hundred and twenty-seven isolates were analyzed totaling 39,861 data points. Samples were received from 11 international laboratories with a worldwide distribution. Methods The high-throughput microbead-based Spoligotyping was performed on CTAB and thermolyzate DNA extracted from isolated Mycobacterium tuberculosis complex (MTC) strains coming from the genotyping participating centers. Information regarding how the classical Spoligotyping method was performed by center was available. Genotype discriminatory analyses were carried out by comparing the spoligotypes obtained by both methods. The non parametric U-Mann Whitney homogeneity test and the Spearman rank correlation test were performed to validate the observed results. Results Seven out of the 11 laboratories (63 %), perfectly typed more than 90% of isolates, 3 scored between 80-90% and a single center was under 80% reaching 51% concordance only. However, this was mainly due to discordance in a single spacer, likely having a non-functional probe on the membrane used. The centers using thermolyzate DNA performed as well as centers using the more extended CTAB extraction procedure. Few centers shared the same problematic spacers and these problematic spacers were scattered over the whole CRISPR locus (Mostly spacers 15, 14, 18, 37, 39, 40). Conclusions We confirm that classical spoligotyping is a robust method with generally a high reliability in most centers. The applied DNA extraction procedure (CTAB or thermolyzate) did not

  9. QADATA user's manual; an interactive computer program for the retrieval and analysis of the results from the external blind sample quality- assurance project of the U.S. Geological Survey

    USGS Publications Warehouse

    Lucey, K.J.

    1990-01-01

    The U.S. Geological Survey conducts an external blind sample quality assurance project for its National Water Quality Laboratory in Denver, Colorado, based on the analysis of reference water samples. Reference samples containing selected inorganic and nutrient constituents are disguised as environmental samples at the Survey 's office in Ocala, Florida, and are sent periodically through other Survey offices to the laboratory. The results of this blind sample project indicate the quality of analytical data produced by the laboratory. This report provides instructions on the use of QADATA, an interactive, menu-driven program that allows users to retrieve the results of the blind sample quality- assurance project. The QADATA program, which is available on the U.S. Geological Survey 's national computer network, accesses a blind sample data base that contains more than 50,000 determinations from the last five water years for approximately 40 constituents at various concentrations. The data can be retrieved from the database for any user- defined time period and for any or all available constituents. After the user defines the retrieval, the program prepares statistical tables, control charts, and precision plots and generates a report which can be transferred to the user 's office through the computer network. A discussion of the interpretation of the program output is also included. This quality assurance information will permit users to document the quality of the analytical results received from the laboratory. The blind sample data is entered into the database within weeks after being produced by the laboratory and can be retrieved to meet the needs of specific projects or programs. (USGS)

  10. Method for using polarization gating to measure a scattering sample

    DOEpatents

    Baba, Justin S.

    2015-08-04

    Described herein are systems, devices, and methods facilitating optical characterization of scattering samples. A polarized optical beam can be directed to pass through a sample to be tested. The optical beam exiting the sample can then be analyzed to determine its degree of polarization, from which other properties of the sample can be determined. In some cases, an apparatus can include a source of an optical beam, an input polarizer, a sample, an output polarizer, and a photodetector. In some cases, a signal from a photodetector can be processed through attenuation, variable offset, and variable gain.

  11. Interval sampling methods and measurement error: a computer simulation.

    PubMed

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  12. [Recent advances in sample preparation methods of plant hormones].

    PubMed

    Wu, Qian; Wang, Lus; Wu, Dapeng; Duan, Chunfeng; Guan, Yafeng

    2014-04-01

    Plant hormones are a group of naturally occurring trace substances which play a crucial role in controlling the plant development, growth and environment response. With the development of the chromatography and mass spectroscopy technique, chromatographic analytical method has become a widely used way for plant hormone analysis. Among the steps of chromatographic analysis, sample preparation is undoubtedly the most vital one. Thus, a highly selective and efficient sample preparation method is critical for accurate identification and quantification of phytohormones. For the three major kinds of plant hormones including acidic plant hormones & basic plant hormones, brassinosteroids and plant polypeptides, the sample preparation methods are reviewed in sequence especially the recently developed methods. The review includes novel methods, devices, extractive materials and derivative reagents for sample preparation of phytohormones analysis. Especially, some related works of our group are included. At last, the future developments in this field are also prospected.

  13. [Respondent-Driven Sampling: a new sampling method to study visible and hidden populations].

    PubMed

    Mantecón, Alejandro; Juan, Montse; Calafat, Amador; Becoña, Elisardo; Román, Encarna

    2008-01-01

    The paper introduces a variant of chain-referral sampling: respondent-driven sampling (RDS). This sampling method shows that methods based on network analysis can be combined with the statistical validity of standard probability sampling methods. In this sense, RDS appears to be a mathematical improvement of snowball sampling oriented to the study of hidden populations. However, we try to prove its validity with populations that are not within a sampling frame but can nonetheless be contacted without difficulty. The basics of RDS are explained through our research on young people (aged 14 to 25) who go clubbing, consume alcohol and other drugs, and have sex. Fieldwork was carried out between May and July 2007 in three Spanish regions: Baleares, Galicia and Comunidad Valenciana. The presentation of the study shows the utility of this type of sampling when the population is accessible but there is a difficulty deriving from the lack of a sampling frame. However, the sample obtained is not a random representative one in statistical terms of the target population. It must be acknowledged that the final sample is representative of a 'pseudo-population' that approximates to the target population but is not identical to it.

  14. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  15. [Weighted estimation methods for multistage sampling survey data].

    PubMed

    Hou, Xiao-Yan; Wei, Yong-Yue; Chen, Feng

    2009-06-01

    Multistage sampling techniques are widely applied in the cross-sectional study of epidemiology, while methods based on independent assumption are still used to analyze such complex survey data. This paper aims to introduce the application of weighted estimation methods for the complex survey data. A brief overview of basic theory is described, and then a practical analysis is illustrated to apply to the weighted estimation algorithm in a stratified two-stage clustered sampling data. For multistage sampling survey data, weighted estimation method can be used to obtain unbiased point estimation and more reasonable variance estimation, and so make proper statistical inference by correcting the clustering, stratification and unequal probability effects.

  16. Integration of sample analysis method (SAM) for polychlorinated biphenyls

    SciTech Connect

    Monagle, M.; Johnson, R.C.

    1996-05-01

    A completely integrated Sample Analysis Method (SAM) has been tested as part of the Contaminant Analysis Automation program. The SAM system was tested for polychlorinated biphenyl samples using five Standard Laboratory Modules{trademark}: two Soxtec{trademark} modules, a high volume concentrator module, a generic materials handling module, and the gas chromatographic module. With over 300 samples completed within the first phase of the validation, recovery and precision data were comparable to manual methods. Based on experience derived from the first evaluation of the automated system, efforts are underway to improve sample recoveries and integrate a sample cleanup procedure. In addition, initial work in automating the extraction of semivolatile samples using this system will also be discussed.

  17. Sampling and sample preparation methods for determining concentrations of mycotoxins in foods and feeds.

    PubMed

    2012-01-01

    Sample variation is often the largest error in determining concentrations of mycotoxins in food commodities. The worldwide safety evaluation of mycotoxins requires sampling plans that give acceptably accurate values for the levels of contamination in specific batches or lots of a commodity. Mycotoxin concentrations show a skewed or uneven distribution in foods and feeds, especially in whole kernels (or nuts), so it is extremely difficult to collect a sample that accurately represents the mean batch concentration. Sample variance studies and sampling plans have been published for select mycotoxins such as aflatoxin, fumonisin, and deoxynivalenol, emphasizing the importance of sample selection, sample size, and the number of incremental samples. For meaningful data to be generated from surveillance studies, representative samples should be collected from carefully selected populations (batches or lots) of food that, in turn, should be representative of clearly defined locations (e.g. a country, a region within a country). Although sampling variability is unavoidable, it is essential that the precision of the sampling plan be clearly defined and be considered acceptable by those responsible for interpreting and reporting the surveillance data. The factors influencing variability are detailed here, with reference to both major mycotoxins and major commodities. Sampling of large bag stacks, bulk shipments, and domestic supplies are all discussed. Sampling plans currently accepted in international trade are outlined. Acceptance sampling plans and the variabilities that affect operating characteristic curves of such plans are also detailed. The constraints and issues related to the sampling of harvested crops within subsistence farming areas are also discussed in this chapter, as are the essential rules of sample labelling and storage. The chapter concludes with a short section on sample preparation methods.

  18. Chain Sampling

    DTIC Science & Technology

    1972-08-01

    35609 Advanced Techniques Branch Plans and Programs Analysis Division Directorate for Product Assurance U. S. Army Missile Command Redstone Arsenal...Ray Heathcock Advanced Techniques Branch Plans and Programs Analysis Division Directorate for Product Assurance U. S. Army Missile Command...for Product Assurance has established a rather unique computer program for handling a variety of chain sampling schemes and is available for

  19. Capillary microextraction: A new method for sampling methamphetamine vapour.

    PubMed

    Nair, M V; Miskelly, G M

    2016-11-01

    Clandestine laboratories pose a serious health risk to first responders, investigators, decontamination companies, and the public who may be inadvertently exposed to methamphetamine and other chemicals used in its manufacture. Therefore there is an urgent need for reliable methods to detect and measure methamphetamine at such sites. The most common method for determining methamphetamine contamination at former clandestine laboratory sites is selected surface wipe sampling, followed by analysis with gas chromatography-mass spectrometry (GC-MS). We are investigating the use of sampling for methamphetamine vapour to complement such wipe sampling. In this study, we report the use of capillary microextraction (CME) devices for sampling airborne methamphetamine, and compare their sampling efficiency with a previously reported dynamic SPME method. The CME devices consisted of PDMS-coated glass filter strips inside a glass tube. The devices were used to dynamically sample methamphetamine vapour in the range of 0.42-4.2μgm(-3), generated by a custom-built vapour dosing system, for 1-15min, and methamphetamine was analysed using a GC-MS fitted with a ChromatoProbe thermal desorption unit. The devices showed good reproducibility (RSD<15%), and a curvilinear pre-equilibrium relationship between sampling times and peak area, which can be utilised for calibration. Under identical sampling conditions, the CME devices were approximately 30 times more sensitive than the dynamic SPME method. The CME devices could be stored for up to 3days after sampling prior to analysis. Consecutive sampling of methamphetamine and its isotopic substitute, d-9 methamphetamine showed no competitive displacement. This suggests that CME devices, pre-loaded with an internal standard, could be a feasible method for sampling airborne methamphetamine at former clandestine laboratories.

  20. A random spatial sampling method in a rural developing nation.

    PubMed

    Kondo, Michelle C; Bream, Kent D W; Barg, Frances K; Branas, Charles C

    2014-04-10

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available.

  1. Methods for sample size determination in cluster randomized trials

    PubMed Central

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-01-01

    Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515

  2. Optimized method for dissolved hydrogen sampling in groundwater.

    PubMed

    Alter, Marcus D; Steiof, Martin

    2005-06-01

    Dissolved hydrogen concentrations are used to characterize redox conditions of contaminated aquifers. The currently accepted and recommended bubble strip method for hydrogen sampling (Wiedemeier et al., 1998) requires relatively long sampling times and immediate field analysis. In this study we present methods for optimized sampling and for sample storage. The bubble strip sampling method was examined for various flow rates, bubble sizes (headspace volume in the sampling bulb) and two different H2 concentrations. The results were compared to a theoretical equilibration model. Turbulent flow in the sampling bulb was optimized for gas transfer by reducing the inlet diameter. Extraction with a 5 mL headspace volume and flow rates higher than 100 mL/min resulted in 95-100% equilibrium within 10-15 min. In order to investigate the storage of samples from the gas sampling bulb gas samples were kept in headspace vials for varying periods. Hydrogen samples (4.5 ppmv, corresponding to 3.5 nM in liquid phase) could be stored up to 48 h and 72 h with a recovery rate of 100.1+/-2.6% and 94.6+/-3.2%, respectively. These results are promising and prove the possibility of storage for 2-3 days before laboratory analysis. The optimized method was tested at a field site contaminated with chlorinated solvents. Duplicate gas samples were stored in headspace vials and analyzed after 24 h. Concentrations were measured in the range of 2.5-8.0 nM corresponding to known concentrations in reduced aquifers.

  3. Methods for collection and analysis of water samples

    USGS Publications Warehouse

    Rainwater, Frank Hays; Thatcher, Leland Lincoln

    1960-01-01

    This manual contains methods used by the U.S. Geological Survey to collect, preserve, and analyze water samples. Throughout, the emphasis is on obtaining analytical results that accurately describe the chemical composition of the water in situ. Among the topics discussed are selection of sampling sites, frequency of sampling, field equipment, preservatives and fixatives, analytical techniques of water analysis, and instruments. Seventy-seven laboratory and field procedures are given for determining fifty-three water properties.

  4. A comprehensive comparison of perpendicular distance sampling methods for sampling downed coarse woody debris

    Treesearch

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2013-01-01

    Many new methods for sampling down coarse woody debris have been proposed in the last dozen or so years. One of the most promising in terms of field application, perpendicular distance sampling (PDS), has several variants that have been progressively introduced in the literature. In this study, we provide an overview of the different PDS variants and comprehensive...

  5. Method and sample spinning apparatus for measuring the NMR spectrum of an orientationally disordered sample

    DOEpatents

    Pines, Alexander; Samoson, Ago

    1990-01-01

    An improved NMR apparatus and method are described which substantially improve the resolution of NMR measurements made on powdered or amorphous or otherwise orientationally disordered samples. The apparatus spins the sample about an axis. The angle of the axis is mechanically varied such that the time average of two or more Legendre polynomials are zero.

  6. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  7. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  8. Field evaluation of personal sampling methods for multiple bioaerosols.

    PubMed

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  9. Adaptive cluster sampling: An efficient method for assessing inconspicuous species

    Treesearch

    Andrea M. Silletti; Joan Walker

    2003-01-01

    Restorationistis typically evaluate the success of a project by estimating the population sizes of species that have been planted or seeded. Because total census is raely feasible, they must rely on sampling methods for population estimates. However, traditional random sampling designs may be inefficient for species that, for one reason or another, are challenging to...

  10. Field Evaluation of Personal Sampling Methods for Multiple Bioaerosols

    PubMed Central

    Wang, Chi-Hsun; Chen, Bean T.; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols. PMID:25799419

  11. Nominal Weights Mean Equating: A Method for Very Small Samples

    ERIC Educational Resources Information Center

    Babcock, Ben; Albano, Anthony; Raymond, Mark

    2012-01-01

    The authors introduced nominal weights mean equating, a simplified version of Tucker equating, as an alternative for dealing with very small samples. The authors then conducted three simulation studies to compare nominal weights mean equating to six other equating methods under the nonequivalent groups anchor test design with sample sizes of 20,…

  12. Nominal Weights Mean Equating: A Method for Very Small Samples

    ERIC Educational Resources Information Center

    Babcock, Ben; Albano, Anthony; Raymond, Mark

    2012-01-01

    The authors introduced nominal weights mean equating, a simplified version of Tucker equating, as an alternative for dealing with very small samples. The authors then conducted three simulation studies to compare nominal weights mean equating to six other equating methods under the nonequivalent groups anchor test design with sample sizes of 20,…

  13. A distance limited method for sampling downed coarse woody debris

    Treesearch

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2012-01-01

    A new sampling method for down coarse woody debris is proposed based on limiting the perpendicular distance from individual pieces to a randomly chosen sample point. Two approaches are presented that allow different protocols to be used to determine field measurements; estimators for each protocol are also developed. Both protocols are compared via simulation against...

  14. INTERVAL SAMPLING METHODS AND MEASUREMENT ERROR: A COMPUTER SIMULATION

    PubMed Central

    Wirth, Oliver; Slaven, James; Taylor, Matthew A.

    2015-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method’s inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. PMID:24127380

  15. A cryopreservation method for Pasteurella multocida from wetland samples

    USGS Publications Warehouse

    Moore, Melody K.; Shadduck, D.J.; Goldberg, D.R.; Samuel, M.D.

    1998-01-01

    A cryopreservation method and improved isolation techniques for detection of Pasteurella multocida from wetland samples were developed. Wetland water samples were collected in the field, diluted in dimethyl sulfoxide (DMSO, final concentration 10%), and frozen at -180 C in a liquid nitrogen vapor shipper. Frozen samples were transported to the laboratory where they were subsequently thawed and processed in Pasteurella multocida selective broth (PMSB) to isolate P. multocida. This method allowed for consistent isolation of 2 to 18 organisms/ml from water seeded with known concentrations of P. multocida. The method compared favorably with the standard mouse inoculation method and allowed for preservation of the samples until they could be processed in the laboratory.

  16. On the sampling method of the JSZ-4 Doppler receiver.

    NASA Astrophysics Data System (ADS)

    Cha, D.-Y.; Huang, K.-Y.

    The authors discuss the properties of the JSZ-4 Doppler receiver and the problem of optimal record. It is shown that the original sampling method losses information. A procedure of improvement is proposed.

  17. Demonstration Report for Visual Sample Plan (VSP) Verification Sampling Methods at the Navy/DRI Site

    DTIC Science & Technology

    2011-08-01

    STATISTICAL VERIFICATION AND REMEDIATION SAMPLING METHODS (200837) August 2011 Pacific Northwest National Laboratory Brent Pulsipher...17. LIMIT ATIOH OF 1S. NUMSER 19~. NAME OS: RESPONSI’SLE PERSON ABSTRACT o• Brent Pulsipher ... ...., .. •. ’ · ’"" .... PAG .’ES uu 93 19b... Statistical Verification Sampling Methods in VSP ii August 2011 6.2.1  Transect Survey Design and Parameter Settings

  18. Separation methods for taurine analysis in biological samples.

    PubMed

    Mou, Shifen; Ding, Xiaojing; Liu, Yongjian

    2002-12-05

    Taurine plays an important role in a variety of physiological functions, pharmacological actions and pathological conditions. Many methods for taurine analysis, therefore, have been reported to monitor its levels in biological samples. This review discusses the following techniques: sample preparation; separation and determination methods including high-performance liquid chromatography, gas chromatography, ion chromatography, capillary electrophoresis and hyphenation procedures. It covers articles published between 1990 and 2001.

  19. On-capillary sample cleanup method for the electrophoretic determination of carbohydrates in juice samples.

    PubMed

    Morales-Cid, Gabriel; Simonet, Bartolomé M; Cárdenas, Soledad; Valcárcel, Miguel

    2007-05-01

    On many occasions, sample treatment is a critical step in electrophoretic analysis. As an alternative to batch procedures, in this work, a new strategy is presented with a view to develop an on-capillary sample cleanup method. This strategy is based on the partial filling of the capillary with carboxylated single-walled carbon nanotube (c-SWNT). The nanoparticles retain interferences from the matrix allowing the determination and quantification of carbohydrates (viz glucose, maltose and fructose). The precision of the method for the analysis of real samples ranged from 5.3 to 6.4%. The proposed method was compared with a method based on a batch filtration of the juice sample through diatomaceous earth and further electrophoretic determination. This method was also validated in this work. The RSD for this other method ranged from 5.1 to 6%. The results obtained by both methods were statistically comparable demonstrating the accuracy of the proposed methods and their effectiveness. Electrophoretic separation of carbohydrates was achieved using 200 mM borate solution as a buffer at pH 9.5 and applying 15 kV. During separation, the capillary temperature was kept constant at 40 degrees C. For the on-capillary cleanup method, a solution containing 50 mg/L of c-SWNTs prepared in 300 mM borate solution at pH 9.5 was introduced for 60 s into the capillary just before sample introduction. For the electrophoretic analysis of samples cleaned in batch with diatomaceous earth, it is also recommended to introduce into the capillary, just before the sample, a 300 mM borate solution as it enhances the sensitivity and electrophoretic resolution.

  20. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  1. A quantitative sampling method for Oncomelania quadrasi by filter paper.

    PubMed

    Tanaka, H; Santos, M J; Matsuda, H; Yasuraoka, K; Santos, A T

    1975-08-01

    Filter paper was found to attract Oncomelania quadrasi in waters the same way as fallen dried banana leaves, although less number of other species of snails was collected on the former than on the latter. Snails were collected in limited areas using a tube (85 cm2 area at cross-section) and a filter paper (20 X 20 CM) samplers. The sheet of filter paper was placed close to the spot where a tube sample was taken, and recovered after 24 hours. At each sampling, 30 samples were taken by each method in an area and sampling was made four times. The correlation of the number of snails collected by the tube and that by filter paper was studied. The ratio of the snail counts by the tube sampler to those by the filter paper was 1.18. A loose correlation was observed between snail counts of both methods as shown by the correlation coefficient r = 0.6502. The formulas for the regression line were Y = 0.77 X + 1.6 and X = 0.55 Y + 1.35 for 3 experiments where Y is the number of snails collected by tube sampling and X is the number of snails collected in the sheet of filter paper. The type of snail distribution was studied in the 30 samples taken by each method and this was observed to be nearly the same in both sampling methods. All sampling data were found to fit the negative binomial distribution with the values of the constant k varying very much from 0.5775 to 5.9186 in (q -- p)-k. In each experiment, the constant k was always larger in tube sampling than in filter paper sampling. This indicates that the uneven distribution of snails on the soil surface becomes more conspicuous by the filter paper sampling.

  2. Quality of plasma sampled by different methods for multiple blood sampling in mice.

    PubMed

    Christensen, S D; Mikkelsen, L F; Fels, J J; Bodvarsdóttir, T B; Hansen, A K

    2009-01-01

    For oral glucose tolerance test (OGTT) in mice, multiple blood samples need to be taken within a few hours from conscious mice. Today, a number of essential parameters may be analysed on very small amounts of plasma, thus reducing the number of animals to be used. It is, however, crucial to obtain high-quality plasma or serum in order to avoid increased data variation and thereby increased group sizes. The aim of this study was to find the most valid and reproducible method for withdrawal of blood samples when performing OGTT. Four methods, i.e. amputation of the tail tip, lateral tail incision, puncture of the tail tip and periorbital puncture, were selected for testing at 21 degrees C and 30 degrees C after a pilot study. For each method, four blood samples were drawn from C57BL/6 mice at 30 min intervals. The presence of clots was registered, haemolysis was monitored spectrophotometrically at 430 nm, and it was noted whether it was possible to achieve 30-50 microL blood. Furthermore, a small amount of extra blood was sampled before and after the four samplings for testing of whether the sampling induced a blood glucose change over the 90 min test period. All methods resulted in acceptable amounts of plasma. Clots were observed in a sparse number of samples with no significant differences between the methods. Periorbital puncture did not lead to any haemolysed samples at all, and lateral tail incision resulted in only a few haemolysed samples, while puncture or amputation of the tail tip induced haemolysis in a significant number of samples. All methods, except for puncture of the tail tip, influenced blood glucose. Periorbital puncture resulted in a dramatic increase in blood glucose of up to 3.5 mmol/L indicating that it is stressful. Although lateral tail incision also had some impact on blood glucose, it seems to be the method of choice for OGTT, as it is likely to produce a clot-free non-haemolysed sample, while periorbital sampling, although producing a

  3. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  4. DOE methods for evaluating environmental and waste management samples

    SciTech Connect

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  5. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    PubMed

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.

  6. Convenient mounting method for electrical measurements of thin samples

    NASA Technical Reports Server (NTRS)

    Matus, L. G.; Summers, R. L.

    1986-01-01

    A method for mounting thin samples for electrical measurements is described. The technique is based on a vacuum chuck concept in which the vacuum chuck simultaneously holds the sample and established electrical contact. The mounting plate is composed of a glass-ceramic insulating material and the surfaces of the plate and vacuum chuck are polished. The operation of the vacuum chuck is examined. The contacts on the sample and mounting plate, which are sputter-deposited through metal masks, are analyzed. The mounting method was utilized for van der Pauw measurements.

  7. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  8. Method and apparatus for imaging a sample on a device

    DOEpatents

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.

    1996-01-01

    The present invention provides methods and systems for detecting a labeled marker on a sample located on a support. The imaging system comprises a body for immobilizing the support, an excitation radiation source and excitation optics to generate and direct the excitation radiation at the sample. In response, labeled material on the sample emits radiation which has a wavelength that is different from the excitation wavelength, which radiation is collected by collection optics and imaged onto a detector which generates an image of the sample.

  9. Soil separator and sampler and method of sampling

    DOEpatents

    O'Brien, Barry H [Idaho Falls, ID; Ritter, Paul D [Idaho Falls, ID

    2010-02-16

    A soil sampler includes a fluidized bed for receiving a soil sample. The fluidized bed may be in communication with a vacuum for drawing air through the fluidized bed and suspending particulate matter of the soil sample in the air. In a method of sampling, the air may be drawn across a filter, separating the particulate matter. Optionally, a baffle or a cyclone may be included within the fluidized bed for disentrainment, or dedusting, so only the finest particulate matter, including asbestos, will be trapped on the filter. The filter may be removable, and may be tested to determine the content of asbestos and other hazardous particulate matter in the soil sample.

  10. System and method for measuring fluorescence of a sample

    DOEpatents

    Riot, Vincent J

    2015-03-24

    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  11. System and method for measuring fluorescence of a sample

    DOEpatents

    Riot, Vincent J.

    2017-06-27

    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  12. Extending the alias Monte Carlo sampling method to general distributions

    SciTech Connect

    Edwards, A.L.; Rathkopf, J.A. ); Smidt, R.K. )

    1991-01-07

    The alias method is a Monte Carlo sampling technique that offers significant advantages over more traditional methods. It equals the accuracy of table lookup and the speed of equal probable bins. The original formulation of this method sampled from discrete distributions and was easily extended to histogram distributions. We have extended the method further to applications more germane to Monte Carlo particle transport codes: continuous distributions. This paper presents the alias method as originally derived and our extensions to simple continuous distributions represented by piecewise linear functions. We also present a method to interpolate accurately between distributions tabulated at points other than the point of interest. We present timing studies that demonstrate the method's increased efficiency over table lookup and show further speedup achieved through vectorization. 6 refs., 12 figs., 2 tabs.

  13. Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations

    PubMed Central

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.

    2016-01-01

    Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when

  14. Tests of a comparative method of dating plutonium samples

    NASA Astrophysics Data System (ADS)

    West, D.

    1987-04-01

    Tests of a comparative method of dating plutonium samples have been carried out using 241Pu in aqueous solution. The six samples were of known ages (between 0.25 and 15 yr) and with one exception the measured ages, using particular samples as standards, agreed with the stated ages. In one case the agreement was beter than 1% in age. Mixed-oxide fuel pins were also intercompared. In this case it was with some difficulty that a sample of known age was obtaine. Comparison using this sample and an older one gave the same value (within ±1%) for the separation date of the unknown sample on three occasions over a three year period.

  15. COMPARISON OF MACROINVERTEBRATE SAMPLING METHODS FOR NONWADEABLE STREAMS

    EPA Science Inventory

    The bioassessment of nonwadeable streams in the United States is increasing, but methods for these systems are not as well developed as for wadeable streams. In this study, we compared six benthic macroinvertebrate field sampling methods for nonwadeable streams based on those us...

  16. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  17. COMPARISON OF MACROINVERTEBRATE SAMPLING METHODS FOR NONWADEABLE STREAMS

    EPA Science Inventory

    The bioassessment of nonwadeable streams in the United States is increasing, but methods for these systems are not as well developed as for wadeable streams. In this study, we compared six benthic macroinvertebrate field sampling methods for nonwadeable streams based on those us...

  18. Comparison of three different sampling methods for canine skin lipids.

    PubMed

    Angelbeck-Schulze, Mandy; Stahl, Jessica; Brodesser, Susanne; Rohn, Karl; Naim, Hassan; Hewicker-Trautwein, Marion; Kietzmann, Manfred; Bäumer, Wolfgang; Mischke, Reinhard

    2013-04-01

    Epidermal lipids are of major interest in dermatological research, especially in canine atopic dermatitis. Owing to the existence of several sampling methods, the interpretation of study results is often complicated. This study aimed to compare three different sampling methods and to establish a minimally invasive method for collecting canine epidermal lipids. Skin samples from five dogs with no obvious skin abnormalities were taken from the caudal back and the inguinal region postmortem. Samples consisted of heat-separated epidermis of three skin biopsies, three scrapes and three skin scrubs. Lipids were analysed by high-performance thin-layer chromatography; the resulting bands were identified by using corresponding standards, retardation factors and mass spectrometry. The influences of the sampling method, the body site and the ceramide standards were investigated. Between body sites, significant differences were found for cholesterol sulphate, cholesteryl esters and triglycerides. Significant differences between sampling methods were detected for all lipid fractions except for cholesterol sulphate and glucosylceramides within the lipid profile, and for at least four ceramide classes within the ceramide profile. The most obvious discrepancies were found between heat-separated epidermis and skin scrub. The reproducibility was high for scraping and skin scrub, but was lowest for heat-separated epidermis. Furthermore, this study revealed a marked influence of ceramide standards on the results regarding the ceramide profile. Scraping and skin scrub are comparably suitable methods for skin lipid sampling, whereas the analysis of heat-separated epidermis may not be the method of first choice. © 2013 The Authors. Veterinary Dermatology © 2013 ESVD and ACVD.

  19. Standard methods for sampling North American freshwater fishes

    USGS Publications Warehouse

    Bonar, Scott A.; Hubert, Wayne A.; Willis, David W.

    2009-01-01

    This important reference book provides standard sampling methods recommended by the American Fisheries Society for assessing and monitoring freshwater fish populations in North America. Methods apply to ponds, reservoirs, natural lakes, and streams and rivers containing cold and warmwater fishes. Range-wide and eco-regional averages for indices of abundance, population structure, and condition for individual species are supplied to facilitate comparisons of standard data among populations. Provides information on converting nonstandard to standard data, statistical and database procedures for analyzing and storing standard data, and methods to prevent transfer of invasive species while sampling.

  20. A multi-dimensional sampling method for locating small scatterers

    NASA Astrophysics Data System (ADS)

    Song, Rencheng; Zhong, Yu; Chen, Xudong

    2012-11-01

    A multiple signal classification (MUSIC)-like multi-dimensional sampling method (MDSM) is introduced to locate small three-dimensional scatterers using electromagnetic waves. The indicator is built with the most stable part of signal subspace of the multi-static response matrix on a set of combinatorial sampling nodes inside the domain of interest. It has two main advantages compared to the conventional MUSIC methods. First, the MDSM is more robust against noise. Second, it can work with a single incidence even for multi-scatterers. Numerical simulations are presented to show the good performance of the proposed method.

  1. Efficiency of snake sampling methods in the Brazilian semiarid region.

    PubMed

    Mesquita, Paula C M D; Passos, Daniel C; Cechin, Sonia Z

    2013-09-01

    The choice of sampling methods is a crucial step in every field survey in herpetology. In countries where time and financial support are limited, the choice of the methods is critical. The methods used to sample snakes often lack objective criteria, and the traditional methods have apparently been more important when making the choice. Consequently researches using not-standardized methods are frequently found in the literature. We have compared four commonly used methods for sampling snake assemblages in a semiarid area in Brazil. We compared the efficacy of each method based on the cost-benefit regarding the number of individuals and species captured, time, and financial investment. We found that pitfall traps were the less effective method in all aspects that were evaluated and it was not complementary to the other methods in terms of abundance of species and assemblage structure. We conclude that methods can only be considered complementary if they are standardized to the objectives of the study. The use of pitfall traps in short-term surveys of the snake fauna in areas with shrubby vegetation and stony soil is not recommended.

  2. Beryllium Wipe Sampling (differing methods - differing exposure potentials)

    SciTech Connect

    Kerr, Kent

    2005-03-09

    This research compared three wipe sampling techniques currently used to test for beryllium contamination on room and equipment surfaces in Department of Energy facilities. Efficiencies of removal of beryllium contamination from typical painted surfaces were tested by wipe sampling without a wetting agent, with water-moistened wipe materials, and by methanol-moistened wipes. Analysis indicated that methanol-moistened wipe sampling removed about twice as much beryllium/oil-film surface contamination as water-moistened wipes, which removed about twice as much residue as dry wipes. Criteria at 10 CFR 850.30 and .31 were established on unspecified wipe sampling method(s). The results of this study reveal a need to identify criteria-setting method and equivalency factors. As facilities change wipe sampling methods among the three compared in this study, these results may be useful for approximate correlations. Accurate decontamination decision-making depends on the selection of appropriate wetting agents for the types of residues and surfaces. Evidence for beryllium sensitization via skin exposure argues in favor of wipe sampling with wetting agents that provide enhanced removal efficiency such as methanol when surface contamination includes oil mist residue.

  3. Systematic sampling for suspended sediment

    Treesearch

    Robert B. Thomas

    1991-01-01

    Abstract - Because of high costs or complex logistics, scientific populations cannot be measured entirely and must be sampled. Accepted scientific practice holds that sample selection be based on statistical principles to assure objectivity when estimating totals and variances. Probability sampling--obtaining samples with known probabilities--is the only method that...

  4. Methodological Development On Conditional Sampling Method: Application To Nox Fluxes Measured During The Escompte Campaign

    NASA Astrophysics Data System (ADS)

    Fotiadi, A.; F; Lohou; Serça, D.; Druilhet, A.; Laville, P.; Bouchou, P.; Lopez, A.

    Surface fluxes of reactive nitrogen oxides (NOx =NO + NO2) are essential to quantify the net impact of nitrogen and ozone budget in the atmospheric boundary layer. To accurately establish their sources and sinks, specific methods of measurement have to be developed taking into account the sensors characteristics (e.g. time response). The most direct method to measure energy and gas fluxes is the Eddy Correlation (EC) method based on the covariance between the vertical wind velocity (w) fluctuations and the scalar (X) fluctuations. The EC method requires fast-response sensors that are not available for many trace gases (as NOx). The Relaxed Eddy Accumulation or conditional sampling technique was proposed as an alternative solution to overcome this problem. A system for conditional sampling at the field scale was developed and applied to determine NOx fluxes in different Mediterranean ecosystems in the frame- work of the ESCOMPTE experimental campaign (June-July 2001). In order to assure the accuracy in the fluxes calculation a methodological approach of data analysis has been developed. This approach is based on the statistical characteristics, internal struc- ture and spectral analysis of turbulent functions. It allows us to establish data selection criteria related to homogeneity, stationarity and turbulence characterisation. These cri- teria which concern statistical characteristics of w that is recorded in real time during the sampling period, have been related to existing stability conditions. Assuming sim- ilarity between the 'slow' scalar related to REA method and the 'fast' scalars related to EC (e.g. H2O, CO2, O3), other criteria based on the covariance convergence can, as well be established to improve the quality of the REA measurements. Indeed, data analysis shows that H2O, CO2, O3 functions are highly correlated (correlation coeffi- cient in the order of 0.9 - absolute value), which confirms the similarity assumption.

  5. Sample preparation methods for determination of drugs of abuse in hair samples: A review.

    PubMed

    Vogliardi, Susanna; Tucci, Marianna; Stocchero, Giulia; Ferrara, Santo Davide; Favretto, Donata

    2015-02-01

    Hair analysis has assumed increasing importance in the determination of substances of abuse, both in clinical and forensic toxicology investigations. Hair analysis offers particular advantages over other biological matrices (blood and urine), including a larger window of detection, ease of collection and sample stability. In the present work, an overview of sample preparation techniques for the determination of substances of abuse in hair is provided, specifically regarding the principal steps in hair sample treatment-decontamination, extraction and purification. For this purpose, a survey of publications found in the MEDLINE database from 2000 to date was conducted. The most widely consumed substances of abuse and psychotropic drugs were considered. Trends in simplification of hair sample preparation, washing procedures and cleanup methods are discussed. Alternative sample extraction techniques, such as head-space solid phase microextraction (HS-SPDE), supercritical fluid extraction (SFE) and molecularly imprinted polymers (MIP) are also reported.

  6. Compliance Assurance Monitoring

    EPA Pesticide Factsheets

    Compliance assurance monitoring is intended to provide a reasonable assurance of compliance with applicable requirements under the Clean Air Act for large emission units that rely on pollution control device equipment to achieve compliance.

  7. Quality Assurance Project Plan

    SciTech Connect

    Holland, R. C.

    1998-06-01

    This Quality Assurance Project Plan documents the quality assurance activities for the Wastewater/Stormwater/Groundwater and Environmental Surveillance Programs. This QAPP was prepared in accordance with DOE guidance on compliance with 10CFR830.120.

  8. Examination of Hydrate Formation Methods: Trying to Create Representative Samples

    SciTech Connect

    Kneafsey, T.J.; Rees, E.V.L.; Nakagawa, S.; Kwon, T.-H.

    2011-04-01

    Forming representative gas hydrate-bearing laboratory samples is important so that the properties of these materials may be measured, while controlling the composition and other variables. Natural samples are rare, and have often experienced pressure and temperature changes that may affect the property to be measured [Waite et al., 2008]. Forming methane hydrate samples in the laboratory has been done a number of ways, each having advantages and disadvantages. The ice-to-hydrate method [Stern et al., 1996], contacts melting ice with methane at the appropriate pressure to form hydrate. The hydrate can then be crushed and mixed with mineral grains under controlled conditions, and then compacted to create laboratory samples of methane hydrate in a mineral medium. The hydrate in these samples will be part of the load-bearing frame of the medium. In the excess gas method [Handa and Stupin, 1992], water is distributed throughout a mineral medium (e.g. packed moist sand, drained sand, moistened silica gel, other porous media) and the mixture is brought to hydrate-stable conditions (chilled and pressurized with gas), allowing hydrate to form. This method typically produces grain-cementing hydrate from pendular water in sand [Waite et al., 2004]. In the dissolved gas method [Tohidi et al., 2002], water with sufficient dissolved guest molecules is brought to hydrate-stable conditions where hydrate forms. In the laboratory, this is can be done by pre-dissolving the gas of interest in water and then introducing it to the sample under the appropriate conditions. With this method, it is easier to form hydrate from more soluble gases such as carbon dioxide. It is thought that this method more closely simulates the way most natural gas hydrate has formed. Laboratory implementation, however, is difficult, and sample formation is prohibitively time consuming [Minagawa et al., 2005; Spangenberg and Kulenkampff, 2005]. In another version of this technique, a specified quantity of gas

  9. Technical Evaluation of Sample-Processing, Collection, and Preservation Methods

    DTIC Science & Technology

    2014-07-01

    purification process. Several purification methods were preprogrammed into the instrument, and all of the necessary reagents were supplied as prefilled ...attached to syringes that filter samples 17 as a means of DNA isolation. Some advantages to this kit were that it required very few materials and was...fairly quick, if used with small amounts of sample. However, this kit was ideally used only with a small quantity at one time because of the syringe

  10. Characterizing lentic freshwater fish assemblages using multiple sampling methods.

    PubMed

    Fischer, Jesse R; Quist, Michael C

    2014-07-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48-1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  11. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    USGS Publications Warehouse

    Fischer, Jesse R.; Quist, Michael

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  12. Fluidics platform and method for sample preparation and analysis

    SciTech Connect

    Benner, W. Henry; Dzenitis, John M.; Bennet, William J.; Baker, Brian R.

    2014-08-19

    Herein provided are fluidics platform and method for sample preparation and analysis. The fluidics platform is capable of analyzing DNA from blood samples using amplification assays such as polymerase-chain-reaction assays and loop-mediated-isothermal-amplification assays. The fluidics platform can also be used for other types of assays and analyzes. In some embodiments, a sample in a sealed tube can be inserted directly. The following isolation, detection, and analyzes can be performed without a user's intervention. The disclosed platform may also comprises a sample preparation system with a magnetic actuator, a heater, and an air-drying mechanism, and fluid manipulation processes for extraction, washing, elution, assay assembly, assay detection, and cleaning after reactions and between samples.

  13. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, Gary J.; Motes, Billy G.; Bird, Susan K.; Kotter, Dale K.

    1996-01-01

    Apparatus for obtaining a whole gas sample, composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method of obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant.

  14. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, G.J.; Motes, B.G.; Bird, S.K.; Kotter, D.K.

    1996-03-26

    Apparatus for obtaining a whole gas sample, is composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method is described for obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant. 3 figs.

  15. A proficiency test system to improve performance of milk analysis methods and produce reference values for component calibration samples for infrared milk analysis.

    PubMed

    Wojciechowski, Karen L; Melilli, Caterina; Barbano, David M

    2016-08-01

    Our goal was to determine the feasibility of combining proficiency testing, analytical method quality-assurance system, and production of reference samples for calibration of infrared milk analyzers to achieve a more efficient use of resources and reduce costs while maximizing analytical accuracy within and among milk payment-testing laboratories. To achieve this, we developed and demonstrated a multilaboratory combined proficiency testing and analytical method quality-assurance system as an approach to evaluate and improve the analytical performance of methods. A set of modified milks was developed and optimized to serve multiple purposes (i.e., proficiency testing, quality-assurance and method improvement, and to provide reference materials for calibration of secondary testing methods). Over a period of years, the approach has enabled the group of laboratories to document improved analytical performance (i.e., reduced within- and between-laboratory variation) of chemical reference methods used as the primary reference for calibration of high-speed electronic milk-testing equipment. An annual meeting of the laboratory technicians allows for review of results and discussion of each method and provides a forum for communication of experience and techniques that are of value to new analysts in the group. The monthly proficiency testing sample exchanges have the added benefit of producing all-laboratory mean reference values for a set of 14 milks that can be used for calibration, evaluation, and troubleshooting of calibration adjustment issues on infrared milk analyzers.

  16. Chemicals of emerging concern in water and bottom sediment in the Great Lakes Basin, 2012: collection methods, analytical methods, quality assurance, and study data

    USGS Publications Warehouse

    Lee, Kathy E.; Langer, Susan K.; Menheer, Michael A.; Hansen, Donald S.; Foreman, William T.; Furlong, Edward T.; Jorgenson, Zachary G.; Choy, Steven J.; Moore, Jeremy N.; Banda, JoAnn; Gefell, Daniel J.

    2015-01-01

    During this study, 53 environmental samples, 4 field duplicate samples, and 8 field spike samples of bottom sediment and laboratory matrix-spike samples were analyzed for a wide variety of CECs at the USGS National Water Quality Laboratory using laboratory schedule 5433 for wastewater indicators; research method 6434 for steroid hormones, sterols, and bisphenol A; and research method 9008 for human-use pharmaceuticals and antidepressants. Forty of the 57 chemicals analyzed using laboratory schedule 5433 had detectable concentrations ranging from 1 to 49,000 micrograms per kilogram. Fourteen of the 20 chemicals analyzed using research method 6434 had detectable concentrations ranging from 0.04 to 24,940 nanograms per gram. Ten of the 20 chemicals analyzed using research method 9008 had detectable concentrations ranging from 0.59 to 197.5 micrograms per kilogram. Five of the 11 chemicals analyzed using research method 9008 had detectable concentrations ranging from 1.16 to 25.0 micrograms per kilogram.

  17. A Review of Methods for Detecting Melamine in Food Samples.

    PubMed

    Lu, Yang; Xia, Yinqiang; Liu, Guozhen; Pan, Mingfei; Li, Mengjuan; Lee, Nanju Alice; Wang, Shuo

    2017-01-02

    Melamine is a synthetic chemical used in the manufacture of resins, pigments, and superplasticizers. Human beings can be exposed to melamine through various sources such as migration from related products into foods, pesticide contamination, and illegal addition to foods. Toxicity studies suggest that prolonged consumption of melamine could lead to the formation of kidney stones or even death. Therefore, reliable and accurate detection methods are essential to prevent human exposure to melamine. Sample preparation is of critical importance, since it could directly affect the performance of analytical methods. Some methods for the detection of melamine include instrumental analysis, immunoassays, and sensor methods. In this paper, we have summarized the state-of-the-art methods used for food sample preparation as well as the various detection techniques available for melamine. Combinations of multiple techniques and new materials used in the detection of melamine have also been reviewed. Finally, future perspectives on the applications of microfluidic devices have also been provided.

  18. RAPID METHOD FOR DETERMINATION OF RADIOSTRONTIUM IN EMERGENCY MILK SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.

    2008-07-17

    A new rapid separation method for radiostrontium in emergency milk samples was developed at the Savannah River Site (SRS) Environmental Bioassay Laboratory (Aiken, SC, USA) that will allow rapid separation and measurement of Sr-90 within 8 hours. The new method uses calcium phosphate precipitation, nitric acid dissolution of the precipitate to coagulate residual fat/proteins and a rapid strontium separation using Sr Resin (Eichrom Technologies, Darien, IL, USA) with vacuum-assisted flow rates. The method is much faster than previous method that use calcination or cation exchange pretreatment, has excellent chemical recovery, and effectively removes beta interferences. When a 100 ml sample aliquot is used, the method has a detection limit of 0.5 Bq/L, well below generic emergency action levels.

  19. Compressive sampling in computed tomography: Method and application

    NASA Astrophysics Data System (ADS)

    Hu, Zhanli; Liang, Dong; Xia, Dan; Zheng, Hairong

    2014-06-01

    Since Donoho and Candes et al. published their groundbreaking work on compressive sampling or compressive sensing (CS), CS theory has attracted a lot of attention and become a hot topic, especially in biomedical imaging. Specifically, some CS based methods have been developed to enable accurate reconstruction from sparse data in computed tomography (CT) imaging. In this paper, we will review the progress in CS based CT from aspects of three fundamental requirements of CS: sparse representation, incoherent sampling and reconstruction algorithm. In addition, some potential applications of compressive sampling in CT are introduced.

  20. Comparison of pigment content of paint samples using spectrometric methods.

    PubMed

    Trzcińska, Beata; Kowalski, Rafał; Zięba-Palus, Janina

    2014-09-15

    The aim of the paper was to evaluate the influence of pigment concentration and its distribution in polymer binder on the possibility of colour identification and paint sample comparison. Two sets of paint samples: one containing red and another one green pigment were prepared. Each set consisted of 13 samples differing gradually in the concentration of pigment. To obtain the sets of various colour shades white paint was mixed with the appropriate pigment in the form of a concentrated suspension. After solvents evaporation the samples were examined using spectrometric methods. The resin and main filler were identified by IR method. Colour and white pigments were identified on the base of Raman spectra. Colour of samples were compared based on Vis spectrometry according to colour theory. It was found that samples are homogenous (parameter measuring colour similarity ΔE<3). The values of ΔE between the neighbouring samples in the set revealed decreasing linear function and between the first and following one--a logarithmic function.

  1. [Progress in sample preparation and analytical methods for trace polar small molecules in complex samples].

    PubMed

    Zhang, Qianchun; Luo, Xialin; Li, Gongke; Xiao, Xiaohua

    2015-09-01

    Small polar molecules such as nucleosides, amines, amino acids are important analytes in biological, food, environmental, and other fields. It is necessary to develop efficient sample preparation and sensitive analytical methods for rapid analysis of these polar small molecules in complex matrices. Some typical materials in sample preparation, including silica, polymer, carbon, boric acid and so on, are introduced in this paper. Meanwhile, the applications and developments of analytical methods of polar small molecules, such as reversed-phase liquid chromatography, hydrophilic interaction chromatography, etc., are also reviewed.

  2. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    NASA Technical Reports Server (NTRS)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  3. Two-dimensional signal reconstruction: The correlation sampling method

    SciTech Connect

    Roman, H. E.

    2007-12-15

    An accurate approach for reconstructing a time-dependent two-dimensional signal from non-synchronized time series recorded at points located on a grid is discussed. The method, denoted as correlation sampling, improves the standard conditional sampling approach commonly employed in the study of turbulence in magnetoplasma devices. Its implementation is illustrated in the case of an artificial time-dependent signal constructed using a fractal algorithm that simulates a fluctuating surface. A statistical method is also discussed for distinguishing coherent (i.e., collective) from purely random (noisy) behavior for such two-dimensional fluctuating phenomena.

  4. Evaluation of Stress Loaded Steel Samples Using Selected Electromagnetic Methods

    SciTech Connect

    Chady, T.

    2004-02-26

    In this paper the magnetic leakage flux and eddy current method were used to evaluate changes of materials' properties caused by stress. Seven samples made of ferromagnetic material with different level of applied stress were prepared. First, the leakage magnetic fields were measured by scanning the surface of the specimens with GMR gradiometer. Next, the same samples were evaluated using an eddy current sensor. A comparison between results obtained from both methods was carried out. Finally, selected parameters of the measured signal were calculated and utilized to evaluate level of the applied stress. A strong coincidence between amount of the applied stress and the maximum amplitude of the derivative was confirmed.

  5. Recording 2-D Nutation NQR Spectra by Random Sampling Method

    PubMed Central

    Sinyavsky, Nikolaj; Jadzyn, Maciej; Ostafin, Michal; Nogaj, Boleslaw

    2010-01-01

    The method of random sampling was introduced for the first time in the nutation nuclear quadrupole resonance (NQR) spectroscopy where the nutation spectra show characteristic singularities in the form of shoulders. The analytic formulae for complex two-dimensional (2-D) nutation NQR spectra (I = 3/2) were obtained and the condition for resolving the spectral singularities for small values of an asymmetry parameter η was determined. Our results show that the method of random sampling of a nutation interferogram allows significant reduction of time required to perform a 2-D nutation experiment and does not worsen the spectral resolution. PMID:20949121

  6. Recording 2-D Nutation NQR Spectra by Random Sampling Method.

    PubMed

    Glotova, Olga; Sinyavsky, Nikolaj; Jadzyn, Maciej; Ostafin, Michal; Nogaj, Boleslaw

    2010-10-01

    The method of random sampling was introduced for the first time in the nutation nuclear quadrupole resonance (NQR) spectroscopy where the nutation spectra show characteristic singularities in the form of shoulders. The analytic formulae for complex two-dimensional (2-D) nutation NQR spectra (I = 3/2) were obtained and the condition for resolving the spectral singularities for small values of an asymmetry parameter η was determined. Our results show that the method of random sampling of a nutation interferogram allows significant reduction of time required to perform a 2-D nutation experiment and does not worsen the spectral resolution.

  7. Multinational Quality Assurance

    ERIC Educational Resources Information Center

    Kinser, Kevin

    2011-01-01

    Multinational colleges and universities pose numerous challenges to the traditional models of quality assurance that are designed to validate domestic higher education. When institutions cross international borders, at least two quality assurance protocols are involved. To guard against fraud and abuse, quality assurance in the host country is…

  8. A comparison of sampling methods for examining the laryngeal microbiome

    PubMed Central

    Hanshew, Alissa S.; Jetté, Marie E.; Tadayon, Stephanie; Thibeault, Susan L.

    2017-01-01

    Shifts in healthy human microbial communities have now been linked to disease in numerous body sites. Noninvasive swabbing remains the sampling technique of choice in most locations; however, it is not well known if this method samples the entire community, or only those members that are easily removed from the surface. We sought to compare the communities found via swabbing and biopsied tissue in true vocal folds, a location that is difficult to sample without causing potential damage and impairment to tissue function. A secondary aim of this study was to determine if swab sampling of the false vocal folds could be used as proxy for true vocal folds. True and false vocal fold mucosal samples (swabbed and biopsied) were collected from six pigs and used for 454 pyrosequencing of the V3–V5 region of the 16S rRNA gene. Most of the alpha and beta measures of diversity were found to be significantly similar between swabbed and biopsied tissue samples. Similarly, the communities found in true and false vocal folds did not differ considerably. These results suggest that samples taken via swabs are sufficient to assess the community, and that samples taken from the false vocal folds may be used as proxies for the true vocal folds. Assessment of these techniques opens an avenue to less traumatic means to explore the role microbes play in the development of diseases of the vocal folds, and perhaps the rest of the respiratory tract. PMID:28362810

  9. NEW COLUMN SEPARATION METHOD FOR EMERGENCY URINE SAMPLES

    SciTech Connect

    Maxwell, S; Brian Culligan, B

    2007-08-28

    The Savannah River Site Environmental Bioassay Lab participated in the 2007 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2007. A new rapid column separation method was applied directly to the NRIP 2007 emergency urine samples, with only minimal sample preparation to reduce preparation time. Calcium phosphate precipitation, previously used to pre-concentrate actinides and Sr-90 in NRIP 2006 urine and water samples, was not used for the NRIP 2007 urine samples. Instead, the raw urine was acidified and passed directly through the stacked resin columns (TEVA+TRU+SR Resins) to separate the actinides and strontium from the NRIP urine samples more quickly. This improvement reduced sample preparation time for the NRIP 2007 emergency urine analyses significantly. This approach works well for small volume urine samples expected during an emergency response event. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and strontium-90 analyses for NRIP 2007 urine samples.

  10. An improved gas chromatography screening method for doping substances using triple quadrupole mass spectrometry, with an emphasis on quality assurance.

    PubMed

    De Brabanter, Nik; Van Gansbeke, Wim; Geldof, Lore; Van Eenoo, Peter

    2012-11-01

    A GC-QqQ-MS method was developed for the detection of over 150 compounds from different classes (steroids, narcotics, stimulants, β-blockers, β-2-agonists and hormone antagonists) in a qualitative way. In the quantitative part, the traditional steroid profile with the most important endogenous steroids is expanded with six minor metabolites, which further improves the detection and identification of endogenous steroid abuse. In addition to these, norandrosterone, salbutamol and the major metabolite of cannabis are also quantified. Methods developed for anti-doping purposes should be subjected to the highest level of quality. Here, the addition of a combination of (deuterated) internal standards allows for an accurate quality control of every single step of the methodology: hydrolysis efficiency, derivatization efficiency and microbiological degradation are monitored in every single sample. Additionally, special attention is paid to the relationships between parameters indicating degradation by micro-organisms and the reliability of the steroid profile. The impact of the degradation is studied by evaluation of the quantities and percentages of 5α-androstane-3,17-dione and 5β-androstane-3,17-dione. The concept of measurement uncertainty was introduced for the evaluation of relative abundances of mass-to-charge ratios and the obtained ranges were compared with the World Anti-Doping Agency regulations on tolerance windows for relative ion intensities. The results indicate that the approaches are similar.

  11. A molecular method to assess Phytophthora diversity in environmental samples.

    PubMed

    Scibetta, Silvia; Schena, Leonardo; Chimento, Antonio; Cacciola, Santa O; Cooke, David E L

    2012-03-01

    Current molecular detection methods for the genus Phytophthora are specific to a few key species rather than the whole genus and this is a recognized weakness of protocols for ecological studies and international plant health legislation. In the present study a molecular approach was developed to detect Phytophthora species in soil and water samples using novel sets of genus-specific primers designed against the internal transcribed spacer (ITS) regions. Two different rDNA primer sets were tested: one assay amplified a long product including the ITS1, 5.8S and ITS2 regions (LP) and the other a shorter product including the ITS1 only (SP). Both assays specifically amplified products from Phytophthora species without cross-reaction with the related Pythium s. lato, however the SP assay proved the more sensitive and reliable. The method was validated using woodland soil and stream water from Invergowrie, Scotland. On-site use of a knapsack sprayer and in-line water filters proved more rapid and effective than centrifugation at sampling Phytophthora propagules. A total of 15 different Phytophthora phylotypes were identified which clustered within the reported ITS-clades 1, 2, 3, 6, 7 and 8. The range and type of the sequences detected varied from sample to sample and up to three and five different Phytophthora phylotypes were detected within a single sample of soil or water, respectively. The most frequently detected sequences were related to members of ITS-clade 6 (i.e. P. gonapodyides-like). The new method proved very effective at discriminating multiple species in a given sample and can also detect as yet unknown species. The reported primers and methods will prove valuable for ecological studies, biosecurity and commercial plant, soil or water (e.g. irrigation water) testing as well as the wider metagenomic sampling of this fascinating component of microbial pathogen diversity.

  12. RAPID SEPARATION METHOD FOR EMERGENCY WATER AND URINE SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.

    2008-08-27

    The Savannah River Site Environmental Bioassay Lab participated in the 2008 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2008. A new rapid column separation method was used for analysis of actinides and {sup 90}Sr the NRIP 2008 emergency water and urine samples. Significant method improvements were applied to reduce analytical times. As a result, much faster analysis times were achieved, less than 3 hours for determination of {sup 90}Sr and 3-4 hours for actinides. This represents a 25%-33% improvement in analysis times from NRIP 2007 and a {approx}100% improvement compared to NRIP 2006 report times. Column flow rates were increased by a factor of two, with no significant adverse impact on the method performance. Larger sample aliquots, shorter count times, faster cerium fluoride microprecipitation and streamlined calcium phosphate precipitation were also employed. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and {sup 90}Sr analyses for NRIP 2008 emergency urine samples. High levels of potential matrix interferences may be present in emergency samples and rugged methods are essential. Extremely high levels of {sup 210}Po were found to have an adverse effect on the uranium results for the NRIP-08 urine samples, while uranium results for NRIP-08 water samples were not affected. This problem, which was not observed for NRIP-06 or NRIP-07 urine samples, was resolved by using an enhanced {sup 210}Po removal step, which will be described.

  13. Comparison of sampling methods for radiocarbon dating of carbonyls in air samples via accelerator mass spectrometry

    NASA Astrophysics Data System (ADS)

    Schindler, Matthias; Kretschmer, Wolfgang; Scharf, Andreas; Tschekalinskij, Alexander

    2016-05-01

    Three new methods to sample and prepare various carbonyl compounds for radiocarbon measurements were developed and tested. Two of these procedures utilized the Strecker synthetic method to form amino acids from carbonyl compounds with either sodium cyanide or trimethylsilyl cyanide. The third procedure used semicarbazide to form crystalline carbazones with the carbonyl compounds. The resulting amino acids and semicarbazones were then separated and purified using thin layer chromatography. The separated compounds were then combusted to CO2 and reduced to graphite to determine 14C content by accelerator mass spectrometry (AMS). All of these methods were also compared with the standard carbonyl compound sampling method wherein a compound is derivatized with 2,4-dinitrophenylhydrazine and then separated by high-performance liquid chromatography (HPLC).

  14. Proteome Analysis of Human Perilymph using an Intraoperative Sampling Method.

    PubMed

    Schmitt, Heike Andrea; Pich, Andreas; Schröder, Anke; Scheper, Verena; Lilli, Giorgio; Reuter, Günter; Lenarz, Thomas

    2017-03-10

    The knowledge about the etiology and pathophysiology of sensorineural hearing loss (SNHL) is still very limited. The project aims at the improvement of understanding different types of SNHL by proteome analysis of human perilymph. Sampling of perilymph has been established during inner ear surgeries (cochlear implant and vestibular schwannoma surgeries) and safety of the sampling method was determined by pure tone audiometry. An in-depth shot-gun proteomics approach was performed to identify cochlear proteins and individual proteome in perilymph of patients. This method enables the identification and quantification of protein composition of perilymph. The proteome of 41 collected perilymph samples with volumes of 1-12 µl was analyzed by data dependent acquisition resulting in overall 878 detected protein groups. At least 203 protein groups were solely identified in perilymph, not in reference samples (serum, cerebrospinal fluid), displaying a specific protein pattern for perilymph. Samples were grouped according to age of patients and type of surgery leading to identification of some proteins specific to particular subgroups. Proteins with different abundances between different sample groups were subjected to classification by gene ontology annotations. The identified proteins might be used to develop tools for non-invasive inner ear diagnostics and to elucidate molecular profiles of SNHL.

  15. Assessment of a sequential extraction method to evaluate mercury mobility and geochemistry in solid environmental samples.

    PubMed

    Fernández-Martínez, Rodolfo; Rucandio, Isabel

    2013-11-01

    The development of a sequential extraction method for mercury in solid environmental samples is presented. The scheme recognizes and quantifies four major phase associations of mercury: "Labile mercury species", "Hg bound to humic and fulvic complexes", "elemental Hg and bound to crystalline oxides" and "Hg sulfide and refractory species". Model solids were used in this study to evaluate different extracting solutions and to determine optimum extraction conditions. Sequential and single-step extractions were conducted to evaluate the interaction among the successive steps. Different variables such as extractant concentration, time, temperature and number of extractions were optimized for each stage when necessary. The selectivity of the selected extractions was assured through experiments with natural and synthetic matrices of some specific Hg-bearing phases. The suitability of the proposed method was evaluated by using four certified reference materials from different Hg sources, physicochemical properties and total Hg content (from 0.3µgg(-1) to 33µgg(-1)). Recovery of total Hg by the sum of fractions in reference materials showed that the accuracy of the method ranges from 85 percent to 105 percent. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Blue noise sampling method based on mixture distance

    NASA Astrophysics Data System (ADS)

    Qin, Hongxing; Hong, XiaoYang; Xiao, Bin; Zhang, Shaoting; Wang, Guoyin

    2014-11-01

    Blue noise sampling is a core component for a large number of computer graphic applications such as imaging, modeling, animation, and rendering. However, most existing methods are concentrated on preserving spatial domain properties like density and anisotropy, while ignoring feature preserving. In order to solve the problem, we present a new distance metric called mixture distance for blue noise sampling, which is a combination of geodesic and feature distances. Based on mixture distance, the blue noise property and features can be preserved by controlling the ratio of the geodesic distance to the feature distance. With the intention of meeting different requirements from various applications, an adaptive adjustment for parameters is also proposed to achieve a balance between the preservation of features and spatial properties. Finally, implementation on a graphic processing unit is introduced to improve the efficiency of computation. The efficacy of the method is demonstrated by the results of image stippling, surface sampling, and remeshing.

  17. Source sampling and analysis guidance: A methods directory

    SciTech Connect

    Jackson, M.D.; Johnson, L.D.; Baughman, K.W.; James, R.H.; Spafford, R.B.

    1991-01-01

    Sampling and analytical methodologies are needed by EPA and industry for testing stationary sources for specific organic compounds such as those listed under the Resource Conservation and Recovery Act (RCRA) Appendix 8 and Appendix 9 and the Clean Air Act of 1990. A computerized directory, Problem POHC Reference Directory, has been developed that supplies information on available field sampling and analytical methodology for each compound in those lists. Existing EPA methods are referenced if applicable, along with their validation status. At the present, the data base is strongly oriented toward combustion sources. The base may be searched on the basis of several parameters including name, Chemical Abstracts Service (CAS) number, physical properties, thermal stability, combustion rank, or general problem areas in sampling or analysis. The methods directory is menu driven and requires no programming ability; however, some familiarity with dBASE III+ would be helpful.

  18. Method and apparatus for sampling low-yield wells

    DOEpatents

    Last, George V.; Lanigan, David C.

    2003-04-15

    An apparatus and method for collecting a sample from a low-yield well or perched aquifer includes a pump and a controller responsive to water level sensors for filling a sample reservoir. The controller activates the pump to fill the reservoir when the water level in the well reaches a high level as indicated by the sensor. The controller deactivates the pump when the water level reaches a lower level as indicated by the sensors. The pump continuously activates and deactivates the pump until the sample reservoir is filled with a desired volume, as indicated by a reservoir sensor. At the beginning of each activation cycle, the controller optionally can select to purge an initial quantity of water prior to filling the sample reservoir. The reservoir can be substantially devoid of air and the pump is a low volumetric flow rate pump. Both the pump and the reservoir can be located either inside or outside the well.

  19. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 26 2014-07-01 2014-07-01 false Representative Sampling Methods I Appendix I to Part 261 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) IDENTIFICATION AND LISTING OF HAZARDOUS WASTE Pt. 261, App. I Appendix I to Part...

  20. A General Linear Method for Equating with Small Samples

    ERIC Educational Resources Information Center

    Albano, Anthony D.

    2015-01-01

    Research on equating with small samples has shown that methods with stronger assumptions and fewer statistical estimates can lead to decreased error in the estimated equating function. This article introduces a new approach to linear observed-score equating, one which provides flexible control over how form difficulty is assumed versus estimated…

  1. METHODS FOR THE ANALYSIS OF CARPET SAMPLES FOR ASBESTOS

    EPA Science Inventory

    Assessing asbestos fiber contamination in a carpet is complicated by the nature of the carpeting – because of the pile’s rough surface and thickness, samples cannot be collected directly from carpet for analysis by TEM. Two indirect methods are currently used by laboratories when...

  2. METHODS FOR THE ANALYSIS OF CARPET SAMPLES FOR ASBESTOS

    EPA Science Inventory

    Assessing asbestos fiber contamination in a carpet is complicated by the nature of the carpeting – because of the pile’s rough surface and thickness, samples cannot be collected directly from carpet for analysis by TEM. Two indirect methods are currently used by laboratories when...

  3. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Method of sample analysis. 58.245 Section 58.245..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating...

  4. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Methods of sample analysis. 58.812 Section 58.812..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating...

  5. A General Linear Method for Equating with Small Samples

    ERIC Educational Resources Information Center

    Albano, Anthony D.

    2015-01-01

    Research on equating with small samples has shown that methods with stronger assumptions and fewer statistical estimates can lead to decreased error in the estimated equating function. This article introduces a new approach to linear observed-score equating, one which provides flexible control over how form difficulty is assumed versus estimated…

  6. Performance of sampling methods to estimate log characteristics for wildlife.

    Treesearch

    Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton

    2004-01-01

    Accurate estimation of the characteristics of log resources, or coarse woody debris (CWD), is critical to effective management of wildlife and other forest resources. Despite the importance of logs as wildlife habitat, methods for sampling logs have traditionally focused on silvicultural and fire applications. These applications have emphasized estimates of log volume...

  7. Comparison of several analytical methods for the determination of tin in geochemical samples as a function of tin speciation

    USGS Publications Warehouse

    Kane, J.S.; Evans, J.R.; Jackson, J.C.

    1989-01-01

    Accurate and precise determinations of tin in geological materials are needed for fundamental studies of tin geochemistry, and for tin prospecting purposes. Achieving the required accuracy is difficult because of the different matrices in which Sn can occur (i.e. sulfides, silicates and cassiterite), and because of the variability of literature values for Sn concentrations in geochemical reference materials. We have evaluated three methods for the analysis of samples for Sn concentration: graphite furnace atomic absorption spectrometry (HGA-AAS) following iodide extraction, inductively coupled plasma atomic emission spectrometry (ICP-OES), and energy-dispersive X-ray fluorescence (EDXRF) spectrometry. Two of these methods (HGA-AAS and ICP-OES) required sample decomposition either by acid digestion or fusion, while the third (EDXRF) was performed directly on the powdered sample. Analytical details of all three methods, their potential errors, and the steps necessary to correct these errors were investigated. Results showed that similar accuracy was achieved from all methods for unmineralized samples, which contain no known Sn-bearing phase. For mineralized samples, which contain Sn-bearing minerals, either cassiterite or stannous sulfides, only EDXRF and fusion ICP-OES methods provided acceptable accuracy. This summary of our study provides information which helps to assure correct interpretation of data bases for underlying geochemical processes, regardless of method of data collection and its inherent limitations. ?? 1989.

  8. Passive sampling methods for contaminated sediments: scientific rationale supporting use of freely dissolved concentrations.

    PubMed

    Mayer, Philipp; Parkerton, Thomas F; Adams, Rachel G; Cargill, John G; Gan, Jay; Gouin, Todd; Gschwend, Philip M; Hawthorne, Steven B; Helm, Paul; Witt, Gesine; You, Jing; Escher, Beate I

    2014-04-01

    Passive sampling methods (PSMs) allow the quantification of the freely dissolved concentration (Cfree ) of an organic contaminant even in complex matrices such as sediments. Cfree is directly related to a contaminant's chemical activity, which drives spontaneous processes including diffusive uptake into benthic organisms and exchange with the overlying water column. Consequently, Cfree provides a more relevant dose metric than total sediment concentration. Recent developments in PSMs have significantly improved our ability to reliably measure even very low levels of Cfree . Application of PSMs in sediments is preferably conducted in the equilibrium regime, where freely dissolved concentrations in the sediment are well-linked to the measured concentration in the sampler via analyte-specific partition ratios. The equilibrium condition can then be assured by measuring a time series or a single time point using passive samplers with different surface to volume ratios. Sampling in the kinetic regime is also possible and generally involves the application of performance reference compounds for the calibration. Based on previous research on hydrophobic organic contaminants, it is concluded that Cfree allows a direct assessment of 1) contaminant exchange and equilibrium status between sediment and overlying water, 2) benthic bioaccumulation, and 3) potential toxicity to benthic organisms. Thus, the use of PSMs to measure Cfree provides an improved basis for the mechanistic understanding of fate and transport processes in sediments and has the potential to significantly improve risk assessment and management of contaminated sediments.

  9. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  10. Sample Selected Averaging Method for Analyzing the Event Related Potential

    NASA Astrophysics Data System (ADS)

    Taguchi, Akira; Ono, Youhei; Kimura, Tomoaki

    The event related potential (ERP) is often measured through the oddball task. On the oddball task, subjects are given “rare stimulus” and “frequent stimulus”. Measured ERPs were analyzed by the averaging technique. In the results, amplitude of the ERP P300 becomes large when the “rare stimulus” is given. However, measured ERPs are included samples without an original feature of ERP. Thus, it is necessary to reject unsuitable measured ERPs when using the averaging technique. In this paper, we propose the rejection method for unsuitable measured ERPs for the averaging technique. Moreover, we combine the proposed method and Woody's adaptive filter method.

  11. Comparison of DNA preservation methods for environmental bacterial community samples

    USGS Publications Warehouse

    Gray, Michael A.; Pratte, Zoe A.; Kellogg, Christina A.

    2013-01-01

    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard™, RNAlater®, DMSO–EDTA–salt (DESS), FTA® cards, and FTA Elute® cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA® cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard™, RNAlater®, and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost.

  12. Comparison of aquatic macroinvertebrate samples collected using different field methods

    USGS Publications Warehouse

    Lenz, Bernard N.; Miller, Michael A.

    1996-01-01

    Government agencies, academic institutions, and volunteer monitoring groups in the State of Wisconsin collect aquatic macroinvertebrate data to assess water quality. Sampling methods differ among agencies, reflecting the differences in the sampling objectives of each agency. Lack of infor- mation about data comparability impedes data shar- ing among agencies, which can result in duplicated sampling efforts or the underutilization of avail- able information. To address these concerns, com- parisons were made of macroinvertebrate samples collected from wadeable streams in Wisconsin by personnel from the U.S. Geological Survey- National Water Quality Assessment Program (USGS-NAWQA), the Wisconsin Department of Natural Resources (WDNR), the U.S. Department of Agriculture-Forest Service (USDA-FS), and volunteers from the Water Action Volunteer-Water Quality Monitoring Program (WAV). This project was part of the Intergovernmental Task Force on Monitoring Water Quality (ITFM) Wisconsin Water Resources Coordination Project. The numbers, types, and environmental tolerances of the organ- isms collected were analyzed to determine if the four different field methods that were used by the different agencies and volunteer groups provide comparable results. Additionally, this study com- pared the results of samples taken from different locations and habitats within the same streams.

  13. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  14. New Methods of Sample Preparation for Atom Probe Specimens

    NASA Technical Reports Server (NTRS)

    Kuhlman, Kimberly, R.; Kowalczyk, Robert S.; Ward, Jennifer R.; Wishard, James L.; Martens, Richard L.; Kelly, Thomas F.

    2003-01-01

    Magnetite is a common conductive mineral found on Earth and Mars. Disk-shaped precipitates approximately 40 nm in diameter have been shown to have manganese and aluminum concentrations. Atom-probe field-ion microscopy (APFIM) is the only technique that can potentially quantify the composition of these precipitates. APFIM will be used to characterize geological and planetary materials, analyze samples of interest for geomicrobiology; and, for the metrology of nanoscale instrumentation. Prior to APFIM sample preparation was conducted by electropolishing, the method of sharp shards (MSS), or Bosch process (deep reactive ion etching) with focused ion beam (FIB) milling as a final step. However, new methods are required for difficult samples. Many materials are not easily fabricated using electropolishing, MSS, or the Bosch process, FIB milling is slow and expensive, and wet chemistry and the reactive ion etching are typically limited to Si and other semiconductors. APFIM sample preparation using the dicing saw is commonly used to section semiconductor wafers into individual devices following manufacture. The dicing saw is a time-effective method for preparing high aspect ratio posts of poorly conducting materials. Femtosecond laser micromachining is also suitable for preparation of posts. FIB time required is reduced by about a factor of 10 and multi-tip specimens can easily be fabricated using the dicing saw.

  15. A flexible importance sampling method for integrating subgrid processes

    NASA Astrophysics Data System (ADS)

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  16. New Methods of Sample Preparation for Atom Probe Specimens

    NASA Technical Reports Server (NTRS)

    Kuhlman, Kimberly, R.; Kowalczyk, Robert S.; Ward, Jennifer R.; Wishard, James L.; Martens, Richard L.; Kelly, Thomas F.

    2003-01-01

    Magnetite is a common conductive mineral found on Earth and Mars. Disk-shaped precipitates approximately 40 nm in diameter have been shown to have manganese and aluminum concentrations. Atom-probe field-ion microscopy (APFIM) is the only technique that can potentially quantify the composition of these precipitates. APFIM will be used to characterize geological and planetary materials, analyze samples of interest for geomicrobiology; and, for the metrology of nanoscale instrumentation. Prior to APFIM sample preparation was conducted by electropolishing, the method of sharp shards (MSS), or Bosch process (deep reactive ion etching) with focused ion beam (FIB) milling as a final step. However, new methods are required for difficult samples. Many materials are not easily fabricated using electropolishing, MSS, or the Bosch process, FIB milling is slow and expensive, and wet chemistry and the reactive ion etching are typically limited to Si and other semiconductors. APFIM sample preparation using the dicing saw is commonly used to section semiconductor wafers into individual devices following manufacture. The dicing saw is a time-effective method for preparing high aspect ratio posts of poorly conducting materials. Femtosecond laser micromachining is also suitable for preparation of posts. FIB time required is reduced by about a factor of 10 and multi-tip specimens can easily be fabricated using the dicing saw.

  17. A method for sampling microbial aerosols using high altitude balloons.

    PubMed

    Bryan, N C; Stewart, M; Granger, D; Guzik, T G; Christner, B C

    2014-12-01

    Owing to the challenges posed to microbial aerosol sampling at high altitudes, very little is known about the abundance, diversity, and extent of microbial taxa in the Earth-atmosphere system. To directly address this knowledge gap, we designed, constructed, and tested a system that passively samples aerosols during ascent through the atmosphere while tethered to a helium-filled latex sounding balloon. The sampling payload is ~ 2.7 kg and comprised of an electronics box and three sampling chambers (one serving as a procedural control). Each chamber is sealed with retractable doors that can be commanded to open and close at designated altitudes. The payload is deployed together with radio beacons that transmit GPS coordinates (latitude, longitude and altitude) in real time for tracking and recovery. A cut mechanism separates the payload string from the balloon at any desired altitude, returning all equipment safely to the ground on a parachute. When the chambers are opened, aerosol sampling is performed using the Rotorod® collection method (40 rods per chamber), with each rod passing through 0.035 m3 per km of altitude sampled. Based on quality control measurements, the collection of ~ 100 cells rod(-1) provided a 3-sigma confidence level of detection. The payload system described can be mated with any type of balloon platform and provides a tool for characterizing the vertical distribution of microorganisms in the troposphere and stratosphere. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Measurement of radon potential from soil using a special method of sampling

    NASA Astrophysics Data System (ADS)

    Cosma, Constantin; Papp, Botond; Moldovan, Mircea; Cosma, Victor; Cindea, Ciprian; Suciu, Liviu; Apostu, Adelina

    2010-10-01

    Soil radon gas and/or its exhalation rate are used as indicators for some applications, such as uranium exploration, indoor radon concentration, seismic activity, location of subsurface faults, etc., and also in the studies where the main interest is the field verification of radon transport models. This work proposes a versatile method for the soil radon sampling using a special manner of pumping. The soil gas is passed through a column of charcoal by using passive pumping. A plastic bottle filled with water is coupled to an activated charcoal column and the flow of water through an adjustable hole made at the bottom of bottle assures a controlled gas flow from the soil. The results obtained for the activity of activated charcoal are in the range of 20-40 kBq/m3, for a depth of approximately 0.8 m. The results obtained by this method were confirmed by simultaneous measurements using LUK 3C device for soil radon measurements. Possible applications for the estimation of radon soil potential are discussed.

  19. RAPID SEPARATION METHOD FOR ACTINIDES IN EMERGENCY SOIL SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.; Noyes, G.

    2009-11-09

    A new rapid method for the determination of actinides in soil and sediment samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used for samples up to 2 grams in emergency response situations. The actinides in soil method utilizes a rapid sodium hydroxide fusion method, a lanthanum fluoride soil matrix removal step, and a streamlined column separation process with stacked TEVA, TRU and DGA Resin cartridges. Lanthanum was separated rapidly and effectively from Am and Cm on DGA Resin. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha sources are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency soil samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinides in soil results were reported within 4-5 hours with excellent quality.

  20. Analytical Quality by Design in pharmaceutical quality assurance: Development of a capillary electrophoresis method for the analysis of zolmitriptan and its impurities.

    PubMed

    Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra

    2015-11-01

    A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets.

  1. Spanish Multicenter Normative Studies (NEURONORMA Project): methods and sample characteristics.

    PubMed

    Peña-Casanova, Jordi; Blesa, Rafael; Aguilar, Miquel; Gramunt-Fombuena, Nina; Gómez-Ansón, Beatriz; Oliva, Rafael; Molinuevo, José Luis; Robles, Alfredo; Barquero, María Sagrario; Antúnez, Carmen; Martínez-Parra, Carlos; Frank-García, Anna; Fernández, Manuel; Alfonso, Verónica; Sol, Josep M

    2009-06-01

    This paper describes the methods and sample characteristics of a series of Spanish normative studies (The NEURONORMA project). The primary objective of our research was to collect normative and psychometric information on a sample of people aged over 49 years. The normative information was based on a series of selected, but commonly used, neuropsychological tests covering attention, language, visuo-perceptual abilities, constructional tasks, memory, and executive functions. A sample of 356 community dwelling individuals was studied. Demographics, socio-cultural, and medical data were collected. Cognitive normality was validated via informants and a cognitive screening test. Norms were calculated for midpoint age groups. Effects of age, education, and sex were determined. The use of these norms should improve neuropsychological diagnostic accuracy in older Spanish subjects. These data may also be of considerable use for comparisons with other normative studies. Limitations of these normative data are also commented on.

  2. Evaluation of sample preservation methods for poultry manure.

    PubMed

    Pan, J; Fadel, J G; Zhang, R; El-Mashad, H M; Ying, Y; Rumsey, T

    2009-08-01

    When poultry manure is collected but cannot be analyzed immediately, a method for storing the manure is needed to ensure accurate subsequent analyses. This study has 3 objectives: (1) to investigate effects of 4 poultry manure sample preservation methods (refrigeration, freezing, acidification, and freeze-drying) on the compositional characteristics of poultry manure; (2) to determine compositional differences in fresh manure with manure samples at 1, 2, and 3 d of accumulation under bird cages; and (3) to assess the influence of 14-d freezing storage on the composition of manure when later exposed to 25 degrees C for 7 d as compared with fresh manure. All manure samples were collected from a layer house. Analyses performed on the manure samples included total Kjeldahl nitrogen, uric acid nitrogen, ammonia nitrogen, and urea nitrogen. In experiment 1, the storage methods most similar to fresh manure, in order of preference, were freezing, freeze-drying, acidification, and refrigeration. Thoroughly mixing manure samples and compressing them to 2 to 3 mm is important for the freezing and freeze-dried samples. In general, refrigeration was found unacceptable for nitrogen analyses. A significant effect (P < 0.0001) of time for refrigeration was found on uric acid nitrogen and ammonia nitrogen. In experiment 2, the total Kjeldahl nitrogen and uric acid nitrogen were significantly lower (P < 0.05) for 1, 2, and 3 d of accumulation compared with fresh manure. Manure after 1, 2, and 3 d of accumulation had similar nitrogen compositions. The results from experiment 3 show that nitrogen components from fresh manure samples and thawed samples from 14 d of freezing are similar at 7 d but high variability of nitrogen compositions during intermediate times from 0 to 7 d prevents the recommendation of freezing manure for use in subsequent experiments and warrants future experimentation. In conclusion, fresh poultry manure can be frozen for accurate subsequent nitrogen

  3. High assurance SPIRAL

    NASA Astrophysics Data System (ADS)

    Franchetti, Franz; Sandryhaila, Aliaksei; Johnson, Jeremy R.

    2014-06-01

    In this paper we introduce High Assurance SPIRAL to solve the last mile problem for the synthesis of high assurance implementations of controllers for vehicular systems that are executed in today's and future embedded and high performance embedded system processors. High Assurance SPIRAL is a scalable methodology to translate a high level specification of a high assurance controller into a highly resource-efficient, platform-adapted, verified control software implementation for a given platform in a language like C or C++. High Assurance SPIRAL proves that the implementation is equivalent to the specification written in the control engineer's domain language. Our approach scales to problems involving floating-point calculations and provides highly optimized synthesized code. It is possible to estimate the available headroom to enable assurance/performance trade-offs under real-time constraints, and enables the synthesis of multiple implementation variants to make attacks harder. At the core of High Assurance SPIRAL is the Hybrid Control Operator Language (HCOL) that leverages advanced mathematical constructs expressing the controller specification to provide high quality translation capabilities. Combined with a verified/certified compiler, High Assurance SPIRAL provides a comprehensive complete solution to the efficient synthesis of verifiable high assurance controllers. We demonstrate High Assurance SPIRALs capability by co-synthesizing proofs and implementations for attack detection and sensor spoofing algorithms and deploy the code as ROS nodes on the Landshark unmanned ground vehicle and on a Synthetic Car in a real-time simulator.

  4. Quality assurance program plan for radionuclide airborne emissions monitoring

    SciTech Connect

    Boom, R.J.

    1995-12-01

    This Quality Assurance Program Plan identifies quality assurance program requirements and addresses the various Westinghouse Hanford Company organizations and their particular responsibilities in regards to sample and data handling of radiological airborne emissions. This Quality Assurance Program Plan is prepared in accordance with and to written requirements.

  5. EPA'S COASTAL 2000 MONITORING PROGRAM IN THE NORTHEAST U.S.: CONSISTENCY IN METHODS AND QUALITY ASSURANCE

    EPA Science Inventory

    As part of EPA's national Coastal 2000 effort to estimate the ecological condition of our Nation's estuarine resources, sampling of the estuaries of the northeast United States (Delaware to Maine) began in the summer of 2000. Samples and data were collected to determine water qua...

  6. Quality assurance in bariatric surgery.

    PubMed

    Rendon, Stewart E; Pories, Walter J

    2005-08-01

    Quality assurance is a function that exists in manufacturing,engineering, and the service industry. Bariatric surgery is an undertaking with a special form of consumer product and service. In this day of limited resources and significant value exchanges among stakeholders (ie, patients, surgeons, third-party payers),the goal of the bariatric community is to deliver quality outcomes with safety, efficacy, and efficiency. The American Society for Bariatric Surgery and the Surgical Review Corporation, in conjunction with the bariatric community, will use quality assurance methods to produce quality outcomes that will satisfy the value exchanges of all stakeholders.

  7. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  8. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  9. A direct method for e-cigarette aerosol sample collection.

    PubMed

    Olmedo, Pablo; Navas-Acien, Ana; Hess, Catherine; Jarmul, Stephanie; Rule, Ana

    2016-08-01

    E-cigarette use is increasing in populations around the world. Recent evidence has shown that the aerosol produced by e-cigarettes can contain a variety of toxicants. Published studies characterizing toxicants in e-cigarette aerosol have relied on filters, impingers or sorbent tubes, which are methods that require diluting or extracting the sample in a solution during collection. We have developed a collection system that directly condenses e-cigarette aerosol samples for chemical and toxicological analyses. The collection system consists of several cut pipette tips connected with short pieces of tubing. The pipette tip-based collection system can be connected to a peristaltic pump, a vacuum pump, or directly to an e-cigarette user for the e-cigarette aerosol to flow through the system. The pipette tip-based system condenses the aerosol produced by the e-cigarette and collects a liquid sample that is ready for analysis without the need of intermediate extraction solutions. We tested a total of 20 e-cigarettes from 5 different brands commercially available in Maryland. The pipette tip-based collection system condensed between 0.23 and 0.53mL of post-vaped e-liquid after 150 puffs. The proposed method is highly adaptable, can be used during field work and in experimental settings, and allows collecting aerosol samples from a wide variety of e-cigarette devices, yielding a condensate of the likely exact substance that is being delivered to the lungs.

  10. Harmonisation of microbial sampling and testing methods for distillate fuels

    SciTech Connect

    Hill, G.C.; Hill, E.C.

    1995-05-01

    Increased incidence of microbial infection in distillate fuels has led to a demand for organisations such as the Institute of Petroleum to propose standards for microbiological quality, based on numbers of viable microbial colony forming units. Variations in quality requirements, and in the spoilage significance of contaminating microbes plus a tendency for temporal and spatial changes in the distribution of microbes, makes such standards difficult to implement. The problem is compounded by a diversity in the procedures employed for sampling and testing for microbial contamination and in the interpretation of the data obtained. The following paper reviews these problems and describes the efforts of The Institute of Petroleum Microbiology Fuels Group to address these issues and in particular to bring about harmonisation of sampling and testing methods. The benefits and drawbacks of available test methods, both laboratory based and on-site, are discussed.

  11. Quality Assurance Project Plan Development Tool

    EPA Pesticide Factsheets

    This tool contains information designed to assist in developing a Quality Assurance (QA) Project Plan that meets EPA requirements for projects that involve surface or groundwater monitoring and/or the collection and analysis of water samples.

  12. Data-collection methods, quality-assurance data, and site considerations for total dissolved gas monitoring, lower Columbia River, Oregon and Washington, 2000

    USGS Publications Warehouse

    Tanner, Dwight Q.; Johnston, Matthew W.

    2001-01-01

    Excessive total dissolved gas pressure can cause gas-bubble trauma in fish downstream from dams on the Columbia River. In cooperation with the U.S. Army Corps of Engineers, the U.S. Geological Survey collected data on total dissolved gas pressure, barometric pressure, water temperature, and probe depth at eight stations on the lower Columbia River from the John Day forebay (river mile 215.6) to Camas (river mile 121.7) in water year 2000 (October 1, 1999, to September 30, 2000). These data are in the databases of the U.S. Geological Survey and the U.S. Army Corps of Engineers. Methods of data collection, review, and processing, and quality-assurance data are presented in this report.

  13. Ontario's Quality Assurance Framework: A Critical Response

    ERIC Educational Resources Information Center

    Heap, James

    2013-01-01

    Ontario's Quality Assurance Framework (QAF) is reviewed and found not to meet all five criteria proposed for a strong quality assurance system focused on student learning. The QAF requires a statement of student learning outcomes and a method and means of assessing those outcomes, but it does not require that data on achievement of intended…

  14. Ontario's Quality Assurance Framework: A Critical Response

    ERIC Educational Resources Information Center

    Heap, James

    2013-01-01

    Ontario's Quality Assurance Framework (QAF) is reviewed and found not to meet all five criteria proposed for a strong quality assurance system focused on student learning. The QAF requires a statement of student learning outcomes and a method and means of assessing those outcomes, but it does not require that data on achievement of intended…

  15. The experience sampling method: Investigating students' affective experience

    NASA Astrophysics Data System (ADS)

    Nissen, Jayson M.; Stetzer, MacKenzie R.; Shemwell, Jonathan T.

    2013-01-01

    Improving non-cognitive outcomes such as attitudes, efficacy, and persistence in physics courses is an important goal of physics education. This investigation implemented an in-the-moment surveying technique called the Experience Sampling Method (ESM) [1] to measure students' affective experience in physics. Measurements included: self-efficacy, cognitive efficiency, activation, intrinsic motivation, and affect. Data are presented that show contrasts in students' experiences (e.g., in physics vs. non-physics courses).

  16. Rock sampling. [method for controlling particle size distribution

    NASA Technical Reports Server (NTRS)

    Blum, P. (Inventor)

    1971-01-01

    A method for sampling rock and other brittle materials and for controlling resultant particle sizes is described. The method involves cutting grooves in the rock surface to provide a grouping of parallel ridges and subsequently machining the ridges to provide a powder specimen. The machining step may comprise milling, drilling, lathe cutting or the like; but a planing step is advantageous. Control of the particle size distribution is effected primarily by changing the height and width of these ridges. This control exceeds that obtainable by conventional grinding.

  17. Recent advances in sample preparation techniques for effective bioanalytical methods.

    PubMed

    Kole, Prashant Laxman; Venkatesh, Gantala; Kotecha, Jignesh; Sheshala, Ravi

    2011-01-01

    This paper reviews the recent developments in bioanalysis sample preparation techniques and gives an update on basic principles, theory, applications and possibilities for automation, and a comparative discussion on the advantages and limitation of each technique. Conventional liquid-liquid extraction (LLE), protein precipitation (PP) and solid-phase extraction (SPE) techniques are now been considered as methods of the past. The last decade has witnessed a rapid development of novel sample preparation techniques in bioanalysis. Developments in SPE techniques such as selective sorbents and in the overall approach to SPE, such as hybrid SPE and molecularly imprinted polymer SPE, have been addressed. Considerable literature has been published in the area of solid-phase micro-extraction and its different versions, e.g. stir bar sorptive extraction, and their application in the development of selective and sensitive bioanalytical methods. Techniques such as dispersive solid-phase extraction, disposable pipette extraction and micro-extraction by packed sorbent offer a variety of extraction phases and provide unique advantages to bioanalytical methods. On-line SPE utilizing column-switching techniques is rapidly gaining acceptance in bioanalytical applications. PP sample preparation techniques such as PP filter plates/tubes offer many advantages like removal of phospholipids and proteins in plasma/serum. Newer approaches to conventional LLE techniques (salting-out LLE) are also covered in this review article.

  18. Employer-Led Quality Assurance

    ERIC Educational Resources Information Center

    Tyszko, Jason A.

    2017-01-01

    Recent criticism of higher education accreditation has prompted calls for reform and sparked interest in piloting alternative quality assurance methods that better address student learning and employment outcomes. Although this debate has brought much needed attention to improving the outcomes of graduates and safeguarding federal investment in…

  19. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  20. Simplified sample preparation method for triclosan and methyltriclosan determination in biota and foodstuff samples.

    PubMed

    Canosa, P; Rodríguez, I; Rubí, E; Ramil, M; Cela, R

    2008-04-25

    An improved method for the determination of triclosan (TCS) and methyltriclosan (MTCS) in fish and foodstuff samples is presented. Analytes were simultaneously extracted and purified using the matrix solid-phase dispersion (MSPD) technique, and then selectively determined by gas chromatography with tandem mass spectrometry (GC-MS/MS). Several combinations of dispersants, clean-up co-sorbents and extraction solvents were tested in order to obtain lipid-free extracts and quantitative recoveries for TCS and MTCS. Under optimised conditions, 0.5 g samples were dispersed using 1.5 g of neutral silica in a mortar with a pestle, and transferred to a polypropylene cartridge containing 3 g of silica impregnated with 10% of sulphuric acid (SiO2-H2SO4, 10%, w/w). Analytes were recovered with 10 mL of dichloromethane whereas lipids were oxidized in the layer of acidic silica. The extract was concentrated to dryness and re-constituted with 1 mL of ethyl acetate. Then, a fraction of 0.5 mL was mixed with 50 microL of N-methyl-N-(tert-butyldimethylsilyl)trifluoroacetamide (MTBSTFA) and injected in the GC-MS/MS system. The developed method provided absolute recoveries between 77 and 120% for different samples spiked at the low ng g(-1) level, quantification limits in the range of 1-2 ng g(-1) and a considerable simplicity in comparison with previously developed sample preparation approaches. Experiments carried out placing sliced food samples in direct contact with TCS-treated kitchenware surfaces showed the capability of the biocide to migrate into foodstuffs.

  1. Rapid separation method for actinides in emergency air filter samples.

    PubMed

    Maxwell, Sherrod L; Culligan, Brian K; Noyes, Gary W

    2010-12-01

    A new rapid method for the determination of actinides and strontium in air filter samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used in emergency response situations. The actinides and strontium in air filter method utilizes a rapid acid digestion method and a streamlined column separation process with stacked TEVA, TRU and Sr Resin cartridges. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha emitters are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The purified (90)Sr fractions are mounted directly on planchets and counted by gas flow proportional counting. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency air filter samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinide and (90)Sr in air filter results were reported in less than 4 h with excellent quality. Copyright 2010 Elsevier Ltd. All rights reserved.

  2. RAPID SEPARATION METHOD FOR ACTINIDES IN EMERGENCY AIR FILTER SAMPLES

    SciTech Connect

    Maxwell, S.; Noyes, G.; Culligan, B.

    2010-02-03

    A new rapid method for the determination of actinides and strontium in air filter samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used in emergency response situations. The actinides and strontium in air filter method utilizes a rapid acid digestion method and a streamlined column separation process with stacked TEVA, TRU and Sr Resin cartridges. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha emitters are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The purified {sup 90}Sr fractions are mounted directly on planchets and counted by gas flow proportional counting. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency air filter samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinide and {sup 90}Sr in air filter results were reported in {approx}4 hours with excellent quality.

  3. Cool walking: a new Markov chain Monte Carlo sampling method.

    PubMed

    Brown, Scott; Head-Gordon, Teresa

    2003-01-15

    Effective relaxation processes for difficult systems like proteins or spin glasses require special simulation techniques that permit barrier crossing to ensure ergodic sampling. Numerous adaptations of the venerable Metropolis Monte Carlo (MMC) algorithm have been proposed to improve its sampling efficiency, including various hybrid Monte Carlo (HMC) schemes, and methods designed specifically for overcoming quasi-ergodicity problems such as Jump Walking (J-Walking), Smart Walking (S-Walking), Smart Darting, and Parallel Tempering. We present an alternative to these approaches that we call Cool Walking, or C-Walking. In C-Walking two Markov chains are propagated in tandem, one at a high (ergodic) temperature and the other at a low temperature. Nonlocal trial moves for the low temperature walker are generated by first sampling from the high-temperature distribution, then performing a statistical quenching process on the sampled configuration to generate a C-Walking jump move. C-Walking needs only one high-temperature walker, satisfies detailed balance, and offers the important practical advantage that the high and low-temperature walkers can be run in tandem with minimal degradation of sampling due to the presence of correlations. To make the C-Walking approach more suitable to real problems we decrease the required number of cooling steps by attempting to jump at intermediate temperatures during cooling. We further reduce the number of cooling steps by utilizing "windows" of states when jumping, which improves acceptance ratios and lowers the average number of cooling steps. We present C-Walking results with comparisons to J-Walking, S-Walking, Smart Darting, and Parallel Tempering on a one-dimensional rugged potential energy surface in which the exact normalized probability distribution is known. C-Walking shows superior sampling as judged by two ergodic measures.

  4. A Study to Develop a Method of Assessing Military Hospital Health Care Delivery Performance for Use in a Quality Assurance Program.

    DTIC Science & Technology

    1981-05-01

    assur- ance, were high costs and resistance by physicans and nurses to "cook book" review. Farrington, et al., stated that the process had become a game...to care, and impersonal treatment by the staff as the hospital’s major problems. These problem areas relate to quality assurance structure rather than

  5. Riverland ERA cleanup sampling and analysis plan

    SciTech Connect

    Heiden, C.E.

    1993-07-01

    This report describes the Riverland Expedited Response Action taking place at the Hanford Reservation. Characterization of potential waste sites within the Riverland ERA boundaries was conducted in October and November 1992. This sampling and analysis plan contains two parts: The field sampling plan (Part 1) and the quality assurance project plan (Part 2). The field sampling plan describes the activities to be performed, defines sample designation, and identifies sample analysis to be performed. The quality assurance project plan establishes data quality objectives, defines analytical methods and procedures and documentation requirements, and provides established technical procedures to be used for field sampling and measurement. The quality assurance project plan details all quality assurance/quality control procedures to be followed to ensure that usable and defensible data are collected.

  6. Small satellite product assurance

    NASA Astrophysics Data System (ADS)

    Demontlivault, J.; Cadelec, Jacques

    1993-01-01

    In order to increase the interest in small satellites, their cost must be reduced; reducing product assurance costs induced by quality requirements is a major objective. For a logical approach, small satellites are classified in three main categories: satellites for experimental operations with a short lifetime, operational satellites manufactured in small mass with long lifetime requirements, operational satellites (long lifetime required), of which only a few models are produced. The various requirements as regards the product assurance are examined for each satellite category: general requirements for space approach, reliability, electronic components, materials and processes, quality assurance, documentation, tests, and management. Ideal product assurance system integrates quality teams and engineering teams.

  7. A Novel Method for Sampling Alpha-Helical Protein Backbones

    DOE R&D Accomplishments Database

    Fain, Boris; Levitt, Michael

    2001-01-01

    We present a novel technique of sampling the configurations of helical proteins. Assuming knowledge of native secondary structure, we employ assembly rules gathered from a database of existing structures to enumerate the geometrically possible 3-D arrangements of the constituent helices. We produce a library of possible folds for 25 helical protein cores. In each case the method finds significant numbers of conformations close to the native structure. In addition we assign coordinates to all atoms for 4 of the 25 proteins. In the context of database driven exhaustive enumeration our method performs extremely well, yielding significant percentages of structures (0.02%--82%) within 6A of the native structure. The method's speed and efficiency make it a valuable contribution towards the goal of predicting protein structure.

  8. A time domain sampling method for inverse acoustic scattering problems

    NASA Astrophysics Data System (ADS)

    Guo, Yukun; Hömberg, Dietmar; Hu, Guanghui; Li, Jingzhi; Liu, Hongyu

    2016-06-01

    This work concerns the inverse scattering problems of imaging unknown/inaccessible scatterers by transient acoustic near-field measurements. Based on the analysis of the migration method, we propose efficient and effective sampling schemes for imaging small and extended scatterers from knowledge of time-dependent scattered data due to incident impulsive point sources. Though the inverse scattering problems are known to be nonlinear and ill-posed, the proposed imaging algorithms are totally ;direct; involving only integral calculations on the measurement surface. Theoretical justifications are presented and numerical experiments are conducted to demonstrate the effectiveness and robustness of our methods. In particular, the proposed static imaging functionals enhance the performance of the total focusing method (TFM) and the dynamic imaging functionals show analogous behavior to the time reversal inversion but without solving time-dependent wave equations.

  9. [A membrane filter sampling method for determining microbial air pollution].

    PubMed

    Cherneva, P; Kiranova, A

    1996-01-01

    The method is a contribution in the evaluation of the exposition and the control of the standards for organic powders. The method concerns the sample-taking procedure and the analysis-making technique for determining of the concentration of the microbial pollution of the air. It is based on filtering of some amount of air through a membrane filter which is then processed for cultivating of microbial colonies on its surface. The results are obtained in number of microbial colonies per unit of air. The method presents opportunity to select and vary the filtered volume of air, to determine the respirable fraction, to determine the personal exposition, as well as for the simultaneous determining of the microbial pollution together with other important parameters of the particle pollutants of the air (metal, fibre and others).

  10. Nonuniform sampling of urodynamic signals: a comparison of different methods.

    PubMed

    Kocjan, T; van Mastrigt, R

    1994-01-01

    Several different techniques for urodynamic signal compression have been proposed in the last few years. Using these techniques it is possible to reduce the requirements for digital storage or transmission. There are a number of applications where it is essential to use such techniques in diagnostic and ambulatory urodynamics. The purpose of this study is to compare different techniques of urodynamic data compression. The so-called FAN, voltage triggered, two point projection and second difference methods. The comparison between the methods is based on 65 pressure, 46 uroflow and 18 surface electromyogram signals. The reduction ratio achieved for different allowable errors between the original and compressed signals is calculated and compared for the different techniques. Results show that it is possible to store urodynamic signals accurately at a low sampling rate, where FAN and voltage triggered methods seem to be superior to the rest.

  11. Novel method for pairing wood samples in choice tests.

    PubMed

    Oberst, Sebastian; Evans, Theodore A; Lai, Joseph C S

    2014-01-01

    Choice tests are a standard method to determine preferences in bio-assays, e.g. for food types and food additives such as bait attractants and toxicants. Choice between food additives can be determined only when the food substrate is sufficiently homogeneous. This is difficult to achieve for wood eating organisms as wood is a highly variable biological material, even within a tree species due to the age of the tree (e.g. sapwood vs. heartwood), and components therein (sugar, starch, cellulose and lignin). The current practice to minimise variation is to use wood from the same tree, yet the variation can still be large and the quantity of wood from one tree may be insufficient. We used wood samples of identical volume from multiple sources, measured three physical properties (dry weight, moisture absorption and reflected light intensity), then ranked and clustered the samples using fuzzy c-means clustering. A reverse analysis of the clustered samples found a high correlation between their physical properties and their source of origin. This suggested approach allows a quantifiable, consistent, repeatable, simple and quick method to maximize control over similarity of wood used in choice tests.

  12. Novel Method for Pairing Wood Samples in Choice Tests

    PubMed Central

    Oberst, Sebastian; Evans, Theodore A.; Lai, Joseph C. S.

    2014-01-01

    Choice tests are a standard method to determine preferences in bio-assays, e.g. for food types and food additives such as bait attractants and toxicants. Choice between food additives can be determined only when the food substrate is sufficiently homogeneous. This is difficult to achieve for wood eating organisms as wood is a highly variable biological material, even within a tree species due to the age of the tree (e.g. sapwood vs. heartwood), and components therein (sugar, starch, cellulose and lignin). The current practice to minimise variation is to use wood from the same tree, yet the variation can still be large and the quantity of wood from one tree may be insufficient. We used wood samples of identical volume from multiple sources, measured three physical properties (dry weight, moisture absorption and reflected light intensity), then ranked and clustered the samples using fuzzy c-means clustering. A reverse analysis of the clustered samples found a high correlation between their physical properties and their source of origin. This suggested approach allows a quantifiable, consistent, repeatable, simple and quick method to maximize control over similarity of wood used in choice tests. PMID:24551173

  13. Hand held sample tube manipulator, system and method

    DOEpatents

    Kenny, Donald V [Liberty Township, OH; Smith, Deborah L [Liberty Township, OH; Severance, Richard A [late of Columbus, OH

    2001-01-01

    A manipulator apparatus, system and method for measuring analytes present in sample tubes. The manipulator apparatus includes a housing having a central bore with an inlet end and outlet end; a plunger mechanism with at least a portion thereof slideably disposed for reciprocal movement within the central bore, the plunger mechanism having a tubular gas channel with an inlet end and an outlet end, the gas channel inlet end disposed in the same direction as said inlet end of the central bore, wherein the inlet end of said plunger mechanism is adapted for movement so as to expel a sample tube inserted in the bore at the outlet end of the housing, the inlet end of the plunger mechanism is adapted for connection to gas supply; a first seal is disposed in the housing for sealing between the central bore and the plunger mechanism; a second seal is disposed at the outlet end of the housing for sealing between the central bore and a sample tube; a holder mounted on the housing for holding the sample tube; and a biasing mechanism for returning the plunger mechanism to a starting position.

  14. Field evaluation of endotoxin air sampling assay methods.

    PubMed

    Thorne, P S; Reynolds, S J; Milton, D K; Bloebaum, P D; Zhang, X; Whitten, P; Burmeister, L F

    1997-11-01

    This study tested the importance of filter media, extraction and assay protocol, and bioaerosol source on the determination of endotoxin under field conditions in swine and poultry confinement buildings. Multiple simultaneous air samples were collected using glass fiber (GF) and polycarbonate (PC) filters, and these were assayed using two methods in two separate laboratories: an endpoint chromogenic Limulus amebocyte lysate (LAL) assay (QCL) performed in water and a kinetic chromogenic LAL assay (KQCL) performed in buffer with resistant-parallel line estimation analysis (KLARE). In addition, two aqueous filter extraction methods were compared in the QCL assay: 120 min extraction at 22 degrees C with vigorous shaking and 30 min extraction at 68 degrees C with gentle rocking. These extraction methods yielded endotoxin activities that were not significantly different and were very highly correlated. Reproducibility of endotoxin determinations from duplicate air sampling filters was very high (Cronbach alpha all > 0.94). When analyzed by the QCL method GF filters yielded significantly higher endotoxin activity than PC filters. QCL and KLARE methods gave similar estimates for endotoxin activity from PC filters; however, GF filters analyzed by the QCL method yielded significantly higher endotoxin activity estimates, suggesting enhancement of the QCL assay or inhibition of the KLARE asay with GF filters. Correlation between QCL-GF and QCL-PC was high (r = 0.98) while that between KLARE-GF and KLARE-PC was moderate (r = 0.68). Analysis of variance demonstrated that assay methodology, filter-type, barn-type, and interactions between assay and filter-type and between assay and barn-type were important factors influencing endotoxin exposure assessment.

  15. Methods for parasitic protozoans detection in the environmental samples.

    PubMed

    Skotarczak, B

    2009-09-01

    The environmental route of transmission of many parasitic protozoa and their potential for producing large numbers of transmissive stages constitute persistent threats to public and veterinary health. Conventional and new immunological and molecular methods enable to assess the occurrence, prevalence, levels and sources of waterborne protozoa. Concentration, purification, and detection are the three key steps in all methods that have been approved for routine monitoring of waterborne cysts and oocysts. These steps have been optimized to such an extent that low levels of naturally occurring (oo)cysts of protozoan can be efficiently recovered from water. Ten years have passed since the United States Environmental Protection Agency (USEPA) introduced the 1622 and 1623 methods and used them to concentrate and detect the oocysts of Cryptosporidium and cysts of Giardia in water samples. Nevertheless, the methods still need studies and improvements. Pre-PCR processing procedures have been developed and they are still improved to remove or reduce the effects of PCR inhibitors. The progress in molecular methods allows to more precise distinction of species or simultaneous detection of several parasites, however, they are still not routinely used and need standardization. Standardized methods are required to maximize public health surveillance.

  16. Path Sampling Methods for Enzymatic Quantum Particle Transfer Reactions.

    PubMed

    Dzierlenga, M W; Varga, M J; Schwartz, S D

    2016-01-01

    The mechanisms of enzymatic reactions are studied via a host of computational techniques. While previous methods have been used successfully, many fail to incorporate the full dynamical properties of enzymatic systems. This can lead to misleading results in cases where enzyme motion plays a significant role in the reaction coordinate, which is especially relevant in particle transfer reactions where nuclear tunneling may occur. In this chapter, we outline previous methods, as well as discuss newly developed dynamical methods to interrogate mechanisms of enzymatic particle transfer reactions. These new methods allow for the calculation of free energy barriers and kinetic isotope effects (KIEs) with the incorporation of quantum effects through centroid molecular dynamics (CMD) and the full complement of enzyme dynamics through transition path sampling (TPS). Recent work, summarized in this chapter, applied the method for calculation of free energy barriers to reaction in lactate dehydrogenase (LDH) and yeast alcohol dehydrogenase (YADH). We found that tunneling plays an insignificant role in YADH but plays a more significant role in LDH, though not dominant over classical transfer. Additionally, we summarize the application of a TPS algorithm for the calculation of reaction rates in tandem with CMD to calculate the primary H/D KIE of YADH from first principles. We found that the computationally obtained KIE is within the margin of error of experimentally determined KIEs and corresponds to the KIE of particle transfer in the enzyme. These methods provide new ways to investigate enzyme mechanism with the inclusion of protein and quantum dynamics.

  17. Well fluid isolation and sample apparatus and method

    DOEpatents

    Schalla, Ronald; Smith, Ronald M.; Hall, Stephen H.; Smart, John E.

    1995-01-01

    The present invention specifically permits purging and/or sampling of a well but only removing, at most, about 25% of the fluid volume compared to conventional methods and, at a minimum, removing none of the fluid volume from the well. The invention is an isolation assembly that is inserted into the well. The isolation assembly is designed so that only a volume of fluid between the outside diameter of the isolation assembly and the inside diameter of the well over a fluid column height from the bottom of the well to the top of the active portion (lower annulus) is removed. A seal may be positioned above the active portion thereby sealing the well and preventing any mixing or contamination of inlet fluid with fluid above the packer. Purged well fluid is stored in a riser above the packer. Ports in the wall of the isolation assembly permit purging and sampling of the lower annulus along the height of the active portion.

  18. [Wound microbial sampling methods in surgical practice, imprint techniques].

    PubMed

    Chovanec, Z; Veverková, L; Votava, M; Svoboda, J; Peštál, A; Doležel, J; Jedlička, V; Veselý, M; Wechsler, J; Čapov, I

    2012-12-01

    The wound is a damage of tissue. The process of healing is influenced by many systemic and local factors. The most crucial and the most discussed local factor of wound healing is infection. Surgical site infection in the wound is caused by micro-organisms. This information is known for many years, however the conditions leading to an infection occurrence have not been sufficiently described yet. Correct sampling technique, correct storage, transportation, evaluation, and valid interpretation of these data are very important in clinical practice. There are many methods for microbiological sampling, but the best one has not been yet identified and validated. We aim to discuss the problem with the focus on the imprint technique.

  19. Miniaturized sample preparation method for determination of amphetamines in urine.

    PubMed

    Nishida, Manami; Namera, Akira; Yashiki, Mikio; Kimura, Kojiro

    2004-07-16

    A simple and miniaturized sample preparation method for determination of amphetamines in urine was developed using on-column derivatization and gas chromatography-mass spectrometry (GC-MS). Urine was directly applied to the extraction column that was pre-packed with Extrelut and sodium carbonate. Amphetamine (AP) and methamphetamine (MA) in urine were adsorbed on the surface of Extrelut. AP and MA were then converted to a free base and derivatized to N-propoxycarbonyl derivatives using propylchloroformate on the column. Pentadeuterated MA was used as an internal standard. The recoveries of AP and MA from urine were 100 and 102%, respectively. The calibration curves showed linearity in the range of 0.50-50 microg/mL for AP and MA in urine. When urine samples containing two different concentrations (0.50 and 5.0 microg/mL) of AP and MA were determined, the intra-day and inter-day coefficients of variation were 1.4-7.7%. This method was applied to 14 medico-legal cases of MA intoxication. The results were compared and a good agreement was obtained with a HPLC method.

  20. Vadose Zone Sampling Methods for Detection of Preferential Pesticides Transport

    NASA Astrophysics Data System (ADS)

    Peranginangin, N.; Richards, B. K.; Steenhuis, T. S.

    2003-12-01

    Leaching of agricultural applied chemicals through the vadose zone is a major cause for the occurrence of agrichemicals in groundwater. Accurate soil water sampling methods are needed to ensure meaningful monitoring results, especially for soils that have significant preferential flow paths. The purpose of this study was to assess the capability and the effectiveness of various soil water sampling methods in detecting preferential transport of pesticides in a strongly-structured silty clay loam (Hudson series) soil. Soil water sampling devices tested were wick pan and gravity pan lysimeters, tile lines, porous ceramic cups, and pipe lysimeters; all installed at 45 to105 cm depth below the ground surface. A reasonable worse-case scenario was tested by applying a simulated rain storm soon after pesticides were sprayed at agronomic rates. Herbicides atrazine (6-chloro-N2-ethyl-N4-isopropyl-1,3,5-triazine-2,4-diamine) and 2,4-D (2,4-dichloro-phenoxyacetic acid) were chosen as model compounds. Chloride (KCl) tracer was used to determine spatial and temporal distribution of non-reactive solute and water as well as a basis for determining the retardation in pesticides movement. Results show that observed pesticide mobility was much greater than would be predicted by uniform flow. Under relatively high soil moisture conditions, gravity and wick pan lysimeters had comparably good collection efficiencies, whereas the wick samplers had an advantage over gravity driven sampler when the soil moisture content was below field capacity. Pipe lysimeters had breakthrough patterns that were similar to pan samplers. At small plot scale, tile line samplers tended to underestimate solute concentration because of water dilution around the samplers. The use of porous cup samplers performed poorly because of their sensitivity to local profile characteristics: only by chance can they intercept and sample the preferential flow paths that are critical to transport. Wick sampler had the least

  1. Method optimization for fecal sample collection and fecal DNA extraction.

    PubMed

    Mathay, Conny; Hamot, Gael; Henry, Estelle; Georges, Laura; Bellora, Camille; Lebrun, Laura; de Witt, Brian; Ammerlaan, Wim; Buschart, Anna; Wilmes, Paul; Betsou, Fay

    2015-04-01

    This is the third in a series of publications presenting formal method validation for biospecimen processing in the context of accreditation in laboratories and biobanks. We report here optimization of a stool processing protocol validated for fitness-for-purpose in terms of downstream DNA-based analyses. Stool collection was initially optimized in terms of sample input quantity and supernatant volume using canine stool. Three DNA extraction methods (PerkinElmer MSM I®, Norgen Biotek All-In-One®, MoBio PowerMag®) and six collection container types were evaluated with human stool in terms of DNA quantity and quality, DNA yield, and its reproducibility by spectrophotometry, spectrofluorometry, and quantitative PCR, DNA purity, SPUD assay, and 16S rRNA gene sequence-based taxonomic signatures. The optimal MSM I protocol involves a 0.2 g stool sample and 1000 μL supernatant. The MSM I extraction was superior in terms of DNA quantity and quality when compared to the other two methods tested. Optimal results were obtained with plain Sarstedt tubes (without stabilizer, requiring immediate freezing and storage at -20°C or -80°C) and Genotek tubes (with stabilizer and RT storage) in terms of DNA yields (total, human, bacterial, and double-stranded) according to spectrophotometry and spectrofluorometry, with low yield variability and good DNA purity. No inhibitors were identified at 25 ng/μL. The protocol was reproducible in terms of DNA yield among different stool aliquots. We validated a stool collection method suitable for downstream DNA metagenomic analysis. DNA extraction with the MSM I method using Genotek tubes was considered optimal, with simple logistics in terms of collection and shipment and offers the possibility of automation. Laboratories and biobanks should ensure protocol conditions are systematically recorded in the scope of accreditation.

  2. Drum plug piercing and sampling device and method

    DOEpatents

    Counts, Kevin T [Aiken, SC

    2011-04-26

    An apparatus and method for piercing a drum plug of a drum in order to sample and/or vent gases that may accumulate in a space of the drum is provided. The drum is not damaged and can be reused since the pierced drum plug can be subsequently replaced. The apparatus includes a frame that is configured for engagement with the drum. A cylinder actuated by a fluid is mounted to the frame. A piercer is placed into communication with the cylinder so that actuation of the cylinder causes the piercer to move in a linear direction so that the piercer may puncture the drum plug of the drum.

  3. A GPU code for analytic continuation through a sampling method

    NASA Astrophysics Data System (ADS)

    Nordström, Johan; Schött, Johan; Locht, Inka L. M.; Di Marco, Igor

    We here present a code for performing analytic continuation of fermionic Green's functions and self-energies as well as bosonic susceptibilities on a graphics processing unit (GPU). The code is based on the sampling method introduced by Mishchenko et al. (2000), and is written for the widely used CUDA platform from NVidia. Detailed scaling tests are presented, for two different GPUs, in order to highlight the advantages of this code with respect to standard CPU computations. Finally, as an example of possible applications, we provide the analytic continuation of model Gaussian functions, as well as more realistic test cases from many-body physics.

  4. Sample Size for Assessing Agreement between Two Methods of Measurement by Bland-Altman Method.

    PubMed

    Lu, Meng-Jie; Zhong, Wei-Hua; Liu, Yu-Xiu; Miao, Hua-Zhang; Li, Yong-Chang; Ji, Mu-Huo

    2016-11-01

    The Bland-Altman method has been widely used for assessing agreement between two methods of measurement. However, it remains unsolved about sample size estimation. We propose a new method of sample size estimation for Bland-Altman agreement assessment. According to the Bland-Altman method, the conclusion on agreement is made based on the width of the confidence interval for LOAs (limits of agreement) in comparison to predefined clinical agreement limit. Under the theory of statistical inference, the formulae of sample size estimation are derived, which depended on the pre-determined level of α, β, the mean and the standard deviation of differences between two measurements, and the predefined limits. With this new method, the sample sizes are calculated under different parameter settings which occur frequently in method comparison studies, and Monte-Carlo simulation is used to obtain the corresponding powers. The results of Monte-Carlo simulation showed that the achieved powers could coincide with the pre-determined level of powers, thus validating the correctness of the method. The method of sample size estimation can be applied in the Bland-Altman method to assess agreement between two methods of measurement.

  5. Introduction: Cybersecurity and Software Assurance Minitrack

    SciTech Connect

    Burns, Luanne; George, Richard; Linger, Richard C

    2015-01-01

    Modern society is dependent on software systems of remarkable scope and complexity. Yet methods for assuring their security and functionality have not kept pace. The result is persistent compromises and failures despite best efforts. Cybersecurity methods must work together for situational awareness, attack prevention and detection, threat attribution, minimization of consequences, and attack recovery. Because defective software cannot be secure, assurance technologies must play a central role in cybersecurity approaches. There is increasing recognition of the need for rigorous methods for cybersecurity and software assurance. The goal of this minitrack is to develop science foundations, technologies, and practices that can improve the security and dependability of complex systems.

  6. Method for testing earth samples for contamination by organic contaminants

    DOEpatents

    Schabron, J.F.

    1996-10-01

    Provided is a method for testing earth samples for contamination by organic contaminants, and particularly for aromatic compounds such as those found in diesel fuel and other heavy fuel oils, kerosene, creosote, coal oil, tars and asphalts. A drying step is provided in which a drying agent is contacted with either the earth sample or a liquid extract phase to reduce to possibility of false indications of contamination that could occur when humic material is present in the earth sample. This is particularly a problem when using relatively safe, non-toxic and inexpensive polar solvents such as isopropyl alcohol since the humic material tends to be very soluble in those solvents when water is present. Also provided is an ultraviolet spectroscopic measuring technique for obtaining an indication as to whether a liquid extract phase contains aromatic organic contaminants. In one embodiment, the liquid extract phase is subjected to a narrow and discrete band of radiation including a desired wave length and the ability of the liquid extract phase to absorb that wavelength of ultraviolet radiation is measured to provide an indication of the presence of aromatic organic contaminants. 2 figs.

  7. Eigenvector method for umbrella sampling enables error analysis

    NASA Astrophysics Data System (ADS)

    Thiede, Erik H.; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R.

    2016-08-01

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence.

  8. Method for testing earth samples for contamination by organic contaminants

    DOEpatents

    Schabron, John F.

    1996-01-01

    Provided is a method for testing earth samples for contamination by organic contaminants, and particularly for aromatic compounds such as those found in diesel fuel and other heavy fuel oils, kerosene, creosote, coal oil, tars and asphalts. A drying step is provided in which a drying agent is contacted with either the earth sample or a liquid extract phase to reduce to possibility of false indications of contamination that could occur when humic material is present in the earth sample. This is particularly a problem when using relatively safe, non-toxic and inexpensive polar solvents such as isopropyl alcohol since the humic material tends to be very soluble in those solvents when water is present. Also provided is an ultraviolet spectroscopic measuring technique for obtaining an indication as to whether a liquid extract phase contains aromatic organic contaminants. In one embodiment, the liquid extract phase is subjected to a narrow and discrete band of radiation including a desired wave length and the ability of the liquid extract phase to absorb that wavelength of ultraviolet radiation is measured to provide an indication of the presence of aromatic organic contaminants.

  9. Eigenvector method for umbrella sampling enables error analysis

    PubMed Central

    Thiede, Erik H.; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R.

    2016-01-01

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence. PMID:27586912

  10. Eigenvector method for umbrella sampling enables error analysis.

    PubMed

    Thiede, Erik H; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R

    2016-08-28

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence.

  11. Personal sampling of airborne particles: method performance and data quality.

    PubMed

    Janssen, N A; Hoek, G; Harssema, H; Brunekreef, B

    1998-01-01

    A study of personal exposure to respirable particles (PM10) and fine particles (FP) was conducted in groups of 50-70 year-old adults and primary school children in the Netherlands. Four to eight personal measurements per subject were conducted, on weekdays only. Averaging time was 24 hours. Method performance was evaluated regarding compliance, flow, weighing procedure, field blanks and co-located operation of the personal samplers with stationary methods. Furthermore, the possibility that subjects change their behavior due to the wearing of personal sampling equipment was studied by comparing time activity on days of personal sampling with time activity other weekdays. Compliance was high; 95% of the subjects who agreed to continue participating after the first measurement, successfully completed the study, and, expect for the first two days of FP sampling, over 90% of all personal measurements were successful. All pre and post sampling flow readings were within 10% of the required flow rate of 4 L/min. For PM10 precision of the gravimetric analyses was 2.8 microgram/m3 and 0.7 micrograms/m3 for filters weighted on an analytical and a micro-balance respectively. The detection limit was 10.8 micrograms/m3 and 8.6 micrograms/m3 respectively. For FP, weighing precision was 0.4 micrograms/m3 and the detection limit was 5.3 micrograms/m3. All measurements were above the detection limit. Co-located operation of the personal sampler with stationary samplers gave highly correlated concentration (R > 0.90). Outdoor PM10 concentrations measured with the personal sampler were on average 4% higher compared to a Sierra Anderson (SA) inlet and 9% higher compared to a PM10 Harvard Impactor (HI). With the FP cyclone 6% higher classroom concentrations were measured compared to a PM2.5 HI. Adults spent significantly less time outdoor (0.5 hour) and more time at home (0.9 hour) on days of personal sampling compared to other weekdays. For children no significant differences in time

  12. Software Assurance Competency Model

    DTIC Science & Technology

    2013-03-01

    2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...for specific projects. L5: Analyze assurance technologies and contribute to the development of new ones. Assured Software Development L1

  13. Why Quality Assurance?

    PubMed Central

    Alexander, Leslie L.; Lewis, Nathan

    1981-01-01

    Quality assurance programs in radiology are essential and each radiologist must be committed to exert comprehensive efforts toward excellent quality control. Identification and evaluation of a radiological problem, corrective action, and good record keeping are essential features of a well-managed quality assurance program. This paper discusses the background and impact of these programs on providing safe radiologic services to patients. PMID:7218367

  14. Applicability Comparison of Methods for Acid Generation Assessment of Rock Samples

    NASA Astrophysics Data System (ADS)

    Oh, Chamteut; Ji, Sangwoo; Yim, Giljae; Cheong, Youngwook

    2014-05-01

    Minerals including various forms of sulfur could generate AMD (Acid Mine Drainage) or ARD (Acid Rock Drainage), which can have serious effects on the ecosystem and even on human when exposed to air and/or water. To minimize the hazards by acid drainage, it is necessary to assess in advance the acid generation possibility of rocks and estimate the amount of acid generation. Because of its relatively simple and effective experiment procedure, the method of combining the results of ABA (Acid Base Accounting) and NAG (Net Acid Generation) tests have been commonly used in determining acid drainage conditions. The simplicity and effectiveness of the above method however, are derived from massive assumptions of simplified chemical reactions and this often leads to results of classifying the samples as UC (Uncertain) which would then require additional experimental or field data to reclassify them properly. This paper therefore, attempts to find the reasons that cause samples to be classified as UC and suggest new series of experiments where samples can be reclassified appropriately. Study precedents on evaluating potential acid generation and neutralization capacity were reviewed and as a result three individual experiments were selected in the light of applicability and compatibility of minimizing unnecessary influence among other experiments. The proposed experiments include sulfur speciation, ABCC (Acid Buffering Characteristic Curve), and Modified NAG which are all improved versions of existing experiments of Total S, ANC (Acid Neutralizing Capacity), and NAG respectively. To assure the applicability of the experiments, 36 samples from 19 sites with diverse geologies, field properties, and weathering conditions were collected. The samples were then subject to existing experiments and as a result, 14 samples which either were classified as UC or could be used as a comparison group had been selected. Afterwards, the selected samples were used to conduct the suggested

  15. SU-F-P-28: A Method of Maximize the Noncoplanar Beam Orientations and Assure the Beam Delivery Clearance for Stereotactic Body Radiation Therapy (SBRT)

    SciTech Connect

    Zhu, J

    2016-06-15

    Purpose: Develop a method to maximize the noncoplanar beam orientations and assure the beam delivery clearance for SBRT, therefore, optimize the dose conformality to the target, increase the dose sparing to the critical normal organs and reduce the hot spots in the body. Methods: A SBRT body frame (Elekta, Stockholm, Sweden) was used for patient immobilization and target localization. The SBRT body frame has CT fiducials on its side frames. After patient’s CT scan, the radiation treatment isocenter was defined and its coordinators referring to the body frame was calculated in the radiation treatment planning process. Meanwhile, initial beam orientations were designed based on the patient target and critical organ anatomy. The body frame was put on the linear accelerator couch and positioned to the calculated isocenter. Initially designed beam orientations were manually measured by tuning the body frame position on the couch, the gantry and couch angles. The finalized beam orientations were put into the treatment planning for dosimetric calculations. Results: Without patient presence, an optimal set of beam orientations were designed and validated. The radiation treatment plan was optimized and guaranteed for delivery clearance. Conclusion: The developed method is beneficial and effective in SBRT treatment planning for individual patient. It first allows maximizing the achievable noncoplanar beam orientation space, therefore, optimize the treatment plan for specific patient. It eliminates the risk that a plan needs to be modified due to the gantry and couch collision during patient setup.

  16. BMAA extraction of cyanobacteria samples: which method to choose?

    PubMed

    Lage, Sandra; Burian, Alfred; Rasmussen, Ulla; Costa, Pedro Reis; Annadotter, Heléne; Godhe, Anna; Rydberg, Sara

    2016-01-01

    β-N-Methylamino-L-alanine (BMAA), a neurotoxin reportedly produced by cyanobacteria, diatoms and dinoflagellates, is proposed to be linked to the development of neurological diseases. BMAA has been found in aquatic and terrestrial ecosystems worldwide, both in its phytoplankton producers and in several invertebrate and vertebrate organisms that bioaccumulate it. LC-MS/MS is the most frequently used analytical technique in BMAA research due to its high selectivity, though consensus is lacking as to the best extraction method to apply. This study accordingly surveys the efficiency of three extraction methods regularly used in BMAA research to extract BMAA from cyanobacteria samples. The results obtained provide insights into possible reasons for the BMAA concentration discrepancies in previous publications. In addition and according to the method validation guidelines for analysing cyanotoxins, the TCA protein precipitation method, followed by AQC derivatization and LC-MS/MS analysis, is now validated for extracting protein-bound (after protein hydrolysis) and free BMAA from cyanobacteria matrix. BMAA biological variability was also tested through the extraction of diatom and cyanobacteria species, revealing a high variance in BMAA levels (0.0080-2.5797 μg g(-1) DW).

  17. Methods to maximise recovery of environmental DNA from water samples

    PubMed Central

    Gleeson, Dianne; Lintermans, Mark

    2017-01-01

    The environmental DNA (eDNA) method is a detection technique that is rapidly gaining credibility as a sensitive tool useful in the surveillance and monitoring of invasive and threatened species. Because eDNA analysis often deals with small quantities of short and degraded DNA fragments, methods that maximize eDNA recovery are required to increase detectability. In this study, we performed experiments at different stages of the eDNA analysis to show which combinations of methods give the best recovery rate for eDNA. Using Oriental weatherloach (Misgurnus anguillicaudatus) as a study species, we show that various combinations of DNA capture, preservation and extraction methods can significantly affect DNA yield. Filtration using cellulose nitrate filter paper preserved in ethanol or stored in a -20°C freezer and extracted with the Qiagen DNeasy kit outperformed other combinations in terms of cost and efficiency of DNA recovery. Our results support the recommendation to filter water samples within 24hours but if this is not possible, our results suggest that refrigeration may be a better option than freezing for short-term storage (i.e., 3–5 days). This information is useful in designing eDNA detection of low-density invasive or threatened species, where small variations in DNA recovery can signify the difference between detection success or failure. PMID:28604830

  18. Assuring NASA's Safety and Mission Critical Software

    NASA Technical Reports Server (NTRS)

    Deadrick, Wesley

    2015-01-01

    What is IV&V? Independent Verification and Validation (IV&V) is an objective examination of safety and mission critical software processes and products. Independence: 3 Key parameters: Technical Independence; Managerial Independence; Financial Independence. NASA IV&V perspectives: Will the system's software: Do what it is supposed to do?; Not do what it is not supposed to do?; Respond as expected under adverse conditions?. Systems Engineering: Determines if the right system has been built and that it has been built correctly. IV&V Technical Approaches: Aligned with IEEE 1012; Captured in a Catalog of Methods; Spans the full project lifecycle. IV&V Assurance Strategy: The IV&V Project's strategy for providing mission assurance; Assurance Strategy is driven by the specific needs of an individual project; Implemented via an Assurance Design; Communicated via Assurance Statements.

  19. Evaluation of field sampling and preservation methods for strontium-90 in ground water at the Idaho National Engineering Laboratory, Idaho

    USGS Publications Warehouse

    Cecil, L.D.; Knobel, L.L.; Wegner, S.J.; Moore, L.L.

    1989-01-01

    Water from four wells completed in the Snake River Plain aquifer was sampled as part of the U.S. Geological Survey 's quality assurance program to evaluate the effect of filtration and preservation methods on strontium-90 concentrations in groundwater at the Idaho National Engineering Laboratory. Water from each well was filtered through either a 0.45-micrometer membrane or a 0.1-micrometer membrane filter; unfiltered samples also were collected. Two sets of filtered and two sets of unfiltered samples was preserved in the field with reagent-grade hydrochloric acid and the other set of samples was not acidified. For water from wells with strontium-90 concentrations at or above the reporting level, 94% or more of the strontium-90 is in true solution or in colloidal particles smaller than 0.1 micrometer. These results suggest that within-laboratory reproducibility for strontium-90 in groundwater at the INEL is not significantly affected by changes in filtration and preservation methods used for sample collections. (USGS)

  20. Quality Assurance in Higher Education: Proposals for Consultation.

    ERIC Educational Resources Information Center

    Higher Education Funding Council for England, Bristol.

    This document sets out for consultation proposals for a revised method for quality assurance of teaching and learning in higher education. The proposals cover: (1) the objectives and principles of quality assurance; (2) an approach to quality assurance based on external audit principles; (3) the collection and publication of information; (4)…

  1. Quality Assurance in Higher Education: Proposals for Consultation.

    ERIC Educational Resources Information Center

    Higher Education Funding Council for England, Bristol.

    This document sets out for consultation proposals for a revised method for quality assurance of teaching and learning in higher education. The proposals cover: (1) the objectives and principles of quality assurance; (2) an approach to quality assurance based on external audit principles; (3) the collection and publication of information; (4)…

  2. Internal Quality Assurance System and Its Implementation in Kaunas College

    ERIC Educational Resources Information Center

    Misiunas, Mindaugas

    2007-01-01

    The article discusses the internal system of quality assurance and its implementation methods in Kaunas College. The issues of quality assurance are reviewed in the context of the European higher education area covering the three levels: European, national and institutional. The importance of quality assurance and its links with external…

  3. Automated Aqueous Sample Concentration Methods for in situ Astrobiological Instrumentation

    NASA Astrophysics Data System (ADS)

    Aubrey, A. D.; Grunthaner, F. J.

    2009-12-01

    The era of wet chemical experiments for in situ planetary science investigations is upon us, as evidenced by recent results from the surface of Mars by Phoenix’s microscopy, electrochemistry, and conductivity analyzer, MECA [1]. Studies suggest that traditional thermal volatilization methods for planetary science in situ investigations induce organic degradation during sample processing [2], an effect that is enhanced in the presence of oxidants [3]. Recent developments have trended towards adaptation of non-destructive aqueous extraction and analytical methods for future astrobiological instrumentation. Wet chemical extraction techniques under investigation include subcritical water extraction, SCWE [4], aqueous microwave assisted extraction, MAE, and organic solvent extraction [5]. Similarly, development of miniaturized analytical space flight instruments that require aqueous extracts include microfluidic capillary electrophoresis chips, μCE [6], liquid-chromatography mass-spectrometrometers, LC-MS [7], and life marker chips, LMC [8]. If organics are present on the surface of Mars, they are expected to be present at extremely low concentrations (parts-per-billion), orders of magnitude below the sensitivities of most flight instrument technologies. Therefore, it becomes necessary to develop and integrate concentration mechanisms for in situ sample processing before delivery to analytical flight instrumentation. We present preliminary results of automated solid-phase-extraction (SPE) sample purification and concentration methods for the treatment of highly saline aqueous soil extracts. These methods take advantage of the affinity of low molecular weight organic compounds with natural and synthetic scavenger materials. These interactions allow for the separation of target organic analytes from unfavorable background species (i.e. salts) during inline treatment, and a clever method for selective desorption is utilized to obtain concentrated solutions on the order

  4. Measurement of atmospheric mercury species with manual sampling and analysis methods in a case study in Indiana

    USGS Publications Warehouse

    Risch, M.R.; Prestbo, E.M.; Hawkins, L.

    2007-01-01

    Ground-level concentrations of three atmospheric mercury species were measured using manual sampling and analysis to provide data for estimates of mercury dry deposition. Three monitoring stations were operated simultaneously during winter, spring, and summer 2004, adjacent to three mercury wet-deposition monitoring stations in northern, central, and southern Indiana. The monitoring locations differed in land-use setting and annual mercury-emissions level from nearby sources. A timer-controlled air-sampling system that contained a three-part sampling train was used to isolate reactive gaseous mercury, particulate-bound mercury, and elemental mercury. The sampling trains were exchanged every 6 days, and the mercury species were quantified in a laboratory. A quality-assurance study indicated the sampling trains could be held at least 120 h without a significant change in reactive gaseous or particulate-bound mercury concentrations. The manual sampling method was able to provide valid mercury concentrations in 90 to 95% of samples. Statistical differences in mercury concentrations were observed during the project. Concentrations of reactive gaseous and elemental mercury were higher in the daytime samples than in the nighttime samples. Concentrations of reactive gaseous mercury were higher in winter than in summer and were highest at the urban monitoring location. The results of this case study indicated manual sampling and analysis could be a reliable method for measurement of atmospheric mercury species and has the capability for supplying representative concentrations in an effective manner from a long-term deposition-monitoring network. ?? 2007 Springer Science+Business Media B.V.

  5. Passive sampling methods for contaminated sediments: Scientific rationale supporting use of freely dissolved concentrations

    PubMed Central

    Mayer, Philipp; Parkerton, Thomas F; Adams, Rachel G; Cargill, John G; Gan, Jay; Gouin, Todd; Gschwend, Philip M; Hawthorne, Steven B; Helm, Paul; Witt, Gesine; You, Jing; Escher, Beate I

    2014-01-01

    Passive sampling methods (PSMs) allow the quantification of the freely dissolved concentration (Cfree) of an organic contaminant even in complex matrices such as sediments. Cfree is directly related to a contaminant's chemical activity, which drives spontaneous processes including diffusive uptake into benthic organisms and exchange with the overlying water column. Consequently, Cfree provides a more relevant dose metric than total sediment concentration. Recent developments in PSMs have significantly improved our ability to reliably measure even very low levels of Cfree. Application of PSMs in sediments is preferably conducted in the equilibrium regime, where freely dissolved concentrations in the sediment are well-linked to the measured concentration in the sampler via analyte-specific partition ratios. The equilibrium condition can then be assured by measuring a time series or a single time point using passive samplers with different surface to volume ratios. Sampling in the kinetic regime is also possible and generally involves the application of performance reference compounds for the calibration. Based on previous research on hydrophobic organic contaminants, it is concluded that Cfree allows a direct assessment of 1) contaminant exchange and equilibrium status between sediment and overlying water, 2) benthic bioaccumulation, and 3) potential toxicity to benthic organisms. Thus, the use of PSMs to measure Cfree provides an improved basis for the mechanistic understanding of fate and transport processes in sediments and has the potential to significantly improve risk assessment and management of contaminated sediments. Integr Environ Assess Manag 2014;10:197–209. © 2014 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC. PMID:24288295

  6. Creating and Evaluating a Hypertext System of Documenting Analytical Test Methods in a Chemical Plant Quality Assurance Laboratory.

    ERIC Educational Resources Information Center

    White, Charles E., Jr.

    The purpose of this study was to develop and implement a hypertext documentation system in an industrial laboratory and to evaluate its usefulness by participative observation and a questionnaire. Existing word-processing test method documentation was converted directly into a hypertext format or "hyperdocument." The hyperdocument was designed and…

  7. Metrology: Measurement Assurance Program Guidelines

    NASA Technical Reports Server (NTRS)

    Eicke, W. G.; Riley, J. P.; Riley, K. J.

    1995-01-01

    The 5300.4 series of NASA Handbooks for Reliability and Quality Assurance Programs have provisions for the establishment and utilization of a documented metrology system to control measurement processes and to provide objective evidence of quality conformance. The intent of these provisions is to assure consistency and conformance to specifications and tolerances of equipment, systems, materials, and processes procured and/or used by NASA, its international partners, contractors, subcontractors, and suppliers. This Measurement Assurance Program (MAP) guideline has the specific objectives to: (1) ensure the quality of measurements made within NASA programs; (2) establish realistic measurement process uncertainties; (3) maintain continuous control over the measurement processes; and (4) ensure measurement compatibility among NASA facilities. The publication addresses MAP methods as applied within and among NASA installations and serves as a guide to: control measurement processes at the local level (one facility); conduct measurement assurance programs in which a number of field installations are joint participants; and conduct measurement integrity (round robin) experiments in which a number of field installations participate to assess the overall quality of particular measurement processes at a point in time.

  8. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such a...

  9. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such a...

  10. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  11. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  12. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  13. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    PubMed

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  14. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    PubMed Central

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  15. European guidelines for quality assurance in colorectal cancer screening and diagnosis. First Edition--Principles of evidence assessment and methods for reaching recommendations.

    PubMed

    Minozzi, S; Armaroli, P; Segnan, N

    2012-09-01

    Multidisciplinary, evidence-based guidelines for quality assurance in colorectal cancer screening and diagnosis have been developed by experts in a project coordinated by the International Agency for Research on Cancer. The full guideline document covers the entire process of population-based screening. It consists of 10 chapters and over 250 recommendations, graded according to the strength of the recommendation and the supporting evidence. The 450-page guidelines and the extensive evidence base have been published by the European Commission. The principles of evidence assessment and methods for reaching recommendations are presented here to promote international discussion and collaboration by making the principles and methods used in developing the guidelines known to a wider professional and scientific community. Following this methodology in the future updating of the guidelines has the potential to enhance the control of colorectal cancer through improvement in the quality and effectiveness of the screening process, including multidisciplinary diagnosis and management of the disease. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  17. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. © 2014

  18. Improved transition path sampling methods for simulation of rare events.

    PubMed

    Chopra, Manan; Malshe, Rohit; Reddy, Allam S; de Pablo, J J

    2008-04-14

    The free energy surfaces of a wide variety of systems encountered in physics, chemistry, and biology are characterized by the existence of deep minima separated by numerous barriers. One of the central aims of recent research in computational chemistry and physics has been to determine how transitions occur between deep local minima on rugged free energy landscapes, and transition path sampling (TPS) Monte-Carlo methods have emerged as an effective means for numerical investigation of such transitions. Many of the shortcomings of TPS-like approaches generally stem from their high computational demands. Two new algorithms are presented in this work that improve the efficiency of TPS simulations. The first algorithm uses biased shooting moves to render the sampling of reactive trajectories more efficient. The second algorithm is shown to substantially improve the accuracy of the transition state ensemble by introducing a subset of local transition path simulations in the transition state. The system considered in this work consists of a two-dimensional rough energy surface that is representative of numerous systems encountered in applications. When taken together, these algorithms provide gains in efficiency of over two orders of magnitude when compared to traditional TPS simulations.

  19. A comparison of methods for representing sparsely sampled random quantities.

    SciTech Connect

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  20. Process measurement assurance program

    SciTech Connect

    Pettit, R.B.

    1996-05-01

    This paper describes a new method for determining, improving, and controlling the measurement process errors (or measurement uncertainty) of a measurement system used to monitor product as it is manufactured. The method is called the Process Measurement Assurance Program (PMAP). It integrates metrology early into the product realization process and is a step beyond statistical process control (SPC), which monitors only the product. In this method, a control standard is used to continuously monitor the status of the measurement system. Analysis of the control standard data allow the determination of the measurement error inherent in the product data and allow one to separate the variability in the manufacturing process from variability in the measurement process. These errors can be then associated with either the measurement equipment, variability of the measurement process, operator bias, or local environmental effects. Another goal of PMAP is to determine appropriate re-calibration intervals for the measurement system, which may be significantly longer or shorter than the interval typically assigned by the calibration organization.

  1. SU-E-T-570: New Quality Assurance Method Using Motion Tracking for 6D Robotic Couches

    SciTech Connect

    Cheon, W; Cho, J; Ahn, S; Han, Y; Choi, D

    2015-06-15

    Purpose: To accommodate geometrically accurate patient positioning, a robotic couch that is capable of 6-degrees of freedom has been introduced. However, conventional couch QA methods are not sufficient to enable the necessary accuracy of tests. Therefore, we have developed a camera based motion detection and geometry calibration system for couch QA. Methods: Employing a Visual-Tracking System (VTS, BonitaB10, Vicon, UK) which tracks infrared reflective(IR) markers, camera calibration was conducted using a 5.7 × 5.7 × 5.7 cm{sup 3} cube attached with IR markers at each corner. After positioning a robotic-couch at the origin with the cube on the table top, 3D coordinates of the cube’s eight corners were acquired by VTS in the VTS coordinate system. Next, positions in reference coordinates (roomcoordinates) were assigned using the known relation between each point. Finally, camera calibration was completed by finding a transformation matrix between VTS and reference coordinate systems and by applying a pseudo inverse matrix method. After the calibration, the accuracy of linear and rotational motions as well as couch sagging could be measured by analyzing the continuously acquired data of the cube while the couch moves to a designated position. Accuracy of the developed software was verified through comparison with measurement data when using a Laser tracker (FARO, Lake Mary, USA) for a robotic-couch installed for proton therapy. Results: VTS system could track couch motion accurately and measured position in room-coordinates. The VTS measurements and Laser tracker data agreed within 1% of difference for linear and rotational motions. Also because the program analyzes motion in 3-Dimension, it can compute couch sagging. Conclusion: Developed QA system provides submillimeter/ degree accuracy which fulfills the high-end couch QA. This work was supported by the National Research Foundation of Korea funded by Ministry of Science, ICT & Future Planning. (2013M2A2A

  2. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    PubMed

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method. Published by Elsevier B.V.

  3. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods

    USGS Publications Warehouse

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.

    2014-01-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  4. Software quality assurance handbook

    SciTech Connect

    Not Available

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  5. A simple capacitive method to evaluate ethanol fuel samples

    NASA Astrophysics Data System (ADS)

    Vello, Tatiana P.; de Oliveira, Rafael F.; Silva, Gustavo O.; de Camargo, Davi H. S.; Bufon, Carlos C. B.

    2017-02-01

    Ethanol is a biofuel used worldwide. However, the presence of excessive water either during the distillation process or by fraudulent adulteration is a major concern in the use of ethanol fuel. High water levels may cause engine malfunction, in addition to being considered illegal. Here, we describe the development of a simple, fast and accurate platform based on nanostructured sensors to evaluate ethanol samples. The device fabrication is facile, based on standard microfabrication and thin-film deposition methods. The sensor operation relies on capacitance measurements employing a parallel plate capacitor containing a conformational aluminum oxide (Al2O3) thin layer (15 nm). The sensor operates over the full range water concentration, i.e., from approximately 0% to 100% vol. of water in ethanol, with water traces being detectable down to 0.5% vol. These characteristics make the proposed device unique with respect to other platforms. Finally, the good agreement between the sensor response and analyses performed by gas chromatography of ethanol biofuel endorses the accuracy of the proposed method. Due to the full operation range, the reported sensor has the technological potential for use as a point-of-care analytical tool at gas stations or in the chemical, pharmaceutical, and beverage industries, to mention a few.

  6. Martian Radiative Transfer Modeling Using the Optimal Spectral Sampling Method

    NASA Technical Reports Server (NTRS)

    Eluszkiewicz, J.; Cady-Pereira, K.; Uymin, G.; Moncet, J.-L.

    2005-01-01

    The large volume of existing and planned infrared observations of Mars have prompted the development of a new martian radiative transfer model that could be used in the retrievals of atmospheric and surface properties. The model is based on the Optimal Spectral Sampling (OSS) method [1]. The method is a fast and accurate monochromatic technique applicable to a wide range of remote sensing platforms (from microwave to UV) and was originally developed for the real-time processing of infrared and microwave data acquired by instruments aboard the satellites forming part of the next-generation global weather satellite system NPOESS (National Polarorbiting Operational Satellite System) [2]. As part of our on-going research related to the radiative properties of the martian polar caps, we have begun the development of a martian OSS model with the goal of using it to perform self-consistent atmospheric corrections necessary to retrieve caps emissivity from the Thermal Emission Spectrometer (TES) spectra. While the caps will provide the initial focus area for applying the new model, it is hoped that the model will be of interest to the wider Mars remote sensing community.

  7. A simple capacitive method to evaluate ethanol fuel samples

    PubMed Central

    Vello, Tatiana P.; de Oliveira, Rafael F.; Silva, Gustavo O.; de Camargo, Davi H. S.; Bufon, Carlos C. B.

    2017-01-01

    Ethanol is a biofuel used worldwide. However, the presence of excessive water either during the distillation process or by fraudulent adulteration is a major concern in the use of ethanol fuel. High water levels may cause engine malfunction, in addition to being considered illegal. Here, we describe the development of a simple, fast and accurate platform based on nanostructured sensors to evaluate ethanol samples. The device fabrication is facile, based on standard microfabrication and thin-film deposition methods. The sensor operation relies on capacitance measurements employing a parallel plate capacitor containing a conformational aluminum oxide (Al2O3) thin layer (15 nm). The sensor operates over the full range water concentration, i.e., from approximately 0% to 100% vol. of water in ethanol, with water traces being detectable down to 0.5% vol. These characteristics make the proposed device unique with respect to other platforms. Finally, the good agreement between the sensor response and analyses performed by gas chromatography of ethanol biofuel endorses the accuracy of the proposed method. Due to the full operation range, the reported sensor has the technological potential for use as a point-of-care analytical tool at gas stations or in the chemical, pharmaceutical, and beverage industries, to mention a few. PMID:28240312

  8. MARKOV CHAIN MONTE CARLO POSTERIOR SAMPLING WITH THE HAMILTONIAN METHOD

    SciTech Connect

    K. HANSON

    2001-02-01

    The Markov Chain Monte Carlo technique provides a means for drawing random samples from a target probability density function (pdf). MCMC allows one to assess the uncertainties in a Bayesian analysis described by a numerically calculated posterior distribution. This paper describes the Hamiltonian MCMC technique in which a momentum variable is introduced for each parameter of the target pdf. In analogy to a physical system, a Hamiltonian H is defined as a kinetic energy involving the momenta plus a potential energy {var_phi}, where {var_phi} is minus the logarithm of the target pdf. Hamiltonian dynamics allows one to move along trajectories of constant H, taking large jumps in the parameter space with relatively few evaluations of {var_phi} and its gradient. The Hamiltonian algorithm alternates between picking a new momentum vector and following such trajectories. The efficiency of the Hamiltonian method for multidimensional isotropic Gaussian pdfs is shown to remain constant at around 7% for up to several hundred dimensions. The Hamiltonian method handles correlations among the variables much better than the standard Metropolis algorithm. A new test, based on the gradient of {var_phi}, is proposed to measure the convergence of the MCMC sequence.

  9. Investigation of Presage 3D Dosimetry as a Method of Clinically Intuitive Quality Assurance and Comparison to a Semi-3D Delta4 System

    NASA Astrophysics Data System (ADS)

    Crockett, Ethan Van

    The need for clinically intuitive metrics for patient-specific quality assurance in radiation therapy has been well-documented (Zhen, Nelms et al. 2011). A novel transform method has shown to be effective at converting full-density 3D dose measurements made in a phantom to dose values in the patient geometry, enabling comparisons using clinically intuitive metrics such as dose-volume histograms (Oldham et al. 2011). This work investigates the transform method and compares its calculated dose-volume histograms (DVHs) to DVH values calculated by a Delta4 QA device (Scandidos), marking the first comparison of a true 3D system to a semi-3D device using clinical metrics. Measurements were made using Presage 3D dosimeters, which were readout by an in-house optical-CT scanner. Three patient cases were chosen for the study: one head-and-neck VMAT treatment and two spine IMRT treatments. The transform method showed good agreement with the planned dose values for all three cases. Furthermore, the transformed DVHs adhered to the planned dose with more accuracy than the Delta4 DVHs. The similarity between the Delta4 DVHs and the transformed DVHs, however, was greater for one of the spine cases than it was for the head-and-neck case, implying that the accuracy of the Delta4 Anatomy software may vary from one treatment site to another. Overall, the transform method, which incorporates data from full-density 3D dose measurements, provides clinically intuitive results that are more accurate and consistent than the corresponding results from a semi-3D Delta 4 system.

  10. [Novel quality assurance method in oncology: the two-level, multi-disciplinary and oncotherapy oncology team system].

    PubMed

    Mangel, László; Kövér, Erika; Szilágyi, István; Varga, Zsuzsanna; Bércesi, Eva; Nagy, Zsuzsanna; Holcz, Tibor; Karádi, Oszkár; Farkas, Róbert; Csák, Szilvia; Csere, Tibor; Kásler, Miklós

    2012-12-16

    By now therapy decision taken by a multi-disciplinary oncology team in cancer care has become a routine method in worldwide. However, multi-disciplinary oncology team has to face more and more difficulties in keeping abreast with the fast development in oncology science, increasing expectations, and financial considerations. Naturally the not properly controlled decision mechanisms, the permanent lack of time and shortage of professionals are also hindering factors. Perhaps it would be a way out if the staff meetings and discussions of physicians in the oncology departments were transformed and provided with administrative, legal and decision credentials corresponding to those of multi-disciplinary oncology team. The new form of the oncotherapy oncoteam might be able to decide the optimal and particular treatment after previous consultation with the patient. The oncotherapy oncoteam is also suitable to carry out training and tasks of a cancer centre and by diminishing the psychological burden of the doctors it contributes to an improved patient care. This study presents the two-level multi-disciplinary and oncotherapy oncology team system at the University of Pécs including the detailed analysis of the considerations above.

  11. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  12. Spline methods for approximating quantile functions and generating random samples

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Matthews, C. G.

    1985-01-01

    Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.

  13. Sampling and Analysis Plan for Ground-Water Monitoring of Wells Near the Metropolitan Utilities District’s Platte River West Well Field Near Wann, Nebraska: Part I, Field Sampling Plan and Part II, Quality Assurance Project Plan

    DTIC Science & Technology

    2005-01-01

    Sample documentation, such as bottle lot numbers as received from supplier; • Sample transportation information, including the name of the...stabilization is achieved; • Sample preservation procedures, including the HCL lot numbers ; and • For QC blank samples, the manufacturer’s lot numbers of the...obtain from OWQRL for Hach acid cartridges of certain lot numbers — default value is 1.00) Vs = volume of sample, in milliliters For samples with pH

  14. Probing methane hydrate nucleation through the forward flux sampling method.

    PubMed

    Bi, Yuanfei; Li, Tianshu

    2014-11-26

    Understanding the nucleation of hydrate is the key to developing effective strategies for controlling methane hydrate formation. Here we present a computational study of methane hydrate nucleation, by combining the forward flux sampling (FFS) method and the coarse-grained water model mW. To facilitate the application of FFS in studying the formation of methane hydrate, we developed an effective order parameter λ on the basis of the topological analysis of the tetrahedral network. The order parameter capitalizes the signature of hydrate structure, i.e., polyhedral cages, and is capable of efficiently distinguishing hydrate from ice and liquid water while allowing the formation of different hydrate phases, i.e., sI, sII, and amorphous. Integration of the order parameter λ with FFS allows explicitly computing hydrate nucleation rates and obtaining an ensemble of nucleation trajectories under conditions where spontaneous hydrate nucleation becomes too slow to occur in direct simulation. The convergence of the obtained hydrate nucleation rate was found to depend crucially on the convergence of the spatial distribution for the spontaneously formed hydrate seeds obtained from the initial sampling of FFS. The validity of the approach is also verified by the agreement between the calculated nucleation rate and that inferred from the direct simulation. Analyzing the obtained large ensemble of hydrate nucleation trajectories, we show hydrate formation at 220 K and 500 bar is initiated by the nucleation events occurring in the vicinity of water-methane interface, and facilitated by a gradual transition from amorphous to crystalline structure. The latter provides the direct support to the proposed two-step nucleation mechanism of methane hydrate.

  15. Analytical results, database management and quality assurance for analysis of soil and groundwater samples collected by cone penetrometer from the F and H Area seepage basins

    SciTech Connect

    Boltz, D.R.; Johnson, W.H.; Serkiz, S.M.

    1994-10-01

    The Quantification of Soil Source Terms and Determination of the Geochemistry Controlling Distribution Coefficients (K{sub d} values) of Contaminants at the F- and H-Area Seepage Basins (FHSB) study was designed to generate site-specific contaminant transport factors for contaminated groundwater downgradient of the Basins. The experimental approach employed in this study was to collect soil and its associated porewater from contaminated areas downgradient of the FHSB. Samples were collected over a wide range of geochemical conditions (e.g., pH, conductivity, and contaminant concentration) and were used to describe the partitioning of contaminants between the aqueous phase and soil surfaces at the site. The partitioning behavior may be used to develop site-specific transport factors. This report summarizes the analytical procedures and results for both soil and porewater samples collected as part of this study and the database management of these data.

  16. Sequential sampling: a novel method in farm animal welfare assessment.

    PubMed

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  17. Healthcare Software Assurance

    PubMed Central

    Cooper, Jason G.; Pauley, Keith A.

    2006-01-01

    Software assurance is a rigorous, lifecycle phase-independent set of activities which ensure completeness, safety, and reliability of software processes and products. This is accomplished by guaranteeing conformance to all requirements, standards, procedures, and regulations. These assurance processes are even more important when coupled with healthcare software systems, embedded software in medical instrumentation, and other healthcare-oriented life-critical systems. The current Food and Drug Administration (FDA) regulatory requirements and guidance documentation do not address certain aspects of complete software assurance activities. In addition, the FDA’s software oversight processes require enhancement to include increasingly complex healthcare systems such as Hospital Information Systems (HIS). The importance of complete software assurance is introduced, current regulatory requirements and guidance discussed, and the necessity for enhancements to the current processes shall be highlighted. PMID:17238324

  18. EMC Compliance Assurance Monitoring

    EPA Pesticide Factsheets

    The Compliance Assurance Monitoring, or CAM, rule is designed to satisfy the requirements for monitoring and compliance certification in the Part 70 operating permits program and Title VII of the 1990 Clean Air Act Amendments

  19. The evaluation methods of sampling rate performance in GNSS receiver

    NASA Astrophysics Data System (ADS)

    Ke, Ting; Hu, Xiulin; Liu, Yuqi; Ran, Yihang

    2009-12-01

    This paper investigates into the performance of sampling rate on time discrimination of PRN code in GNSS, proposes an innovative performance evaluation criterion for actual time discrimination of noncommensurate sampling technique, and then develops an algorithm to quickly obtain this criterion. Computer simulation verification guarantees the correctness of the proposed fast algorithm. The proposed algorithm can be adopted in all PRN code ranging based applications to choose the "better" sampling rate, which could achieve better time discrimination performance with lower sampling rate.

  20. Processes and procedures for a worldwide biological samples distribution; product assurance and logistic activities to support the mice drawer system tissue sharing event

    NASA Astrophysics Data System (ADS)

    Benassai, Mario; Cotronei, Vittorio

    The Mice Drawer System (MDS) is a scientific payload developed by the Italian Space Agency (ASI), it hosted 6 mice on the International Space Station (ISS) and re-entered on ground on November 28, 2009 with the STS 129 at KSC. Linked to the MDS experiment, a Tissue Sharing Program (TSP), was developed in order to make available to 16 Payload Investigators (PI) (located in USA, Canada, EU -Italy, Belgium and Germany -and Japan) the biological samples coming from the mice. ALTEC SpA (a PPP owned by ASI, TAS-I and local institutions) was responsible to support the logistics aspects of the MDS samples for the first MDS mission, in the frame of Italian Space Agency (ASI) OSMA program (OSteoporosis and Muscle Atrophy). The TSP resulted in a complex scenario, as ASI, progressively, extended the original OSMA Team also to researchers from other ASI programs and from other Agencies (ESA, NASA, JAXA). The science coordination was performed by the University of Genova (UNIGE). ALTEC has managed all the logistic process with the support of a specialized freight forwarder agent during the whole shipping operation phases. ALTEC formalized all the steps from the handover of samples by the dissection Team to the packaging and shipping process in a dedicated procedure. ALTEC approached all the work in a structured way, performing: A study of the aspects connected to international shipments of biological samples. A coopera-tive work with UNIGE/ASI /PIs to identify all the needs of the various researchers and their compatibility. A complete revision and integration of shipment requirements (addresses, tem-peratures, samples, materials and so on). A complete definition of the final shipment scenario in terms of boxes, content, refrigerant and requirements. A formal approach to identification and selection of the most suited and specialized Freight Forwarder. A clear identification of all the processes from sample dissection by PI Team, sample processing, freezing, tube preparation

  1. Software Safety Assurance of Programmable Logic

    NASA Technical Reports Server (NTRS)

    Berens, Kalynnda

    2002-01-01

    Programmable Logic (PLC, FPGA, ASIC) devices are hybrids - hardware devices that are designed and programmed like software. As such, they fall in an assurance gray area. Programmable Logic is usually tested and verified as hardware, and the software aspects are ignored, potentially leading to safety or mission success concerns. The objective of this proposal is to first determine where and how Programmable Logic (PL) is used within NASA and document the current methods of assurance. Once that is known, raise awareness of the PL software aspects within the NASA engineering community and provide guidance for the use and assurance of PL form a software perspective.

  2. Software Safety Assurance of Programmable Logic

    NASA Technical Reports Server (NTRS)

    Berens, Kalynnda

    2002-01-01

    Programmable Logic (PLC, FPGA, ASIC) devices are hybrids - hardware devices that are designed and programmed like software. As such, they fall in an assurance gray area. Programmable Logic is usually tested and verified as hardware, and the software aspects are ignored, potentially leading to safety or mission success concerns. The objective of this proposal is to first determine where and how Programmable Logic (PL) is used within NASA and document the current methods of assurance. Once that is known, raise awareness of the PL software aspects within the NASA engineering community and provide guidance for the use and assurance of PL form a software perspective.

  3. Final Technical Plan, Including the Final Sampling and Analysis Plan, Final Quality Assurance Project Plan, Fort Douglas, Environmental Investigation/Alternatives Analysis

    DTIC Science & Technology

    1991-09-01

    Fort Douglas were hastily constructed primarily of logs or adobe . In the 1870’s, most of the original buildings were replaced with locally quarried red...detected o;- susqpec ted , you must reprta rleseto the Executive Secr-etar-y, Sol idc andl HiazardJou~s Wastes I Committee at 801-538-6170. If...silica gel , the extract is analyzed by infrared spectrophotometry. Proposed Soil Method and Reference: The proposed method of analysis for soil is

  4. RAVEN Quality Assurance Activities

    SciTech Connect

    Cogliati, Joshua Joseph

    2015-09-01

    This report discusses the quality assurance activities needed to raise the Quality Level of Risk Analysis in a Virtual Environment (RAVEN) from Quality Level 3 to Quality Level 2. This report also describes the general RAVEN quality assurance activities. For improving the quality, reviews of code changes have been instituted, more parts of testing have been automated, and improved packaging has been created. For upgrading the quality level, requirements have been created and the workflow has been improved.

  5. Performance assurance program plan

    SciTech Connect

    Rogers, B.H.

    1997-11-06

    B and W Protec, Inc. (BWP) is responsible for implementing the Performance Assurance Program for the Project Hanford Management Contract (PHMC) in accordance with DOE Order 470.1, Safeguards and Security Program (DOE 1995a). The Performance Assurance Program applies to safeguards and security (SAS) systems and their essential components (equipment, hardware, administrative procedures, Protective Force personnel, and other personnel) in direct support of Category I and H special nuclear material (SNM) protection. Performance assurance includes several Hanford Site activities that conduct performance, acceptance, operability, effectiveness, and validation tests. These activities encompass areas of training, exercises, quality assurance, conduct of operations, total quality management, self assessment, classified matter protection and control, emergency preparedness, and corrective actions tracking and trending. The objective of the Performance Assurance Program is to capture the critical data of the tests, training, etc., in a cost-effective, manageable program that reflects the overall effectiveness of the program while minimizing operational impacts. To aid in achieving this objective, BWP will coordinate the Performance Assurance Program for Fluor Daniel Hanford, Inc. (FDH) and serve as the central point for data collection.

  6. Developing new extension of GafChromic RTQA2 film to patient quality assurance field using a plan-based calibration method

    NASA Astrophysics Data System (ADS)

    Peng, Jiayuan; Zhang, Zhen; Wang, Jiazhou; Xie, Jiang; Chen, Junchao; Hu, Weigang

    2015-10-01

    GafChromic RTQA2 film is a type of radiochromic film designed for light field and radiation field alignment. The aim of this study is to extend the application of RTQA2 film to the measurement of patient specific quality assurance (QA) fields as a 2D relative dosimeter. Pre-irradiated and post-irradiated RTQA2 films were scanned in reflection mode using a flatbed scanner. A plan-based calibration (PBC) method utilized the mapping information of the calculated dose image and film grayscale image to create a dose versus pixel value calibration model. This model was used to calibrate the film grayscale image to the film relative dose image. The dose agreement between calculated and film dose images were analyzed by gamma analysis. To evaluate the feasibility of this method, eight clinically approved RapidArc cases (one abdomen cancer and seven head-and-neck cancer patients) were tested using this method. Moreover, three MLC gap errors and two MLC transmission errors were introduced to eight Rapidarc cases respectively to test the robustness of this method. The PBC method could overcome the film lot and post-exposure time variations of RTQA2 film to get a good 2D relative dose calibration result. The mean gamma passing rate of eight patients was 97.90%  ±  1.7%, which showed good dose consistency between calculated and film dose images. In the error test, the PBC method could over-calibrate the film, which means some dose error in the film would be falsely corrected to keep the dose in film consistent with the dose in the calculated dose image. This would then lead to a false negative result in the gamma analysis. In these cases, the derivative curve of the dose calibration curve would be non-monotonic which would expose the dose abnormality. By using the PBC method, we extended the application of more economical RTQA2 film to patient specific QA. The robustness of the PBC method has been improved by analyzing the monotonicity of the derivative of the

  7. Developing new extension of GafChromic RTQA2 film to patient quality assurance field using a plan-based calibration method.

    PubMed

    Peng, Jiayuan; Zhang, Zhen; Wang, Jiazhou; Xie, Jiang; Chen, Junchao; Hu, Weigang

    2015-10-07

    GafChromic RTQA2 film is a type of radiochromic film designed for light field and radiation field alignment. The aim of this study is to extend the application of RTQA2 film to the measurement of patient specific quality assurance (QA) fields as a 2D relative dosimeter.Pre-irradiated and post-irradiated RTQA2 films were scanned in reflection mode using a flatbed scanner. A plan-based calibration (PBC) method utilized the mapping information of the calculated dose image and film grayscale image to create a dose versus pixel value calibration model. This model was used to calibrate the film grayscale image to the film relative dose image. The dose agreement between calculated and film dose images were analyzed by gamma analysis. To evaluate the feasibility of this method, eight clinically approved RapidArc cases (one abdomen cancer and seven head-and-neck cancer patients) were tested using this method. Moreover, three MLC gap errors and two MLC transmission errors were introduced to eight Rapidarc cases respectively to test the robustness of this method.The PBC method could overcome the film lot and post-exposure time variations of RTQA2 film to get a good 2D relative dose calibration result. The mean gamma passing rate of eight patients was 97.90%  ±  1.7%, which showed good dose consistency between calculated and film dose images. In the error test, the PBC method could over-calibrate the film, which means some dose error in the film would be falsely corrected to keep the dose in film consistent with the dose in the calculated dose image. This would then lead to a false negative result in the gamma analysis. In these cases, the derivative curve of the dose calibration curve would be non-monotonic which would expose the dose abnormality.By using the PBC method, we extended the application of more economical RTQA2 film to patient specific QA. The robustness of the PBC method has been improved by analyzing the monotonicity of the derivative of the calibration

  8. Biocatalytic spectrophotometric method to detect paracetamol in water samples.

    PubMed

    Méndez-Albores, Alia; Tarín, Cristina; Rebollar-Pérez, Georgette; Dominguez-Ramirez, Lenin; Torres, Eduardo

    2015-01-01

    A biocatalytic methodology based on the quantification of the laccase inhibition during the oxidation of a standard substrate ABTS (2,2'-azino-bis(3-ethylbenzothiazoline-6-sulphonic acid) for the indirect determination of paracetamol in drinking water has been developed. The method displayed a fast response time (20 s), and high selectivity to paracetamol in presence of interfering substances such as naproxen, estradiol, ketoprofen, sulfamethoxazole, and diclofenac. The limit of detection (LOD) and limit of quantification (LOQ) were noticed to be 0.55 µM and 8.3 µM, respectively. By comparing the catalytic constants value KM and kcat for ABTS oxidation in the absence and presence of various concentrations of paracetamol, a competitive-type inhibition was disclosed. On the other hand, the close value between Ki and KM indicates similar binding affinity of the enzyme to ABTS and paracetamol corroborated by docking studies. The methodology was successfully applied to real water samples, presenting an interesting potential for further development of a biosensor to paracetamol detection.

  9. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    PubMed

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  10. A cleaning method to minimize contaminant luminescence signal of empty sample carriers using off-the-shelf chemical agents.

    PubMed

    Kazakis, Nikolaos A; Kitis, George; Tsirliganis, Nestor C

    2014-11-01

    Signals acquired during thermoluminescence or optically stimulated luminescence measurements must be completely free of any spurious and/or contamination signals to assure the credibility of the results, especially during exploratory research investigating the luminescence behavior of new materials. Experiments indicate that such unwanted signals may also stem from new (unused) and used empty sample carriers, namely cups and discs, which are widely used for such measurements, probably due to contamination from a fluorite and/or silica-related source. Fluorite and/or silicone oil appear to be the most likely sources of contamination, thus, their removal, along with any other possible source that exhibits undesirable luminescence behavior, is necessary. Conventional cleaning methods fail to eliminate such contaminants from empty cups and discs. In this work a new cleaning method is proposed incorporating off-the-shelf chemical agents. Results of thermoluminescence measurements highlight the efficiency of the new cleaning process, since it can completely remove any observed contaminants from both new and used sample carriers, of various shapes and/or materials. Consequently their signal is minimized even at relatively high beta-doses, where it is prominent, resulting in a clean and only sample-attributed signal.

  11. Assuring Life in Composite Systems

    NASA Technical Reports Server (NTRS)

    Chamis, Christos c.

    2008-01-01

    A computational simulation method is presented to assure life in composite systems by using dynamic buckling of smart composite shells as an example. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 9% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load. The uncertainties in the electric field strength and smart material volume fraction have moderate effects and thereby in the assured life of the shell.

  12. Photoacoustic spectroscopy sample array vessels and photoacoustic spectroscopy methods for using the same

    DOEpatents

    Amonette, James E.; Autrey, S. Thomas; Foster-Mills, Nancy S.

    2006-02-14

    Methods and apparatus for simultaneous or sequential, rapid analysis of multiple samples by photoacoustic spectroscopy are disclosed. Particularly, a photoacoustic spectroscopy sample array vessel including a vessel body having multiple sample cells connected thereto is disclosed. At least one acoustic detector is acoustically positioned near the sample cells. Methods for analyzing the multiple samples in the sample array vessels using photoacoustic spectroscopy are provided.

  13. A comparison of four gravimetric fine particle sampling methods.

    PubMed

    Yanosky, J D; MacIntosh, D L

    2001-06-01

    A study was conducted to compare four gravimetric methods of measuring fine particle (PM2.5) concentrations in air: the BGI, Inc. PQ200 Federal Reference Method PM2.5 (FRM) sampler; the Harvard-Marple Impactor (HI); the BGI, Inc. GK2.05 KTL Respirable/Thoracic Cyclone (KTL); and the AirMetrics MiniVol (MiniVol). Pairs of FRM, HI, and KTL samplers and one MiniVol sampler were collocated and 24-hr integrated PM2.5 samples were collected on 21 days from January 6 through April 9, 2000. The mean and standard deviation of PM2.5 levels from the FRM samplers were 13.6 and 6.8 microg/m3, respectively. Significant systematic bias was found between mean concentrations from the FRM and the MiniVol (1.14 microg/m3, p = 0.0007), the HI and the MiniVol (0.85 microg/m3, p = 0.0048), and the KTL and the MiniVol (1.23 microg/m3, p = 0.0078) according to paired t test analyses. Linear regression on all pairwise combinations of the sampler types was used to evaluate measurements made by the samplers. None of the regression intercepts was significantly different from 0, and only two of the regression slopes were significantly different from 1, that for the FRM and the MiniVol [beta1 = 0.91, 95% CI (0.83-0.99)] and that for the KTL and the MiniVol [beta1 = 0.88, 95% CI (0.78-0.98)]. Regression R2 terms were 0.96 or greater between all pairs of samplers, and regression root mean square error terms (RMSE) were 1.65 microg/m3 or less. These results suggest that the MiniVol will underestimate measurements made by the FRM, the HI, and the KTL by an amount proportional to PM2.5 concentration. Nonetheless, these results indicate that all of the sampler types are comparable if approximately 10% variation on the mean levels and on individual measurement levels is considered acceptable and the actual concentration is within the range of this study (5-35 microg/m3).

  14. 30 CFR 14.8 - Quality assurance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... order to assure that the finished conveyor belt will meet the flame-resistance test— (1) Flame test a sample of each batch, lot, or slab of conveyor belts; or (2) Flame test or inspect a sample of each batch or lot of the materials that contribute to the flame-resistance characteristic. (b)...

  15. Evaluation of sample preservation methods for space mission

    NASA Technical Reports Server (NTRS)

    Schubert, W.; Rohatgi, N.; Kazarians, G.

    2002-01-01

    For interplanetary spacecraft that will travel to destinations where future life detection experiments may be conducted or samples are to be returned to earth, we should archive and preserve relevant samples from the spacecraft and cleanrooms for evaluation at a future date.

  16. [Current methods for preparing samples on working with hematology analyzers].

    PubMed

    Tsyganova, A V; Pogorelov, V M; Naumova, I N; Kozinets, G I; Antonov, V S

    2011-03-01

    The paper raises a problem of preparing samples in hematology. It considers whether the preanalytical stage is of importance in hematological studies. The use of disposal vacuum blood collection systems is shown to solve the problem in the standardization of a blood sampling procedure. The benefits of the use of close tube hematology analyzers are also considered.

  17. An automated method of sample preparation of biofluids using pierceable caps to eliminate the uncapping of the sample tubes during sample transfer.

    PubMed

    Teitz, D S; Khan, S; Powell, M L; Jemal, M

    2000-09-11

    Biological samples are normally collected and stored frozen in capped tubes until analysis. To obtain aliquots of biological samples for analysis, the sample tubes have to be thawed, uncapped, samples removed and then recapped for further storage. In this paper, we report an automated method of sample transfer devised to eliminate the uncapping and recapping process. This sampling method was incorporated into an automated liquid-liquid extraction procedure of plasma samples. Using a robotic system, the plasma samples were transferred directly from pierceable capped tubes into microtubes contained in a 96-position block. The aliquoted samples were extracted with methyl-tert-butyl ether in the same microtubes. The supernatant organic layers were transferred to a 96-well collection plate and evaporated to dryness. The dried extracts were reconstituted and injected from the same plate for analysis by liquid chromatography with tandem mass spectrometry.

  18. Acoustically levitated droplets: a contactless sampling method for fluorescence studies.

    PubMed

    Leiterer, Jork; Grabolle, Markus; Rurack, Knut; Resch-Genger, Ute; Ziegler, Jan; Nann, Thomas; Panne, Ulrich

    2008-01-01

    Acoustic levitation is used as a new tool to study concentration-dependent processes in fluorescence spectroscopy. With this technique, small amounts of liquid and solid samples can be measured without the need for sample supports or containers, which often limits signal acquisition and can even alter sample properties due to interactions with the support material. We demonstrate that, because of the small sample volume, fluorescence measurements at high concentrations of an organic dye are possible without the limitation of inner-filter effects, which hamper such experiments in conventional, cuvette-based measurements. Furthermore, we show that acoustic levitation of liquid samples provides an experimentally simple way to study distance-dependent fluorescence modulations in semiconductor nanocrystals. The evaporation of the solvent during levitation leads to a continuous increase of solute concentration and can easily be monitored by laser-induced fluorescence.

  19. Validated Test Method 5030C: Purge-and-Trap for Aqueous Samples

    EPA Pesticide Factsheets

    This method describes a purge-and-trap procedure for the analysis of volatile organic compoundsin aqueous samples & water miscible liquid samples. It also describes the analysis of high concentration soil and waste sample extracts prepared in Method 5035.

  20. Stability of nitrate-ion concentrations in simulated deposition samples used for quality-assurance activities by the U.S. Geological Survey

    USGS Publications Warehouse

    Willoughby, T.C.; See, R.B.; Schroder, L.J.

    1989-01-01

    Three experiments were conducted to determine the stability of nitrate-ion concentrations in simulated deposition samples. In the four experiment-A solutions, nitric acid provided nitrate-ion concentrations ranging from 0.6 to 10.0 mg/L and that had pH values ranging from 3.8 to 5.0. In the five experiment-B solutions, sodium nitrate provided nitrate-ion concentrations ranging from 0.5 to 3.0 mg/L. The pH was adjusted to about 4.5 for each of the solutions by addition of sulfuric acid. In the four experiment-C solutions, nitric acid provided nitrate-ion concentrations ranging from 0.5 to 3.0 mg/L. Major cation and anion concentrations were added to each solution to simulate natural deposition. Aliquots were removed from the 13 original solutions and analyzed by ion chromatography about once a week for 100 days to determine if any changes occurred in nitrate-ion concentrations throughout the study period. No substantial changes were observed in the nitrate-ion concentrations in solutions that had initial concentrations below 4.0 mg/L in experiments A and B, although most of the measured nitrate-ion concentrations for the 100-day study were below the initial concentrations. In experiment C, changes in nitrate-ion concentrations were much more pronounced; the measured nitrate-ion concentrations for the study period were less than the initial concentrations for 62 of the 67 analyses. (USGS)